October 2004 // Volume 42 // Number 5 // Feature Articles // 5FEA8

Previous Article Issue Contents Previous Article

Evaluating Software Development: A Case Study with Pasture Land Management (PLMS) Grazing Software

Abstract
A process for evaluating and improving public domain software is presented for agents and faculty who author software and Web-based training. Extension, education, and conservation employees participated in workshops to learn about a Pasture Land Management System software program that enables farmers to experiment with alternative grazing methods. Users were questioned at initial workshop training and again 6 months later. The workshop evaluation showed concern about the software complexity. The follow-up questionnaire revealed the respondents' priorities for technical improvements. The authors used the participants' feedback to evaluate existing problems and prioritize improvements in the usability and functionality of the software.


John M. Galbraith
Assistant Professor
Crop and Soil Environmental Sciences Department
John.Galbraith@vt.edu

Gordon E. Groover
Extension Economist Farm Management
Agricultural and Applied Economics Department

Franklin A. (Lex) Bruce, Jr.
Extension Specialist
Agric. & Extension Education Department

Nicholas D. Stone
Associate Professor
International Institute for Information Technology

Gordon B. (Brinkley) Benson
Research Associate
Virginia Polytechnic Institute and State University
Blacksburg, Virginia


Introduction

Controlled or rotational grazing has been widely recognized among educators, Extension agents, USDA-Natural Resource Conservation Service (NRCS), and Soil and Water Conservation District employees as a management strategy that provides benefits to farmers and society through profitable and sound ecological management of grazing land and livestock. The economic benefits of controlled over continuous grazing at high stocking rates include improved productivity and harvest efficiency, improved forage and pasture quality (Dalrymple, Rogers, & Ingram, 1996; Hoveland, McCann, & Hill, 1997; Walton, Martinez, & Bailey, 1981), and more uniform distribution and recycling of animal waste (Joost, 1997). Controlled grazing also lowers the risk of soil erosion and nutrient runoff into surface water compared to continuous grazing at high stocking rates (Faulkner & Boyer, 1993; Faulkner, Kinvig, & Boyer, 1994; Faulkner, Boyer, & Dalton, 2000).

Despite the broad range of benefits described from the use of controlled grazing, only small numbers of producers have adopted it. In Virginia, just 5% of all beef cattle operations (Virginia Forages and Grassland Council, 1998) and 11% of all dairies currently use management-intensive rotational grazing (Groover, 1998). Controlled grazing has not been widely accepted because it is difficult for some producers to plan and manage and there is uncertainty in the initial investment cost required to convert a farm to a controlled grazing system.

Recently, decision support system (DSS) software programs have made planning easier and allowed users to test potential management benefits without making capital investments. Producers who use DSS computer programs can improve their economic efficiency, easily evaluate complex decisions, and benefit from appropriate use of science-based information.

Developing public domain DDS software is difficult because of the time-limited usability testing period that precludes the collection of meaningful user feedback about the user-friendliness, functionality, accuracy, and potential acceptance of the software. Many public domain DSS software prototypes are developed under short-term funding contracts that do not allow enough time to identify weaknesses and implement appropriate modifications. Funding for marketing, sales, beta-version testing, and distribution studies are seldom included in grants used to develop the DSS software.

The adoption of any DSS software is dependent on how easy it is to learn and use, its reliability and technical accuracy, its likelihood of being provided with long-term development and technical support, and its cost compared to the benefits it provides. The software must also fill user's needs that are not being supplied by a competing DSS. Incorporating user input during the initial stages of software development increases the likelihood that the final product will be adopted and will meet the needs of its users.

While a number of beef and dairy grazing management software packages are available from commercial sources and academic institutions in the U.S., none have risen to become industry leaders. The lack of success for the public domain software seems to be due the software maker's failure to meet user needs, failure to provide programs that work outside of specific applications or regions, or lack of sustained funding for maintenance and improvement.

Government agencies are reluctant to pay to collect user feedback that can be critical for software acceptance and do not allow the software products to be sold for profit. The lack of continued income limits the options for improving software after the initial distribution and makes the role of collecting user feedback during development stage even more critical.

The Pasture Land Management System (PLMS) (Information Systems and Insect Studies, 2002) is a DSS software program that has been in development since 1998, with funding by the Environmental Protection Agency's (EPA) Ruminant Livestock Efficiency Program (RLEP) and Sustainable Agricultural Research and Education Program (SARE). A partnership between Virginia Polytechnic Institute and State University (Virginia Tech) and NRCS provided the knowledge base and design specification for the program. PLMS is also an educational program that allows users to compare and contrast alternative management strategies by showing visually the relationships between forage supply and demand and the effects of changes on profitability and efficiency (Stone, Benson, Groover, Venuto, & Cline, 2000).

The authors of PLMS believe that evaluations after initial training and subsequent software use can provide important information to identify training and program strengths and weaknesses that would not be available through conventional software development methods. This article presents a case study of an evaluation process that collected pertinent information about PLMS software from participants at training workshops and 6 months later, after the participants had time to test the software with potential users for consideration of use by other public domain software developers.

Methods and Materials

Two training sessions for using the PLMS software were conducted in December 2001 and January 2002 at Virginia Tech. Forty-four Extension agents, educators, and conservationists from Pennsylvania, West Virginia, Virginia, and North Carolina participated. Session activities and instructional resources included in each workshop were PLMS prototype software, two case studies, climate and Geographic Information Systems (GIS) maps, and user's guide. Participants received instruction on how to use the discussion forum and bug report sites on the PLMS Web site and how to download the training materials and the user's guide.

Workshop activities included hands-on instruction consisting of program theory and background assumptions, data sources and input, downloading and installation practice, basic program operation, and case studies of actual beef, dairy, and stocker farms. All participants were asked to design a new farm plan/grazing system and to present and discuss the results with other participants. Participants were asked to complete a Web-based questionnaire before leaving the workshop to provide feedback on the instructional techniques used in the workshop and the instructors' ability to communicate important details involved with using PLMS.

A follow-up questionnaire was developed and sent to workshop participants 6 months after they had completed their respective workshops. The questionnaire was aimed at assessing how much the participants had used the PLMS system and/or Web site after the initial training and, more particularly, to gather input from the participants to prioritize shortfalls and anticipated needs of the overall PLMS system and training.

Results and Discussion

Instructional Evaluations

Workshop participants identified their roles relating to working with farmers and forage/animal systems as 47%" Education-teaching principles," 36%" Service-assisting design and implementation," and 17 percent% "Administration of Programs and Compliance." Overall, the participants felt that the training they received was very good to excellent and that the instructors were well prepared and very knowledgeable (Table 1). Comments concerning training weaknesses and program difficulty for first-time participants were offset by positive responses with almost opposite opinions (Table 2).

Table 1.
Training Session Evaluation Results (Scale is from 1 to 5, where 1 = "Excellent" and 5 = "Poor".)

Mean

C.V.1

Questions

1.58

0.32

How would you rate the organization of the presentations?

1.37

0.36

The instructors' knowledge of the subject seemed to be...

1.47

0.35

The instructors' ability to explain information clearly was...

1.11

0.29

The instructors' attitude toward the participants was...

1.58

0.32

I rate the quality of reference materials presented as...

1.32

0.36

The availability of individual help was...

1.53

0.34

Overall, I considered this training session to be...

4.05

0.19

PLMS is too complicated for the work I am asked to perform.

1.89

0.35

PLMS will help me educate farmers about design and management of forage/animal systems.

2.78

0.32

PLMS will reduce the time I spend designing forage system for livestock producers.

2.58

0.42

PLMS would be a tool that farmers would routinely use to help design and implement a new grazing system.

2.21

0.29

Having completed the PLMS training, I am confident that I can use PLMS to help farmers evaluate grazing and forage management alternatives.

1 Coefficient of variation

 

Table 2.
Positive and Negative Training Session Evaluation Comments

Question 1 - What were the most negative aspects of the training?

Responses to Question 1

  • "Multiple needs of audience; NRCS needs one thing, Extension needs something else..."
  • "This is a BIG program! Going to take some time to get comfortable with it!"
  • "After the enhancements and changes have been implemented, I cannot see any negatives."
  • "Bugs still need to be worked out-though it's hard to find the bugs until you have multiple people working with the program. This wasn't really a negative aspect."

Question 2 - What were the most positive aspects of the training?

Responses to Question 2

  • "Very easy to understand. Appears easy to use with some training. Good that maps are incorporated, makes it easy to show farmer what's going on."
  • "Easy to use program. Good computer lab. I see potential benefits for current systems that are not set up ideally (in addition to new systems). I will be able to help producers make changes based on actual field info." 
  • "I think this will be a good tool to use to set up pasture based programs."
  • "The most positive aspect was that the program has the potential to be used to help design grazing systems. Also, apparently to is possible to expand it as GPS data becomes available. I think the development of this software shows a lot of effort and ingenuity."

Follow-up Questionnaire

Participants were mailed a questionnaire about 6 months after participating in a workshop. The questionnaire included questions pertaining to using the PLMS Web site and user's manual, general use and application of the PLMS program, PLMS functional problems, and opinions about the PLMS system in general.

Questionnaire Response Rate

Nineteen of the 43 workshop participants returned usable questionnaires for a response rate of 44%. It was assumed that the 24 participants not returning questionnaires were uninterested in the PLMS System and would not be using it. Therefore, a response rate of 44% seemed acceptable in the attempt to gather additional information after the training sessions.

Using the PLMS Web Site

The PLMS Web site provided 1) a discussion forum; 2) frequently asked questions section; 3) bug report request; 4) resources for PLMS training; and 4) suggestions and/or problems on the Bug Report and Change Request pages. Most of the participants (72%) said that they had visited the Web site an average of a little over 5 times (one participant had visited it 10 times). The second most visited section on the Web site was the resources for PLMS training section, which was visited by 44% of the participants (Table 3).

Table 3.
Questionnaire Results Concerning Use of the PLMS Web Site

Since your training in Blacksburg, have you:

N1

No

Yes

If Yes, how many times?

Mean
(SD)

Min.
Max.

N1

a. Visited the PLMS Web site?

18
5 (28%)
13 (72%)
5.27
(2.9)
2
10
11

b. Visited the Web site's Discussion Forum?

18
15 (83%)
3 (17%)
2
(1)
1
3
3

c. Visited the Web site's Frequently Asked Questions section?

16
12 (75%)
4 (25%)
1.75 (0.96)
1
3
4

d. Visited the Web site's Bug Report & Change Request page?

18
14 (78%)
4 (22%)
2.67 (2.88)
1
6
3

e. Visited the Web site's Resources for PLMS Training page?

18
10 (56%)
8 (44%)
1.75 (0.50)
1
2
4

f. Posted suggestions and/or problems on Bug Report & Change Request page?

18
16 (89%)
2 (11%)
2
(n/a)
n/a
1
1 Number of respondents

User's Manual

The next section of the questionnaire pertained to the usefulness of the PLMS User's Manual. Only 2 of the 15 participants said that they had actually used the hard copy manual; however, most (77%) said that they planned to use it but had not had time to do so. Because PLMS is a computer program, not using the hard copy manual is somewhat understandable: PLMS users could be expected to want all directions, assistance, and/or tutorials included within the computer program.

Because all participants had used the manual in their respective workshop, any comments they made about the manual was considered valid, even if they said they had not used it within the last 6 months. One participant commented that he felt that the manual was a bit complicated for him because he was a beginning computer user. Another participant commented that he had used the manual in explaining aspects of the PLMS to producers. Two other participants commented that the case studies within the manual were helpful.

General Use/Application of the PLMS Program

The area of inquiry in the questionnaire pertaining to general use/application of the PLMS was paid special attention. If participants had used the system in the last 6 months, it was assumed that they would have more insight than someone who had not. However, even if a participant had not used the system outside of the workshop setting, his comments were still considered meaningful with regard to PLMS functions and/or problem and difficulties.

Twenty-one percent of the participants (9) said that they had used the PLMS. The predominant reason given by five participants who had not used the system was "lack of time." The remaining participants' reasons for not using the system included lack of computer access or inability to load the system on a computer (3); system still needs refinement (1); insufficient pasture or grazing land (1); and our agency not making the program available or another program being available (2).

The nine participants who said that they had used the system were asked how often they had used it, how many cooperators they had shown it to, and the reaction they had received from those cooperators. Several of the respondents said that they had not shown the system to any producers but had shown it to other employees in their agency and had used it several times themselves.

Respondents said they had actually shown the system to anywhere from 1 to 63 cooperators. The respondents reported mostly positive reactions from producers regarding the system, and one respondent reported signing up 18 producers to learn more about the PLMS software. However, a few unspecified negative reactions to the system came from producers, one producer being concerned about the accuracy of the yield database.

PLMS Functional Problems

Fifteen PLMS functional problems were listed for respondents to either agree or disagree with by using a 1-4 rating scale (1=Strongly Disagree, 2=Disagree, 3=Agree, and 4=Strongly Agree) (Tables 4 and 5). To make the statements easier for the respondents to read, all of them were written as statements with negative connotation (e.g., Errors occur in growth curves of certain forages).

Table 4.
Statement Agreement Results Concerning PLMS Functional Problem Statements

Statement Agreement (Ranked from most to Least; Mean > 2.5)

PLMS Functional Problems

Mean
(SD2)

N1

Strongly Disagree

Disagree

Agree

Strongly
Agree

Errors occur in growth curves of certain forages

2.94
(1.06)

16

1
(6%)

5
(31%)

5
(31%)

4
(25%)

Forage growth insensitive to pH and temperature changes

2.85
(0.55)

13

-

3
(23%)

9
(69%)

1
(8%)

Limited choices of forages, interseeding, and double-cropping for southern states

2.85
(0.80)

14

-

5
(36%)

5
(36%)

3
(21%)

Inability to specify levels of farm management and supplementation limit simulation accuracy

2.80
(0.41)

15

-

3
(20%)

12
(80%)

-

Program functions, assumptions, or default values are not all technically accurate

2.73
(0.59)

15

-

5
(33%)

9
(60%)

1
(7%)

Entering the field data for a farm too tedious without a copy or paste function

2.73
(0.70)

15

1
(7%)

3
(20%)

10
(67%)

1
(7%)

Interface not user-friendly enough

2.67
(0.62)

15

-

6
(40%)

8
(53%)

1
(7%)

Confusing method of selecting and changing baselines and alternatives

2.56
(0.63)

16

-

8
(50%)

7
(44%)

1
(6%)

1 Number of respondents
2 Standard deviation

 

Table 5.
Statement Disagreement Results Concerning PLMS Functional Problem Statements

Statement Disagreement (Ranked from most to least; Mean < 2.5)

PLMS Functional Problems

Mean (SD2)

N1

Strongly Disagree

Disagree

Agree

Strongly
Agree

Crashes too frequently and easily

2
(0.38)
15
1
(7%)
13
(87%)
1
(7%)
-

Field data inputs too difficult to gather or determine

2
(0.54)
15
2
(13%)
11
(73%)
2
(13%)
-

Cannot have both continuous and rotational grazing on the same farm

2.18
(0.73)
17
2
(12%)
11
(65%)
3
(18%)
1
(6%)

Difficult to generating summary reports and graphics

2.25
(0.68)
16
1
(6%)
11
(69%)
3
(19%)
1
(6%)

Errors occur in map and field display window when selecting "pan"  and "zoom"  options

2.29
(0.73)
14
1
(7%)
9
(64%)
3
(21%)
1
(7%)

Difficult to understand or read the supply and demand graphs

2.33
(0.49)
15
-
10
(67%)
5
(33%)
-

Not similar enough to real-world grazing operations

2.33
(0.49)
15
-
10
(67%)
5
(33%)
-

1 Number of respondents
2 Standard deviation

Mean values were calculated for each of the 15 functions listed, using the 1-4 scaled values (Tables 4 and 5). Lower means indicated disagreement with a statement, while higher means indicated agreement with a statement. The mean value of each PLMS function question was interpreted as being in disagreement if the mean value was less than 2.5 and in agreement if the mean value was greater that 2.5. None of the mean values was exactly 2.5.

Respondents agreed with 8 of the 15 statements confirming what were thought to be problems within the PLMS. Among the most agreed upon statements were first, Errors occur in growth curves of certain forages (Mean = 2.94, SD=1.06); second, Forage growth insensitive to pH and temperature (Mean=2.85, SD=0.555); third, Limited choices of forages, interseeding, and double-cropping for southern states (Mean=2.85, SD=0.801); and fourth, Inability to specify levels of farm management and supplementation limit simulation accuracy (Mean=2.80, SD=0.414). The most disagreed with statements (which were the functions operating well) were first, Crashes too frequently and Field data inputs too difficult to gather or determine (Mean=2, SD=0.378 and 0.535, respectively); and second, Cannot have both continuous and rotational grazing on the same farm (Mean=2.18, SD=0.728).

Overall Opinions About PLMS

In the last section of the survey, respondents were first asked to list their top three problems/difficulties with the PLMS (Table 6). Common themes were found in each of the three rankings. Therefore, all the problems/difficulties mentioned by the respondents were combined into fewer than five themes. The themes included Limitations/Specific problems; Reporting; Time to use the system; Computer Related; and Other.

Table 6.
Ranking of the Importance of the Statement to the User and Trainer (Ranked in order from 1 to 5 next to the item, with #1 being the most important item, #2 the next most important, etc.)

Statement

Importance Factor

1

2

3

4

5

User-friendliness of the menus and online support

7
2
5
1
1

Technical accuracy of the existing program functions

4
4
4
2
2

Features/options that simulate true grazing systems

2
4
3
3
3

Amount of time it takes to learn how to use and teach the program

2
2
2
4
5

Amount of time/difficulty it takes to input the initial farm data

-
3
2
6
4

The most commented on problem and/or difficulty with the system fell under the theme limitations and/or specific problems within the system. While lack of user-friendliness was cited by several respondents, most comments tended to have to do with specific things like plant growth curves, forage growth patterns, setting baselines, etc. One comment asked for additional training. This suggestion seemed to be a good idea in light of the eclectic nature and specificity of the comments in general. The second most commented on problem with the system had to do with the reporting functions. These included printing reports, incorrect information within a report, and having more options for creating and printing summary reports.

The last two themes (which had fewer comments) centered on not having time to get acquainted with and/or use the system and either not having a computer available or what seemed to be the complicated nature involved with loading the system.

Also included in the Opinions about the PLMS section of the questionnaire were five statements pertaining to existing problems with PLMS that need to be addressed. Respondents were asked to rank these problems in terms of importance. One-half (50%) of the respondents ranked the statement User-friendliness of the menus and online support as the most important issue about the PLMS that needed to be addressed, followed by Technical accuracy of the existing program functions (29%). The last three statements: Features/options that simulate true grazing systems, Amount of time it takes to learn how to use and teach the program, and Amount of time/difficulty it takes to input the initial farm data, were all considered important by the respondents in that they needed to be addressed but were not ranked as most needed by as many respondents (24%, 24%, and 12%, respectively).

The final question on the questionnaire gave the respondents an opportunity to make comments on the PLMS. Comments tended to replicate many of the statements that had already been made. Several comments pertained to the system being a good or great program but that the bugs in it need to be fixed (and that the program had a long way to go.) The need to conduct another training session was mentioned. There was a comment about the program's potential usefulness to other agencies. Finally, there were several comments pertaining to encouraging the PLMS researchers to keep working on the system, that the system is needed, and that it has great potential.

Conclusions

Even though the PLMS has met with limited acceptance and use in the 6 months since the first training, developers of this and other public domain software can learn from the procedures used to obtain participant feedback pertaining to the software's development. User follow-up is critical for developers operating on limited budgets or seeking grant funds to continue the development process, for agents that develop informational web pages, and for faculty that develop Web-based curricula. Obtaining user feedback with ranking of priorities to address the needs of the targeted users provides a cost effective means direct programming. The authors have identified the following issues and tools that can help developers of public domain software and Web pages on limited budgets to direct their resources wisely.

  • Involve a selected group of potential users during software development beginning at the initial stages.

  • Software must be objectively and rigorously tested for reliability before training activities start.

  • Trainers must have a working knowledge of both the software and the subject area and have expertise in the practical applications of the software.

  • Training must be targeted at the end users to assist in their delivery of programs.

  • Targeted users must have access to the Internet and reliable computer hardware.

  • Targeted users with subject expertise but lacking sufficient general computer operations knowledge should be identified and trained outside of the software-training program.

  • Onsite evaluations of the training programs are necessary to identify success or failure of training program.

  • Development of Web-based tools for users to interact with developers (discussion forums, bug report, and change request) will help identify new problems, but they will not take the place of direct user contact.

  • Web access to all resources, materials, data files, teaching examples, and user's guides provides users with a central location for materials, which is especially important if they are infrequent users.

  • Follow-up surveys are strongly recommended to provide feedback on problems, frequency of use, and priorities for additions and/or modifications to software and resource materials.

Finally, future implications regarding the software are reflected by many participants saying they were glad there was a PLMS system and complimenting the researchers who were developing it. Comments included "the system has an overall potential," "could be used within other agencies," and that "if the bugs were worked out it, could provide needed assistance for producers." The feedback received from these methods will be used to improve the functionality, accuracy, and user-friendliness of the PLMS software and can be used by other public domain DSS software to improve their chance at user adoption.

References

Dalrymple, R. L., Rogers, J., & Ingram, S. (1996). Comparison of "good" continuous stocking versus controlled rotation grazing of a cereal rye-wheat-annual ryegrass mixture. p. 14-18. In: Proceedings of the American Forage and Grassland Council. Vancouver, BC, Canada. 13-15 June 1996.

Faulkner, D., & Boyer, D. (1993). Cow/calf operation case study of the conservation effects of intensive rotational grazing on Danny and Twyla Boyer's farm in Grayson County. 4 p. USDA-NRCS, Richmond, VA.

Faulkner, D., Kinvig, K., & Boyer, D. (1994). Case study of the conservation effects of intensive rotational grazing on Mike and Marion Goldwasser's beef cattle farm in Virginia's Grayson and Carroll Counties. 6 p. USDA-NRCS, Richmond, VA.

Faulkner, D., Boyer, D., & Dalton, S. (2000). Case study of the effects of intensive rotational grazing on Sanford and Teresa Dalton's dairy farm in Carroll County, Virginia. USDA-NRCS, Richmond, VA.

Groover, G. E. (1998). Management practices on Virginia dairy farms. Virginia Cooperative Extension Publication 448-232. Virginia Polytechnic Institute and State University, Blacksburg, VA.

Hoveland., C. S., McCann, M. A., & Hill, N. S. (1997). Rotational vs. continuous stocking of beef cows and calves on mixed endophyte-free tall fescue-bermudagrass pasture. Journal of Production Agriculture 10:245-250.

Information Systems and Insect Studies. (2002). Pasture Land Management System. Virginia Polytechnic Institute and State University. Available at: http://clic.cses.vt.edu/PLMS/index.html

Joost, R. (1997). Pasture soil fertility management. In: 1997 Missouri grazing manual. (Gerrish & Roberts, eds.) University of Missouri, Columbia, pgs.35-44.

Stone, N., Benson, G. B. Groover, G. Venuto, J., & Cline, B. E. (2000). Pasture Land Management System (PLMS). Proceedings of the National Conference on Grazing Lands (NCGL). Las Vegas, Nevada. December 2000. pp 261-271.

Virginia Forage and Grasslands Council. (1998). Unpublished survey data, livestock management survey, Virginia Forage and Grasslands Council. Crop and Soil Environmental Sciences Department, Center for Survey Research, Virginia Polytechnic Institute and State University, Blacksburg, Virginia.

Walton, P.D., Martinez, R., & Bailey, A. W. (1981). A comparison of continuous and rotational grazing. Journal of Range Management 34:19-21.