February 2001 // Volume 39 // Number 1 // Tools of the Trade // 1TOT5

Previous Article Issue Contents Previous Article

Questionnaires for Evaluating On-Farm Field Days

Abstract
On-farm field days are a traditional educational tool used by Cooperative Extension Agricultural Agents/Educators. The field day is generally a day-long event held at a local cooperating farm and typically includes demonstrations of specific management practices. One of the most common ways to evaluate the impact of on-farm field days is with a post-event questionnaire. But the most challenging aspect of evaluating the field day is determining what to measure. Field day questionnaires often attempt to do too much. This article presents three straight-forward categories of questions as well as suggestions for questionnaire design and delivery.


Robin Shepard
Department of Life Sciences Communication
College of Agricultural and Life Sciences
University of Wisconsin ­ Extension
Madison, Wisconsin
Internet Address: Rlshepar@facstaff.wisc.edu


Cooperative Extension agents often take advantage of on-farm field days in their educational programming (Norman, Freyenberger, & Schurle, 1997; Bouare & Bowen, 1990). A field day is generally a day-long event held at a local cooperating farm, and it typically includes demonstrations of specific management practices. Field days are often used to compare traditional practices with a new practice or practices (Seevers et. al, 1997; van den Ban and Hawkins, 1996; and Rollins, Bruening & Radhakrishna, 1991).

How the educational communication process is structured for a field day may vary depending upon what techniques are to be featured and the communication expertise of the event organizers. Field days can range from structured presentations about the practices and impacts of those practices to more informal events where participants walk though field plots or view implements at their own pace (Lionberger & Gwin, 1991). In order to evaluate educational experiences such as field days, organizers often deploy questionnaires the day of the event (Taylor-Powell & Renner, 2000).

This article presents three straight-forward categories of questions for an effective Field Day questionnaire as well as suggestions for questionnaire design and delivery.

The Field Day Questionnaire--What to Measure

Perhaps the most difficult challenge in evaluating field day events is determining what to measure. It is important to understand what can realistically be attributed to a one-time event like a field day. To be most effective, field day questionnaires need to focus on their formative value and include questions that will allow organizers to improve future field days. Questionnaires can be designed to assess participant reactions to the practices being shown as well as to organizational aspects of the event. The results can tell educators what about the event worked and what didn't. When used is this way, field day questionnaires assume formative research qualities (van den Ban & Hawkins, 1996).

At its most basic, a questionnaire should tell event organizers whether they have attracted the appropriate target audience to the demonstration. To accomplish this, three categories of questions should be incorporated into an overall field-day survey.

Information Sources

The first category should address preferred information sources and how they heard about the event (See Field Day Questionnaire, Section A). Questions such as these uncover how participants heard about the field day and how future events should be publicized. Information preferences may also address why the farmer came to the event so that materials and information can be tailored to specific needs and/or levels of current understanding and use of practices.

Demographics

The second category should address the demographics of field day participants (See Field Day Questionnaire, Section B). This helps ensure that notice of future events gets to the people who need the information most. It also tells organizers the participants' farm type and size, and distance traveled by participants.

Applicability of Practices

Finally, the third category of questions should address how the practices being demonstrated applied to those attending the field day (See Field Day Questionnaire, Section C). These questions should record both the extent to which demonstrated practice are being used or not used, and the confidence the participants feel that the practices are practical and can be used on other farms.

Questions in this category can also provide a baseline of information from which to measure adoption. Research shows that field days are more likely to attract farmers who want to adapt practices they have viewed to their own farming systems, rather than adopting practices exactly as demonstrated. This indicates that farmers attending field days come to the event with existing knowledge of practices and dispels the commonly held assumption that farmers need basic instruction in the practices being demonstrated (Hakanson, 1992). Realizing the target audience's knowledge base allows educators to tailor future programs to specific needs.

While assessing knowledge levels is appropriate, attitude change is more difficult to measure and is not considered a reliable impact indicator when measured immediately following an event (Harmon & Jones, 1997). It is also not appropriate to simply ascribe behavior change to field day attendance (Scarborough et al., 1997). Behavior change may be a laudable ultimate goal, but the best event organizers can hope to measure at the time of the event is "intent to use" the demonstrated practices. A separate follow-up survey should be sent to participants after a reasonable amount of time has passed in order to gauge actual practice adoption (Barao, 1992; Warnock, 1992). Where the desire is to evaluate change, such as behavior associated with practice adoption, then it is essential to assess existing behavior.

Considerations in Questionnaire/Survey Design and Delivery

Simplicity

Field days usually offer little time or space for farmers to read detailed information or to sit down and fill out a questionnaire, so the questions should be clear and concise. In most cases field day questionnaires should take less than fifteen minutes to complete. Directions should be easy to follow and questions and answers clearly worded. Pre-testing the questionnaire with a small group of farmers, or even at a similar but smaller event, allows you to assess the readability of questions and their intent is understood.

Avoiding Bias, Rhetorical Questions, and Poorly Worded Response Categories

Biased, poorly worded questions, or questions that present no real difference in responses are common stumbling blocks in questionnaire design. Questions with pre-determined categories as answer choices help simplify the process, but the categories must be clearly defined. Instructions such as "check all that apply" or "check one only" should be included for each question. It is usually a good idea to show the questions to colleagues, particularly people who have experience designing similar instruments. Input from others can help you avoid unclear, leading or misleading questions and statements. It also provides extra proofreading. In practice, every time you use a questionnaire it should be reviewed.

Presentation and Appearance

It is essential that the questionnaire have a professional appearance. The questionnaire should attract the farmer's attention. Questionnaires should be interesting and eye-catching to compete in today's graphical information age. White space, graphics, and color grab attention and give the instrument a professional look. Furthermore, the use of plot diagrams and field maps not only can increases understanding but may draw the reader to the questionnaire and encourage its completion.

Providing Time to Fill It Out

It is often difficult to decide the best time to deploy your questionnaire. Expecting farmers to take it with them and mail or return it will guarantee a low response rate. If the event features an introduction or key speaker, pass the questionnaire out while the main speaker is waiting for the crowd to gather. Describe the purpose of the questionnaire, and even designate a specific time when people can complete it. Past experiences have shown that response rates increase when the need for such information is recognized publicly (Coffy, Jennings, & Humenik, 1998).

However, be cautious not to spend too much time on it at this point because farmers will want to be off to view the demonstration plots. Scheduling a specific time during the day to fill out the questionnaire will stress its importance and will also prevent people from filling it out during the talks, which can be distracting or even inconsiderate to speakers.

This method may not work in informal situations such as pasture walks, where there are no scheduled speakers or sessions. In those instances, an alternative may to be place interviewers at entry or exit points to the demonstration plots. In this case, a selection scheme, such as choosing every fifth person, should be devised. That way you will ensure results that truly represent the diversity of the field day attendees rather than just including respondents who are willing.

Using Incentives While Protecting Confidentiality.

It is very important to protect the respondent's confidentiality. However, to ensure high response rates and minimize non-response bias, a little creativity may be in order. For example, the questionnaire may be designed with a space for the respondent's name on a detachable tab, which can be removed and entered in a drawing. This way, the respondent places his/her questionnaire in one box and the tab with their name on it in another, preserving the confidentiality of their responses. Another option might be to offer a small gift, such as a calendar, farm-management publication, or newsletter, for turning in a completed questionnaire.

Conclusions

Field day questionnaires can provide general information about the influence an event has had on farmers' knowledge or their opinions and perceptions of the demonstrations. However, there is limited research that shows a relationship between how people perceive such events or information, and what they ultimately do with the information (Dixon, 1990; LeRouzic & Cusick, 1998). Although they cannot realistically measure future outcomes such as behavior change or practice adoption, field day questionnaires can help us improve on-farm learning. Their major strength may be to tell us how to expand the educational experience, making it more interesting, relevant, and specific to farmer needs.

References

Barao, S. M. (1992). Behavioral aspects of technology adoption. Journal of Extension [On-line]. 30(2).
Available: http://www.joe.org/joe/1992summer/a4.html

Bouare, D., & Bowen, D. E. (1990). Communications methods used by agricultural Extension agents. Journal of Applied Communications, 74(1), 1-7.

Coffey, S. W., Jennings, G. D., & Humenik, F. J. (1998). Collection of information about farm management practices [)n-line]. Journal of Extension, 36(2).
Available: http://www.joe.org/joe/1998april/a4.html

Dixon, N.M. (1990). The relationship between trainee responses on participant reaction forms and posttest scores. Human Resource Development Quarterly, 1(2), 129-137.

Hakanson, K. I. (1992). An evaluation of the sustainable agriculture demonstration program of Wisconsin. Master's thesis, University of Wisconsin, Madison, Wisconsin.

Harmon, A. H., & Jones, S. B. (1997). Forestry demonstrations: What good is a walk in the woods? Journal of Extension [On-line]. 35(1).
Available: http://www.joe.org/joe/1997february/rb3.html

Le Rouzic, V., & Cusick, M. C. (1998). Immediate evaluation of training events at the Economic Development Institute of the World Bank: Measuring reactions, self-efficacy and learning in a worldwide context. Paper presented at the annual Evaluation Association Meeting, Chicago, IL.

Lionberger, H. F., & Gwin, P. H. (1991). Technology transfer. Columbia, MO: University of Missouri.

Norman, D., Freyenberger, S., & Schurle, B. (1997). County Extension agents and on-farm research work: Results of a Kansas survey. Journal of Extension [On-line]. 35(5).
Available: http://www.joe.org/joe/1997october/a4.html

Rollins, T. J., Bruening, T. B., & Radhakrishna, R. B. (1991). Journal of Applied Communications, 75 (2) 1-9.

Scarborough, V., Killough, S., Johnson, D. A., & Farrington, J. (1997). Farmer-led Extension. Southampton Row, London: Intermediate Technology Publications, in association with World Neighbors, Oklahoma City, Oklahoma.

Seevers, B., Graham, D., Gamon, J, & Conklin, N. (1997). Education through Cooperative Extension. Albany, NY: Delmar Publishers.

Taylor-Powell, E., & Renner, M. (2000). Collecting evaluation data: End-of-session questionnaires (Publication No. G3658-11). Madison, Wisconsin: University of Wisconsin-Extension.

Van den ban, A.W., & Hawkins, H. S. (1996). Agricultural Extension. Cambridge, MA: Blackwell Science.

Warnock, P. (1992). Surveying client satisfaction. Journal of Extension [On-line]. 30(1).
Available: http://www.joe.org/joe/1992spring/a1.html