August 2016
|
August 2016 // Volume 54 // Number 4 // Tools of the Trade // v54-4tt3
Participatory Data Collection Technique for Capturing Beginning Farmer Program Outcomes
Abstract
This article describes an innovative evaluation plan we employed to capture outcomes of a multiyear beginning farmer program and, specifically, highlights the facilitation technique we used to document short-term and intermediate goals of the program that matched U.S. Department of Agriculture grant requirements and Extension administration priorities. Developing a comprehensive, two-phase evaluation plan based on a well-conceived logic model was a key factor in the success of the New FARM program. Our midterm and end-of-program evaluations addressed often sought, but sometimes difficult to obtain, intermediate goals from the logic model and demonstrated program effectiveness to a variety of funders.
Introduction
From November 2009 to April 2012, our team—four early-career Michigan State University Extension professionals—developed and completed the multiyear New FARM (Farmer Assistance and Resource Management) program, which was designed to enhance the success of beginning specialty crop farmers in northwest Michigan (Sirrine, Eschbach, Lizotte, & Rothwell, 2016). Forty-two beginning farmers participated in 20 unique learning events over 30 months. Content-focused workshops with expert speakers fostered new knowledge and regional networking. Topic-focused activities, delivered through meetings and tours, were designed to support participant skill development and provide interaction with program partners. Year 1 activities included educational programs on public speaking, media relations, risk management, business basics, marketing, estate and tax planning; group attendance at an international fruit conference; and meetings with legislators and policy makers at the state capitol. Participants toured farms, packinghouses, processing plants, and retail operations to better understand specialty crop food chains. Years 2 and 3 activities included educational programs on labor issues, environmental stewardship, public policy, communication, and agritourism. Participants toured the Great Lakes region to view value-added agricultural enterprises and had an opportunity to participate in a capstone 2-week international trip to New Zealand to better understand their role in the global food system.
Tool of the Trade: A Comprehensive Participatory Evaluation Plan
Our tool of the trade is the innovative evaluation plan we used to capture outcomes of the New FARM program. Specifically, we highlight the facilitation technique we used to document short-term and intermediate goals of the program that matched U.S. Department of Agriculture (USDA) grant requirements and Extension administration priorities as part of a state plan of work. Extension professionals are increasingly under pressure to seek competitive funding, rigorously evaluate program impacts, and demonstrate peer-reviewed scholarship (Adams, Harrell, Maddy, & Weigel, 2005; Braverman & Engle, 2009). Making evaluation techniques participatory has also been recommended (Franz, 2013). The New FARM program was funded by participant fees, USDA federal grants, local charity funds, and commodity- or industry-provided sponsorships. A comprehensive evaluation plan was the key tool we used for documenting outcomes and sharing them with funders and program stakeholders.
Evaluation Methodology
We were able to construct a two-phase (formative and summative) evaluation plan because the New FARM program was multiyear. During the formative evaluation phase in year 1, we evaluated each educational event to identify potential program improvements and to monitor the quality of program implementation. In years 2 and 3, we shifted to a summative evaluation focused on outcomes. During year 1, nine process evaluations were conducted after major educational events, providing us data on short-term outcomes. For each evaluation, a paper survey or an online survey was administered to New FARM participants. Some of the participants were not online frequently, and the survey method resulted in low response rates. Our best attendance and participation was at program events. In years 2 and 3, we integrated evaluation time into program meetings to increase engagement with and responses to our evaluations and used several methods proposed for a data party (Franz, 2013). Midterm and end-of-program evaluation meetings measured intermediate outcomes from the logic model via a participatory data collection technique.
Participatory Midterm Evaluation
In January 2011, 23 New FARM participants provided feedback on the program on the basis of the USDA Beginning Farmer and Rancher Development Program draft reporting guide outcomes. Participants were asked to determine whether they changed their behavior or farming practices as a result of the education, resources, and networking they gained from the New FARM program. Four primary outcomes measured changes in farming or land management practices, marketing practices, business strategies, and farm plans. More specifically, in a group setting, New FARM participants were asked whether an outcome applied to them (yes or no). As the facilitator led the group through a series of questions, participants generated a list of examples of how the outcome was true for them. Information was recorded on flip charts in front of the group, and clarification or additional information was gathered as outcomes were defined. As participants developed a better understanding of the outcome on the basis of the group conversation, they were able to change their responses if needed. This iterative process allowed the facilitator to determine a final percentage of participants who agreed with the outcome.
Participatory Summative Evaluation
At the final session and program graduation in March 2012, 21 participants engaged in a facilitated group evaluation that was similar to the midterm evaluation process. The same outcomes were again tallied, and percentages were developed through discussion and consensus. In 2015, a 3-year follow-up survey was administered online and involved program outcome questions similar to those used for the midterm and end-of-program evaluations. Although the response rate was lower than we had hoped it would be (N = 14), the survey responses included rich qualitative information that suggested lasting program impacts. Also, the low response rate reinforced the benefit of in-person, participatory evaluation techniques.
Benefits of the Evaluation Plan
Table 1 highlights the outcomes documented across the program life. The participatory method created a quantitative report (percentage) and yielded specific examples of what the outcomes meant for the participants. Repeating the participatory technique allowed comparisons across time on common measures important to both USDA and Extension administration. Program evaluation results were provided to participants and stakeholders, including community partners, Extension administrators, and funders.
Change in behavior or practice | Midterm results, January 2011 (N = 23) | End-of-program results, March 2012 (N = 21) | 3-year follow-up results, July 2015 (N = 14) |
Modified or expanded current marketing practices; started producing value-added crops | 100% | 57% | 86% |
Changed farming/growing operations or land management practices, including purchasing, leasing, or taking over of family farming operations | 61% | 43% | 79% |
Developed or revised a farm plan, including acting on land sustainability | 43% | 50% | 71% |
Changed business practices and/or applied practical knowledge to improve sustainability of farming operations | 26% | 89% | 86% |
Conclusion
Developing a comprehensive, two-phase evaluation plan based on a well-conceived logic model was a key factor in the success of the New FARM program. By making more advanced use of program theory and logic models in our evaluation planning, we were able to be effective and economical in answering to our stakeholders (Braverman & Engle, 2009). If a program life allows, we encourage the use of an evaluation plan that has a two-phased data collection process to improve efficiency and quality of program evaluations. Among evaluation methods, a facilitated participatory evaluation process generated the most feedback. It also offered participants the opportunity to build relationships and learn about one another's farming operations. We found the USDA outcomes-based reporting guide for the Beginning Farmer and Rancher Development Program helpful in framing our evaluation approach, especially after we operationalized the concepts and documented changes through the facilitated participatory technique. Our midterm and end-of-program evaluations addressed often sought, but sometimes difficult to obtain, intermediate goals from the logic model and demonstrated program effectiveness to a variety of funders.
Acknowledgments
This project was supported by the Agriculture and Food Research Initiative of the National Institute of Food and Agriculture, Grant #2010-49400-21734, Rotary Charities of Traverse City, The Leelanau Conservancy, Grand Traverse Fruit Growers Council, Leelanau Horticultural Society, Cherry Marketing Institute, and New FARM member participant funds.
References
Adams, R. G., Harrell, R. M., Maddy, D. J., & Weigel, D. (2005). A diversified portfolio of scholarship: The making of a successful Extension educator. Journal of Extension [online], 43(4) Article 4COM2. Available at: http://www.joe.org/joe/2005august/comm2.php
Braverman, M., & Engle, M. (2009). Theory and rigor in Extension program evaluation planning. Journal of Extension [online], 47(3) Article 3FEA1. Available at: http://www.joe.org/joe/2009june/a1.php
Franz, N. (2013). The data party: Involving stakeholders in meaningful data analysis. Journal of Extension [online], 51(1) Article 1IAW2. Available at: http://www.joe.org/joe/2013february/iw2.php
Sirrine, J R., Eschbach, C. L., Lizotte, E., & Rothwell, N. L. (2016). The northwest Michigan New FARM program: A model for supporting diverse emerging farmers and early-career Extension professionals. Journal of Extension [online], 54(4) Article 4FEA1. Available at: http://www.joe.org/joe/2016august/a1.php