The Journal of Extension - www.joe.org

December 2020 // Volume 58 // Number 6 // Feature // v58-6a2

Measuring the Aggregated Public Value of Extension

Abstract
Extension program participants tell story after story of the impact of Cooperative Extension on their lives, their families, and their businesses. Despite huge amounts of qualitative data, very little quantitative data exist showing the aggregated public value of Extension programs—especially across program areas. The lack of data leaves Extension administrators high and dry when they are asked to show public value, a circumstance sometimes resulting in reductions in human and financial resources. A simple, yet powerful Extension public value instrument was developed and used in Georgia along with analysis methods designed to showcase the short-, medium-, and long-term impacts of Extension programs.


Alexa J. Lamm
Associate Professor
University of Georgia

Adam Rabinowitz
Assistant Professor
Auburn University

Kevan W. Lamm
Assistant Professor
University of Georgia

Kisha Faulk
Extension Program Development Coordinator
University of Georgia

Introduction

Proving the public value of the programs offered by the national Extension system has been a topic discussed by legislators, funding agencies, university administrators, and Extension professionals for decades (Lamm et al., 2013). Public value as a concept describes the "value," beyond simple monetary costs and benefits, added by a given policy, program, or agency (Moore, 1995). The first action taken to ensure that public value was at the forefront of Extension funding was in 1977 when the Food and Agriculture Act mandated that the Secretary of Agriculture examine the social and economic value of Extension programs. The resulting report identified the work of Extension to be "short on impacts" (Warner & Christenson, 1984, p. 17). Despite pressure over the past 40 years, Extension has repeatedly fallen short, found to be lacking in data proving its worth (Andrews, 1983; Chapman-Novakofski et al., 1997; Radhakrishna & Relado, 2009), and state and federal Extension organizations are repeatedly found to be especially "inadequate at reporting programmatic successes at the medium- and long-term level" (Lamm et al., 2013, para. 7).

Several state Extension systems have examined their hiring practices, recognizing that they do not hire Extension professionals with the training needed to evaluate effectively. Previous efforts to address this issue included targeting for hire individuals with formal training in program and evaluation design. However, this approach severely limited the pool of eligible applicants and was quickly found to result in an unsustainable Extension workforce. In addition, applicants with suitable evaluation experience often did not have the formal training needed in specific subject matter areas. The lack of evaluation experience left professional development for Extension professionals as the only option. Extension organizations have worked to build evaluation capacity within their ranks (Arnold, 2002; Baughman et al., 2010; Diaz et al., 2019; Franz & Archibald, 2018; Rennekamp & Arnold, 2009; Silliman, 2016), but impact statements and federal reports continue to supply numbers of Extension participants with little data indicating how Extension efforts have made citizens' lives better, businesses more efficient, or communities stronger.

Rather than rely on extensive training, several Extension evaluation specialists have suggested that the measure of medium-term (behavior or practice) changes and long-term (social, economic, and environmental) changes would be more attainable when working with groups of Extension professionals (Lamm et al., 2011). Lamm et al. (2013) found that Extension professionals' working together allows for distribution of tasks such as "creating detailed plans, establishing instruments to measure behavior and social, economic, and environmental condition changes, and conducting data analysis for reporting purposes," thereby "reducing the pressure and time commitment felt by a single individual" (para. 20). Despite best efforts, a single measure of Extension impact has remained unattainable. We believe that this is largely due to the fact that Extension programs cover a variety of topics, including, but not limited to, production crops, animal sciences, gardening, pest management, water conservation, human nutrition, leadership development, community resilience and vitality, and youth development. Accordingly, individual program areas report measurements of impact in diverse ways, resulting in there being no way to aggregate the data to tell a single Extension story of public value.

The list of Extension programs, and the clientele they reach, is vast and long with a diverse array of outcomes expected. However, the one thing most endeavors have in common is that we, as Extension professionals, expect clientele to gain some sort of knowledge and then change their behavior. Therefore, if we can focus an evaluation on the larger aspect of the changes we expect (knowledge gain and behavior change) and link that change to economic value, we can begin to tell an aggregated Extension impact story. We explored this concept by developing an evaluation instrument that can be customized for various audiences who are asked to connect self-reported knowledge gain to intent to change behavior and anticipated economic value. We then collected exploratory results to showcase the power of aggregated Extension impact.

Instrument Development

Extension professionals across the United States have reported that it is extremely difficult to follow up with participants of one-time Extension programs (L. Perry Johnson, personal communication, August 1, 2018). Considering the difficulty associated with collecting data from participants, we designed an Extension public value instrument to be administered once at the conclusion of an Extension program so that program participants were present and no follow-up would be necessary. We took into account the one-time-application feature throughout the instrument design process. Through a review of the literature, we determined that self-report methods, although not 100% reliable, do provide insights into participants' perceptions of the value of a program and can be reported as such (e.g., Gonyea, 2005).

A panel of experts reviewed the instrument to ensure content and face validity. The panel included Extension professionals with programmatic expertise in agricultural production, agricultural economics, family and consumer sciences, community and leadership development, and survey development. We then had the instrument reviewed by the state Extension director to ensure that the data obtained would be useful in discussing the public value of Extension. Finally, we conducted pilot testing of the instrument with selected representative Extension program participants.

Self-Reported Knowledge Gain

The instrument begins with a two-part Likert-type question designed to capture self-reported knowledge gained through use of a retrospective post-then-pre design (see Figure 1).

Figure 1.
Question for Measuring Self-Reported Knowledge Gain

Intent to Change Behavior

A behavior change question on the instrument measures intention to use the information presented as Ajzen (1991) identified intent as the primary indicator of actual behavioral change. The question is presented as categorical (see Figure 2).

Figure 2.
Question for Measuring Intent to Change Behavior

Self-Reported Economic Value

The complexities of programmatic intent come into play when trying to determine through self-report perceptions of benefits to an individual, society, or the environment—whether such benefits are social or financial. Recognizing that one size does not fit all in Extension programming, we created and tested multiple iterations of the self-report economic value questions, although we used a fairly consistent approach for all such items.

At a statewide Extension strategic planning session, Georgia agriculture and natural resources Extension professionals expressed that they were interested in determining the economic benefit of their Extension programs (M. McCann, personal communication, July 10, 2018); however, they were uncomfortable asking participants for specific financial information (L. Perry Johnson, personal communication, August 1, 2018). Changes in farm gate values over time are not an adequate measure of change resulting from Extension programming because they can vary as a result of many influences in addition to an Extension intervention (e.g., extreme weather events, changes to import/export regulation, competition from other countries/states). On the other hand, Extension program participants are fairly aware of their crop and herd values and what they spend on production inputs. Therefore, program participants have an idea of the cost or savings to their business likely to result from implementing a new practice or behavioral change as well as the potential increase in revenue that such a change can generate. Taking into consideration all these circumstances, we developed two questions to capture self-reported economic gain/savings values (see Figure 3). The example in the figure shows the questions in terms of acreage, but the questions could be slightly altered using a different measurement unit, such as "per head" for animal production.

Figure 3.
Questions for Measuring Self-Reported Economic Gain/Savings Associated With an Agriculture or Natural Resources Extension Program

At the same statewide Extension strategic planning session described previously, leadership and community development and family and consumer sciences Extension professionals also expressed interest in the economic gains or savings participants attributed to their programs (L. Perry Johnson, personal communication, August 1, 2018). However, the factors used in the questions presented in Figure 3 were not well suited for their audiences. Therefore, we developed a single question to capture leadership and community development participants' self-reported economic benefit (see Figure 4). Further, we expected that some Extension program participants might struggle to think about how the information presented in a program could assist them economically over a coming year but would be able to think about how it could assist them over a coming month; for example, this scenario might apply to individuals participating in family and consumer sciences programs focused on household budgeting. Therefore, we slightly adapted the question shown in Figure 4 to develop two questions applicable to family and consumer sciences audiences (see Figure 5). In the adaptations, the question phrasing alters the way the resulting economic value data can be reported and/or aggregated with the results from questions not pertaining to only a single month. Consequently, one would need to either discuss this result separately or make an assumption regarding how representative a subsequent month would be relative to an entire year in order to aggregate the 1-month data with annual data.

Figure 4.
Question for Measuring Self-Reported Economic Gain/Savings Associated With a Leadership and Community Development Extension Program

Figure 5.
Questions for Measuring Self-Reported Economic Gain/Savings in Subsequent Month Associated With a Family and Consumer Sciences Extension Program

Data Collection

Agriculture and Natural Resources

Institutional review board approval was obtained from the University of Georgia Human Subjects Office (Project00000044). Data were collected January to March 2019 at county crop production meetings throughout Georgia. Crop production meetings occur annually to prepare participants for the upcoming growing season. They last approximately 2 hr and include two or three state Extension specialists presenting on a range of crop-specific topics, including agronomy, plant pathology, entomology, and economics. These meetings have become a popular location for participants to receive information on the latest scientific research and market updates so that they can make well-informed farm management decisions. We were able to collect 1,501 completed questionnaires using the questions from Figures 1, 2, and 3 from cotton, peanut, and blueberry producers. The largest group of respondents was cotton producers (n = 851).

Leadership and Community Development

Institutional review board approval was obtained from the University of Georgia Human Subjects Office (Project00006723). Data from statewide leadership and community development programs held throughout Georgia were collected from April to August 2019. Leadership and community development programs last from 2 to 8 hr, with a state Extension specialist delivering educational material. Specific topics include leadership development, strategic planning, organizational development, community resilience, community planning, and rural stress. We were able to collect 217 completed questionnaires using the questions from Figures 1, 2, and 4.

Family and Consumer Sciences

Institutional review board approval was obtained from the University of Georgia Human Subjects Office (Project00000045). Data from family and consumer sciences programs conducted in the northwest district of Georgia, which includes the Atlanta metro area, were collected from December 2018 to September 2019. Family and consumer sciences programs last approximately 1 to 3 hr, with an Extension professional (agent or program assistant) delivering educational material in both English and/or Spanish. Specific topics addressed included nutrition, weight loss, financial management, cancer prevention, chronic disease control, canning, and establishing healthy relationships. We were able to collect 1,592 completed questionnaires using the questions from Figures 1, 2, and 5.

Data Analysis

Self-Reported Knowledge Gain

The question measuring knowledge gain was designed to allow an Extension professional to compare a respondent's perception of their knowledge of the information presented before the program to their perception of their knowledge after the program. We analyzed the collected data by assigning a numerical score to the response to each part of the item ("BEFORE," "AFTER"): 1 = no knowledge, 2 = some knowledge, 3 = fairly knowledgeable, 4 = very knowledgeable, 5 = extremely knowledgeable. To determine whether a significant change in knowledge occurred, we compared the before and after mean scores of the program participants using a paired t test (p < .05). We then used Cohen's d to determine effect size.

Intent to Change Behavior

The intent to change behavior question is presented as categorical; therefore, the results are reported as a frequency count or percentage of respondents indicating each category. For the data we collected, we calculated proportions of respondents who selected each answer choice.

Self-Reported Economic Value

For the agriculture and natural resources questions, we multiplied the mean value of the dollar amount range a respondent selected for the first question by the mean value of the acreage range the respondent selected for the second question. For example, if a respondent indicated believing they would save or gain between $1.01 and $2.00, we would assign a value of $1.50 for that response. Then, if the respondent indicated believing that 10 to 49 ac would be affected by the program, we would assign an average value of 30 ac. Multiplying the two figures, we would determine that the individual ascribed an economic value of $45 to the information gained at the Extension program for the next growing season. We then summed the economic values assigned to the participants to reach an overall value for the program.

For responses from leadership and community development and family and consumer sciences program participants, we used the mean value of each of the selected dollar amounts. For example, for the questions that examined self-reported economic gain/savings anticipated for the coming month, if a respondent indicated believing they would gain or save between $50 and $99, we would assign a value of $75 for that response. We then summed the values assigned to the participants to determine what the overall value of the Extension program to its participants would be in the coming year or month (depending on wording of the question).

Results

Self-Reported Knowledge Gain

Regardless of programmatic area, respondents reported a statistically significant change in knowledge from before to after the Extension program (see Table 1).

Table 1.
Summary of Self-Reported Knowledge Gained as a Result of Extension Programming

Program area n Pre
M (SD)
Post
M (SD)
t Cohen's d
Agriculture and natural resources 1,324 3.72 (1.13) 4.93 (.74) −42.64** −1.26
Leadership and community development 212 2.40 (.83) 3.58 (.71) −50.41** −1.97
Family and consumer sciences 1,369 2.75 (.92) 4.19 (.74) −55.20** −1.73
Aggregated 2,905 3.17 (1.14) 4.48 (.86) −71.54** −1.28
Note. Values for scaled response options were as follows: 1 = no knowledge, 2 = some knowledge, 3 = fairly knowledgeable, 4 = very knowledgeable, 5 = extremely knowledgeable.
**p < .01.

Intent to Change Behavior

Respondents participating in the agriculture and natural resources Extension programs were most likely to say that they definitely would use the information provided, followed by family and consumer sciences participants and then leadership and community development participants (see Table 2). Regardless of program area, over 90% of all Extension program participants reported that they probably or definitely would use the information they received to change their behavior.

Table 2.
Summary of Intent to Change Behavior as a Result of Extension Programming

Program area n I definitely will not use this information
%
I probably will not use this information
%
I have not decided if I will use this information
%
I will probably use this information
%
I will definitely use this information
%
Agriculture and natural resources 1,490 .5 .8 2.5 20.6 75.6
Leadership and community development 217 .0 1.4 5.1 35.3 58.1
Family and consumer sciences 1,592 .1 1.2 2.6 19.1 74.4
Aggregated 3,256 .3 1.0 2.7 21.1 74.8

Self-Reported Economic Value

Self-reporting, respondents from agriculture and natural resources programs associated the information they received at crop production meetings with over $13 million of value relative to the subsequent growing season (see Table 3 at the end of this section). We divided this figure by the number of respondents to determine that the average self-reported economic value of Extension information to agriculture and natural resources audience members was $10,272. Analyzing the data for respondents individually revealed that the self-reported economic values ranged from $0 to $375,000 per person.

Self-reporting, respondents from leadership and community development programs associated the information they received from Extension with over $1,205,000 of financial benefit they would incur in the subsequent year (see Table 3 at the end of this section). We divided this figure by the number of respondents to determine that the average self-reported economic value for the coming year was $8,310. Analyzing the data for respondents individually, we found that the self-reported economic values ranged from $0 to $350,000 per person.

Self-reporting, respondents from family and consumer sciences programs associated the information they received with $83,843 that would be saved or gained over the subsequent month (see Table 3 at the end of this section). We divided this figure by the number of respondents to determine that the average self-reported economic value relative to the subsequent month was $65.52. Analyzing the data for respondents individually, we found that the self-reported economic values ranged from $0 to $1,000 per person. If we assume that the responses obtained are representative of a typical month, we can extrapolate to determine that the economic value for the subsequent year would be $1,006,116.

Finally, we examined the data in an aggregated form to calculate total value that would occur for participants over the course of the subsequent year. Self-reports of participants in the programs studied indicated that the information they received from Georgia Cooperative Extension would be worth $15,493,789 to them over the subsequent year.

Table 3.
Summary of Self-Reported Economic Value of Extension Programming

Program area n Total amount expected to be saved/ gained per acre over subsequent growing season Total size of production acres impacted (acres) Amount expected to be saved/ gained over subsequent month Self-reported value of information over subsequent growing season Financial benefit expected to be derived by participants or participants' organizations over subsequent year
Agriculture and natural resources 1,293 $16,542 1,009,328 $13,282,673 a
Leadership and community development 145 $1,205,000 b
Family and consumer sciences 1,274 $83,843 $1,006,116 c
Aggregated total 2,712 $15,493,789 d
a Calculated by multiplying the individually reported amount expected to be saved/gained per acre by the same individual's reported size of production acres affected and then summing the values. b Sum of the responses provided by leadership and community development Extension program participants. c Calculated by multiplying the amount expected to be saved/gained over the subsequent month by 12. d Sum of the total value identified within the three program areas studied.

Discussion

While the economic value gives us a starting figure for estimating the public value of the Extension programs under inquiry, limitations must be recognized. First, the economic value takes into account only one growing season for agriculture and natural resources program participants. The information provided in agriculture and natural resources county crop production meetings in Georgia often leads to a change in best management practice or the adoption of a new technology that will be implemented across many growing seasons; therefore, impact is expected to have effect beyond a single season. The economic value also is only a reporting by the individuals responding to the economic question; thus, it does not account for nonrespondents as missing data were not imputed. Furthermore, one should consider the indirect effects of information transfer when farmers share their knowledge gained with other farmers or consultants obtain information and share it with their clients. Therefore, not all individuals receiving information from the Extension programs studied were accounted for and thus the economic value underestimates actual public value and should be thought of as a lower bound estimate.

In addition, the values associated with impact perceived by family and consumer sciences program participants do not acknowledge that the information likely will have a broader effect. Participants of family and consumer sciences programs often represent the influencer in a family unit. Therefore, the captured adopted behavior or knowledge gained is restricted to the effect on the participant and does not reflect the change in condition of other family members/dependents in the immediacy and over time.

Finally, other limitations should be acknowledged and used when interpreting the results. There are limitations associated with self-reporting. When responding, program participants are using their perceptions in the moment the data are captured and may be over or under reporting their actual knowledge levels, behavioral intention, and/or associated economic value. Second, the instrument is only measuring behavioral intent and not actual behavior change. Although it is recognized that an individual must have behavioral intent in order to change behavior (Ajzen, 1991), it is also observed that behavioral intent does not necessarily result in actual behavioral change or long-term adoption of a behavior.

Despite these limitations, use of the Extension public value instrument provided us with a way to measure aggregated knowledge gain, intent to change behavior (short-term outcomes leading to medium-term outcomes), and self-reported economic value (long-term outcome) of Extension programs. Given that Extension has repeatedly fallen short, with a lack of data proving its worth (Andrews, 1983; Chapman-Novakofski et al., 1997; Lamm et al., 2013; Radhakrishna & Relado, 2009), the Extension public value instrument could be a first move in Extension's ability to report aggregated short-, medium-, and long-term programmatic successes.

Next steps include broadening the use of the Extension public value instrument to encompass more diverse Extension programs, identifying ways to fill in missing financial data, and exploring sampling methods to extrapolate to the broader population while reducing the burden of survey fatigue on Extension program participants. Once these methods are identified and tested, expanding data collection outside the state of Georgia would allow for regional and national programs to measure aggregated effects. Extension leaders should also consider developing standardized evaluation tools for signature Extension programs that include economic evaluation measures to further capture objective data above and beyond the information obtained by this tool. If public value is measured and reported, Extension will continue to be seen as a valuable asset to the United States, worthy of additional human and financial investment.

Author Note

We would like to thank Dr. Jared Whitaker, Dr. Scott Monfort, and the family and consumer sciences agents in the Northwest Extension District of Georgia for their assistance in collecting the data needed to test and verify the use of the Extension public value instrument.

Correspondence concerning this article should be addressed to Alexa J. Lamm. Email: alamm@uga.edu

References

Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 5(2), 179–211. https://doi.org/10.1016/0749-5978(91)90020-T

Andrews, M. (1983). Evaluation: An essential process. Journal of Extension, 21(5), 8–13. http://www.joe.org/joe/1983september/83-5-a1.pdf

Arnold, M. E. (2002). Be "logical" about program evaluation: Begin with learning assessment. Journal of Extension, 40(3), Article 3FEA4. http://www.joe.org/joe/2002june/a4.php

Baughman, S., Arnold, M., Boyd, H. H., Franz, N. K., Mead, J. P., Rowe, E., & Silliman, B. (2010). Evaluating for impact: Professional development educational content delivery through learning communities. Journal of Extension, 48(3), Article v48-3tt3. https://www.joe.org/joe/2010june/tt3.php

Chapman-Novakofski, K., Boeckner, L. S., Canton, R., Clark, C. D., Keim, K., Britten, P., & McClelland, J. (1997). Evaluating evaluation: What we've learned. Journal of Extension, 35(1), Article IRIB2. http://www.joe.org/joe/1997february/rb2.php

Diaz, J., Kumar Chaudhary, A., Jayaratne, K. S. U., & Warner, L. A. (2019). Program evaluation challenges and obstacles faced by new Extension agents: Implications for capacity building. Journal of Extension, 57(4), Article v57-4a1. https://www.joe.org/joe/2019august/a1.php

Franz, N. K., & Archibald, T. (2018). Four approaches to building Extension program evaluation capacity. Journal of Extension, 56(4), Article v56-4tt5. https://www.joe.org/joe/2018august/tt5.php

Gonyea, R. (2005). Self-reported data in institutional research: Review and recommendations. New Directions for Institutional Research, 127, 73–89.

Lamm, A. J., Harder, A., Israel, G. D., & Diehl, D. (2011). Team-based evaluation of Extension programs. EDIS Publication #WC118. University of Florida. http://edis.ifas.ufl.edu/wc118

Lamm, A. J., Israel, G. D., & Diehl, D. (2013). A national perspective on the current evaluation activities in Extension. Journal of Extension, 51(1), Article v51-1a1. https://www.joe.org/joe/2013february/a1.php

Moore, M. H. (1995). Creating public value: Strategic management in government. President and Fellows of Harvard Collection.

Radhakrishna, R. B., & Relado, R. Z. (2009). A framework to link evaluation questions to program outcomes. Journal of Extension, 47(3), Article v47-3tt2. http://www.joe.org/joe/2009june/tt2.php

Rennekamp, R. A., & Arnold, M. E. (2009). What progress, program evaluation? Reflections on a quarter-century of Extension evaluation practice. Journal of Extension, 47(3), Article v47-3comm1. http://www.joe.org/joe/2009june/comm1.php

Silliman, B. (2016). E-basics: Online basic training in program evaluation. Journal of Extension, 54(1), Article v54-1tt1. https://www.joe.org/joe/2016february/tt1.php

Warner, P. D., & Christenson, J. A. (1984). The Cooperative Extension Service: A national assessment. Westview Press.