The Journal of Extension - www.joe.org

April 2016 // Volume 54 // Number 2 // Tools of the Trade // v54-2tt1

From Knowledge to Action: Tips for Encouraging and Measuring Program-Related Behavior Change

Abstract
It is challenging to document the behavior changes that result from Extension programming. This article describes an evaluation method we call the "action items method." Unlike other approaches for measuring behavior change, this method requires program participants to define their own action plans as part of a program and then asks them about completing these goals several months after program completion. To the extent that we help participants identify specific behavioral changes that move them beyond their individual lives, we also exemplify the public value of Extension programming.


Scott Chazdon
Evaluation and Research Specialist, Community Vitality
University of Minnesota Extension
St. Paul, Minnesota
schazdon@umn.edu

Jody Horntvedt
Extension Educator, Leadership and Civic Engagement
University of Minnesota Extension
Roseau, Minnesota
hornt001@umn.edu

Elizabeth Templin
Extension Educator, Community Economics
University of Minnesota Extension
Andover, Minnesota
templin@umn.edu

Much of our effort as Extension educators and specialists focuses on providing people with credible, research-based information that ultimately causes behavioral change. Whereas we spend considerable time developing curricula and learning objectives—sometimes even ways of measuring these objectives—the task of actually finding out whether or not people have moved from knowledge to action is often perceived as too labor intensive or difficult.

This article offers suggestions for following up with program participants and measuring behavior change several months after program completion. The context for these recommendations stems from Extension's community development programming, but these strategies could easily be applied to other content areas. The particular method discussed in this article is called the "action items method." This method, unlike other approaches for measuring behavior change, requires program participants to define their own action plans as part of the program and then asks them about completing these goals several months after program completion.

Before providing a description of the action items method, it is worth noting the context for behavior change measurement in Extension education. In 1975, Claude Bennett noted in the Journal of Extension that behavior change was among the highest levels of evidence for evaluation of Extension education (Bennett, 1975). Since then, Extension has made progress in measuring behavior change. There have been many examples of effective behavior change evaluation published in this journal (Clements, 1999; Garst & Bruce, 2003; Garton et al., 2003; Jayaratne, Harrison, & Bales, 2009; Koszewski, Sehi, Behrends, & Tuttle, 2011). Workman and Scheer's meta-analysis of evaluation articles published in the Journal of Extension found that about 27% of articles focused on behavior change. Yet, as Workman and Scheer (2012) noted, "Too often, Extension personnel fail to document impact of programs by collecting real evidence of behavior change or greater end results that benefit society" (Problem Statement, Purpose, and Objectives section, para. 1).

Most important, perhaps, is that the National Institute for Food and Agriculture continues to push for impacts that affect conditions rather than simply knowledge changes. Their effort to collect impacts from across the country encourages evaluation specialists to look beyond knowledge change (National Institute for Food and Agriculture, 2015).

The Action Items Method

The University of Minnesota's Extension Center for Community Vitality is currently developing an action items method to measure behavior change in several of its leadership workshops and in one leadership cohort program. To clarify the action items approach, several core components of the method were identified and are shown in Table 1. The table provides information on how the method is conducted in two distinct program contexts. The components of the method include program delivery enhancements, program evaluation enhancements, postprogram enhancements, and a feedback loop:

  • Program delivery enhancements. People cannot be asked to identify action items if they have not been prepared for the task. The educator must devote instructional time to helping participants think about their goals and the specific steps they will take to achieve those goals.
  • Program evaluation enhancements. It is important for participants to leave the program with a copy of their action items. There are many ways to achieve this, such as by using carbonless forms or by scanning forms after the session and sending them to participants. The action items must be typed into a database and linked to each participant so that they can be used later for a follow-up survey.
  • Postprogram enhancements. Engagement with participants after a program requires time, but strategies such as using technology, enlisting the help of support staff, or creating alumni-oriented programs can be helpful. Online software programs, such as Qualtrics, make it possible to integrate customized data into an online survey so that participants can report their progress on specific items.
  • Feedback loop. The information collected, including the action items themselves, provides a rich source of data that can be used for program development.
Table 1.
Components of the Action Items Method for a Workshop and a Cohort Program
Action items method component One-time workshop Example: eMarketing workshops (target audience = small, locally owned retail and service businesses) Cohort program Example: Red River Valley Emerging Leadership Program (target audience = 30- to 45-year-olds together for four sessions during a 5-month period)
Program delivery enhancements Prior to 2015:
  • Evaluations focused on knowledge gain. Participants were asked to identify action items, but there was limited focus in the curriculum on behavior change or action goals.
Beginning in 2015:
  • Prompted by training on brain research and adult education theory, the team examined how curricula could better identify action steps for workshop participants.
Prior to 2013:
  • Encouraged participants to set personal goals as they completed the program.
Beginning in 2013:
  • Started collecting "action items" at the end of the program. Participants were asked to list 3–5 items.
In 2014–15:
  • Added a 2-hr workshop to encourage participants to reflect on leadership learning.
Program evaluation enhancements
  • A form, prepared in duplicate, is completed at the end of the meeting.
  • The participant tears off a copy to take away and turns in a copy to Extension staff.
At the end of the event, participants are asked the following questions:
  • Thinking about what you've learned here today, what specific actions do you intend to take in the next few months?
  • Is there anything you decided not to do as a result of this session?
  • Are there other individuals you plan to share this information with? If so, who? (Example: my employees, my city council, my banker, etc.)
  • A worksheet, My Action Items, is provided to participants to record details about their action steps. The worksheet includes examples (to jump-start their thinking) and requires a signature (to instill a sense of ownership).
  • Staff collect the forms, scan them, and mail the originals back to participants within 1 month.
  • The worksheet completed by participants includes the following prompts:
    WHAT. Here's what I will do (I will use/practice...)
    WHERE. Here's where (in my family, work, community...)
    WHEN and HOW. Here's when and how (by [date] and by [examples]...)
    WHO. Here's who will help me (my mentor(s) and/or motivator(s) are...)
Postprogram enhancements Evaluator-led
  • Qualtrics survey is emailed to participants, customized with their action items. Qualtrics allows the evaluator to send the survey under the name of the educator, which increases response rates.
Typically 3 months after end of program
  • NOTE: Each community economics offering has a slightly different time frame.
Educator-led
  • Educators send regular mail (letters and/or postcards) monthly and follow up with personal emails monthly.
  • Educators connect with participants via social media (closed Facebook group for participants) and one-on-one contacts initiated by participants.
  • Educators use a Qualtrics survey (16 months after program) for evaluation, customized with each participant's action items.
Feedback loop
  • The actions that participants list alert staff to topics for new curricula.
  • The actions that participants list help determine whether marketing materials were clear regarding learning objectives.
  • Evaluating action items and what people actually did helps identify where more detailed instruction is needed in the curriculum.
  • Educators review action plans to identify areas in which participants might need support and then use that information to design an alumni retreat (3 months after the session) for continued learning.

Lessons Learned

  • The action items method must be embedded into the entire program delivery cycle (before, during, and after) for maximum effectiveness.
  • Participants are much more likely to respond to postprogram surveys if the staff has built a relationship with them during the program. Cohort programs allow for this to happen more readily than one-time workshops.
  • The intensity or length of a program is not an impediment to measuring behavior change. Even a one-time, 1-hr workshop can produce behavior change.
  • Having a postprogram strategy to engage participants supports the action items method. Ideas include exploring the use of online social media groups to share ongoing learning/insights, creating online book clubs, sending regular postcards, or creating special gatherings for program alumni.

By encouraging participants to identify personal action items, we remind ourselves, as well as the participants, of the enduring value of applying knowledge gained, of evaluation and the importance of adding value to their lives, organizations, and communities. To the extent that we help participants identify specific behavioral changes that move them beyond their individual lives, we also exemplify the public value of Extension programming (Chazdon & Paine, 2014; Franz, 2011; Kalambokidis, 2004).

References

Bennett, C. (1975). Up the hierarchy. Journal of Extension [online], 13(2), 7–12. Available at: http://www.joe.org/joe/1975march/1975-2-a1.pdf

Chazdon, S., & Paine, N. (2014). Evaluating for public value: Clarifying the relationship between public value and program evaluation. Journal of Human Sciences and Extension, 2(2), 100–119. Retrieved from http://media.wix.com/ugd/c8fe6e_8b2458db408640e580cfbeb5f8c339ca.pdf

Clements, J. (1999). Results? Behavior change! Journal of Extension [online], 37(2) Article 2COM1. Available at http://www.joe.org/joe/1999april/comm1.php

Franz, N. K. (2011). Advancing the public value movement: Sustaining Extension during tough times. Journal of Extension [online], 49(2) Article 2COM2. Available at: http://www.joe.org/joe/2011april/comm2.php

Garst, B. A., & Bruce, F. A. (2003). Identifying 4-H camping outcomes using a standardized evaluation process across multiple 4-H educational centers. Journal of Extension [online], 41(3) Article 3RIB2. Available at: http://www.joe.org/joe/2003june/rb2.php

Garton, M., Hicks, K., Leatherman, M., Miltenberger, M., Mulkeen, P., Nelson-Mitchell, L., & Winland, C. (2003). Newsletters: Treasures or trash? Parenting newsletter series results in positive behavior changes. Journal of Extension [online], 41(1) Article 1RIB5. Available at: http://www.joe.org/joe/2003february/rb5.php

Jayaratne, K. S. U., Harrison, J. A., & Bales, D. W. (2009). Impact evaluation of food safety self-study Extension programs: Do changes in knowledge relate to changes in behavior of program participants? Journal of Extension [online], 47(3) Article 3RIB1. Available at: http://www.joe.org/joe/2009june/rb1.php

Kalambokidis, L. (2004). Identifying the public value in Extension programs. Journal of Extension [online], 45(2) Article 2FEA1. Available at: http://www.joe.org/joe/2004april/a1.php

Koszewski, W., Sehi, N., Behrends, D., & Tuttle, E. (2011). The impact of SNAP-ED and EFNEP on program graduates 6 months after graduation. Journal of Extension [online], 49(5) Article 5RIB6. Available at: http://www.joe.org/joe/2011october/rb6.php

National Institute of Food and Agriculture. (2015). NIFA 2015 Impacts Report. Retrieved from http://nifa.usda.gov/sites/default/files/resource/NIFA%202015%20Impact%20Report%20Web%20Version.pdf

Workman, J. D., & Scheer, S. D. (2012). Evidence of impact: Examination of evaluation studies published in the Journal of Extension. Journal of Extension [online], 50(2) Article 2FEA1. Available at: http://www.joe.org/joe/2012april/a1.php