June 2020
|
June 2020 // Volume 58 // Number 3 // Tools of the Trade // v58-3tt2
Evaluation Tool for Collecting Statewide Outcomes for Single-Session Programs
Abstract
Evaluation is critical to demonstrating program value and impact and to better communicating outcomes to stakeholders. Purdue Extension Health and Human Sciences (HHS Extension) created an evaluation tool based on the need to collect statewide metrics on a standardized set of questions addressing the topics of food, family, money, and health. This evaluation tool, Survey Builder, allows Extension educators to customize evaluations for single-session programs using a streamlined online approach. Data from Survey Builder allow HHS Extension to demonstrate the collective outcomes of statewide programming efforts. Survey Builder was developed to be used by other organizations as well.
Introduction
Purdue Extension Health and Human Sciences (HHS Extension) has created an evaluation instrument called Survey Builder to collect statewide metrics via a standardized set of questions. HHS Extension educators use program-specific evaluation tools for multisession statewide health and human sciences programs but lacked an efficient and effective way to collect and aggregate outcomes of single-session, community-based programs in the content areas of food, family, money, and health. Common outcomes equip Extension to better communicate programming impact and value to stakeholders (Wise, 2017).
Framework/Methodology
Survey Builder is an online system that we in HHS Extension developed to allow educators to create, implement, and enter data for their own program evaluations. It provides collective outcomes related to knowledge gain and intended behavior change as a result of one-time educational sessions that are typically less than 1 hr and delivered to adult learners by HHS Extension educators across the state.
Evaluations developed in Survey Builder involve a retrospective post-then-pre design. This design is an effective way to assess learners' self-reported changes in knowledge and behaviors (Klatt & Taylor-Powell, 2005). The retrospective post-then-pre design minimizes biases that result from pretest overestimation or underestimation (Lam & Bengo, 2003; Raidl et al., 2004). Survey Builder includes two types of questions: (a) those that educators are able to choose from a standardized list and (b) those that automatically appear on every evaluation. The standardized questions include intended behavior questions that focus on skills and actions the participant will or may take after attending the program and knowledge questions that evaluate self-reported knowledge gained while attending the program. The default (automatic) questions assess demographics and transformational learning and allow participants to share an example of how the program made or will make a difference to them.
Transformational learning theory suggests that beyond a gain in knowledge, changes will occur in one's thoughts, feelings, attitudes, and/or behaviors. In turn, these changes can affect understanding of oneself or how one engages with other people and systems and/or responds to an environment (Simsek, 2012). Survey Builder is intended to collect short-term impacts only, and any longitudinal tracking of participants is not possible. Therefore, the addition of the two transformational learning–related default questions to Survey Builder allows for some conclusions regarding transformational learning as a result of attending HHS Extension programs. Participant responses to these statements ("I learned information that I will share with others" and "I learned information that will improve my life in a positive way") give insight as to the value of the program and how the knowledge from HHS Extension affects the lives of participants after conclusion of a program.
Development
We used a collaborative process to develop learner objectives and questions in Survey Builder. Educators worked with HHS Extension specialists who lead the content areas associated with food, family, money, and health to brainstorm educational topics taught in those content areas. Learner objectives and questions (Table 1) were identified that assessed changes in knowledge and intended behavior relating to the content areas. The questions were reviewed to ensure face validity and comprehension. Translation of Survey Builder from English to Spanish was conducted by a third-party vendor.
Learner objective | Question |
---|---|
Knowledge | |
The learner indicates increased knowledge in PHYSICAL HEALTH. | I know how to reduce my risk of developing a chronic disease. |
The learner indicates increased knowledge in RELATIONSHIPS/SOCIAL HEALTH. | I know how to create positive experiences with others. |
Intended behavior | |
The learner intends to improve their FINANCIAL EMPOWERMENT. | I plan to save money for emergencies. |
The learner intends to improve their FOOD SAFETY. | I intend to use a food thermometer when cooking and reheating food. |
Technical Design
Survey Builder is a custom-built evaluation instrument. Access to Survey Builder is password protected and ensures data security. Upon entry into Survey Builder, educators start the process of creating a survey by entering basic information about the program they will be evaluating. Next they select which of the approximately 130 knowledge and intended behavior questions to include on the survey on the basis of the learner objectives they have designated for the program. Once all questions have been added to the survey, the educator is able to print the survey in English/Spanish or make it available electronically via short message service, or SMS, text or web link.
The survey is administered to participants following a face-to-face or virtual educational program. The educator collects the completed paper surveys from the participants and enters the data into Survey Builder. Electronic participant entries automatically populate into Survey Builder, thus eliminating the need for manual entry of the survey responses. Survey Builder generates custom reports by county, educator, or program title.
Application
Survey Builder has been used since 2014 to evaluate 1,542 programs from 20,563 individuals. The goal of Survey Builder is to better communicate programming impact and value to stakeholders. Each programming year, aggregated results from Survey Builder are featured in various formats, including an infographic. This infographic highlights key findings from programming efforts, as well as examples of increased knowledge and intended behavior change among program participants (Figure 1). HHS Extension uses the infographic to communicate impact and value to board members, elected officials, leaders, and other partners.
Figure 1.
Survey Builder Infographic Excerpt
Although Survey Builder was designed for HHS Extension, it could easily be adapted for use with different educational content in other areas of Extension, such as agriculture and natural resources or community development. The standardized question categories also have the potential to extend beyond knowledge and intended behavior to include questions centered on attitudes or beliefs, for example. Survey Builder was developed to be used by other organizations. Institutions that are interested should contact a member of our author team for more information.
Conclusion
The goal of Survey Builder is to better communicate program impact and value of single-session, community-based programs to stakeholders. Through its development and use by Extension educators, this goal is achieved. Survey Builder captures demographic data of program participants. This information helps HHS Extension identify gaps in audiences served and direct efforts to recruiting more diverse audiences. Data also reveal the breadth of content being delivered to program participants. Being able to show positive changes by program participants reflects the role and value these programs have in improving the health, well-being, and quality of life of individuals and families across the state. The ability to articulate outcomes and impacts identified via Survey Builder enables HHS Extension to be competitive in seeking funds to expand programming and increase the number of people served.
Acknowledgments
We wish to acknowledge the contributors to the development and implementation of Survey Builder, including HHS Extension administration, specialists, and educators; former HHS Extension web developer Russell Query and independent contractor Isodomum LLC; and Purdue Extension strategic initiatives coordinator Julie Huetteman.
References
Klatt, J., & Taylor-Powell, E. (2005). Using the retrospective post-then-pre design, quick tips #27. Retrieved from https://fyi.uwex.edu/programdevelopment/files/2016/04/Tipsheet27.pdf
Lam, T., & Bengo, P. (2003). A comparison of three retrospective self-reporting methods of measuring change in instructional practice. American Journal of Evaluation, 24(1), 65–80. doi:10.1177/109821400302400106
Raidl, M., Johnson, S., Gardiner, K., Denham, M., Spain, K., Lanting, R., . . . Barron, K. (2004). Use retrospective surveys to obtain complete data sets and measure impact in Extension programs. Journal of Extension, 42(2), Article 2RIB2. Available at: https://www.joe.org/joe/2004april/rb2.php
Simsek, A. (2012). Transformational learning. In Encyclopedia of the sciences of learning (Vol. 2, pp. 3341–3344). Boston, MA: Springer.
Wise, D. (2017). Evaluating Extension impact on a nationwide level: Focus on programs or concepts? Journal of Extension, 55(1), Article v55-1comm1. Available at: https://www.joe.org/joe/2017february/comm1.php