February 2016
|
February 2016 // Volume 54 // Number 1 // Tools of the Trade // v54-1tt1
E-Basics: Online Basic Training in Program Evaluation
Abstract
E-Basics is an online training in program evaluation concepts and skills designed for youth development professionals, especially those working in nonformal science education. Ten hours of online training in seven modules is designed to prepare participants for mentoring and applied practice, mastery, and/or team leadership in program evaluation. In this article, implications for practice, policy, and research are reviewed.
Context for Professional Development in Evaluation
Program evaluation skills are critical to Extension professionals (Arnold et al., 2008; Harder, Place, & Scheer, 2010; McClure, Fuhrman, & Morgan, 2012; Rodgers, Hillaker, Haas, & Peters, 2012), yet resources, including time, tools, and expertise, are shrinking as budgets decrease and workloads increase (Lamm & Israel, 2013). Two capacity-building initiatives found that most Extension staff identified themselves as "novices" or "beginners" in evaluation skills, at least before training (Douglah, Boyd, & Gundermann, 2003 [as cited in Arnold, 2006]). Academic courses and professional workshops are typically more effective in promoting knowledge gain than skill mastery, which requires field experience and mentoring approaches (Arnold, 2006; Dillman, 2013). However, most novice practitioners find knowledge of basic concepts a valuable prerequisite for applied learning experiences (Arnold, 2006).
Youth professionals face time, cost, and skill-learning pressures to keep up with the demands of their field (Leikes & Bennett, 2011). Online professional development provides a convenient, low-cost, and effective alternative to face-to-face training (Fishman et al., 2013). Online platforms provide flexible, self-directed accommodation to the scheduling, topical, and delivery needs of community-based professionals (Archer, Bruns, & Heaney, 2007; McCann, 2007; McClure et al., 2012; Zint, Dowd, & Covitt, 2011).
Needs Assessment and Design of E-Basics
Building on needs assessment data (National Association of Extension 4-H Agents, 2006), professional development and evaluation competency frameworks (National Professional Development Task Force, 2004; Arnold et al., 2008), North Carolina State University, in collaboration with the National 4-H Council and National 4-H Headquarters and with support from the Noyce Foundation, initiated development of an online basic training in program evaluation. Six webinar-based needs assessment forums with members of 12 4-H state science teams identified strong interest in content focused on basic skills and featuring practical examples, especially if organized in brief, logically sequenced units, with self-paced learning supported by electronic record keeping (Wagstaff & Silliman, 2012). These preferences, consistent with best practices for online learning (Salas, Tannenbaum, Kraiger, & Smith-Jentsch, 2012), were incorporated into the design of the online training program E-Basics. In addition to featuring PowerPoint teaching videos, each of the 33 units in the seven modules features an outline that includes links to related websites and fact sheets, content-based quizzes, exercises that allow individuals or teams to reflect and apply evaluation concepts, and access to an online evaluation glossary. A discussion board and periodic webinars also are planned. Module and unit content is summarized in Table 1.
Module Number | Module | Unit(s) | Story Line |
0 | Orientation | Preview of Content, Format | Sample slides from training |
1 | Planning |
Program Development Planning Tools Defining Evaluation Purposes of Evaluation Theories of Change |
Club plans for competition |
2 | Focusing |
Questions and Indicators SMART Objectives Evaluation Management Evaluating Maturing Programs Evaluation Standards |
Coordinating activities with measures |
3 | Designing |
Timelines Types and Timing of Data Collection Design Strategies, Sampling Real-World Evaluation |
Club plans work, takes field trip, and observes experiments |
4 | Selecting Methods |
Measurement Mindset Checklists and Rubrics Tests and Surveys Interviews, Focus Groups Portfolio and Participatory Evaluation |
Reflecting on matching goals to measures; samples from club activities |
5 | Collecting Data |
Issues and Ethics Protocols for Gathering Data Organizing and Managing Data Perspective on the Data Collection Process |
Reflecting on data gathering process |
6 | Analyzing Data |
Taking Inventory on Data Basic Statistics Content Analysis Applying Evaluation Findings Visualizing Success |
Reflecting on data analysis |
7 | Communicating Results |
Communicating Results Writing a Report Proofreading and Improving Communicating Live Communicating Value |
Reflecting on reporting; examples of club reporting |
— | Epilogue | Dramas and Dilemmas of Evaluating | Limits and potential of evaluation process |
Story Line: Practical Examples
Modules 1 through 4 feature scenes of a 4-H club of middle school youth preparing for a robotics competition as illustrations of evaluation concept learning and skill application. For instance, in the logic model segment, club members consider what building materials, work space, and expert help (inputs) they will need to prepare for a competition. The use of illustrations in context has its points of origin in the 4-H science mandate, assessment recommendations for relevant examples, and guiding values on connecting evaluation to experiential learning.
In Modules 5 and 6, examples expand to include adult professional contexts as well as youth events relevant to formal data gathering, hiring of an evaluation consultant, and analysis and interpretation of program results. For instance, viewers are challenged to consider how they might integrate reports of several robotics clubs and other types of groups and activities into a coherent overarching report of local program activities and outcomes. Module 7 closes with tips for communicating public value. The brief Epilogue module reminds learners that the value of youth programs can be measured in different ways but that some of their most enduring value is beyond measurement.
Access and Use of E-Basics
Participants currently enroll at no cost and then complete a self-assessment and user profile in preparation for online learning. Quizzes are available at the end of each unit and module. A posttraining self-assessment offers a self-check on progress, whereas a survey on training experience and future needs supports consumer voice. Completion of seven module quizzes at 80% competence qualifies a user for an achievement certificate. Participants are expected to complete the training within 6 months of their start date but can re-enter or repeat units on their own timetables. Access to E-Basics is available through the North Carolina Cooperative Extension Youth Development portal (http://youthdevelopment.ces.ncsu.edu/evaluation-2/).
Implications and Future Plans
Practice: Professional Development and Programming
E-Basics offers a thorough introduction to basic knowledge that is accessible, self-paced, and cost-effective and that, although focused on youth science, might also benefit a wide range of community-based professionals. E-Basics is nested in a web portal that features links to related resources in Extension (e.g., Children, Youth, and Families Education and Research Network [https://cyfernetsearch.org/]; University of Wisconsin Extension [http://www.uwex.edu/ces/pdande/evaluation/]), professional organizations (e.g., American Evaluation Association), university-based programs (e.g., Program in Education, Afterschool, and Resiliency [http://www.pearweb.org/]), and private providers (e.g., David P. Weickert Center for Youth Program Quality [http://cypq.org/]).
Although the training examples feature volunteers and youth, the program's length and breadth make it most appropriate for professionals. In the future, volunteers might benefit from brief lessons focused on incorporating evaluation in experiential learning processes and managing and using data. Knowledge about evaluation is but one element of organizational capacity building, which includes coaching for skill mastery and effective management and use of evaluation (Taylor-Powell & Boyd, 2008). Within professional development alone, a discussion forum and periodic webinars might link experientially and geographically diverse learners in a community of practice and mutual assistance.
Organizational Policies and Strategies
E-Basics expands training in an area of need, particularly for systems with limited evaluation capacity. Although current costs are manageable, sustainability, improvement, and expansion may require fees or sponsorships. E-Basics operates without external mandates or incentives, each which have potential to increase participation . . . or resentment.
Research Opportunities
E-Basics monitors user needs, skills, motivation, and growth. Future investigations may contribute to understanding individual and organizational learning and application of evaluation skills and strategies for online learning in Extension and beyond.
Acknowledgments
E-Basics was funded by National 4-H Council, in collaboration with National 4-H Headquarters, with support of the Noyce Foundation. Support and guidance from Suzanne LeMenestral, Jill Walahoskie, and Jessica Bauman and Ed Bender as well as useful feedback from Extension colleagues on the needs assessment webinars and assistance from North Carolina State University student workers Nathaniel Conti and Iris Wagstaff and North Carolina State University IT staff Leigh Jay Temple is gratefully acknowledged.
References
Archer T. M., Bruns, K., & Heaney, C. A. (2007). SAMMIE: Using technology for a one-stop program evaluation resource. Journal of Extension [Online], 45(5) Article 5TOT1. Available at: http://www.joe.org/joe/2007october/tt1.php
Arnold, M. E. (2006). Developing evaluation capacity in Extension 4-H field faculty: A framework for success. American Journal of Evaluation, 27, 257–269.
Arnold, M. E., Calvert, M. C., Cater, M. D., Evans, W., LeMenestral, S., Silliman, B., & Walahoski, J. S. (2008). Evaluating for impact: Educational content for professional development. Washington, DC: National 4-H Learning Priorities Project, Cooperative State Research, Education, and Extension Service, U.S. Department of Agriculture. Retrieved from http://www.national4-hheadquarters.gov/library/indicators_4h_mm.pdf
Dillman, L. M. (2013). Evaluator skill acquisition: Linking educational experiences to competencies. American Journal of Evaluation, 34(2), 270–285.
Fishman, B., Konstantopoulos, S., Kubitskey, B. W., Vath, R., Park, G., Johnson, H., & Edelson, D. C. (2013). Comparing the impact of online and face-to-face professional development in the context of curriculum implementation. Journal of Teacher Education, 64, 426–438.
Harder, A., Place, N. T., & Scheer, S. D. (2010). Towards a competency-based Extension education curriculum: A Delphi study. Journal of Agricultural Education, 51(3), 44–52.
Lamm, A. J., & Israel, G. D. (2013). A national examination of Extension professionals' use of evaluation: Does intended use improve effort? Journal of Human Sciences and Extension, 1(1), 49–62.
Leikes, K. S., & Bennett, A. M. (2011). Evaluation attitudes and practices of 4-H educators. Journal of Extension [Online], 49(1) Article 1RIB2. Available at: http://www.joe.org/joe/2011february/rb2.php
McCann, B. (2007). The effectiveness of Extension in-service training by distance: Perception versus reality. Journal of Extension [Online], 45(1) Article 1FEA4. Available at: http://www.joe.org/joe/2007february/a4p.shtml
McClure, M. M., Fuhrman, N. E., & Morgan, A. C. (2012). Program evaluation competencies of Extension professionals: Implications for continuing professional development. Journal of Agricultural Education, 53(4), 85–97.
National Association of Extension 4-H Agents (2006). NAE4-HA membership survey results. Public Relations and Information and Research, Evaluation and Programs Committees. United States Department of Agriculture. (For a copy of the report contact Dr. Susan Le Menestrel, National Program Leader, Youth Development Research, email: slemenestrel@csrees.usda.gov.)
National Professional Development Task Force (2004). New foundations for the 4-H youth development profession. Washington, DC: National 4-H Council.
Rodgers, M. S., Hillaker, B. D., Haas, B. E., & Peters, C. (2012). Taxonomy for assessing evaluation competencies in Extension. Journal of Extension [Online], 50(4) Article 4FEA2. Available at: http://www.joe.org/joe/2012august/a2.php
Salas, E., Tannenbaum, S. I., Kraiger, K., & Smith-Jentsch, K. A. (2012). The science of training and development in organizations: What matters in practice. Psychological Science in the Public Interest, 13(2), 74–101.
Taylor-Powell, E., & Boyd, H. H. (2008). Evaluation capacity building in complex organizations. In M. T. Braverman, M. Engle, M. E. Arnold, & R. A. Rennekamp (Eds.) Program Evaluation in a complex organizational system: Lessons from Cooperative Extension: New directions for evaluation, number 120, (pp. 55–69). Wiley Periodicals, Inc.
Wagstaff, I., & Silliman, B. (2012). National 4-H science training needs assessment. Raleigh, NC: North Carolina Cooperative Extension. (Available from second author.)
Zint, M. T., Dowd, P. F., & Covitt, B. A. (2011). Enhancing environmental educators' evaluation competencies: Insights from an examination of the effectiveness of the My Environmental Education Evaluation Resource Assistant (MEERA) website. Environmental Education Research, 17(4), 471–497.