December 2007 // Volume 45 // Number 6 // Tools of the Trade // 6TOT1
Reporting Program Impacts: Slaying the Dragon of Resistance
Abstract
Virginia Cooperative Extension has responded to today's environment of enhanced accountability by improving the organization's program impact reporting. Successful strategies to enhance the quantity and quality of program impact reports include new hires, training faculty and administration, individual and small group technical assistance, development of reporting tools, and tying impact reporting to performance and recognition. This holistic approach resulted in enhanced reporting and use of program impacts as well as improved program design and evaluation.
Introduction
Many Extension faculty and administrators struggle with reporting substantive program impacts and organizational excellence (Archer et al., 2007). Increased program accountability has magnified this "struggle to report" (Hoffman & Grabowski, 2004). Virginia Cooperative Extension (VCE) responded to this situation with a holistic approach focused specifically on program development, implementation, and evaluation of outcomes-based programming.
Intended outcomes for this improving reporting and accountability included the:
- Increased ability of faculty to measure and report program impacts;
- Increased quality and quantity of program impact statements;
- Increased usefulness of impact statements for faculty, clients, and administration; and
- Improved program design and evaluation.
New Faculty Hires
Over the years, VCE's program leadership positions were eroded by budget cuts. In 2005, VCE administrators committed funds to fully staff program leadership. By mid 2006, five state and six district program leaders were in place, and a Program Development Specialist position was filled. These positions focus on outcomes-based programming through faculty training, technical assistance, and other support.
Administration and Faculty Training
An organizational assessment revealed training and technical assistance for program impact reporting was the highest priority. Twenty-six training sessions for faculty were conducted around the state in districts, on campus, and at Agricultural Research and Extension Centers. Extension administrators also participated in impact reporting training at the invitation of the dean of the college. District Directors participated in their district's training.
Each of the 26 trainings taught by program leadership and the Director of VCE included:
- The purpose and audiences for impact reporting,
- An impact reporting formula,
- Tips and tools for impact reporting,
- Review of sample reports,
- Writing or rewriting personal impact reports, and
- Critique of impact reports.
The training ranged from 1 to 4 hours in length, with eight to 35 faculty attending.
Individual and Small Group Technical Assistance
Interviews with faculty indicated individuals rarely applied concepts and tools from training sessions. Therefore, individual and small group technical assistance was implemented following impact reporting training. Program leadership worked with faculty to develop logic models for program development, design program evaluations, and develop program impact reports.
Development of Reporting Tools
The VCE Program Development Specialist and the College of Ag and Life Sciences Director of Communications and Marketing created a college-wide document on impact reporting guidelines <http://www.cals.vt.edu/communications/writingimpactstatements.html>. This document covers the purposes of impact reporting, target reporting audiences, types of impact, an impact reporting formula, and report examples for faculty to emulate. Ewert's matrix of approaches to Extension work was adapted to help faculty mesh program evaluation and impact report methods with program approaches (McDowell, 2001).
Tying Impact Reporting to Performance and Recognition
Administrators deliberately began tying employee performance and recognition to program impact and reporting. This enhanced faculty readiness to engage in improved program development, evaluation, and reporting. The change was supported by a new online electronic faculty annual reporting system that produces faculty reports used by administrators for faculty performance review.
The VCE director initiated two new recognition systems to enhance program impact reporting. The first included district and state program awards for agents in program evaluation, program impact, Extension leadership council development, program marketing, interdisciplinary programming, and new program initiatives. Program excellence grants were also introduced for faculty focused on outcomes-based programs that highlight agent/specialist relationships.
So What?
Positive effects of VCE's approach impact reporting approach quickly became evident. A process evaluation of these efforts was conducted using quick whips, listening posts, and focus group interviews. The data shows 339 faculty attending trainings found them to be valuable for gaining clear expectations about impact writing, useful tools, and reflection on programming.
These stories also emerged:
- Faculty voluntarily rewrote promotion materials to focus on program impact;
- Faculty rewrote award applications/nominations to highlight program impact;
- Faculty used the impact writing formula for other reporting;
- Faculty asked for additional training on related topics; and
- Technical assistance on program development, evaluation, and report increased.
One participant wrote, "My camping partners and I are planning ahead of time this year to offer pre/post tests to our campers on various areas of learning that will take place at camp. The importance of these types of tests became evident to me at the training."
A survey was sent to faculty and administrators who received impact reporting training and technical assistance. Of the 339 participants, 131 responded and indicated that as a result of the training and/or technical assistance:
- 96% made impact statements more useful for themselves, their clients, and administration;
- 95% improved their ability to measure and report impacts;
- 93% increased the quality and quantity of program impact reports; and
- 71% improved their program design and evaluation
Faculty also highly valued the simple reporting formula, small group work, example impact statements, practical/hands on approach, writing and feedback, and better understanding of impact reporting and expectations.
Lessons Learned
Lessons learned by VCE's program leadership about impact reporting support for faculty included the following.
- Effective faculty impact reporting requires training not just on basic tips and techniques but actual writing and critiquing of impact reports and related follow up technical assistance to personally apply training concepts/tools.
- Some faculty find it difficult to move from technical writing to reporting impacts in lay language for public consumption.
- Key administrative champions for impact reporting results in faculty buy in and attitude and behavior change.
- Impact reporting culture change should take place across the functions of the land-grant university. At Virginia Tech, the dean for the College of Agriculture and Life Sciences held a training for department heads on impact reporting for research, teaching, and Extension.
- Faculty want a wide variety of program impact statement examples to work from.
- Program leadership and campus specialist positions providing technical assistance are key to quality impact data collection and reporting.
- Tying impact reporting to performance and recognition motivates faculty to enhance the number, quality, and use of impact reports.
- Increased attention to impact reporting enhances critical reflection on better aligning programs with Extension's mission, faculty and organizational capacity, and local needs.
- Improving impact reporting can focus on the product and/or the process. Virginia Cooperative Extension focused on the product of outcomes-based programming to force faculty to more closely scrutinize and evaluate their current programming models, mechanics, and outcomes. This resulted in heightened awareness of the importance of good program development and evaluation.
- Focusing on good organizational citizens rather than resistors enhances reporting quality.
These efforts to slay the dragon of resistance around impact writing are not easy but necessary to enhance Extension's ability to cope with and excel in today's environment of enhanced accountability.
References
Archer, T., Warner, P., Miller, W., Clark, C., James, S., Cummings, S., & Adamu, U. (2007). Can we define and measure excellence in Extension? Journal of Extension [On-line], 45(1) Article 1COM1. Available at: http://www.joe.org/joe/2007february/comm1.shtml
Hoffman, B., & Grabowski, B. (2004). Smith Lever 3(d) Extension evaluation and outcome reporting--A scorecard to assist federal program leaders. Journal of Extension [On-line], 42(6). Available at: http://www.joe.org/joe/2004december/a1.shtml
McDowell, G. R. (2001). Land-grant universities and Extension into the 21st century. Ames, IA: Iowa State University Press.