August 2003 // Volume 41 // Number 4 // Tools of the Trade // 4TOT4
Using a Retrospective Pre-Post Questionnaire to Determine Program Impact
Abstract
This article describes how Extension program impact was documented using
a retrospective pretest. The method, employed with 35 economic development
professionals involved in a traditional Extension educational program,
illustrated change in knowledge, skills, attitudes, and behavior. Characteristics
of this type of program evaluation are discussed in relation to its implementation.
Introduction
Evaluating program impact is important for all Extension educators in today's political economy. If passage of the Government Performance and Results Act of 1993 has not yet placed a renewed emphasis upon Extension's program effectiveness component (Richardson, Gamble, & Mustian, 1998; O'Neill, 1998), most certainly shrinking budgets for Extension will. Diem (2003) indicated that documenting such impact is not only a requirement of the agencies and political bodies that provide Extension funding, it also serves as a way to build and maintain credibility as well as justify use of limited resources. For these reasons and others, evaluation and documentation of Extension programming impact are beginning to receive increased emphasis in Extension work (Arnold, 2002).
A brief analysis of recent JOE articles on the topic revealed that a variety of methods and techniques can be used for program evaluation, including: Logic Modeling, children's drawings, formal qualitative and quantitative methods, and the retrospective pretest program evaluation. This article describes how the retrospective pretest methodology was used to determine change in knowledge, skills, and attitudes toward organizational strategic planning of 35 economic development professionals involved in a traditional Extension educational program.
The Retrospective Pretest
One Administration
Documenting changes in knowledge and behavior can be done simply and efficiently using the retrospective pretest evaluation (Rockwell & Kohn, 1989; Stevens & Lodl, 1999). According to Rockwell and Kohn, this tool "is specifically useful for evaluating the impact of Extension programs by asking participants to report actual changes in behavior" (in Stevens & Lodl). The retrospective pretest design, unlike the typical pretest-posttest, is administered only once. Because of time limitations, this characteristic made using the method more appealing to my audience and to me as the administrator of the instrument. Only a few minutes were required to complete the 13-item questionnaire.
Improved Accuracy
With the retrospective pretest, participants are asked to share the knowledge or attitude they had toward a particular subject before some experience, program, or treatment and after. When participants are asked to respond to a question about how much they know about a particular subject after they have some basic knowledge of the subject itself, they are more able to accurately reflect on the degree of change in knowledge or attitude (Rockwell & Kohn, 1989). Furthermore, respondents oftentimes overestimate their level of knowledge on a particular subject when using the traditional pretest-posttest (Pratt, McGuigan, & Katzev, 2000). With the retrospective pretest methodology, respondents are given an opportunity to learn how much they know about a subject prior to responding to a questionnaire.
My audience indicated that they had some experience with the topic prior to the program. Enabling them to more accurately assess their baseline level of understanding after the program provided them an opportunity to better illustrate the degree of change as a result of the program and provided me (and ultimately my stakeholders) with more meaningful data.
Using the Retrospective Pretest to Measure Change
For this evaluation effort, a one-page questionnaire was used that contained four background questions designed to collect basic data such as: role played in economic development; number of years of experience in these roles; population of the community on which these efforts are focused; and frequency of formal organizational strategic planning processes undertaken. The reverse side contained the retrospective pretest.
The retrospective pretest was designed with instructions at the top, an example, and nine statements. The statements were developed using the learning objectives for the strategic planning workshop. Participants were asked to indicate their level of agreement with each statement before and after the workshop using a six-point, Likert-type scale; (1-strongly disagree and 6-strongly agree).
Administration
Participants were asked to complete the one-page questionnaire at the conclusion of the workshop. A conscious attempt was made by the instructor to downplay the instrument, and there was no verbal instruction provided for completing the two-part questionnaire. Participants were simply asked to place their completed questionnaire on a table at the back of the room as they exited. Of 35 workshop participants, 32 completed questionnaires.
Data input/Analysis
Questionnaire data were analyzed using SPSS 10.1 to determine if participation in the workshop affected participant knowledge, awareness, confidence, and attitude. While the SPSS software is quite capable of examining the degree of change (among numerous other data analysis procedures), the degree of change was not examined. Group means (before and after) were also examined.
Results and Discussion
The retrospective pretest's nine workshop indicators revealed that participants experienced a positive change in knowledge, awareness, confidence, and attitudes. Eight of the nine indicators registered positive change for at least one third of the respondents. The overall mean for the nine items increased from 3.9 (before) to 4.9 (after) (Table 1).
Variable |
Mean |
sd. |
p |
---|---|---|---|
I have a basic awareness of the mechanics of strategic planning. |
|||
Pre |
3.7 |
1.6 |
<.05 |
Post |
4.7 |
1.0 |
|
I know what the key components of strategic planning are. |
|||
Pre |
3.5 |
1.4 |
<.05 |
Post |
4.7 |
1.1 |
|
I think I could facilitate a strategic planning process. |
|||
Pre |
3.3 |
1.5 |
<.05 |
Post |
4.5 |
1.1 |
|
I have the skills necessary to facilitate a strategic planning process. |
|||
Pre |
3.5 |
1.4 |
<.05 |
Post |
4.4 |
1.2 |
|
Strategic planning can provide direction to an organization's efforts. |
|||
Pre |
4.4 |
1.6 |
<.05 |
Post |
5.2 |
1.1 |
|
I would like to try facilitating a strategic planning process at some point. |
|||
Pre |
3.6 |
1.6 |
<.05 |
Post |
4.6 |
1.3 |
|
I will attempt some form of strategic planning process in the future. |
|||
Pre |
4.0 |
1.8 |
<.05 |
Post |
5.0 |
1.1 |
|
Thinking strategically is a worthwhile practice. |
|||
Pre |
4.6 |
1.6 |
<.05 |
Post |
5.4 |
1.0 |
|
Strategic planning is an ideal way to guide an organization's economic development efforts. |
|||
Pre |
4.4 |
1.7 |
<.05 |
Post |
5.3 |
1.0 |
Conclusions and Recommendations
I found this program evaluation tool to provide rich data with a modest investment of time, relative to more traditional pretest-post test evaluative measures. Program participants had little difficulty understanding and completing the questionnaire. Furthermore, participants were able to complete the instrument in a timely fashion, yielding very useful data compared to other evaluation tools requiring a similar investment of time. The data gathered were relatively easy to analyze and communicate a change in knowledge, awareness, confidence, and attitudes as ably as other more complex and involved evaluative measures.
In short, I found the retrospective pretest a useful tool for evaluating this traditional Extension program. While my use of the instrument focused primarily on immediate impact, the tool could also be used for demonstrating intermediate and long-term outcomes of Extension programs.
References
Arnold, M. E. (2002). Be "Logical" about program evaluation: Begin with learning assessment. Journal of Extension [On-line], 40(3). Available at: http://www.joe.org/joe/2002june/a4.html
Diem, K. G. (2003). Program development in a political world--It's all about impact! Journal of Extension [On-line], 41(1). Available at: http://www.joe.org/joe/2003february/a6.shtml
Diem, K. G. (2002). Using research methods to evaluate your Extension program. Journal of Extension [On-line], 41(1). Available at: http://www.joe.org/joe/2002december/a1.shtml
O'Neill, B. (1998). Money Talks: Documenting the economic impact of Extension personal finance programs. Journal of Extension [On-line], 36(5). Available at: http://www.joe.org/joe/1998october/a2.html
Pratt, C. C., McGuigan, W. M., Katzev, A. R. (2000). Measuring program outcomes: Using retrospective pretest methodology. American Journal of Evaluation. 21(3).
Richardson, J. G., Gamble, K. J., Mustian R. (1998). Creation of a web based accomplishment reporting system. Journal of Extension [On-line]. 36(2). Available at: http://www.joe.org/joe/1998april/a1.html
Rockwell, S. K., & Kohn, H. (1989). Post-then-pre evaluation. Journal of Extension [On-line]. 27(2). Available at: http://www.joe.org/joe/1989summer/a5.html
Stevens, G. L., & Lodl, K. A. (1999). Community coalitions: Identifying changes in coalition members as a result of training. Journal of Extension [On-line]. 37(2). Available at: http://www.joe.org/joe/1999april/rb2.html