October 2008 // Volume 46 // Number 5 // Research in Brief // 5RIB4

Previous Article Issue Contents Next Article

Better Extension Programming Through Statistics: Using Statistical Analysis of Program Evaluations to Guide Program Development

Abstract
Evaluation can not only inform changes in content and delivery, but also help Extension educators tailor future programs to specific target audiences. This article focuses on post-program evaluation of the Iowa Planning Officials Academy and describes how statistical analysis of the results yielded valuable information related to the skills and educational needs of the various groups of program participants. The experience suggests that collecting the right background information on the evaluation instrument and using statistical analysis to look for distinctions between participant subgroups can provide rich information for improving future programming efforts and targeting specific Extension clientele.


Gary D. Taylor
Assistant Professor & Extension Specialist
Department of Community & Regional Planning
Iowa State University
Ames, Iowa
gtaylor@iastate.edu


Introduction

A number of articles have appeared in JOE focusing on the value of using evaluation to guide future program development. Bush, Mullis, and Mullis (1995) argue that evaluation is a means of assessing not only whether program objectives have been met, but also the program strategies and techniques producing those results. Brown and Kiernan (1998) suggest that evaluation data should be used to modify content and delivery to make subsequent programs more effective. Chapman-Novakofski et al. (2004) provide valuable evidence of this concept in practice, showing how evaluation led to positive revisions in University of Illinois Extension's Dining with Diabetes program.

Evaluation can not only inform changes in content and delivery, but also help Extension educators tailor future programs to specific target audiences. This is especially beneficial for new programming efforts, when the link between program and audience is untested. This article focuses on evaluation results of the newly developed Iowa Planning Officials Academy and describes how this feedback is being used to create future programs focused more directly for specific clientele.

The Iowa Planning Officials Academy

Iowa State University Extension has a long history of providing assistance to Iowa communities on land use planning matters. Until recently, however, ISU Extension has not consistently offered an integrated series of educational workshops addressing land use planning and development. Programming was recently initiated to fill that gap. In Spring 2005, workshops were offered around the state to provide local officials with basic training on the principles and practices of land use planning, zoning, and land subdivision. Introduction to Iowa Planning and Zoning was a 3-hour program delivered to over 600 participants at 12 locations. Beginning in 2008, this program will be updated and repeated in the spring of even-numbered years.

The second component of the series is the Iowa Planning Officials Academy (IPOA), an intensive 12-hour program designed to provide participants with a more thorough exposure to the land development process. The program follows a real development project from proposal to final approval. Eight case study scenarios focus on different stages in the development review process. Participants are provided the community's comprehensive plan, zoning and subdivision ordinances, the master plan and subdivision plat maps, and photos of the site and staff reports. They are asked to work in small groups to review plans, ordinances, and plat maps; discuss problems posed by the scenarios; and propose alternatives and solutions.

The IPOA was first offered during Spring 2007 in four locations across Iowa. One hundred-thirteen registrants attended the IPOA. The first 9 hours of the program were delivered in 1 1/2- day (Friday afternoon-all day Saturday) sessions. The final 3 hours were later broadcast via the Iowa Communications Network, a statewide fiber optics network that allows two-way interactive audio and video conferencing, to the four original locations. The intended audience was local elected and appointed board and commission members, although roughly 40% of registered attendees were zoning administrators and city managers. (This is discussed in further detail below.)

Evaluation Design

Program Logic Model

A logic model was used to develop the IPOA program and the plan for evaluation. As explained by Arnold (2002), logic modeling can help Extension educators identify program inputs and activities, as well as short-, medium-, and long-term program outcomes. The logic model also can provide a roadmap for evaluation to help isolate key components for assessment.

The logic model for the IPOA set out several desired learning objectives, behavioral changes, and ultimate impacts (Figure 1). The learning objectives were developed in consultation with a seven-member program advisory committee made up of city and county professional staff, elected and appointed officials, staff from the Iowa League of Cities, and ISU Extension educators. The eight scenarios were developed in tandem with the learning objectives, with each scenario designed to address no more than three learning objectives.

Figure 1.
Relevant Components of the IPOA Logic Model

Outputs Outcomes - Impact
Activities Participation Short-term (Learning objectives) Medium-term (Behavioral change) Long-term (Ultimate impacts)

Develop reference materials

Conduct workshops

Local government elected officials

Appointed commissioners and board members

  • Purposes of preapplication conference.
  • Purposes of comprehensive plan review during rezoning process.
  • Things to look for during a site visit.
  • Things to look for on a subdivision plat.
  • [14 others not listed here]

Changed local practices:

  • Make better use of plans and ordinances.
  • Conduct more effective site plan review.
  • Increase use of site visits.
  • Conduct efficient hearings.
  • [5 others not listed here]

Better developments "on the ground":

  • Meets local government standards.
  • aesthetically-pleasing.
  • compatible with surrounding land uses.

Less land use litigation

Less contentious public hearings

[2 others not listed here]


Retrospective Pre-Test Evaluation

A retrospective pre-test was used in the end-of-program questionnaire to assess short-term learning outcomes. The retrospective pre-test assesses learning outcomes asking participants at the end of a program about their knowledge of a given topic prior to, then after participation. The retrospective pre-test is recognized as a valid method for capturing perceived changes in knowledge. The advantages of using retrospective pre-tests in evaluating Extension programs include the following.

  • The evaluation is implemented at only one point in time on a single instrument, making it easier for participants to complete and educators to administer (Davis, 2003).

  • The incidence of incomplete data sets is reduced (Raidl et al., 2004).

  • It is not subject to "response shift bias" (when participants overestimate their pre-intervention level of knowledge), which can occur with pre- then post-testing (Rockwell & Kohn 1989).

The evaluation instrument for the IPOA asked participants background questions about jurisdiction type and size, their planning-related position (i.e., planning commissioner, board of adjustment member, city manager, etc.) and years of service in planning-related positions. The instrument included eighteen questions, corresponding to the identified learning objectives (Figure 2). At the conclusion of each day participants were asked to assess their level of understanding of the concepts covered during that day. Of 113 workshop participants, 80 returned fully completed evaluations.

Figure 2.
Sample of IPOA Program Evaluation Questions

Please rate what you believe to be your level of understanding of the following topics before and after the workshop (circle your responses):

1 = Little or no understanding of this topic 6 = Thorough understanding of this topic

  Before Academy After Academy
The purpose of the preapplication meeting 1 2 3 4 5 6 1 2 3 4 5 6
The purpose of comprehensive plan review during rezoning 1 2 3 4 5 6 1 2 3 4 5 6
Things to look for during a site visit 1 2 3 4 5 6 1 2 3 4 5 6
Ex parte contacts — what is/is not permissible 1 2 3 4 5 6 1 2 3 4 5 6
How to handle conflicts of interest at a public hearing 1 2 3 4 5 6 1 2 3 4 5 6
[13 others not listed here]

Analyzing Evaluation Results

Perceived Changes in Knowledge

Mean pre- and post-program scores were calculated for all participants (N=80) and for several subgroups. The subgroups were categorized based on years of service in planning-related positions and whether participants served in elected/appointed positions or professional positions (city administrators and zoning administrators) (Table 1). Perceived positive change was reported on each of the 18 questions for the all participants group and for each subgroup. The differences in means for the entire group ranged from 1.0 on a question concerning the use of parliamentary procedure, to 2.13 on a question about the differences between legislative and quasi-judicial hearings. The largest difference in means (2.32) for any subgroup was recorded for the group with 2 years or less experience on a question about preapplication meetings. The smallest difference (0.93) was recorded for the hired-professionals on a question related to meeting management. The results were largely consistent with our impression of the effectiveness of the different scenarios:

  • Scenarios providing hands-on use of comprehensive plans and ordinances, and reading site plans and plat maps were well received.

  • It was difficult to create a scenario that accurately reflects the hearings process in a limited amount of time. The mean differences were lower on those learning objectives related to parliamentary procedure, motions, and other process issues.
Table 1.
Pre- Versus Post-IPOA scores--All Questions

Participant Group Participant Sub-Group Mean Score "Before IPOA" Mean Score "After IPOA" Diff.
1 = Little or no understanding of this topic. 6 = Thorough understanding of this topic.
All participants - 3.39 4.98 1.59
Years of planning-related service 2 years or less 3.09 4.89 1.80
  5 years or less 3.23 4.95 1.72
  More than 5 years 3.66 5.08 1.42
Planning-related position Elected and appointed positions 3.12 4.84 1.71
  Hired professional positions 3.73 5.17 1.44

Assessing Perceived Magnitude of Change--Comparing Groups

While the descriptive statistics were instructive on the effectiveness of each scenario, the collection of background information also made it possible to compare groups to assess whether some subgroups had greater changes in perceived knowledge than others. Comparisons between groups were made by statistically analyzing the mean differences in pre- versus post-program ratings for each question. A number of different paired t-tests were run, primarily looking for differences between groups based on experience or on official position. Two examples are provided in Table 2.

Table 2.
Examples of Comparison in Differences of Means, Pre- Versus Post-IPOA

Question Compared Groups N Difference in Means Pre- vs. Post-IPOA Std. Dev. Significance (2-tailed)
Example 1: Comparing participants with ≤ 2 yrs. experience vs. > 2 years experience — Q1
Q1: Purpose of pre- application mtg. 2 years or less 38 2.22 1.40 0.032
More than 2 years 42 1.50 1.50
Example 2: Comparing elected/appointed officials vs. professional staff — Q1
Q1: Purpose of pre- application mtg. Elected/appointed 42 1.90 1.57 0.51
Professional 41 1.68 1.47

The subgroup comparisons yielded a number of interesting results. As expected, the greatest number of statistically significant (p <.05) differences was found when comparing participants with relatively few years of experience to those with greater experience. Comparing the group of participants with two years or less of experience with others yielded statistically significant differences on six of eighteen questions. This simple comparison, however, masks some subtle but important differences in perceived learning that was reported by experience subgroups. Those differences can be generally summarized as follows.

  • Participants with the least experience (1 year or less; N=19) reported statistically significant differences from more experienced participants on five "end of the development process" issues, including the proper sequence of events at a public hearing, handling written and oral testimony, and writing clear, legally defensible decisions. Generally these are the most immediate and most visible (to the public) aspects of these positions, and so it can be surmised that this group is more aware of their pre-program limitations in these areas.

  • When participants with slightly more experience were included (resulting in a subgroup representing 0-3 years experience), significantly-different responses on six "beginning of the development process" issues emerged, such as the purpose of the preapplication meeting, the purpose of comprehensive plan review, and things to look for during a site visit. These are the next-level skills that land use officials need to develop to increase competency after becoming familiar with the immediate tasks.

  • Participants with 5 years or less experience reported greater perceived learning than the other group (p =.032) on how to handle conflicts of interest, at least suggesting that this issue remains a vexing one for planning officials well beyond their earliest years of service.

The comparison of elected and appointed officials with professional staff was informative for the lack of statistically significant differences. The only question resulting in a statistically significant distinction between these subgroups was about knowledge of a legally technical device--zoning development agreements--generally negotiated by professional staff. It was alluded to above that the significant number of zoning administrators and city managers signing up for the program came as a surprise, considering the program was primarily targeted at elected and appointed officials. The concern was that these officials would have little use for much of the content because of their daily exposure to the zoning process. In hindsight, these concerns were unwarranted.

The vast majority of city managers and zoning administrators in small and mid-size Iowa communities have never received formal training in planning and zoning. Zoning administration is just one of many responsibilities they carry. Daily exposure to the permitting process does not necessarily translate into an understanding of how zoning and subdivision regulations fit into the larger context of community development. The following comment was typical of those received from the professional staff participants: "Very helpful--Now I understand better why I am doing the things I do."

Using Results to Guide Future Programs

Evaluation of the IPOA has resulted in more informed discussions with stakeholders about the future direction of ISU Extension's planning and zoning programming. The present plan is to reorganize the programming into two tracks, and to make explicit in promotional materials the distinctions between the two.

  • The Introduction to Planning and Zoning course will be the foundational program for both tracks. It will be targeted at professional staff and newly elected/appointed local officials with less than 2 years experience. The course is being redesigned in a case study scenario format similar the IPOA. Several of the "end of the development process" learning objectives considered critical by the most inexperienced IPOA participants will receive more attention in this course.

  • The IPOA will be the second step in an educational track designed specifically for elected and appointed board and commission members. Moving some of the end-of-process issues to the introductory course will allow for increased attention to the use of the comprehensive plan and ordinances, decision-making issues, and conflicts of interest.

  • A second track--Zoning Administrators Certification Training--is now planned that includes the introductory course and a second course targeted specifically at professional staff. A 1-day course for zoning administrators covering the basics of permit processing, handling zoning violations, proper notice procedures, and similar practical functions has always been contemplated. Realizing that there is a need and demand for a more comprehensive treatment of planning and zoning generally, the plan now is to create a 2-day course that covers "big picture" questions of planning and zoning, and advanced skills, as well as the practical functions.

Conclusion

The absence of any history in offering planning and zoning programs meant that ISU Extension was "flying blind" on a number of questions of content, delivery, and audience. The evaluation of the IPOA yielded valuable information related not only to content and delivery, but also to the skills and educational needs of the program participants. This experience suggests that collecting the right background information on the evaluation instrument and using statistical analysis to look for distinctions between participant subgroups can provide rich information for future programming efforts.

References

Arnold, M. E. (2002). Be "logical" about program evaluation: Begin with learning assessment. Journal of Extension [On-line], 40(3). Available at: http://www.joe.org/joe/2002june/a4.html

Brown, J. L., & Kiernan, N. E. (1998). A model for integrating program development and evaluation. Journal of Extension [On-line], 36(3). Available at: http://www.joe.org/joe/1998june/rb5.html

Bush, C., Mullis, R., & Mullis, A. (1995). Evaluation: An afterthought or an integral part of program development. Journal of Extension [On-line], 33(2). Available at: http://www.joe.org/joe/1995april/a4.html

Chapman-Novakofski, K., DeBruine, V., Derrick, B., Karduck, J., Todd, J., & Todd, S. (2004). Using program evaluation to guide program content: Diabetes Education. Journal of Extension [On-line], 42(3). Available at: http://www.joe.org/joe/2004june/iw1.shtml

Davis, G. A. (2003). Using a retrospective pre-post questionnaire to determine program impact. Journal of Extension [On-line], 41(4). Available at: http://www.joe.org/joe/2003august/tt4.shtml

Griner-Hill, L., & Betz, D. L. (2005). Revisiting the retrospective pretest. American Journal of Evaluation, 26, 501-517.

Pratt, C. C., McGuigan, W. M., & Katzev, A. R. (2000). Measuring program outcomes: Using retrospective pretest methodology. American Journal of Evaluation. 21(3), 341-349.

Raidl, M., Johnson, S., Gardiner, K., Denham, M., Spain, K., & Lanting, R. (2004). Use retrospective surveys to obtain complete data sets and measure impact in Extension programs. Journal of Extension [On-line], 42(2). Available at: http://www.joe.org/joe/2004april/rb2.shtml

Rockwell, S. K., & Kohn, H. (1989). Post-then-pre evaluation. Journal of Extension [On-line]. 27(2). Available at: http://www.joe.org/joe/1989summer/a5.html