December 2007 // Volume 45 // Number 6 // Tools of the Trade // 6TOT2

Previous Article Issue Contents Previous Article

A Practical Tool for the Evaluation of Extension Programs Presented to Older Adults

Abstract
This article presents a practical tool for the evaluation of Extension programs presented to older adults. The survey method is not practical to collect evaluation data from older adults due to their physical limitations. This evaluation tool is an especially designed box used with plastic tokens. The evaluation box was tested and found accurate and easy to use. It is a practical alternative for the survey method in special situations. The evaluation box is appropriate to collect evaluation data from older adults and illiterate audiences. This tool can be used to evaluate exhibits as well.


K. S. U. Jayaratne
State Leader for Program Evaluation and Assistant Professor
Department of Agricultural and Extension Education
North Carolina State University
Raleigh, North Carolina
jay_jayaratne@ncsu.edu


Introduction

The increased older adult population is a significant client group for Extension in the U.S. There were 35.9 million people aged 65 years or older in the U.S. in 2003 (Wan, Sengupta, Velkoff, & Debarros, 2005). Extension reaches out to this older adult population with educational programs such as nutrition education, food safety, diabetic management, and fraud prevention.

Evaluation of Extension programs presented to older adults is often challenging because some of them are physically weak and have limited eye sight. Sometimes, Extension agents use big fonts to print evaluation surveys in deference to older adults' limited eye sight. Due to the physical limitations of some older adults, responding to a written survey can be a challenging task. Because of these limitations and difficulties of collecting data using a printed survey, sometimes Extension programs presented to older adults are left without being evaluated.

It is difficult to secure continuous funding support for Extension programming without being accountable for the resources utilized. For this reason, Extension is continually demanding for program evaluation (Radhakrishna, 1999). This situation led to the development of a practical tool to collect evaluation data from the older adults in Extension programs. This evaluation data collection tool was tested by Georgia Cooperative Extension. The Family and Consumer Sciences Extension agents in Georgia use this tool to evaluate educational programs presented to older adults. This article shares the information about this evaluation tool.

Description of the Evaluation Tool

When this evaluation tool was developed, older adults' limited physical abilities in responding to a survey were taken into account. The evaluation tool is a specially designed wooden box. The dimensions of the evaluation box are in the figure 1. The evaluation box has been partitioned into three sections. On the side of the box there are three holes. Each of these holes opens into a separate section of the partitioned box. Just above these holes on the side of the box there is a sliding transparent plastic cover. A printed paper can be slipped into this transparent cover. The top lid of the box is a sliding cover.

Figure 1.
Evaluation Box

Evaluation Box

How Does the Evaluation Box Work?

The evaluation box is used with some plastic tokens. The size of a plastic token is about the size of a nickel coin. This method is very appropriate to measure participants' levels of aspirations--readiness to apply learned practices--at the time of concluding Extension workshops. However, this can be used to measure participants' perceived knowledge gained, skill development, and attitude change.

When participants' level of aspirations is assessed, each of the holes should be labeled from left to right as "More Likely," "Undecided," and "Less Likely," respectively. If the target audience is a low literate group, it is important to use a simple set of response choices. A possible alternative for this type of situation is "Yes," "Maybe," "No." The targeting behavior change should be typewritten with large fonts and inserted into the transparent sliding frame of the box.

For an example, if the workshop is emphasizing low-fat dairy products, this behavior change would be type written as follows: "As a result of today's workshop, how likely are you to consume low-fat dairy products?" At the end of the training session, the evaluation box and tokens would be passed to the participants, and they would be asked to drop a token into the hole that represents the direction of their potential behavior change. If someone has been aspired to take charge of eating low-fat dairy products, he or she would drop a token into the "Yes" hole.

At the end of the training, the Extension Agent who presented the workshop will be able to evaluate the outcomes just by counting the number of tokens in each section of the evaluation box. For example, if there were 25 older adults in the program and 15 said that they are more likely to consume low-fat dairy products, the outcome would be that 60% of the participants intended to follow the desired dietary habit. This method is very user-friendly. Because this tool keeps the respondents anonymous, it is compatible with the human subjects governing rules and regulations.

Applications and Recommendations

When written surveys are not practical, this method can be used to collect evaluation data. Collecting evaluation data from older adults is just one application of this method. There are two other possible applications of this method in Extension.

First, this method can be used to evaluate programs presented to low literate audiences. The written survey method is not appropriate for collecting data from low-literate audiences.

The second important application is to evaluate the outcome of educational exhibits. Extension educators present many exhibits and rarely evaluate the outcome of exhibits. The evaluation box can be used to document the outcome of exhibits by placing the box and tokens beside the exhibits and asking viewers to drop a token to indicate the direction of their learning outcome.

References

Radhakrishna, R. (1999). Program evaluation and accountability: Training needs of Extension agents. Journal of Extension [On-line], 37(3). Available at: http://www.joe.org/joe/1999june/rb1.html

Wan, H., Sengupta, M. Velkoff, V. A., & Debarros, K. A. (2005). U. S. Census Bureau, current publication reports, 65+ in the United States: 2005. U. S. Government Printing Office, Washington DC. Retrieved on April 13, 2007 from: http://www.census.gov/prod/2006pubs/p23-209.pdf