Spring 1991 // Volume 29 // Number 1 // Ideas at Work // 1IAW2

Previous Article Issue Contents Previous Article

Integrating Evaluation Into Teaching

Abstract
Building an evaluating device into the program delivery method is a way to reduce negative participant reactions to pre-post evaluation instruments. Making the evaluation a part of the teaching can provide clear evidence of results. The students like it. By participating, they receive individualized attention and information they can use.


Sonia W. Butler
Extension Home Economist
Rutgers Cooperative Extension of Ocean County
Toms River, New Jersey


One traditional Extension evaluation tool has been pre- and post-tests. Knowledge increase is traditionally less important than a positive behavior change.

The characteristics of learners attending classes can make traditional evaluation inappropriate. Participants come to learn, not to be studied. Some have limited ability for taking standardized tests. Some find the testing patronizing and insulting.

Building an evaluation device into the program delivery method is a way to reduce negative participant reactions to pre- post evaluation instruments. The following describes three home economics programs that used an integrated evaluation method.

The first program involved 37 senior citizens who attended a two-part series about dietary guidelines. At the last class, the participants evaluated their own diets and set specific behavior goals for improvement. The participants and I each kept a copy. Six weeks later, each received a letter listing that individual's goals with "yes" or "no" to be circled indicating whether each had been done. The response was good (68%) and 92% had met one or more goals.

Next, 10 previously homeless welfare mothers, in a transitional housing program, attended six monthly sessions on budgeting. After the first class, each wrote short- and long-term financial goals. For most, getting to the end of the month with food to eat was the first goal. After four months, seven were reaching this goal and at the end of six months, all but two were also meeting some of their other short-term goals.

Finally, in the same group of 10 mothers, eight kept a three -day food intake record that was analyzed by computer on a database provided by the university. Although there was evidence of very poor diets, the women hadn't been interested in nutrition information. Their interest greatly increased when they saw their individual printouts. "I knew my diet was bad, but I didn't know it was this bad," was a typical response.

Six weeks later, they were asked, unexpectedly, to do a food recall of the previous day, which was also analyzed. The two women with the worst diets the first time had greatly improved. The other eight were either slightly worse or much worse. In examining if the two who improved had been treated differently from the others, I found the coordinator in the housing program had reinforced the importance of nutrition with one.

The nutrition segment of this program is being changed. The final recall was taken at the end of the month and although all women ate a reasonable quantity of food, the quality was poor. Therefore, as future groups reach their goal of having money enough to buy food at the end of the month, I need to look at how they're doing it. Teaching nutrition to this group wasn't very successful. However, because an evaluation tool was used, the way I teach the nutrition and budgeting segments can be changed, along with continually monitoring students more closely.

Making the evaluation a part of the teaching can provide clear evidence of results. The students like it. By participating, they receive individualized attention and information they can use. So far, I've only tried it with very small groups, but as methods are developed and standardized, adaptation to larger groups will be possible.