Spring 1989 // Volume 27 // Number 1 // Feature Articles // 1FEA2

Previous Article Issue Contents Previous Article

Evaluating Across State Lines

Abstract


Dorothea Cudaback
Human Relations Specialist
Cooperative Extension
University of California-Berkeley


National Initiatives and multistate issues programming will require new forms of collaboration that extend across state lines. It's one thing to talk about multistate cooperation, but it's quite another to actually do it. It clearly takes a lot of effort, but is the effort worthwhile? A multistate evaluation by family life specialists offers both lessons and caveats that may be relevant to any multi-state Extension effort.

Collaboration Begins

The seeds of this multistate evaluation were planted in 1981 at a national Cooperative Extension family life specialists conference. About a dozen of us got together to share ideas about our use of age-paced, parent-education, home-learning programs. These are a series of four- to eight-page booklets of information about pregnancy, infant development, and parenting, which are distributed monthly-usually by mail-to new parents. The information in each booklet is keyed to the baby's age in months, so parents receive information when they most want and need it.

Each of us in this group was using some form of this program. We decided that we wanted to learn more about how these programs were used nationally and the impact they had on the parents who received them. That was the first and last time we met together face to face; from then on, communication was by phone or mail.

National Survey

Our first year, we surveyed all state Cooperative Extension family life specialists to learn about their use, if any, of these home-learning, parent-education programs. We found that 19 states were using the programs, reaching about 100,000 families annually. We also learned that although the specialists were pleased with the programs, only a few had tried to evaluate them.1

Joint Evaluation

By now, it was 1983 and our team had shrunk to five specialists.2 Intrigued by the results of our survey, we decided to do a multistate evaluation to determine more fully the impact of the programs on recipients and to learn about the kinds of parents for which the series was most useful. We invited family life specialists from each of the other user states to join us in this venture. Eight accepted our invitation; five were able to follow-through.3 Our study consisted of evaluating programs in 10 states-five states served by the invited specialists and five by the team members.

With one exception, each state's series differed from the rest in length, format, style, and reading level, but a content analysis showed that generally content and goals were similar. We agreed on common program objectives, devised a common post-series questionnaire, and determined a standard method of administering the questionnaire and eliciting returns.

Each of the 10 state specialists duplicated and mailed her own questionnaires. My staff and I coded, tabulated, and analyzed the data. Computer costs came from the University of California Cooperative Extension budget.

By mid-1985, we had analyzed the 2,263 usable questionnaires received, a 58% return rate. Our analysis gave us aggregate and state data about the kinds of parents who received the series, the extent to which they read and shared the booklets, their rating of the usefulness of each subject, and their view of the impact of the series on their parenting attitudes and practices. Our analysis also identified those kinds of parents who were most likely to report changes in parenting as a result of receiving the series.

Using the Results

State printouts and aggregate data reports were sent to each of the 10 participating state specialists to use as she wanted. A few months later, and again a year later, we contacted family life specialists in each of the 10 states to find out if and how they had used the information.

We learned that our data didn't end up filed and forgotten. All 10 family life specialists had used their state data in national and state reports and had sent summaries of it to their county staff. Three had used the data in reports to funding agencies; two had sent material based on the data to key state legislators.

All but one state specialist reported that the evaluation results had had a significant impact on her state's parenting program. Four states extended the length of their home-learning programs to parents of older children. Two improved the content of their series, using top-rated items from other states. Three state specialists increased their distribution of the series to those parents shown by our evaluation to be most responsive to the program (teenage, Hispanic, and low-income parents). Three state specialists shifted from batch mailing to monthly mailing of their booklets because our analysis showed that parents who received the booklets monthly were most likely to show improved parenting practices. Seven state specialists successfully used the results of our study to promote increased program funding for postage, printing, and/or series revision.

Implications

Multistate evaluations bring with them benefits and challenges. Collaborating takes time. Any one of us working alone could have evaluated our program more quickly. It took time to come to agreement on evaluation goals, direction, and methodology. Work was delayed by team members' heavy work schedules, study leaves, vacations, etc. To produce multistate evaluations on a tight schedule, all participants would need to commit some unencumbered time to the evaluation process; this isn't always easy to do.

On the other hand, our multistate evaluation gave us more credible and useful data than we could have obtained by individual state evaluations. Aggregating our data gave us convincing numbers-ones we could work with statistically. Jointly, we received over 2,000 evaluation responses; had we done separate state evaluations, each of us would probably have received only about one tenth this number. The common evaluation also gave us the opportunity to compare the impact of slightly different home-learning series. We also saved time and money by tabulating and analyzing our data in one location.

Our much discussed and revised evaluation methodology and instruments are probably better than any we could have devised individually. Working together, we prodded each other to keep the evaluation moving, pondered jointly our evaluation dilemmas, and, when the results were in, shared ideas on interpreting the data and using the information to improve our programs. Finally, and maybe most importantly, our joint endeavor has encouraged us to continue to work together on other issues, share program ideas, and struggle with common problems.

Extension is moving toward increased national issue-oriented programming that will likely mean more cooperative state programming to address national priority problems. As part of this effort, we'll need to measure the extent to which these programs succeed in reducing the problems they address-and thus evaluation will transcend state borders. Our experience has convinced us that this kind of multistate evaluation is practical, effective - and exciting.

Footnotes

1. For more information on the results of this survey see: Dorothea Cudaback and others, "Becoming Successful Parents: Can Age-Paced Newsletters Help?" Family Relations, XXXIV (April 1985), 271-75 and Patricia Nelson and Dorothea Cudaback, "Catch Them When You Can: Sequencing Newsletters to Capture the Teachable Moment," Journal of Extension, XXIII (Summer 1985), 13-15.

2. State family life specialists on the team were: Dorothea Cudaback, California; Cindy Darden, Georgia; Dorothy Labensohn, Iowa; Patricia Nelson, Delaware; and Emily Wiggins, South Carolina.

3. The following five state family life specialists collaborated with our team in this evaluation: Alberta Johnson, Arizona; Sally Kees, Nevada; Martha Lamberts, Washington; Frances Wagner, North Carolina; and Evelyn Rooks-Weir, Florida.