June 1994 // Volume 32 // Number 1 // Feature Articles // 1FEA6

Previous Article Issue Contents Previous Article

A Family Life Program Accountability Tool

The Cooperative Extension Program Evaluation Survey (CEPES) was created as an accountability tool for research in family life programs. CEPES tests seven dependent variables concerning family life, demographic information, and controls for potential changes in family strain levels. As a pilot, a pretest and posttest were given to 13 different audiences (N = 244) in two states. In the pilot test, family life programs in parenting, communication, conflict resolution, stress management, and balancing work and family were evaluated. Results from CEPES can be used by program leaders, department heads, specialists, and county agents to assess current programs. The assessments can be used to validate programs to clientele, county commissioners, state and national legislators, and funding agencies.

Robert J. Fetsch
Professor and Extension Specialist
Human Development and Family Studies
Colorado State University Cooperative Extension
Internet address: fetsch@lamor.colostate.edu

Deb Gebeke
Family Science Specialist
Cooperative Extension Service
North Dakota State University

Nationwide, Cooperative Extension professionals are responding to the public's requests for parenting, communication, and other social and economic programs. In offering these educational programs, Extension must be accountable and measure results. We need to establish results that speak to the people's needs and to stakeholders' expectations.

Although the Cooperative Extension System has a long history of helping people help themselves, many Extension professionals have difficulty demonstrating the results and impacts of their educational programs. If Extension faculty can aggregate and compare preventative educational program results both within and across state lines, we can begin to provide the kind of data that our supporters at the state and national levels can use to speak with conviction about our results to funders. This article describes the ongoing testing of a pretest-posttest tool and a method for providing valid, reliable data about family life programs to Extension clientele and other stakeholders.


Across the United States, a need exists for research-based, social and economic educational programs that strengthen families. In a recent national study, when asked how critical and urgent they considered "strengthening the family" as a national priority, Americans ranked it seventh most critical of 33 social and economic issues (Jenson & Warstadt, 1990). According to a statewide Colorado needs assessment, "strengthening the family should be a national priority" was ranked first of 33 social and economic issues by a sample of selected local people (n = 344) and third most critical by a sample of Extension advisory committee members (n = 384) (Weigel, Fetsch, Jenson, Yang, & Rogers, 1992).

Because strengthening families was a critical issue in Colorado and North Dakota, the authors responded to specific requests by providing preventative educational programs (parenting, communication, problem-solving, balancing work and family, stress and time management) and by evaluating program results using a new tool as pretest and posttest measures of program participant changes.

One of the Cooperative Extension System's special contributions to our nation is its research-based educational programs that invite families to identify their strengths and marshal their resources so they can meet family members' changing needs. Another is its use of land grant university program evaluation resources to develop tools and methods that determine which programs work best with whom.


The objective of this study was to pilot test the Cooperative Extension Program Evaluation Survey (CEPES) (Fetsch, 1994), which provides results data on seven dependent variables measuring behavioral changes, tax dollars support, family coping, quality of life, self-esteem, stress and depression levels. Validity coefficients of the three subscales by Hamilton McCubbin (family strain, family coping, and quality of life) are .87, .80, and .82 respectively; reliability indexes are .69, .71, and .76 respectively (McCubbin, 1987; McCubbin, Olson, Lavee, & Patterson, 1985). CEPES also provides demographic information and controls for potential changes in family strain levels.


The problem we faced was how to evaluate the legitimate results of family life programs. Traditionally, Extension educators have used fairly "soft" instruments to obtain knowledge, attitudinal and behavioral results according to the Bennett hierarchy (Bennett, 1975, 1980). These were acceptable for convincing many people inside Extension of the legitimacy of family life programs. Today, however, as revenues become scarcer, Extension faculty are questioned about the differences their programs make. Legislators who have to decide which programs to fund are becoming more sophisticated about legitimate quasi-experimental methodology.


For more than a decade, Cooperative Extension family life specialists have collaborated to develop, test, and share methods for obtaining results of family life programs (Fetsch, 1991). A number of state specialists worked with Hamilton I. McCubbin, Dean, School of Family Resources and Consumer Sciences at the University of Wisconsin-Madison, to identify and test short, sensitive, common, valid, and reliable instruments to collect empirical data documenting family life program impacts. McCubbin gave permission for Extension family life specialists to use his copyrighted instruments. We used four questions to guide our selection from among these instruments:

  1. Which concepts are we likely to teach in our stress management and balancing work and family programs?

  2. Which subscales have acceptable validity and reliability levels?

  3. Which subscales and items appear sensitive to the impacts of our educational programs?

  4. What impact data can we aggregate across state lines?

From these instruments, the authors developed the Cooperative Extension Program Evaluation Survey (CEPES) (Fetsch, 1994). To pilot test CEPES, we evaluated family life programs in parenting, communication, conflict resolution, stress management, and balancing work and family with 13 different audiences (N = 244) in two states. We used CEPES pretests early during the educational programs. For immediate feedback from participants during the workshops, we guided them in scoring two subtests (family coping-coherence and quality of life). We provided nationwide norms reported by McCubbin, Olson, Lavee, and Patterson (1985) so participants could see how their scores compared with those of others. We used a 10-Step Method to incorporate the self-assessment component into the content of the educational program, to guide participants through the evaluation process, and to "sell" them on completing and returning posttests (Fetsch, 1993). We mailed out CEPES posttests two to five months later.

Evaluation and Impact Data

So what? How effective are our programs? According to the Bennett hierarchy (Bennett, 1975, 1980), a critical indicator of the effectiveness of an educational workshop is whether or not participants improve their behaviors as a result. Programs at all 13 sites resulted in self-reports of positive behavioral changes as a direct result of the program. Fifty percent to 88 percent of respondents reported making one to three positive behavioral changes. They changed in four major ways, indicating positive answers to the following statements: (a) we seek encouragement and support from friends, (b) we've improved our financial management practices, (c) I manage conflict better, and (d) we've increased our use of effective family coping strategies. Incidentally, participants in the programs also indicated positive attitudes about use of tax dollars to support such educational programs. The range of support by the groups was from 73 to 100 percent.


The Land Grant and Cooperative Extension Systems must address social and economic conditions. The needs of the people demand that we do so. At the same time, we must be accountable for programming results. Extension professionals interested in addressing social and economic issues of their constituents can now assess their program results using CEPES. CEPES is timely and superior to some of our previously used program evaluation tools. It has good reliability.

CEPES is useful to county agents and specialists with family life programming responsibilities. It could be used by 4-H youth development professionals to determine their leader training results. CEPES could be applied by agricultural professionals who see the potential for enhancing the human relationship skills of farm and ranch families.

CEPES may also be useful to Cooperative Extension program leaders and Department Heads who want to assist their specialists and agents in being accountable and in comparing and aggregating the results of programming efforts. These evaluation results may then be used to drop or modify weak, ineffective programs, to strengthen strong ones, and to report results to clientele, county commissioners, state and national legislators. CEPES can help us do what we do best--provide effective research-based programs.


Bennett, C. F. (1975). Up the hierarchy. Journal of Extension, XII(March/April), 13-22.

Bennett, C. F. (1980). Teaching materials on "Seven levels of evidence": A guide for extension workers (ESC-575, Supplement 1). Washington, DC: United States Department of Agriculture.

Fetsch, R. J. (1991, November). Preliminary Colorado results using cooperative extension program evaluation survey (CEPES). Paper presented at the National Meeting of Extension Specialists NCFR Pre-Conference, Denver, CO.

Fetsch, R. J. (1993). Help us help you: 10 steps to results. Unpublished paper, Colorado State University, Department of Human Development & Family Studies, Fort Collins.

Fetsch, R. J. (1994). Cooperative extension program evaluation survey: Pretest & posttest. Unpublished paper, Colorado State University, Department of Human Development & Family Studies, Fort Collins.

Jenson, G. O., & Warstadt, T. (1990). A ranking of critical issues facing American families. Logan: Utah State University Cooperative Extension.

McCubbin, H. I. (1987). FIRA-G family index of regenerativity and adaptation--general. In H. I. McCubbin & A. I. Thompson (Eds.), Family assessment inventories for research and practice (pp. 285-302). Madison: University of Wisconsin.

McCubbin, H. I., Olson, D. H., Lavee, Y., & Patterson, J. M. (1985). The family paradigm album: Family invulnerability test stress, strengths and adaptation. St. Paul: University of Minnesota.

Weigel, R. R., Fetsch, R. J., Jenson, G. O., Yang, R. K., & Rogers, D. L. (1992). Issues validation: A new environmental scanning technique for family life educators. Family Relations, 41, 251-255.