October 2001 // Volume 39 // Number 5 // Feature Articles // 5FEA2

Previous Article Issue Contents Previous Article

Two (or More ) Heads Are Better Than One: An Application of Group Process to Developing Extension Evaluation Tools

Abstract
This article describes a process used to design a statewide evaluation tool for parenting education programs. Domains of successful parenting were identified using a nominal group process. Indicators were developed for five domains and were pilot tested in six counties. The resulting instrument was easy to use and produced reliable results that could be aggregated at the state level. Several shortcomings were identified and will be addressed. The involvement of county faculty in this process was a key to its success.


Millie Ferrer
Associate Professor, Department of Family, Youth, and Community Sciences
University of Florida
Gainesville, Florida
Internet Address: ferrer@mail.ifas.ufl.edu

Steve Jacob
Assistant Professor, Department of Family, Youth, and Community Sciences
University of Florida
Gainesville, Florida
Internet Address: sgj@ufl.edu

Theresa M. Ferrari
Assistant Professor, Department of Human and Community Resource Development
The Ohio State University
Columbus, Ohio
Internet Address: tferrari@postoffice.ag.ohio-state.edu


Background and Rationale

Evaluation has become critical in the functioning of Extension programs. The Federal government is demanding, through the Agricultural Research, Extension, and Education Reform Act (AREERA) of 1998, that Extension systems report state-level outcomes for funding decision purposes. Many states are placing similar demands on their Extension systems. Additionally, many states and counties have moved to performance-based budgeting, which requires state-level evaluation data for benchmarking purposes.

One way that the Florida Cooperative Extension Service works to achieve accountability is through state major programs and design teams (Taylor & Summerhill, 1994). One such major program is Successful Parenting and Family Development. A significant number of Florida counties provide educational programs to address the needs of parents. The Successful Parenting and Family Development Design Team provides leadership to activities that support these efforts, such as:

  1. Identifying or developing educational materials,
  2. Providing in-service training, and
  3. Facilitating and supporting evaluation activities.

The membership of the design team consists of Extension specialists and county faculty from each of the five districts in Florida.

In the spring of 1999, Florida sent representatives from the Successful Parenting and Family Development Design Team to the Southern Region Accountability Workshop held in Atlanta, Georgia. Various Extension professionals representing13 states met for the purpose of developing a performance-based accountability model for parenting education programs to be used by Extension professionals. The development of such a performance plan would provide a basis for more effectively communicating nationwide impacts of Extension's parent education programs.

In this regional meeting, participants first discussed major issues facing children and families throughout the nation. Subsequently, sub-groups of state representatives met and worked together to:

  • Prioritize the most relevant issues,
  • Share program ideas, and
  • Collectively formulate a set of strategic objectives from which to measure performance.

After the sharing of ideas from each sub-group, individual states met to work on a list of possible core domains to be used in their parent education programs. Each state reported to the larger group regarding the domains they were willing to work on upon their return to their respective states.

The Southern Region Accountability Workshop gave Florida the impetus to continue working toward a statewide evaluation strategy. Our goal was to refine these core domains, with indicators that were general enough to be used with most programming related to successful parenting and family development. The evaluation strategy was also to be based upon the following programming parameters:

  1. Current program delivery efforts of county faculty,
  2. County and state faculty identification of state major program objectives, and
  3. Current national and state models for parent/family education.

Methods

In addition to the draft of the core domains identified at the Atlanta workshop, other possible domains were obtained from various sources. These sources included the National Extension Parent Education Model (Smith, Cudaback, Goddard, & Myers-Walls, 1994), Parent Training Today (Alvy, 1994), Alabama Children's Trust Fund Evaluation Manual (Goddard, Smith, Mize, White, & White, 1994), and Nurturing Program for Parents and Children 4 to 12 Years (Family Development Resources, no date). These source materials are typically used by Extension professionals.

There were common themes that emerged from these resources. These themes are shown in Table 1 in the first column. They include:

  • Stress management and support,
  • Age-appropriate behavior,
  • Discipline,
  • Communication and intellectual development,
  • Connecting with community,
  • Anger management,
  • Physical care giving, and
  • Healthy self-esteem.

The members of the state major program team needed a great deal of input if these domains were to be useful at all levels--county, state, and federal. Previous efforts to develop state-level evaluation tools had failed because there was not sufficient input from county faculty and because the process and measures were too complex. Specifically, the questions were often written at too high a reading comprehension level for the audience being served, and there were far too many questions that were developed. The high number of questions enhanced the reliability of the measures, but the instrument became too long to use effectively with the audience.

The county faculty quickly realized this and refused to use the state-provided evaluation tool. At a statewide in-service training, county faculty were introduced to the draft of the core domains. By having the agents participate in the process of refining the initial domains formulated in Atlanta and by possibly adding additional domains based on their identified teaching objectives, we hoped to be more successful this time in developing a statewide evaluation tool.

Table 1
Summary of Potential Core Domains of Successful Parenting

Table One: Core Domains of Successful Parenting

The following process based on the nominal group technique was used (Delbecq, Van de Ven, & Gustafson, 1986).

  1. Faculty representing 26 counties voted on the top five (out of eight) domains of successful parenting that best represented their programming objectives (see Table1, Column 1).

  2. State specialists compiled the first wave of county faculty input and shared the overall results. Several domains received tie votes. Therefore, county faculty participated in a second wave of voting and again chose the top five items. From this process we identified the set of five core domains receiving the highest degree of support, which were: 1. Stress Management and Support, 2. Age Appropriate Behavior, 3. Discipline, 4. Communication, and 5. Healthy Self-Esteem.

  3. County faculty elected a representative from each of the five districts in Florida to assist with preparing an evaluation tool to address the five core domains.

  4. The committee met and identified five to seven specific indicators for each parenting domain. These indicators were derived from the academic literature, existing indicators used by county faculty, and brainstorming of ideas by the committee members. Indicators were generic enough that they were appropriate to use with various parenting curricula. Particular effort was made to keep the questions simple and easy to understand to accommodate varying levels of literacy among program participants. To ensure the readability, committee members shared draft question items with target audience members. After revision, the resulting instrument contained 27 specific indicators (see Table 2). The county faculty supported a post-only design because of its simplicity for respondents and ease of administration. The response categories were designed so program participants could indicate to what extent their behavior had changed compared to the start of the educational program (better, about the same, or worse)

Following the development of the new statewide instrument, six counties, four urban (central and south Florida) and two rural (north Florida), volunteered to pilot test the instrument in their parent education programs. Consistent with their plans of work, county faculty conducted programs primarily with low socioeconomic status target audiences consisting of single parents, court-ordered parents, and Headstart participant families. They selected indicators to reflect the domains that matched their educational objectives. Therefore, the total number of respondents ranged from 370 to 402 depending on the specific indicators included (Table 2).

Results of the Pilot Test

The results of the pilot test are summarized in Table 2.

Stress Management and Support

Among the five items, respondents reported the highest incidence of doing better with "identifying sources of stress in your life" (88%). About 7 in 10 respondents reported that they were doing better with dealing with stress and practicing positive stress management techniques (70 68%, respectively). The participants were less likely to report doing better with taking a break when needed (57%) and asking and receiving help from others (42%). The five items, when scaled to establish Chronbach's alpha reliability, produced a reliability coefficient of .71.

Age-Appropriate Behavior

Over 93% reported doing better with understanding their children's behavior, and nearly 90% reported doing better knowing what to expect at their child's age. Over 70% reported doing better at creating an environment where their children can grow and learn. About half of the respondents reported doing better with not comparing their child with other children and matching learning activities with developmental level (58 and 52%, respectively). The Chronbach's alpha reliability coefficient for the five age-appropriate behavior items was .72.

Discipline

About 8 in 10 respondents reported doing better in understanding the causes of a child's misbehavior and responding to their child's positive behavior. About 60% reported doing better with avoiding spanking and yelling. The remaining items--being consistent with rules, guiding their children's behavior, setting reasonable limits, and providing choices to make decisions--ranged from 55 to 45% of respondents reporting doing better. The seven discipline items produced an alpha reliability of .70.

Communication

About 80% of respondents reported doing better with not criticizing their children and listening to their children. About three-quarters of respondents reported doing better communicating as a family. Over half reported doing better at sharing feelings. Finally, only 40% reported doing better at letting their children express their feelings respectfully. The Chronbach's alpha reliability for the five communication items was .68.

Healthy Self-Esteem

About 87% of parents reported doing better at praising their children's efforts and helping them feel better about themselves. Over 70% reported doing better with providing opportunities for children to experience success. About 60% reported doing better with spending special time with each child and doing things together to create happy memories of childhood. The alpha reliability for the five self-esteem items was .77.

Table 2
Summary of Pilot Test Results

Table Two: Pilot Test Survey Results

Discussion and Conclusions

The state major program team members were very encouraged by the results and the feedback from the process. County faculty who participated in the pilot test reported that the instrument was extremely easy to use. It also provided them with useful information that they can incorporate into their annual state accomplishment report.

From a state-level perspective, it provided information that can be aggregated even when different curricula are used by county faculty. Although one may question validity of aggregating such data, it is acceptable because the instrument focuses upon measurement of general parenting domains rather than specific information. Indicators were generic enough that they were appropriate to use with various parenting curricula.

In addition, the outcomes were very useful for demonstrating accountability. These results move beyond documenting numbers and program satisfaction, and measure improvement in parenting practices as a result of the educational programs.

Furthermore, the information obtained can assist us with shaping the topics for future parenting educational materials. For example, if the audience is not indicating a high level of improvement in one of the domains, new teaching materials could be developed to further teach those skills to parents.

Having a valid and reliable instrument that measures statewide impacts and is acceptable to county faculty is a long-awaited goal of the state major program team. However, there are some shortcomings with the instrument's response categories. In particular, the "about the same" category is vague. It is not clear if respondents did not adopt the behavior or if they were already doing the behavior before the class. From a program development perspective, it would be useful to know this information in order to recommend programmatic changes such as increasing time devoted to the topic or incorporating different teaching strategies. Also, we were unable to tell how much "better" respondents were doing when they responded "better."

Our next step is to make the necessary revisions without making the evaluation tool too complex or cumbersome for the program participants. Again, county faculty will have input at our next statewide inservice training. Our goal is to revise response categories, to consider adding additional domains and indicators, and to increase the number of counties using the instrument. We will continue to use the group process outlined here to achieve these goals.

In summary, from this experience we want to emphasize three main points.

First, the process used to generate the domains would be applicable to most program areas in Extension. By incorporating county faculty input, the relevance of the evaluation instruments can be significantly improved. By using the nominal group process, the instrument is validated by the very people who will be using it and, consequently, is more relevant to their program needs. In addition, this helps ensure that it will be adopted--which has been a significant problem in the past.

Second, the instrument developed proved to be very easy to use, yet sensitive enough to show program areas that are in need of improvement.

Third, the process has greatly advanced the statewide evaluation efforts of parent education programs, and we will continue to refine and add tools as we proceed into the new 4-year plan of work. When it comes to designing useable evaluation tools, two heads (or more) are better than one.

References

Alvy, K. T. (1994). Parent training today. Studio City, CA: Center for the Improvement of Child Caring.

Delbecq, A., Van de Ven, A., & Gustafson, D. (1986). Group techniques for program planning: A guide to nominal group and Delphi processes. Middleton, WI: Green Briar Press.

Family Development Resources. (No date). Nurturing programs. Park City, UT: Author.

Goddard, H. W., Smith, B. L., Mize, J., White, M. B., & White, C. P. (1994). The Alabama Children's Trust Fund evaluation manual. Auburn, AL: Alabama Cooperative Extension Service.

Smith, C. A., Cudaback, D., Goddard, H. W., & Myers-Walls, J. (1994). National Extension Parent Education Model. Manhattan, KS: Kansas Cooperative Extension Service.

Taylor, C. L., & Summerhill, W. (1994). Concept of state major programs and design teams (Fact Sheet PE-56). Gainesville, FL: Florida Cooperative Extension Service.