Winter 1993 // Volume 31 // Number 4 // Feature Articles // 4FEA4

Previous Article Issue Contents Previous Article

Why Programs Aren't Implemented as Planned

Abstract
Rather than bemoan the seemingly unmanageable situation and berate those who don't accomplish their stated work plans, perhaps we should embrace and strive to better understand the current planning practice as more purposeful and productive in a multi-influence, constantly changing environment....Given the dynamic, multi-influential environment in which learners reside and the educational experience takes place, implementing programs as originally planned may, in fact, be counterproductive to Extension goals.


Lorilee R. Sandmann
Director, Outreach Program Development
Michigan State University-East Lansing
Internet address: sandmann@msu.edu


Much attention is devoted in adult and Extension education literature, teaching, and practice to the planning phase of program development. A tremendous amount of time, resources, and energy are put into situational analysis, problem identification, and needs assessment to produce Extension plans of work. This has recently been accelerated with the emphasis on emerging issues management and multiyear planning cycles.

Yet we know little about the effectiveness of these planning processes. For example, in a study involving a randomly selected sample of 36 front line Extension officers from four Caribbean national extension services, it was found 89% of the agents studied had well-written, specific, and action-oriented work plans.1 In analyzing the resulting programs, however, only eight percent of the programs were implemented as specified in the written work plan. In most cases, Extension officers modified their written plans. Although this study was done in an international setting, the results are similar to those observed within the U.S. Cooperative Extension System and provide an instructive basis for analyzing our planning process.

Why, then, is there such a discrepancy between planning and implementation? Diagnosing the problem as faulty planning may be too simplistic. Clues come from more fully understanding the 1990s community-based programming milieu widely applied in Extension.

Implementation Impediments

Constraints to implementation of work plans identified in the Caribbean study included:

  • Shifts in staffing patterns (vacancies, extended leaves, transfers, inexperienced officers).
  • Budgetary constraints.
  • Unplanned activities (poor organization, unrealistic projection of time, special assignments).
  • Inadequate supervision.
  • Lack of organizational supports (infrastructure, specialist assistance, educational materials).
  • Unfavorable climatic conditions.
  • Import or export market uncertainty.
  • Changes in support from a cooperating agency or commodity group.
  • Lack of participation, sustained interest, and commitment from constituents/learners.

These, and other factors, usually described by the programmers as pressures, problem sources, or resources, influence the educational program decision making throughout the life cycle of the program. The four-quadrant framework, presented in Figure 1, adapted from Forest,2 represents these multifactorial influences on community-based Extension programming and educators.

Figure 1. Sources of influence on program development decisions.

In the Caribbean study, these influences were shown to change frequently, and were often unpredictable. As one Extension officer said, "The ministry might come up with something and throw things off balance. You can't complete things as you have planned."

Congruent or Competing Influences

To fully understand the current programming practice, one more dimension must be considered. In addition to the nature and number of dominant influences, the congruency or competitiveness of those influences also affects programs. Influences can range from being congruous or supportive of each other, to being competitive or conflicting with one another. The extent to which the influences are congruent or competitive affect the Extension educator's practice and the clarity, fluidity, and outcomes of the resulting program.

The pattern of interrelationships among influences is depicted by the congruence-contingency model shown in Figure 2. The level of dominance of individual influences appears on a vertical continuum from even (many dominant influences) to uneven (one or fewer dominant influences); the level of congruence among influences appears on a horizontal continuum from congruent to incongruent. The relative size of the circles and their physical relationship-that is, the way in which the circles mesh- represents the different relationships among programming influences. The four cells describe the influence configurations and type of program that results.

Figure 2. Relationship of influence to programming practice.

In a program in which the influences are both congruent and evenly distributed, as shown in Cell 1, all four influence systems contribute equally to promoting the program. In some programs, the dominance of the influences may be even, but the influences compete with one another. As shown in Cell 2, the result is a program that may not live up to its potential. For example, in one Caribbean Extension weed control program, the influences appeared to be congruent in the beginning. However, halfway through the program, the major community employer began to compete with the agent's goal and clientele participation. The program died.

In some programs, the influences may be congruent, but one influence dominates the others, as shown in Cell 3. In these programs, the programmer leads through the dominant approach, while negotiating and incorporating the other influences when necessary and appropriate. A rabbit production program is an example of a dominant institutional approach, since it was mandated, funded, and staffed by the Extension unit. Although the Extension officer started there, she also built local technology, clientele need, and markets, as well as personal expertise.

Yet, in other programs, one influence is dominant, and other competing influences are disregarded, as illustrated in Cell 4. An example is an institutionally imposed program when it doesn't "fit" or where the markets aren't aligned.

Fluid, Generative Programming

Again, why the discrepancy between work plans and implementation? Based on this understanding of the current programming model, at least three interpretations can be offered. One interpretation may be that Extenion educators are unable to do contempory planning. It could be hypothesized that the more competently and thoroughly the four influences' areas are analyzed-both the influences' current and anticipated status-the more likely the resulting plans would be implemented.

Another interpretation is that written work plans are an organizational artifact or designed to meet the needs of the institution. Given the multi-influence world confronting programmers, deviations from the plan would appear as the program is made more congruent with other influences.

Finally, another view is that written plans of work are based on the more linear, rational process than Extension educators actually follow. In fact, by tracing particular programs through the experience of their initiators as the Caribbean study has done, insights are gained into the day-to-day business of Extension education. Program decision making was seldom the logical, systematic process suggested by educational planning manuals and management textbooks.3 The educators studied didn't see program development as a step-by-step process; instead, they practiced it as an interactive process, one that was constantly changing and shaping their programs.

During the life of the programs, adult educators were constantly making decisions about how to define and implement, as well as determine future directions their programs should take. They spent much of their time trying to alleviate the tensions that resulted from conflicts between the amount of time available, and inevitable interruptions among differing organizational clientele, community, and personal goals. These Extension officers operated in an open system, one that was characterized by complex and unpredictable interdependencies; they worked with contradictions, ambiguity, and change. As a result, many of them proceeded in a generative manner. Under these conditions, deviations from the plans are to be expected.

Flexible Planning Model

Rather than bemoan the seemingly unmanageable situation and berate those who don't accomplish their stated work plans, perhaps we should embrace and strive to better understand the current planning practice as more purposeful and productive in a multi-influence, constantly changing environment.

Extension needs an organizational belief statement that effective programming involves continual formative evaluation of both the process (planning and implementation) and the product (the plan). To support this belief, a program planning approach is also needed that accepts the uncertainty and opportunities present in the program delivery environment, such as the one represented by the "Flex" Program Planning Model4 (see Table 1). This conceptualization incorporates strategic implementation considerations into an ongoing program planning process. This gives Extension educators permission to "flex" a program in differing directions, while remaining true to the organizational mission and agreed on or adjusted outcomes. This model, in many respects, formalizes how skillful educational programmers work anyway, but are often perceived as mavericks for doing so.

Table 1. "FLEX" program planning model.

First: Connect to mission and clarify

What does the organization's mission, record of achievement, and current administrative direction suggest that programs should primarily accomplish:
  1. A specific behavioral change by a specific audience?
  2. The delivery of a program at a certain level of quality?
  3. The delivery of a certain quantity of programs to a certain quantity of audience members?

Second: Evaluate, commit, prepare resources for program use

  1. What program provider staff and support staff are available for this program, and when?
  2. What financial resources, instructional resources, space, and support of cooperating organizations are available, and when?
  3. Is a staff development process required prior to, or during, program delivery to meet program objectives?
  4. For behavioral change objectives, what audience preparation and commitment development activities need to take place prior to "start up"? What type of activities and resources will be needed for reinforcing program audiences during the course of the program?

Third: Plan according to expectations, realism, and alternative program delivery possibilities

  1. Develop program objectives to match the outcomes sought for the program.
  2. Determine if the program delivery approach will be fully predetermined or will be "free form" with a starting process, but flexible to vary delivery methods.
  3. Develop "realistic" program implementation timetable and responsibility assignments in accordance with resource availability and capability.
  4. Develop evaluation criteria, process, and timetable in accordance with program objectives.

Fourth: Develop a "program flex" strategy to accompany the program plan

  1. Identify special opportunities or situations that might occur during program implementation, which should require program expansion or alteration.
  2. Identify other organizational priorities or needs that may divert resources from the program.
  3. Develop one or more strategies by which the program could expand, retract, or refocus, based on a change of situation.
  4. Confirm the method by which permission to refocus should be obtained; negotiate and obtain approval for both program plan and "flex" strategy.

Fifth: Execute the program plan with constant monitoring to determine degree of "flex" needed

  1. Develop benchmark and indicators as to when program "flex" should be considered.
  2. Execute appropriate changes in program according to "flex" situation, and with timely communication to administration.
  3. Fully document program changes.
  4. Review program evaluation design criteria to determine impact of program "flex."
  5. Identify and document the learning from program "flex" situations that should be reflected in the next planning cycle.

Given the dynamic, multi-influential environment in which learners reside and the educational experience takes place, implementing programs as originally planned may, in fact, be counterproductive to Extension goals.

Footnotes

1. L. R. Sandmann, Educational Program Development Approaches Associated with Eastern Caribbean Extension Programs (Ph.D. dissertation, University of Wisconsin-Madison, 1989).

2. L. Forest, Working with Our Publics: Module 4: Situation Analysis (Raleigh: North Carolina State University, 1988).

3. M. W. McCall, Jr., and R. E. Kaplan, Whatever It Takes: Decision Makers at Work (Englewood Cliffs, New Jersey: Prentice- Hall, 1985).

4. L. Granger and L. R. Sandmann, The Five-Step "Flex" Program Planning Model (Unpublished concept paper, Michigan State University, East Lansing, 1993).