October 2003 // Volume 41 // Number 5 // Feature Articles // 5FEA4
Success Outcome Markers in Extension (SOME): Evaluating the Effects of Transformational Learning Programs
Abstract
Success outcome markers (SOMs), and the process of creating them, offer
Extension a new approach to plan, monitor, and evaluate programs. Generating
success outcome markers helps to carefully determine all partners (including
beneficiaries) who may need to change to accomplish program goals and
identifies steps to continuously track incremental successes. Hard-to-measure
human behaviors become more concrete when success outcome markers are
listed. To successfully use SOMs, one must (a) create a vivid and compelling
vision, (b) list your WHOs, (c) write an outcome challenge for each WHO,
and (d) determine SOMs. Then decide how to monitor and report on each
SOM.
Introduction
Transformational Extension Programs--Educational programs that transform peoples' lives, as well as the health and well-being of a community.
SOME--A five-step process for assessing the contributions that transformational programs make in achieving outcomes.
Transformational Extension programs are essentially about people relating to each other and their environment. They go beyond service, technology transfer, and facilitation to concentrated, in-depth programs that help individuals develop and grow. These programs address complex and interrelated issues in social, economic, political, and/or technological contexts. In transformational learning situations, people's behaviors, relationships, actions, and/or activities change to improve their own lives, as well as the health and well-being of a community.
Relationships between educational approaches and the subject matter context (Figure 1) illustrate Extension programming that ranges from short-term service programs to in-depth programs focusing on transformational learning (Williams, Dickey, & Hergert, 2001). Service activities tend to be specific responses to focused questions or information that passes through an educator to the general public and includes answering clientele questions, promoting educational offerings, and providing information in crisis situations such as floods and droughts. Technology transfer programs provide an awareness of issues along with a more in-depth level of educational information to the learner through such efforts as publications, field days, health fairs, festivals, training events, invited presentations, Web page information, newsletters, personalized media columns and special news feature stories.
Facilitation efforts bring together parties who see different aspects of a problem, constructively explore their differences, and search for (and implement) solutions that go beyond their own limited vision of what is possible (Taylor-Powell, Rosing, & Geran, 1998). For simplicity, each identified quadrant represents an important facet of Extension programming. However, these quadrants are interrelated and when brought together into in-depth programming, transformational learning occurs with corresponding behavioral changes.
Figure 1.
Model of Extension Programming Adapted by Dean of Nebraska Cooperative
Extension, Elbert Dickey. (Original source unknown.)
Because of the "people factor" in the transformational quadrant, Extension has encountered numerous challenges in assessing and reporting transformational outcomes. While the Extension system is under pressure to demonstrate that transformational programs produce significant and lasting changes in their clientele, these outcomes may result from multiple agency efforts, and no single agency can claim sole credit. While assessing transformational outcomes is problematic, Extension needs to learn how to measure their contribution to program results.
To address this issue, Extension staff working in the areas of welfare-to-work, youth mentoring, capacity building for youth and families, juvenile diversion, and coalition building formed a small working group to Tell Extension's Success Stories (TESS). They started with the Outcome Engineering approach (Kibel, 1999, 2000a, 2000b) and used an appreciative inquiry process (Watkins & Mohr, 2001) to explore how the concepts might apply in Extension's transformational programming.
The TESS group used an iterative process over a 2-year period to think systematically, question their assumptions and mental models, engage in meaningful dialogue, and create visions that energized action. TESS also collaborated with sister agencies to test concepts in complementary types of programming. Success Outcome Markers in Extension (SOME) emerged (Jha, 2001).
Developing Success Outcome Markers
SOME assesses contributions that transformational programs make to the achievement of outcomes. While SOME can be used for monitoring at the project, program, or organizational level, it can also be used to evaluate on-going or completed activities. SOME significantly alters the way a program understands its goals and assesses its performance and results. SOME uses a five-step model to confirm a vision in a social, economic, or environmental condition to which a program hopes to contribute (Figure 2).
For the actors within the program's sphere of influence, SOME first identifies the vision leaders have for a program, and the mission identifies how the vision will be carried out. Program partners are then listed as well as program beneficiaries (i.e., WHOs). Outcome challenges are written for each "WHO," and success outcome markers (SOMs), identifiable actions or behaviors which indicate successful accomplishment of the outcome, are established. Transformation is accomplished through fundamental behavioral changes in clientele. Therefore, behavior change is the central concept of SOME (Kibel, 2000b; Rockwell Jha, Williams, & Thayer, 2000; Jha, 2001).
Figure 2.
Five-Step Process for Identifying Success Outcome Markers (SOMs)
Step One: Create the Program Vision
A program vision is a vivid and compelling description of a transformed reality one intends to be a partner in creating. It uses present tense to describe the optimum social, economic, or environmental condition the program hopes to help bring about, as well as a broad behavioral change in the primary clientele.
The vision goes deeper than program objectives, is broader in its scope, and extends over a longer term. The vision represents the ideal social, economic, or environmental condition the program wants to support; it should be inspirational and broad enough to remain relevant over time. The vision statement is used throughout the programming cycle to ensure that activities are consistent with its intent (Kibel, 2000b; Rockwell et al., 2000; Jha, 2001).
While achieving the vision usually lies beyond the program's potential, program activities contribute to, and facilitate, the transformed reality. Evaluation will measure the program's contribution to the vision, not the achievement of the vision.
Example of a Vision Statement At risk youth who participate in Juvenile Diversion, a community-based program to divert first time offenders from the court system, are successful in school, do not re-offend, and contribute to the community. |
Step Two: Describe the Mission
The mission statement tells how the program will carry out the vision. It describes the domain in which the program supports the vision rather than specific activities the program will use. It's an ideal statement that describes how the program will contribute as it supports the vision (Kibel, 2000b; Rockwell et al., 2000; Jha, 2001).
Example of a Mission Statement The Juvenile Diversion program supports the vision by connecting first time juvenile offenders and their parents with a multi-disciplinary professional team from various community agencies. It is built on research-based, relevant information that identifies appropriate activities to modify behaviors so youth can make positive life choices. |
Step Three: List All WHOs
WHOs are individuals, groups, or organizations who work together to achieve program success. WHOs (comparable to the term "stakeholders") include those who can influence a program, as well as those the program directly targets. If the program does not directly interact with a WHO, it determines the persons the program can influence who will, in turn, interact with the WHO.
In this way, the program stays within its sphere of influence, but with a broader vision (Kibel, 2000b; Rockwell et al., 2000; Jha, 2001). For example, a Juvenile Diversion program may not be able to interact with the entire police force directly, but it can interact with the police chief who can influence the police force. Therefore, the police chief would be included in the list of WHOs, but the police force would not.
When listing WHOs, the program includes partners, as well as program beneficiaries. Generally, WHOs fall into four categories:
- Primary Beneficiaries: the program's target population for whom the program
works to improve social, economic, or environmental conditions.
- Partners: individuals, agencies, or organizations that cooperate or offer
interrelated services to the same primary beneficiaries. Extension partners
can include educators, specialists, program assistants, and volunteers. Other
community partners may be from education, government, or human service agencies/organizations.
- Catalysts & Overseers: individuals, groups, or organizations that have
the power to promote, block, or otherwise influence how the primary beneficiaries
are reached and affected. Included in this group are program funders, advisory
boards, as well as others to whom change agents may report.
- Change Agents: persons who develop or implement best practices. Included in this list are those who design or teach research-based programs to primary beneficiaries in response to identified needs (Kibel, 2000b; Rockwell et al., 2000; Jha, 2001).
Example of WHOs for the Juvenile Diversion Program First time youth offenders |
Step Four: Write an Outcome Challenge for Each WHO
Outcome challenges describe intended impacts on key program partners. Outcome challenges describe how patterns of behaviors, procedures, or actions of individuals, groups, or institutions will change if the program is extremely successful. They should focus on behavioral change and be idealistic but realistic. They are phrased so they capture how the actor will behave and relate to others if the program reaches it full potential as a facilitator of change. Outcome challenges typically have three distinct parts:
- Identification of the program partner (WHO),
- A clause describing successful attainment of a desired change (i.e., what beneficiaries gain from the program), and
- A behavioral intention that represents a significant attainment for the person or group targeted (Kibel, 2000b; Rockwell et al., 2000; Jha, 2001).
Example Outcome Challenge for a Youth in a
|
Step Five: Write Success Outcome Markers for Each Who
Success outcome markers (SOMs) are similar to indicators because they are identifiable actions or behaviors that lead to successful accomplishment of the outcome challenge. They advance in degree from simple participatory activities to complex, life-changing behaviors. SOMs are listed at three levels: EXPECT to see, LIKE to see, and LOVE to see (Kibel, 1999).
- EXPECT to see SOMs identify behaviors that must occur
before there can be any successful program outcomes. They usually focus on
participation activities because the WHOs need to be engaged in the
program activity before they can begin to react to the subject matter and
change their behavior patterns to be consistent with new knowledge, attitudes,
skills, or aspirations promoted by the program.
- LIKE to see SOMs identify behaviors that come after, or start
to emerge from, the 'expect to see successes'. They are the more immediate
behaviors, or new practices, program beneficiaries adopt as they start to
apply new knowledge and skills, or alter their attitudes or aspirations in
their work and life situations. Typically a change at this stage needs to
be sustained for at least six months. Like to see SOMs may be the
highest level that many program participants ever attain.
- LOVE to see SOMs are longer-term or higher-order behavior changes that come after the 'like to see successes'. They are new practices that program partners adopt as they use new skills to affect their own life or the environment in which they live, work, or play. They are sustained over extended periods of time and become indicators of transformational change. Although some program participants may never achieve Love to see SOMs, it should not be viewed as a program failure (Rockwell & Bennett, 2000; Rockwell et al., 2000).
Examples of SOMs That Flow from
an Outcome Challenge in a Outcome Challenge: We expect to see youth who attend juvenile diversion classes take the necessary steps to make better life choices and become responsible community citizens. EXPECT TO SEE YOUTH WHO:
LIKE TO SEE YOUTH WHO:
LOVE TO SEE YOUTH WHO:
|
Monitoring the SOMs
Transformational change in the end program user is the program goal, and SOMs are a way to monitor achievements that contribute to the transformational outcome. Each SOM is important individually and can be viewed as a sample indicator of behavioral change, but it is the cumulative power of the SOMs that summarizes the transformational change identified in the outcome challenge.
Establishing a way to track progress is an important step in the SOM process. How SOMs will be measured--simple counts, observation, surveys, interviews, focus groups, specific instruments--and who will be responsible for gathering the information are important considerations. Identifying which SOMs are most likely to describe program outcomes and concentrating on appropriate monitoring and evaluation methods for tracking them is an essential part of monitoring success by using the Success Outcome Markers in Extension (SOME) strategy.
Summary
Using Kibel's basic outcome engineering theory (1999), a number of agencies and organizations are rethinking ways to target outcomes in complex programming. Some evaluators, such as those in the International Development Research Centre, are applying the theory in an outcome mapping context to large, complex international development grants (Earl et al., 2001); others, such as those in Cooperative Extension, are applying the theory, or parts of it, at the project, program, or organizational level to evaluate on-going or completed transformational programs.
For Extension, Success Outcome Markers (SOMs), and the process of creating them, offer a new approach to plan, monitor and evaluate programs. Generating success outcome markers helps to carefully determine all partners (including beneficiaries) who may need to change to accomplish program goals and identifies steps to continuously track incremental successes. Hard-to-measure human behaviors become more concrete when success outcome markers are listed. To successfully use SOMs, one must:
- Create a vivid and compelling vision,
- List your WHOs,
- Write an outcome challenge for each WHO, and
- Determine SOMs.
Then decide how to monitor (i.e., simple counts, observations, surveys, interviews, focus groups, etc.) and report on each SOM.
References
Earl, S., Carden, F. & Smutylo, T. (2001). Outcome Mapping: Building learning and reflection into development programs. International Development Research Centre, PO Box 8500, Ottawa, Canada K1G 3H9.
Jha, L. R. (2001). Using appreciative inquiry to test the application of outcome engineering in extension programs (Doctoral dissertation, University of Nebraska-Lincoln, 2001). Dissertation Abstracts International, 62, 2656.
Kibel, B. (1999). Outcome engineering. Unpublished document, Pacific Institute for Research and Evaluation (P.I.R.E.), Chapel Hill, NC.
Kibel, B. (2000a). Accounting for spirit: A guide for organizations and programs that aim to make a deep difference in people's lives. Unpublished document, Pacific Institute for Research and Evaluation (P.I.R.E.), Chapel Hill, NC.
Kibel, B. (2000b, September). Outcome engineering toolbox: User manual. Retrieved June 15, 2001 from Pacific Institute for Research and Evaluation Web site: http://www.pire.org/resultmapping/homepage.htm
Rockwell, S. K. & Bennett, C. F. (2000). Targeting outcomes of programs (TOP): A hierarchy for targeting outcomes and evaluating their achievement. Retrieved July, 2002 from University of Nebraska, Institute of Agriculture and Natural Resources Web site: http://deal.unl.edu/TOP/
Rockwell, S. K., Jha, L. R., Williams, S. & Thayer, C. (November 2000). Using success markers for programming in Extension education. Presented to the American Evaluation Association's Annual Meeting, Honolulu, Hawaii. Available at: http://danr.ucop.edu/eee-aea/using_success_markers.htm
Taylor-Powell, E., Rosing, B. & Geran, J. (1998). Evaluating collaboratives: Reaching the potential. Retrieved from the University of Wisconsin-Extension Web site: http://cecommerce.uwex.edu/pdfs/G3658_8.PDF
Watkins, J. M. & Mohr, B. J. (2001). Appreciative inquiry. Jossey-Bass/Pfeifer. San Francisco, CA.
Williams, S. N., Dickey, E. C., Hergert, G. W. (2001). Excerpt from the 2001 Unit Accomplishment Guidelines in Cooperative Extension. Unpublished guidelines for Cooperative Extension at the University of Nebraska, Lincoln.