The Journal of Extension - www.joe.org

October 2012 // Volume 50 // Number 5 // Feature // v50-5a1

A Model for Evaluating eXtension Communities of Practice

Abstract
As Americans shift their work and leisure activities online, Extension seeks to remain viable by delivering programs through a website known as eXtension. eXtension is predicated on the voluntary labor of Extension specialists and educators who form Communities of Practice to create and deliver content through the website. Evaluation of eXtension CoP can be effectively executed using Patton's (2011) developmental evaluation model. A flow of activities for evaluating eXtension CoP using this approach is presented along with a case study of the Grape CoP evaluation using Patton's developmental evaluation model.


Kathleen D. Kelsey
Professor
Oklahoma State University
Stillwater, Oklahoma
Kathleen.kelsey@okstate.edu

Eric T. Stafne
Assistant Extension Professor-Fruit Crops
Mississippi State University
Poplarville, Mississippi
estafne@ext.msstate.edu

Introduction to eXtension Communities of Practice

eXtension.org was launched in 2008 to meet the public's expectations of a relevant and accessible Cooperative Extension Service (CES). The goal of eXtension was to become a "centrally managed, but locally delivered state-of-the-art, full-service program that uses technology and new organizational processes such as Communities of Practice (CoPs), Frequently Asked Questions (FAQ), and various Wikis" (Grace & Lambur, 2009, p. 5).

eXtension is predicated on the voluntary labor of CoP members, who form to create and deliver content around their areas of expertise. Most eXtension CoPs consists of Extension specialists and educators who focus on a specific content area such as horses (see <http://www.extension.org/horses> as an example of one of the most popular CoPs).

CoPs were outlined by Lave and Wenger (1991) as informal professional networks that exist to enhance professional development, mentoring, and expertise through observation, interaction, discourse, and practice. CoPs provide the structure for professionals to learn, problem solve, and create a space for interaction around a specific focus.

The benefits of participating in a CoP are networking with peers, working across disciplines, working in multi-state programs, learning from peers, teaching peers, reducing redundancy, and engaging the discipline in a more innovative and in-depth manner (Wenger, McDermott, & Snyder, 2002). Sobrero (2008) discussed the value of virtual CoPs in relation to eXtension and concluded that engaged universities should rely on virtual learning environments to "stay on the cutting edge in our disciplines, areas of expertise and issues valued by learners" (para. 5). Sobrero encouraged Extension staff to use and contribute to eXtension to address local issues as well as engage with other Internet-based resources.

Conditions necessary for a CoP to exist include:

  • A mix of "newcomers" and "old-timers" to the profession (Lave & Wenger, 1991)
  • A knowledge base
  • A lived-in world where knowledge is socially constructed
  • Shared values
  • A process for cycling in newcomers by allowing them access and cycling out old-timers through displacement
  • Intrinsic motivation among members to perpetuate the CoP.

Learning happens as a result of "being active participants in the practices of social communities and constructing identities in relation to these communities" (Wenger, 1998, p. 4); hence, the term "Communities of Practice." Learning occurs as individuals participate in their CoPs, as learning is a socially mediated experience involving active participation in an engaged and dynamic community. Wenger (1998) admits that the notion of a CoP is "neither new nor old" and has the "eye-opening character of novelty and the forgotten familiarity of obviousness" (p. 7). His contribution has been in naming what we intuitively knew about learning, yet had no vocabulary for articulating the process.

While Lave and Wenger (1991) and Wenger (1998) have outlined the structure, organizational flow, and activities of CoPs, little research exists on the methods of evaluating their effectiveness. This article provides a streamlined process for evaluating eXtension CoPs using the Grape CoP as a case study.

A Model for Evaluating eXtension Communities of Practice

Evaluation often brings to mind the classic definition of judging the merit and worth of a program using defensible criteria with the assumption that the evaluator will serve as the judge who seeks to measure program goal attainment, outcomes, and impacts (Fitzpatrick, Sanders, & Worthen, 2010). Developmental evaluation was conceived to serve a very different role and function, where the evaluator acts as a critical friend to the leadership team for the purpose of "developing the intervention or program" from infancy to maturity (Patton, 2011, p. 116).

Patton (2011, p. 21-22) outlined five uses of developmental evaluation:

  1. Ongoing development or adapting an intervention to new conditions.
  2. Adapting effective general principles to a new context.
  3. Developing a rapid response to a major change.
  4. Preformative development of a potentially scalable innovation, or getting an intervention ready for summative evaluation.
  5. Major systems change and cross-scale evaluation to provide feedback about how the intervention is unfolding and how it may need to be adapted for broader application.

Using this framework for evaluating a CoP, the evaluator might focus on the third use, developing a rapid response to a major change. Data can be collected and fed back to the leadership team, formally and informally, for the purpose of rapid learning about how the innovation is unfolding.

In developmental evaluation, the leadership team consists of social innovators who are working to bring about change and use evaluation as a tool for learning about the innovation as it unfolds. Innovations are defined as non-linear, emergent, dynamic, adaptive, uncertain, and co-evolutionary (Patton, 2011). Using this definition, eXtension is an innovation.

Role of the Evaluator

A developmental evaluator must be situated as an insider, part of the design and leadership team, fully participating in decisions and "facilitating discussion about how to evaluate whatever happens" (Patton, 2011, p. 116). The developmental evaluator is deeply involved in improving the CoP within eXtension and uses evaluative techniques to support data-driven decision-making. "Evaluator credibility depends on a mutually respectful relationship" with the project principals as the evaluator strives for "reality-testing, results-focused, and learning-oriented leadership" within the team (Patton, 2011, p. 25).

Ideally, as a group of content experts move toward creating a CoP for eXtension, they will recruit an evaluator to co-write the project logic model and evaluation plan for the project. The evaluator should participate in project milestones, providing rapid response feedback to project leaders, and work closely with the project manager to identify emerging issues and options for resolution.

Evaluative judgments are not the goal of developmental evaluation (did the program achieve its goals and what are the impacts); rather, the goal is to use evaluation logic and reasoning to collect data for rapid feedback to the management team to improve the intervention in real time. The evaluation should result in "effective principles that can inform practice and minimum specifications that can be adapted to local context" (Patton, 2011, p. 26).

The Adaptive Cycle

The adaptive cycle was first introduced by Ludwig, Jones, and Holling (1978) to explain forest ecology and was used by Patton (2011) as a metaphor for understanding the life cycle of social programs and appropriate evaluation frameworks.

The adaptive cycle has four phases:

  1. Creative destruction, termination or release, such as a forest fire, resulting in chaos and uncertainty.
  2. Reorganization or renewal resulting in emerging ideas, creative problem solving, and generation of new programs.
  3. Exploitation resulting in a period of trying out new ideas and pilot programs, where the best ideas and programs emerge and lesser ideas and programs disappear.
  4. Conservation or dominance of a few programs resulting in a mature system such as a hardwood forest with stability, efficiency, and scale.

Developmental evaluation is an ideal choice for evaluating eXtension CoPs because they are considered an innovation in the reorganization phase of the adaptive cycle (Patton, 2011).

eXtension as Reorganization of Extension

eXtension was positioned at the reorganization phase, following the creative destruction of the Extension system over the past two decades (Morse et al., 2009). Over 85% of state Extension programs have experienced significant budget reductions, resulting in reorganization, including staff cuts and program elimination. Since 1982, Morse et al. reported an 18% reduction in FTE for county educators and a 29% loss in FTE for area agents. Meanwhile, there has been a 44% increase in state specialists FTE and a 74% increase in the number of administrators and supervisors over the same time frame, resulting in a shift from grassroots delivery of Extension programs to a top-down and administratively heavy organization that seeks to deliver content via the Internet.

Extension is under pressure to improve the quality of its programs, reach a wider and more diverse audience, while simultaneously adjusting to budget cuts. Meanwhile, the public is turning away from Extension, evidenced by a rapidly declining rate of public participation in Extension programs (Morse et al., 2009). King and Boehlje (2000) suggested that Extension is on the brink of extinction, and in a more detailed analysis of Extension, McDowell (2001) postulated that Extension has run its course, served its purpose, and will disappear altogether.

Developmental Evaluation Framework: Collaboration for Innovation

Patton (2011, p. 227) outlined inquiry frameworks to help the evaluator ask the right questions in the right evaluation context. Generally, developmental evaluation asks the following:

  1. "What is being developed?"
  2. "How is what's being developed and what's emerging to be judged?"
  3. "What's next?"

More specifically, the evaluator must match evaluation questions to the specific context and be responsive to the leadership team. When evaluating CoPs, it is recommended that the following questions guide the evaluation:

  • "What can we do together that we can't do separately?"
  • "What will each of us contribute to the whole?"
  • "How will we work together?"
  • "What difference will we make together?" (Patton, 2011, p. 244).

In short, what is the collaboration "actually doing and achieving?" Evaluators should provide feedback regarding how interactions are unfolding, what the collaboration is accomplishing, how the collaborators are working together, how collaborators see themselves and their shared effort in regard to degrees of engagement, and whether the aspirations of the collaboration actually emerge from the project (Patton, 2011).

Data Collection and Reporting

Evaluative data collection using the developmental model begins at the same time as the project commences through observation and participation in project milestones. The first step the evaluator takes is to become a member of the leadership team, present at all meetings and part of the decision-making process. Conversations among the group may focus on the developmental nature of the CoP, how the CoP is operating, and hoped for outcomes. Key sensitizing concepts will emerge from the meetings to further guide the evaluation, such as motives and barriers for participating in the CoP and how members perceived their role in the CoP. From the key sensitizing concepts, the evaluator can move toward developing data collection instruments such as surveys and interview protocols.

Data are reported informally and continuously. The goal of developmental evaluation is rapid response and feedback to inform decision-making. The evaluator should consider reporting as a conversation among the leadership team, discuss emerging theories of action, and provide recommendations for changes in direction on the fly. The evaluator should not wait for an annual formal report, much as a critical friend might inform a neighbor that his barn door is open and the cows are escaping.

The developmental evaluator exits the project when it has reached maturity, or the conservation stage of development. When an innovation is no longer an innovation, has attained its goals, served stakeholders, and obtained measurable outcomes, the evaluator's role may shift to the traditional function of summative evaluation or judgment of a mature program.

Case Study of the Grape CoP

In 2009, a grape community of practice (GCoP) was formed with eXtension as the hosting entity. The vision for the GCoP was to become a comprehensive online resource for research-based information by providing viticulturists access to content addressing all aspects of grape production. The GCoP is comprised of a North America-based group of professionals with expertise in commercial grape production. To date, the GCoP has 72 members from 30 states and Canada, who interact through a variety of online methods to collaboratively create content on the eXtension site. The GCoP is inclusive with other disciplines and is interested in including any interested participant. The group was funded by a United States Department of Agriculture, National Institute of Food and Agriculture, Specialty Crops Research Initiative grant and is considered a national priority by the National Grape and Wine Initiative.

The developmental evaluation model was used to evaluate the formation of the Grape CoP in years one and two. The evaluator assumed the role of developmental evaluator as described previously. Data were collected throughout the life of the project by participating in all milestones, including meetings, conference calls, and shadowing the emerging CoP on the eXtension website. The evaluator and the leadership team maintained a friendly relationship, engaging in informal conversations about observations and emerging trends that might serve to build a stronger CoP. The project manager used the evaluator as a sounding board to air frustrations and brainstorm ideas for engaging members. In addition to providing informal feedback on the fly, the evaluator compiled a formal report to share first and second year accomplishments with all members of the CoP.

To gather perceptions about the functionality of the CoP, all members were solicited to complete an open-ended survey (original collaboration team plus those who joined over the year = 38). Twenty members returned the survey, for a 53% response rate. The data were analyzed using a qualitative data analysis tool, Nvivo 8®, and reported as themes to address the following evaluation questions:

  • How were interactions unfolding within the CoP?
  • What was the CoP accomplishing?
  • How well did CoP members work together?
  • How did CoP members see themselves and their shared effort in regard to degrees of engagement?
  • Did the aspirations of the collaboration emerge?

CoP members experienced a high degree of success with interactions. Monthly conference calls, individual team meetings, and weekly newsletters were essential for creating a sense of community and fostering positive interactions.

The CoP accomplished all stated goals for its first year of operation. The majority of CoP members were productive and committed to the project. Eighty-two percent (82%) of members felt that decisions were made to advance the CoP.

However, only two of 17 believed that the group was functioning at optimal capacity, the primary reason being a lack of participation by the majority of CoP members. Thirty-seven percent of members reported not having created any content for the CoP. A lack of time was stated as the reason for the lack of content development by some members. A third of the members appeared to have little focus or direction within the CoP.

The majority of the members felt the collaboration was creating a shared identity as a CoP focused on an area of concern. Patton (2011) outlined degrees of working together as a continuum with five levels; 1) Networking or sharing information and ideas, 2) Cooperating or helping members to accomplish their separate individual goals, 3) Coordinating or working separately on shared goals, 4) Collaborating or working together toward a common goal but maintaining separate resources and responsibilities, 5) Partnering with shared goals, shared decisions, shared resources within a single entity. The CoP members appeared to be working at level 4, collaborating toward a common goal but maintaining separate resources and responsibilities, not fully integrated as partners with a shared vision, mission, and goals.

Members saw themselves as content experts (not novices). Contributing to the Grape CoP was in line with their professional values and practice. They saw the CoP as another tool for practicing Extension. Members joined the CoP to engage in a CoP with other grape experts and desired what the CoP promised to offer. Members were generally forward thinking and hopeful that the CoP would create a one-stop shop for viticulture information, a sense of accomplishment, professional recognition, and region-specific information for grape growers.

The majority of members were positive about their participation in the CoP. However, there were deep concerns and internal conflicts that prevented full commitment from all members. A lack of time was cited as the primary reason for non-engagement by 94% of members (indicative of a lack of priority). The primary concern was the amount of time and effort devoted to the CoP versus the perceived reward for doing so expressed by 47% of the members. Members were mostly concerned about how to document contributions to the CoP toward performance reviews.

Overall, the developmental evaluation found the aspirations of the CoP were realized. The project was well organized and well managed, and all goals for years one and two were achieved. The founding team envisioned a national repository for viticulture information. The CoP had become a tangible product, the infrastructure was in place to grow the repository, and a true Community of Practice emerged for members to benefit from professionally.

Future Evaluation of eXtension CoPs

Because eXtension depends on the voluntary labor of content experts to form CoPs, it is important to understand the functioning of these teams using evaluation. Ideal functioning of a CoP includes a dynamic mix of individuals who teach and learn from each other for professional development. Patton's (2011) developmental evaluation framework is an excellent model for evaluating eXtension CoPs because it assumes that eXtension is an innovation in its early stages of evolution (reorganization) and positions the evaluator as a critical friend for organizational learning, rather than measuring and judging outcomes.

Adopting developmental evaluation methods when evaluating the Grape CoP was successful because it allowed for rapid feedback on emerging issues. A suggested flow of activities is presented in Figure 1 to guide the developmental evaluation process.

Figure 1.
Flow of Activities for Developmental Evaluation

Flow of Activities for Developmental Evaluation

References

Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2010). Program evaluation: Alternative approaches and practical guidelines (4th ed.). Prentice Hall.

Grace, P., & Lambur, M. (2009). How is eXtension enhancing and impacting the Cooperative Extension system? Retrieved from: http://about.extension.org/mediawiki/files/0/03/EXtension_Lit_Review_8_09.pdf

King, D. A., & Boehlje, M. D. (2000). Extension: On the brink of extinction or distinction? Journal of Extension [On-line], 38(5) Article 5COM1. Available at: http://www.joe.org/joe/2000october/comm1.php

Lave, J., & Wegner. E. (1991). Situated learning: Legitimate peripheral participation. Cambridge: Cambridge University Press

Ludwig, D., Jones, D. D., & Holling, C. S. (1978). Qualitative analysis of insect outbreak systems: The spruce budworm and forest. Journal of Animal Ecology, 47(1), 315-332.

McDowell, G. R. (2001). Land grant universities and Extension into the 21st century: Renegotiating or abandoning a social contract. Ames: Iowa State Press.

Morse, G. W., Markell, J., O'Brien, P., Ahmed, A., Klein, T., & Coyle, L. (2009). The Minnesota response: Cooperative Extension's money and mission crisis. Bloomington: iUniverse Publisher.

Patton, M. Q. (2011). Developmental evaluation: Applying complexity concepts to enhance innovation and use. New York: The Guilford Press.

Sobrero, P. M. (2008). Social learning through virtual teams and communities. Journal of Extension [On-line], 46(3) Article 3FEA1. Available at: http://www.joe.org/joe/2008june/a1.php

Wenger, E. (1998). Communities of practice: Learning, meaning and identity. New York: Cambridge University Press.

Wenger, E., McDermott, R., & Snyder, W. M. (2002). Cultivating communities of practice. Harvard Business School Press.