April 2004 // Volume 42 // Number 2 // Feature Articles // 2FEA2

Previous Article Issue Contents Previous Article

Evidence-Based Extension

Abstract
This argues that Extension should embrace the evidenced-based practice movement, which links scientific evidence and practice. This movement entails a thorough scientific review of the research literature, the identification of the most effective interventions or strategies, and a commitment to translating the results into guidelines for practice. This process corresponds closely to the goals of USDA CSREES. We offer several ways in which Extension can connect with ongoing evidence-based activities in relevant areas. By doing so, Extension can improve its use of research-based practice and also inform and advance the ongoing evidence-based work occurring in the scientific community.


Rachel Dunifon
Assistant Professor
Cornell University
Department of Policy Analysis and Management
red26@cornell.edu

Michael Duttweiler
Assistant Director
Cornell Cooperative Extension
mwd1@cornell.edu

Karl Pillemer
Professor
Cornell University
Department of Human Development
kap6@cornell.edu

Donald Tobias
Associate Professor
Cornell University
Department of Policy Analysis and Management
djt3@cornell.edu

William M.K. Trochim
Professor
Cornell University
Department of Policy Analysis and Management
wmt1@cornell.edu

Ithaca, New York


Introduction

Since its inception, one of the central goals of the USDA Cooperative State Research Education and Extension Service has been to translate the best of current research into practice. In establishing Land Grant Universities and the Cooperative Extension Service, the Smith-Lever Act of 1914 stated that its "...work shall consist of the development of practical applications of research knowledge...," and the recent Kellogg Commission review of the land-grant system included the need for "conscious efforts to bring the resources and expertise at our institutions to bear on community, state, national, and international problems in a coherent way (Kellogg Commission, 2000, page 6).

Historically, Extension faculty have conducted research with an expectation that the knowledge generated would be disseminated through local offices to address the issues and problems of communities. This tradition of research synthesis, translation, and dissemination in Cooperative Extension is consonant with the recent evolution of "evidence-based" research, which includes a thorough scientific review of the research literature, the identification of the most effective interventions or strategies, and a commitment to translating the results of this process into guidelines for practice.

This article summarizes the evidence-based research movement, provides an example of work designed to translate evidence-based research to practice, and considers some of the implications for Extension. We offer several ways in which Extension can connect with ongoing evidence-based activities in relevant areas. By doing so, Extension can further improve its use of research-based practice and also inform and advance the ongoing evidence-based work occurring in the scientific community.

A Brief Review of Evidence-Based Practice

The term "evidence-based," when used to describe the conjunction of research and practice, comes originally from medical research (Antes, 1998), where it is termed "evidence-based medicine" (EBM) or sometimes more generically "evidence-based practice" (EBP).

Features of Evidence-Based Practice

There is no clear or universally accepted definition of "evidence-based," but the following features generally characterize such approaches:

  • Identification and definition of a topic that is important for practice.
  • Systematic identification of all published research addressing this topic and screening of identified studies for quality and appropriateness. This is done by developing a detailed instrument in which each relevant study is evaluated based on established criteria. Criteria used to evaluate studies focus on the research design of the study (i.e., the use of control groups or longitudinal data), the sample size, effect sizes, and other important factors.
  • Summary and analysis of the selected studies with recommendations for practice. This typically involves a combination of formal statistical meta-analysis and review by a panel of researchers.
  • Development of guidelines that summarize evidence-based practices in a manner that is accessible to practitioners, indicating recommended practices and identifying areas where scientific evidence is currently insufficient.
  • Diffusion and dissemination of evidence-based practice guidelines, programs, or treatment protocols and evaluation of changes in practice and outcomes that result.
What makes the EBP movement unique and sets it apart from other systems for moving science to practice is the emphasis on statistical analyses of qualified existing studies and the formation of guidelines that have been developed through a rigorous process of analysis and review, all set within a framework that views the science-to-practice continuum as a formal system for diffusion of research (Rogers, 2003).

The Cochrane Collaboration

The "granddaddy" of review systems comes from medicine and is known as the Cochrane Collaboration (http://www.cochrane.org). It consists of numerous review groups from across the spectrum of medical specialties and involves hundreds of researchers who collaborate on systematic reviews. These reviews follow a very specific methodology for selecting and analyzing studies, and the results are published in The Cochrane Database of Systematic Reviews. The Cochrane Library (http://www.cochrane.org/reviews/clibintro.htm) is the online resource that publishes the completed reviews.

A controversial aspect of the Cochrane Collaboration is the almost exclusive emphasis it places on randomized experiments, a research design in which participants are randomly assigned to treatment or control groups. Because these two groups are equivalent, any result showing that the people receiving the treatment showed an improved outcome can be attributed to the treatment itself. This type of design is well-established in medical research but often more challenging to implement in more applied contexts (and are rare in evaluations of Extension programs).

Evidence-based approaches such as the Cochrane Collaboration have influenced virtually every area of medical practice. For instance, public health has developed evidence-based efforts in areas ranging from physical activity to tobacco control (Brownson, Baker, Leet, & Gillespie, 2002). In recent years, the idea of EBP has been spreading rapidly to new fields outside of medicine and public health. In 1999, the Campbell Collaboration (Schuerman et al., 2002) (http://www.campbellcollaboration.org/) was created as a counterpart to the Cochrane Collaboration and focuses on social, behavioral, and education arenas. Other organizations, such as the Centers for Disease Control, the National Cancer Institute, and Child Trends, are actively undertaking evidence-based reviews of research and the publication of resulting guidelines for practice.

Arguments in favor of Evidence-Based Practice

There are several compelling arguments in favor of EBP. The use of formal methods and reliance on panels of scientists to review results help encourage a more thorough and rigorous review of research than lone investigator literature reviews tend to produce. Additionally, formal recommendations or "best" practices or guidelines help to assure a higher degree of consonance between the system of science-based knowledge generation and the world of practice.

In an age where information overload is a significant concern, it is often difficult for practitioners to distinguish legitimate science claims from pseudo-science. Additionally, practitioners too often develop programs and policies based in whole or in part on anecdotal evidence and intuition, uninformed by the most recent science.

EBP offers a systematic approach for summarizing the best that current science has to offer in an area and packaging the programs and interventions that were actually tested in a manner that is accessible to the practitioner. Proponents maintain that EBP strategies have been transformative, have improved practice, and have produced a paradigm shift in the education of practitioners (Davidoff, 1999; Hoge, Jacobs, Belitsky, & Migdole, 2002).

Criticisms of Evidence-Based Practice

The development of EBP has not been without significant challenges. Identifying all aspects of the published literature applicable to the practices being studied is difficult (Robinson & Dickersin, 2002). Further, it is hard to maintain the infrastructure and funding necessary to ensure high-quality, consistent reviews (Laupacis, 2002). Additionally, there is often disagreement about the methods used to score the quality of studies (Juni, Witschi, Bloch, & Egger, 1999).

Critics in medicine have argued that EBP threatens the autonomy of the physician practitioner (Armstrong, 2002; Hampton, 2002), is in opposition to a patient-centered model of care (Armstrong, 2002), and is simply the latest methodological fad (Bauchner, 1999). Some argue that applied programs in medical practice (Rothwell, 2002) or public health (Rychetnik, Frommer, Hawe, & Shiell, 2002), are too complex and context-dependent to be well-described by EBP syntheses. One can reasonably expect that analogous arguments will also be raised by practitioners in other fields.

This criticism could pose a major barrier to widespread adoption of evidence-based approaches in Extension. The obstacle lies in the culture of Extension work. That is, limiting Extension programs to only those on which there is sound empirical evidence of effectiveness (especially based on randomized controlled trial research) would be perceived as foreign to many in the Extension system, in part because many programs are developed in collaboration with communities rather than delivered in standardized form.

It is highly unlikely that the evidence-based requirements will be so stringently applied to Extension work. However, this perspective does offer a challenge to Extension professionals: to rigorously examine what, if any, basis in empirical research exists for programs they promote and to design new programs based on those that have been proven effective through evidence-based reviews.

Evidence-Based Example: Promotion of Physical Activity

To illustrate more concretely what an evidence-based approach looks like and how the results could be used within Extension, we present an example of evidence-based research endeavors in the area of the promotion of physical activity. This is an area in which Extension is active, as shown by the "Healthy People, Healthy Communities" initiative. In this area, evidence-based syntheses and reviews have already been completed, and significant effort has already been expended on the development of guidelines for practitioners.

Development of Guidelines

The evidence-based review on physical activity was undertaken as part of a larger project, the Guide to Community Preventive Services: Systematic Reviews and Evidence-Based Recommendations (the Guide) (Briss et al., 2000). The Guide (available online at http://www.thecommunityguide.org/) is designed to be an evidence-based resource for community public health practitioners.

The steps used by the Guide Task Force to review and synthesize evidence and generate recommendations were:

  1. "forming multidisciplinary chapter development teams,
  2. developing a conceptual approach to organizing, grouping, selecting and evaluating the interventions in each chapter;
  3. selecting interventions to be evaluated;
  4. searching for and retrieving evidence;
  5. assessing the quality of and summarizing the body of evidence of effectiveness;
  6. translating the body of evidence of effectiveness into recommendations;
  7. considering information on evidence other than effectiveness; and
  8. identifying and summarizing research gaps."

Step 5 constituted the heart of the research synthesis and involved careful coding of each research study on physical activity that was considered to be relevant and synthesizing results across similar studies through simple statistical analysis. The Guide did not require that studies be limited only to randomized experiments. In applied community-based public health interventions, such a requirement would likely prove too restrictive.

The end result of this process is the development of a set of guidelines for practitioners. Table 1 summarizes these guidelines. Physical activity interventions were classified as either having strong evidence of effectiveness, having sufficient evidence, or not having enough evidence, based on things such as the number of studies on the intervention, the study designs, and whether results were replicated across many studies (Centers on Disease Control and Prevention, 2001). Interested readers are referred to the Guide Web site for more details. Considerable effort went into making the recommendations as concise, readable. and straightforward as possible, while ensuring the maximum level of scientific accuracy.

Table 1.
Interventions to Increase Physical Activity: Recommendations from an Evidence-Based Review

Intervention

Recommendation

Informational approaches to increasing physical activity

 

Community-wide campaigns

Recommended
(Strong Evidence)

"Point-of-decision" prompts

Recommended
(Sufficient Evidence)

Classroom-based health education focused on information provision

Insufficient Evidence to Determine Effectiveness

Mass media campaigns

Insufficient Evidence to Determine Effectiveness

School-based physical education

Recommended
(Strong Evidence)

Non-family social support

Recommended
(Strong Evidence)

Behavioral and social approaches to increasing physical activity

 

Individually-adapted health behavior change

Recommended
(Strong Evidence)

Health education w/ TV/Video game turnoff component

Insufficient Evidence to Determine Effectiveness

College-age physical education/health education

Insufficient Evidence to Determine Effectiveness

Family-based social support

Insufficient Evidence to Determine Effectiveness

Environmental and policy approaches to increasing physical activity

 

Creation and/or enhanced access to places for PA combined with informational outreach activities

Recommended
(Strong Evidence)

Transportation policy and infrastructure changes to promote non-motorized transit

In Progress

Urban planning approaches - zoning and land use

In Progress

Complete results are available at http://www.thecommunityguide.org/pa/default.htm

Putting Guidelines into Practice

The chapter of the Guide on physical activity is an example of evidence-based practice guidelines that were developed based upon an evidence-based meta-analysis. But the development of guidelines alone is not sufficient to ensure their adoption in practice. Recent efforts have been directed at filling this gap.

The Translating Research into Improved Outcomes (TRIO) program (Kerner & Vinson, 2002) is a collaborative initiative coordinated through the National Cancer Institute that includes a variety of activities designed to translate guidelines into actual practice.

One of the most important and innovative of the TRIO activities is the PLANET (Plan, Link, Act, Network with Evidence-based Tools) program (Kerner, Vinson, & Cynkin, 2003). PLANET is a Web site (http://cancercontrolplanet.cancer.gov/) that provides for public health practitioners and researchers a simple five-step process for developing local programs (in this case, cancer-fighting programs).

The PLANET Web site represents an ambitious, state-of-the-art effort to link evidence-based research and practice in the area of cancer control and to evaluate the results of such dissemination efforts. These steps, detailed below, are highly relevant to Extension educators who are seeking to identify, obtain funding for, and initiate new, research-based, programs in their communities. The five steps of PLANET are as follows.

  1. Assess program priorities. This is similar to performing a needs assessment in a state, county, or community. The PLANET Web site contains a detailed database for performing state and county needs assessments in the area of cancer incidence, for example.

  2. Identify potential partners. The PLANET Web site provides contact information for local agencies working in the cancer prevention area, allowing practitioners to identify potential partners with whom to work and fill gaps, where they exist, in program service delivery.

  3. Determine effectiveness of different intervention approaches. This provides a direct link to the Guide to Community Preventive Services containing the latest evidence-based synthesis of the science examining various programs or interventions. This allows practitioners to learn what the most effective programs are in each specific area.

  4. Find examples of research-tested intervention programs and products. The PLANET Web site links to the Research-Rested Intervention Programs (RTIP) Web site (http://cancercontrol.cancer.gov/rtips/index.asp), which offers programs that have been developed from scientifically based studies and that have been shown to be effective. The database is organized to make it easy to find and compare various intervention programs that address areas of interest, be they a particular cancer site, a demographic, a delivery setting, or other concerns. For many of these programs, practitioners can also download all the program components to be used locally.

  5. Plan and evaluate your program. Links to resources on how to plan and evaluate the evidence-based interventions.

Applications for Extension

Steps 3 and 4 of this process are especially notable for Extension educators and represent a real innovation in the ways in which Extension educators can use research to inform their programs. Once educators identify an area in which they would like to intervene, they can determine which approaches are most effective and then choose from several specific programs that were used in the original scientific research and demonstrated as effective.

For example, educators seeking to promote physical activity in their community would learn that developing a school-based physical education program is likely to be more effective than a mass media campaign. They could then identify specific physical education programs in the PLANET database that have been shown to be effective in the research.

This is a change from the typical ways in which research has been used to inform Extension. Although Extension currently benefits from research summaries on "best" practice guidelines and syntheses of research, an evidence-based Extension program would make such reviews more systematic than those currently available. Rigorous, agreed-upon standards would be applied to reviews of a series of relevant topics, would be coordinated across states and universities, and would link more directly with cutting edge research communities.

Instead of Extension educators creating their own programs based on general conclusions gleaned from research disseminated to them via summaries produced by one or more Extension faculty members familiar with the literature, the educator can draw on a database of programs that have been systematically analyzed. This is not meant to preclude local adaptation of such programs or the need to tailor them appropriately. It simply suggests that the starting point for local program development would be closer to the actual programs on which the scientific evidence is based.

Implications of Evidence-Based Practice for Extension

The emerging movement of evidence-based research is likely to have a significant impact on USDA-CSREES and State Cooperative Extension Systems. Many of the practice areas addressed in evidence-based practice, such as the promotion of physical activity, fall directly within the purview of Extension. As evidence-based research moves into a broader array of applied fields, there is likely to be a corresponding increase in the number of Extension-relevant reviews.

How might Extension get involved in evidence-based efforts like these? We envision several possibilities.

Initiators

First, Extension educators and faculty can act as initiators of new evidence-based projects, playing a leadership role in identifying topics where there is both the greatest need for and feasibility of accomplishing high quality evidence-based reviews. As noted by Schuerman et al. (2002), evidence-based research groups such as the Campbell Collaboration rely on volunteers to help identify topics for review. Extension organizations have both the experience and access to play a key role in identifying topics--through surveys, issue scanning, concept mapping, or other means--that could benefit from rigorous review and communicating this information to the review groups.

Collaborators

Second, Extension faculty could actively participate in the evidence review process, working as collaborators with other organizations on doing evidence-based reviews and guideline development. The Extension system is an ideal mechanism for identifying and pulling together a nationwide network of researchers who can collaborate with others on evidence-based reviews of relevant topics. With faculty in major research universities across the United States, Extension can work to identify specialists in specific fields and tap their expertise to contribute to evidence-based reviews in specific areas.

Disseminators

Third, Extension could serve a dissemination role, making use of the national network of Extension offices. A key role of Extension faculty would be to categorize and package evidence-based information and disseminate it to educators, who could then use it in their existing programs and in developing and obtaining funding for new initiatives.

For example, the National Cancer Institute is developing training on PLANET (Kerner et al., 2003) to help Extension educators learn how to implement and evaluate evidence-based cancer control program (e.g., disease prevention, early detection, and survivorship). These efforts could, at relatively little expense, be expanded nationwide, encouraging broad consistency in programs across the Extension system and helping assure that practice is linked to the most up-to-date scientific research. Web-based portals could be used by Extension educators to access relevant evidence-based reviews and practice guidelines.

There are some potential roadblocks for Extension when disseminating evidence-based reviews into practice guidelines. In particular, such guidelines will only be useful and inform actual practice if educators feel that the contexts of the research studies are relevant to the contexts in which they work. Therefore, efforts must be made to illustrate ways in which research studies are generalizable to a larger population and to point out ways in which results from such studies are specific only to the contexts in which the research occurred. Attention to such factors should be both a key feature and a unique contribution of Evidence-Based Extension efforts, and would differentiate EBE from other evidence-based endeavors.

Evaluators

Finally, Extension is in an ideal position to play a key role as evaluators of the effects of evidence-based practice guidelines and programs. Extension has the experience and the local presence throughout the nation to be on the ground coordinating the distribution of evidence-based programs and interventions, and the collection of relevant outcome data. Extension educators have a rich tradition of implementing programs and interventions, and gathering evaluative data about their effectiveness. In short, Extension is a broad-based existing natural laboratory that can be utilized to implement evidence-based results and to evaluate the effectiveness of such efforts.

This would require, minimally, an organizational commitment to coupling dissemination of evidence-based results with systems for evaluation process and outcome data collection and the synthesis of such results. This commitment would need to take place at both the national and state levels to ensure systematic dissemination of results and collection of outcomes. Evaluation is, perhaps, both the greatest challenge to and opportunity for a major role for Extension in the evidence-based endeavor.

For the example of physical activity discussed here, this might include the following:

  • Studies that document dissemination of the guidelines through the Extension system, including the use of the Guide, or PLANET.
  • Outcome assessments of changes in practitioner knowledge, attitudes, and behavior as a result of the dissemination.
  • Cost-effectiveness and cost-benefit studies.
  • Studies of the long-term impact of the use of evidence-based guidelines.

On some of these evaluations, Extension could take a primary role or even be the exclusive evaluator. But in many, Extension would partner with other organizations (National Cancer Institute, Centers for Disease Control and Prevention, American Cancer Society), providing expertise in evaluation, access to the Extension network, and a local program and evaluation presence.

Summary and Conclusion

This article argues that, despite the potential barriers, Extension should embrace this new movement to link scientific evidence and practice. Evidence-based practice entails a thorough scientific review of the research literature, the identification of the most effective interventions or strategies, and a commitment to translating the results of this process into guidelines for practice.

This process corresponds closely to the goals of USDA CSREES. We have offered several ways in which Extension can connect with ongoing EBP activities in relevant areas. By doing so, Extension can further improve its use of research-based practice, and also inform and advance the ongoing EBP work occurring in the scientific community.

References

Antes, G. (1998). Evidence-based medicine. Internist, 39(9), 899-908.

Armstrong, D. (2002). Clinical autonomy, individual and collective: the problem of changing doctors' behaviour. Soc Sci Med, 55(10), 1771.

Bauchner, H. (1999). Evidence-based medicine: A new science or an epidemiologic fad? Pediatrics, 103(5), 1029-1031.

Briss, P. A., Zaza, S., Marguerite, P., Fielding, J., Wright-De Aguero, L., Truman, B. I., et al. (2000). Developing an Evidence-Based Guide to Community Preventive Services-Methods. Am J Prev Med, 18 (1S), 35-43.

Brownson, R. C., Baker, E. A., Leet, T. L., & Gillespie, K. N. (Eds.). (2002). Evidence-Based Public Health: Oxford University Press.

Centers on Disease Control and Prevention (2001). Increasing physical activity: A report on recommendations of the Task Force on Community Preventive Services, MMWR 2001; 50(no.RR-18).

Davidoff, F. (1999). In the teeth of the evidence: The curious case of evidence- based medicine. Mount Sinai Journal of Medicine, 66(2), 75-83.

Hampton, J. R. (2002). Evidence-based medicine, opinion-based medicine, and real-world medicine. Perspect Biol Med, 45(4), 549-568.

Hoge, M. A., Jacobs, S., Belitsky, R., & Migdole, S. (2002). Graduate education and training for contemporary behavioral health practice. Administration and Policy in Mental Health, 29(4-5), 335-357.

Juni, P., Witschi, A., Bloch, R., & Egger, M. (1999). The hazards of scoring the quality of clinical trials for meta- analysis. Jama-Journal of the American Medical Association, 282(11), 1054-1060.

Kellogg Commission on the Future of State and Land-Grant Universities (2000)."Renewing the Covenant: Learning, Discovery and Engagement in a New Age and a Different World". Available at: http://www.nasulgc.org/publications/Kellogg/Kellogg2000_covenant.pdf

Kerner, J., & Vinson, C. (2002). Informing research dissemination & diffusion with audience input through concept mapping. Paper presented at the 130th Annual Meeting of the American Public Health Association, Philadelphia, PA.

Kerner, J., Vinson, C., & Cynkin, L. (2003). Cancer Control PLANET-Plan, Link and Act with Evidence-based Tools, 2003 Priester Conference. Phoenix, Arizona.

Laupacis, A. (2002). The Cochrane Collaboration - how is it progressing? Statistics in Medicine, 21(19), 2815-2822.

Robinson, K. A., & Dickersin, K. (2002). Development of a highly sensitive search strategy for the retrieval of reports of controlled trials using PubMed. International Journal of Epidemiology, 31(1), 150-153.

Rogers, E. M. (2003). Diffusion of Innovations (5th ed.). New York, NY: Free Press.

Rothwell, P. M. (2002). Why do clinicians sometimes find it difficult to use the results of systematic reviews in routine clinical practice? Evaluation & the Health Professions, 25(2), 200-209.

Rychetnik, L., Frommer, M., Hawe, P., & Shiell, A. (2002). Criteria for evaluating evidence on public health interventions. Journal of Epidemiology and Community Health, 56(2), 119-127.

Schuerman, J., Soydan, H., Macdonald, G., Forslund, M., de Moya, D., & Boruch, R. (2002). The Campbell collaboration. Research on Social Work Practice, 12(2), 309-317.

Upshur, R. E. G. (2002). If not evidence, then what? Or does medicine really need a base? Journal of Evaluation in Clinical Practice, 8(2), 113-119.