The Journal of Extension - www.joe.org

December 2017 // Volume 55 // Number 6 // Feature // v55-6a2

Establishing a Common Language: The Meaning of Research-Based and Evidence-Based Programming (in the Human Sciences)

Abstract
This article describes the development, implementation, and exploratory evaluation of a professional development series that addressed educators' knowledge and use of the terms research-based and evidence-based within Human Sciences Extension and Outreach at one university. Respondents to a follow-up survey were more likely to select correctly the commonly accepted standard for each term, and they reported asking more questions, talking with others, examining programs' evidence bases, and placing more value on fidelity and evaluation following participation in the professional development series. Educator reactions to the series were generally positive, although researchers interested in designing like programs might consider engaging educators within the context of their preexisting knowledge levels.


Debra M. Sellers
Associate Dean and Director, Human Sciences Extension and Outreach
Human Development and Family Studies
dsellers@iastate.edu

Lisa M. Schainker
Scientist
Partnerships in Prevention Science Institute
lschain@iastate.edu

Peggy Lockhart
Doctoral Student
Human Development and Family Studies
peglock@iastate.edu

Hsiu Chen Yeh
Assistant Scientist
Partnerships in Prevention Science Institute
hyeh@iastate.edu

Iowa State University
Ames, Iowa

The provision of evidence-based programming and its value within Extension has increasingly become a topic of discussion among Extension professionals (Crawford, Riffe, Trevisan, & Adesope, 2014; Dunifon, Duttweiler, Pillemer, Tobias, & Trochim, 2004; Perkins, Chilenski, Olson, Mincemoyer, & Spoth, 2014). The use of evidence-based programming provides a level of assurance that the work of Extension educators will improve the lives of individuals and families and create a public health effect (Fetsch, MacPhee, & Boyer, 2012; Spoth et al., 2015). Researchers have suggested that Extension educators value evidence-based programs and support using them in the communities they serve (Perkins et al., 2014) but that it is also necessary to find a balance between emphasizing evidence-based programming and responding to individual community needs (Olson, Welsh, & Perkins, 2015). Additionally, Extension educators have reported that evidence-based programs developed outside their systems are not necessarily better than those they can develop in-house (Hamilton, Chen, Pillemer, & Meador, 2013).

As the use of evidence-based programming is likely to become more critical to Extension practice (Perkins et al., 2014), Extension educators should be positioned to select and provide programs within their specialty areas that meet commonly accepted standards (Downey, Peterson, LeMenestrel, Leatherman, & Lang, 2015; Dunifon et al., 2004). By most of these standards, programs become evidence-based after evaluation with a randomized controlled trial and demonstration of positive outcomes, whereas research-based programs contain research-based content but have not necessarily been evaluated (Cooney, Huser, Small, & O'Connor, 2007). Little is known about Extension educators' attitudes toward and knowledge of research- and evidence-based programming. Also not well known is whether professional development on this topic is available to educators and in what form.

Background

In December 2013, Human Sciences Extension and Outreach faculty and staff ("educators") at Iowa State University, who align their work with the College of Human Sciences and within the three areas of family life, family finance, and nutrition and wellness, developed three fundamental principles to guide their work. The first principle highlighted a commitment to research-based and evidence-based educational opportunities. As discussion surrounding the initial creation of the fundamental principles ensued, it became apparent that educators were not using the terms research-based and evidence-based in the same way, or in ways that were consistent with accepted standards. To address this circumstance, Human Sciences Extension and Outreach leadership requested the assistance of the Promoting School-Community-University Partnerships to Enhance Resilience (PROSPER) Network Organization. The PROSPER Network Organization is composed of prevention scientists and program implementation specialists at the Partnerships in Prevention Science Institute, also at Iowa State, who are well positioned to provide professional development and technical assistance related to evidence-based programming (Partnerships in Prevention Science Institute, 2015; Spoth & Greenberg, 2011; Spoth et al., 2015). This group uses prevention science principles, as well as the standards of evidence that are consistent with the Society for Prevention Research (Flay et al., 2005; Mincemoyer et al., 2008), to guide their work with Extension systems across the United States. Representatives from the PROSPER Network Organization, in consultation with Human Sciences Extension and Outreach leadership, developed and implemented a series of professional development opportunities to facilitate a shared understanding among educators of the commonly accepted standards of evidence. This article describes the development, implementation process, and exploratory evaluation of this series. Presented herein are (a) a description of participants, (b) findings from a baseline survey administered prior to provision of the series, (c) a description of the series, (d) findings from a follow-up survey administered at the conclusion of the series, and (e) implications for Extension.

Participants

We identified potential participants from an email distribution list maintained within Human Sciences Extension and Outreach at Iowa State. State-level educators were automatically included; county-level educators who previously elected to be included on the distribution list also were eligible. All professional development opportunities in the series were voluntary, as were the baseline and follow-up surveys. The surveys were anonymous to protect the participants' privacy, so although there likely was some overlap in the respondents across the two surveys, they were treated as independent samples. The Iowa State University Institutional Review Board provided approval prior to data collection activities.

Baseline Survey

We sent via email an invitation to participate in the web-based baseline survey 1 week before the first professional development opportunity (February 2014). Eighty-one percent of the 67 state-level educators and 48% of the 108 county-level educators who were on the distribution list at the time of survey administration participated in the survey.

Learning Methods

We asked respondents to indicate what types of learning methods would provide them with the most knowledge by rating 12 learning strategies using an 11-point scale ranging from 0 (least preferred) to 10 (most preferred). Table 1 provides the results for this set of items.

Table 1.
Respondents' Preferences for Learning Methods

Learning method No. of respondents M (SD) Min
1. Experiential learning that includes practice by doing/teaching others 89 8.1 (1.9) 1
2. Problem solving/brainstorming using new information 90 7.7 (1.8) 1
3. Practical, problem-centered situational learning (rather than content learning) 88 7.7 (1.9) 2
4. Dialogue/discussion group 89 7.5 (1.7) 2
5. Experiential learning/educational games 89 7.3 (2.3) 0
6. Role play/simulation/demonstration 89 6.9 (2.5) 0
7. Case studies that build on my expertise 88 6.9 (1.9) 2
8. Audio-visual and virtual interaction 89 6.8 (2.3) 0
9. Continuous learning that is self-directed based on monitoring feedback 88 6.6 (2.0) 1
10. Reading and studying with a group (shared responsibility for learning) 88 6.2 (2.1) 1
11. Reading and studying independently (self-directed learning) 88 5.5 (2.3) 0
12. Lecture 82 4.9 (2.3) 0
Note. Min = minimum preference rating given by at least one respondent.

Results indicated that on average respondents rated most of the learning methods highly; however, at least one respondent rated each method unfavorably as illustrated by the minimum preference ratings in Table 1. There appeared to be a preference for experiential, applied learning, which is consistent with the original mission of the land-grant university and Cooperative Extension as well as with adult learning theory (Knowles, 1984, 1986; Morrill Land Grant Act of 1862; Smith-Lever Act of 1914). The varied responses across these items reinforced the need to incorporate different learning methods into the series.

Understanding of Research- and Evidence-Based Programs Before Professional Development

We presented the respondents with a set of five descriptions (Table 2) and asked them to select the description that met the standard for being a research-based program and the description that met the standard for being an evidence-based program.

Table 2.
Standards of Evidence Provided to Survey Respondents in Order from Lowest to Highest

Order #a Standard of evidence
1 A program that has been well attended and for which participants report having had a positive experience
2 A program with a design based on relevant theories/research
3 A program based on relevant theories/research, with preprogram and postprogram data showing that participants show gains
4 A program based on relevant theories/research, with positive results from follow-up data comparing participant and nonparticipant groups, but not based on random assignment to groups
5 A program based on specified theory, evaluated with a randomized controlled study that shows positive outcomes
a1 = lowest, 5 = highest.

Although the majority (63%) of respondents correctly selected the commonly accepted standard for research-based programs (#2 in Table 2), 33% selected a more rigorous standard. There was much less clarity around the commonly accepted standard for evidence-based programs (#5 in Table 2), with only 21% of the respondents selecting the correct description. The results from the baseline survey reinforced the need to provide educators with professional development training related to identifying the commonly accepted definitions of both a research-based program and an evidence-based program and determining the standard of evidence for current and potential programs in their specialty areas.

Professional Development Series

We used three existing mechanisms within the Human Sciences Extension and Outreach infrastructure for dissemination of information and delivery of the series (Table 3). The rationale for using these preexisting mechanisms was that doing so would enable communication with all educators engaged with human sciences programming, encourage participation in the professional development opportunities by offering a variety of engagement methods, and assist with message repetition.

Table 3.
Delivery Mechanisms for the Series of Professional Development Opportunities

Mechanism Description
Weekly internal newsletter ("Community Chat") A PDF digest of relevant information for educators is sent via email each Friday. Components include an article from the director, announcements from Extension administration, commentary from various representatives, and upcoming and available professional development opportunities. The PDF also is available via a link, and past issues are archived for later viewing.
Monthly professional development webinar ("First Thursday") These webinars are virtual 1-hr meetings or presentations that provide information and education across a variety of topics and often include guest presenters. Sessions are recorded and posted on a staff-only page for accessibility and later viewing. Dissemination of the date, time, subject, and instructions for joining each webinar and a link to the recording occurs via the weekly internal newsletter. Participation is voluntary.
In-service ("Professional Development Days") These 2-day face-to-face professional development meetings occur twice yearly. Communication about the date, time, subject, and location takes place via the weekly internal newsletter. Participation is highly encouraged for state-level educators; county-level educators are invited. There is no registration fee.

We developed a series of six interrelated professional development opportunities related to research- and evidence-based programming and delivered the series via the mechanisms described in Table 3. The series design was intended to model best practices, increase understanding of the key concepts, and build educators' capacity to select the programs in their respective areas that have the most evidence supporting their effectiveness (Abell, Cummings, Duke, & Marshall, 2015; Gagnon, Franz, Garst, & Bumpus, 2015). Baseline data and adult learning literature informed and guided development of the series (Pereira, Taylor, & Jones, 2009); sequential learning opportunities spaced throughout the year and including a variety of learning methods for maximum engagement reinforced key messages. This approach was consistent with suggestions that Extension should use adult learning principles more effectively (Brower, 1964; Cummings, Andrews, Weber, & Postert, 2015; Franz, 2007; Ota, DiCarlo, Burts, Laird, & Gioe, 2006; Seevers, 1995). Table 4 provides a description of each opportunity and maps the types of learning methods used to the preferences expressed by baseline survey respondents (presented in Table 1). We designed the opportunities to be accessible and usable by other Extension systems for possible replication.

Table 4.
Professional Development Opportunities and Corresponding Learning Methods

Montha Description Mapping of learning methods from Table 1
February First Thursday included content on the commonly accepted standards for research- and evidence-based programs, discussions among participants about their understanding of these standards, and application of the standards to two existing human sciences programs. 2, 4, 8, 10, 12
February Community Chat provided a link to the recording of the First Thursday presentation and included questions to guide self-directed learning. 8, 11
March Community Chat provided an asynchronous learning opportunity that involved viewing two videos, answering questions, and interacting virtually with other educators via a discussion board. 4, 8
April Professional Development Days included an interactive session during which the commonly accepted standards of research- and evidence-based programs were reviewed, current Human Sciences Extension and Outreach programs were placed on a continuum of evidence, and information was shared regarding how to evaluate and select programs based on the rigor of their evidence. 2, 4, 8, 12
July Community Chat provided a summary of and access to a research article articulating the concepts of research- and evidence-based programming and importance for Extension. 11
November Community Chat included a presentation educators could use as self-study and share with external partners. 1, 8
aThe events identified occurred in 2014.

Follow-Up Survey

We sent via email an invitation to participate in the web-based follow-up survey 1 week after the final professional development opportunity was completed. Seventy-four percent of the 61 state-level educators and 38% of the 85 county-level educators who were still on the distribution list at the time of follow-up survey administration participated in the survey.

Participation in Professional Development Opportunities

We asked survey respondents to report on whether they participated in each of the six professional development opportunities (Table 5). Respondents reported relatively higher participation rates for the First Thursday webinar and Professional Development Days, both synchronous learning opportunities, and generally lower participation in asynchronous and self-directed learning activities. Additionally, state-level educators were more likely than county-level educators to have participated in each professional development opportunity. It is important to note that the participation rates shown in Table 5 represent only the group of educators who participated in the follow-up survey; therefore, these rates do not necessarily reflect participation by all educators.

Table 5.
Self-Reported Participation in Professional Development Opportunities by Follow-Up Survey Respondents

Professional development opportunity Self-reported participation
"First Thursday" presentation (February) 54%
Link to "First Thursday" recording (February) 16%
Video and discussion board (March) 26%
Professional Development Days (April) 55%
Newsletter link to research article (July) 33%
Self-study (November) 26%

Understanding and Application of Research- and Evidence-Based Standards After Professional Development

In the follow-up survey, we presented the respondents with the five descriptions included in the baseline survey (see Table 2) and asked them to select the description that met the standard for being a research-based program and the description that met the standard for being an evidence-based program. Results from the follow-up survey indicated that 68% of respondents (vs. 63% of the baseline respondents) correctly selected the commonly accepted standard for research-based programming; in addition, 27% of follow-up survey respondents (vs. 33% of the baseline respondents) incorrectly indicated that "research-based" implied a higher standard of evidence than it does. Also, follow-up survey respondents were more likely to select the commonly accepted standard for evidence-based programs (35%) as compared to baseline respondents (21%); however, there was less clarity around this standard at both time points when compared to the research-based standard, for which there was generally a higher level of understanding. These findings suggest a trend toward greater understanding of the commonly accepted standards for evidence-based programs among the follow-up survey respondents as compared to the baseline survey respondents. The differences associated with research-based programs were smaller, but were also in the expected direction.

Finally, we asked respondents to select the category that best described most Human Sciences Extension and Outreach programs from the following options: research-based, evidence-based, or neither research-based nor evidence-based. We included this item in both the baseline and follow-up surveys (Table 6) to explore how the series might have affected educators' beliefs that currently offered programs are research-based or evidence-based.

Table 6.
Selection of the Category Characterizing Most Human Sciences Extension and Outreach Programs by Baseline and Follow-Up Survey Respondents

Standard Baseline survey respondents Follow-up survey respondents Difference
Research-based 70% 76% +6%
Evidence-based 25% 19% −6%
Neither 6% 5% −1%

The majority of baseline and follow-up survey respondents reported that most of the Human Sciences Extension and Outreach programs were research-based; slightly fewer follow-up survey respondents selected the evidence-based category, and slightly more selected the research-based category. This finding might indicate that the series helped educators begin to better assess the quantity and quality of the evidence behind the programs they are offering.

Participant Feedback on Professional Development Opportunities

Three of us analyzed qualitative data from the follow-up survey through open and focused coding processes and inductively generated codes from the data. We individually developed codes related to each question and then discussed and compared the individually generated codes to come to consensus. When disagreements occurred, we examined the codes in detail and debated until we reached agreement. We then grouped codes into broader themes via consensus. We present results from four open-ended questions included in the follow-up survey in Tables 7–10.

Table 7.
Most Frequently Mentioned Themes in Response to "What Did You Learn from Participating in This Series of Educational Activities About Research- and Evidence-Based Programs?"

Theme Description Example
Lack of consensus among group Existing confusion about the meaning of the terms research-based and evidence-based across educators and human sciences disciplines "The subject matter [teams] define these terms differently and also have different understandings."
Distinction between definitions Understanding of the difference between the definitions "[I better understand] the difference between research and evidence based programs."
Applicability of the effort Ability to apply definitions to programming opportunities "I better understand the terms, what they mean, and how it applies to our programming."

The question "Have you taken any action or changed the way you perform your job as a result of participating in this series of educational activities about research- and evidence-based programs?" was a quantitative item to which 37% of the follow-up survey respondents responded affirmatively. An open-ended follow-up question asked these respondents to describe those actions or changes (Table 8).

Table 8.
Most Frequently Mentioned Themes in Response to "If Yes, What?" Follow-Up Question Related to Action Taken or Changes Made

Theme Description Example
Ask more questions Individual inquiry related to programming opportunities "I ask a lot more questions about the programs we are currently offering and ideas for new programs."
Talk with others Engagement in discussions with others related to programming opportunities "I am more [vocal] about reminding people on work teams about research-based versus evidence-based programming."
Examine programs offered Individual investigation related to programming opportunities ". . . [I] looked for evidence-based programming in my area."
Value fidelity and evaluation Individual use of program fidelity and evaluation concepts "I strive to be more proactive when collecting data and determining objectives."
Table 9.
Most Frequently Mentioned Themes in Response to "What Do You Feel Was Most Helpful About This Series of Educational Activities?"

Theme Description Example
Development and delivery of the content Establishment and dissemination of key concepts across Human Sciences Extension and Outreach ". . . the fact that it existed is the most helpful thing—the fact that we're trying to pay attention to the issue."
Provision of a variety of opportunities Appreciation for having different options for learning ". . . the comprehensive opportunities to better understand."
Repetition of key message Reinforcement of the same message across the series as a learning method ". . . hearing it more than once has helped me to understand it better."
Encouragement to think conceptually Application beyond day-to-day tasks "It helped me to think outside of the box and look at what other states are doing as innovative best practice."
Table 10.
Most Frequently Mentioned Themes in Response to "What Do You Feel Was Least Helpful?"

Theme Description Example
Unclear communication regarding the offerings as a series Opportunities not identified as being part of a series ". . . I was unaware of some of the activities, and that there was a clearly-planned strategy . . ."
Judgmental training context Feelings of being "judged" experienced by some individuals relative to certain aspects of the series ". . . assuming that no-one knew this information. It felt judgmental at times."

Finally, 98% of the follow-up survey respondents indicated that they would recommend the series of educational activities to a colleague.

Limitations

The project we present here was exploratory and process-oriented, and thus there are limitations to the use and generalizability of the findings. Seventeen percent of eligible baseline survey participants left the Extension system by the time of the follow-up survey, and only 53% of the 145 eligible Human Sciences Extension and Outreach–affiliated educators participated in the follow-up survey. Given these circumstances, it is unclear how representative the self-reported participation rates are for the professional development opportunities. Additionally, the themes that arose from the open-ended follow-up survey items may have differed had more educators responded. However, the process used and the results presented herein do add to the scant body of current literature related to knowledge levels about research- and evidence-based programming and may have implications for other human sciences units and/or Extension systems.

Implications for Extension

The catalyst for this professional development series was the perception that educators were not using the terms research-based and evidence-based in ways consistent with the literature (Cooney et al., 2007). In particular, the term research-based was perceived as so integral to the identity of Human Sciences Extension and Outreach that it appeared to have become ubiquitous in its use, yet it lacked explicit meaning and implication. This experience suggests that it is improbable that all educators within the human sciences will decide and agree on one standard of evidence to apply when selecting programs or educational opportunities.

It is important to note that educator reactions to the series on this topic varied, ranging from appreciating the opportunity to learn about the topic to feeling judged. Additional research could provide information related to fully engaging all members of the workforce in a professional development series of this type in ways that are meaningful to them, given their positions within the organization, responsibilities, educational backgrounds, and preferred learning methods. Researchers might consider designing an alternative delivery mechanism for related professional development that uses the strengths and knowledge of current faculty and staff to elevate the knowledge of the entire unit. The varying levels of preexisting knowledge and engagement with the series across educators suggest that a train-the-trainer model could be a practical implementation strategy. For example, a baseline survey might be developed and used to identify educators with the prescribed level of knowledge on research- and evidence-based standards. These individuals would then receive a standardized training to fill any gaps in their knowledge before being deployed to train other educators. This approach might be particularly useful to Extension systems that do not have resident experts such as the PROSPER Network Organization available on their campuses.

Conclusion

Herein, we described the development, implementation process, and exploratory evaluation of a series of professional development opportunities designed to address educators' knowledge and use of the terms research-based and evidence-based within Human Sciences Extension and Outreach at one university. As compared to those who responded to the baseline survey conducted before implementation of the professional development series, respondents to the post-series follow-up survey were more likely to select correctly the commonly accepted standard for evidence-based programs, and to a lesser extent, the commonly accepted standard for research-based programs. In addition, educators reported translating what they learned via the series into action. Specifically, they reported asking more questions, talking to others about the topics, examining the evidence base of current program offerings, and placing more value on fidelity and evaluation within programming. Educator reactions to the series were generally positive; however, researchers interested in designing alternatives might consider engaging educators within the context of their preexisting knowledge levels. A train-the-trainer model involving educators who already have a firm understanding of the concepts might be a good implementation strategy if outside experts are not readily available.

Acknowledgments

We thank Eugenia Hartsook, Richard Spoth, and Jane Todey for their assistance in the development of the professional development series described here and Cleve Redmond for his assistance with the preparation of this article.

References

Abell, E., Cummings, R., Duke, A., & Marshall, J. W. (2015). A framework for identifying implementation issues affecting Extension human sciences programming. Journal of Extension, 53(5), Article 5FEA2. Available at: http://www.joe.org/joe/2015october/a2.php

Brower, S. L. (1964). Dilemma of adult educators. Journal of Extension, 2(2). Available at: http://www.joe.org/joe/1964summer/1964-2-a7.pdf

Cooney, S. M., Huser, M., Small, S., & O'Connor, C. (2007). Evidence-based programs: An overview. What Works, Wisconsin Research to Practice Series, 6. Madison, WI: University of Wisconsin–Madison/Extension.

Crawford, J. K., Riffe, J., Trevisan, D. A., & Adesope, O. O. (2014). Buffering negative impacts of divorce on children: Evaluating impact of divorce education. Journal of Extension, 52(4), Article 4RIB3. Available at: http://www.joe.org/joe/2014august/pdf/JOE_v52_4rb3.pdf

Cummings, S. R., Andrews, K. B., Weber, K. M., & Postert, B. (2015). Developing Extension professionals to develop Extension programs: A case study for the changing face of Extension. Journal of Human Sciences and Extension, 3(2), 132–155.

Downey, L. H., Peterson, D. J., LeMenestrel, S., Leatherman, J., & Lang, J. (2015). The systematic screening and assessment method: An introduction and application. Journal of Extension, 53(2), Article 2IAW2. Available at: http://www.joe.org/joe/2015april/iw2.php

Dunifon, R., Duttweiler, M., Pillemer, K., Tobias, D., & Trochim, W. M. K. (2004). Evidence-based Extension. Journal of Extension, 42(2), Article 2FEA2. Available at: http://www.joe.org/joe/2004april/a2.php

Fetsch, R. J., MacPhee, D., & Boyer, L. K. (2012). Evidence-based programming: What is a process an Extension agent can use to evaluate a program's effectiveness? Journal of Extension, 50(5), Article 5FEA2. Available at: http://www.joe.org/joe/2012october/a2.php

Flay, B. R., Biglan, A., Boruch, R. F., Castro, F. G., Gottfredson, D., Kellam, S., . . . Ji, P. (2005). Standards of evidence: Criteria for efficacy, effectiveness, and dissemination. Prevention Science, 6, 151–175.

Franz, N. (2007). Adult education theories: Informing Cooperative Extension's transformation. Journal of Extension, 45(1), Article 1FEA1. Available at: https://joe.org/joe/2007february/a1.php

Gagnon, R. J., Franz, N., Garst, B. A., & Bumpus, M. F. (2015). Factors impacting program delivery: The importance of implementation research in Extension. Journal of Human Science and Extension, 3(2), 68–82.

Hamilton, S. F., Chen, E. K., Pillemer, K., & Meador, R. H. (2013). Research use by Cooperative Extension educators in New York State. Journal of Extension, 51(3), Article 3FEA2. Available at: http://www.joe.org/joe/2013june/a2.php

Knowles, M. (1984). The adult learner: A neglected species (3rd ed.). Houston, TX: Gulf Publishing Co.

Knowles, M. (1986). Using learning contracts: Practical approaches to individualizing and structuring learning. San Francisco, CA: Jossey-Bass Inc.

Mincemoyer, C., Perkins, D., Ang, P. M., Greenberg, M. T., Spoth, R. L., Redmond, C., & Feinberg, M. (2008). Improving the reputation of Cooperative Extension as a source of prevention education for youth and families: The effects of the PROSPER model. Journal of Extension, 46(1), Article 1FEA6. Available at: https://joe.org/joe/2008february/a6.php

Morrill Land Grant Act of 1862, ch. 130, 12 Stat. 503, 7 U.S.C. §§ 301 et seq.

Olson, J. R., Welsh, J. A., & Perkins, D. F. (2015). Evidence-based programming within Cooperative Extension: How can we maintain program fidelity while adapting to meet local needs? Journal of Extension, 53(3), Article 3FEA3. Available at: http://www.joe.org/joe/2015june/a3.php

Ota, C., DiCarlo, C. F., Burts, D. C., Laird, R., & Gioe, C. (2006). Training and the needs of adult learners. Journal of Extension, 44(6), Article 6TOT5. Available at: http://www.joe.org/joe/2006december/tt5.php

Partnerships in Prevention Science Institute. (2015). PROSPER Network Organization—the PNO. Retrieved from https://prosper-ppsi.sws.iastate.edu/about-us

Pereira, C., Taylor, J., & Jones, M. (2009). Less learning, more often: The impact of spacing effect in an adult e-learning environment. Journal of Adult and Continuing Education, 15, 17–28.

Perkins, D. F., Chilenski, S. M., Olson, J. R., Mincemoyer, C. C., & Spoth, R. (2014). Knowledge, attitudes, and commitment concerning evidence-based prevention programs: Differences between family and consumer sciences and 4-H youth development educators. Journal of Extension, 52(3), Article 3FEA6. Available at: http://www.joe.org/joe/2014june/a6.php

Seevers, B. S. (1995). Extensionists as adult educators: A look at teaching style preference. Journal of Extension, 33(3), Article 3RIB2. Available at: http://www.joe.org/joe/1995june/rb2.php

Smith-Lever Act of 1914, ch. 79, 38 Stat. 372, 7 U.S.C. §§ 341 et seq.

Spoth, R., & Greenberg, M. (2011). Impact challenges in community science-with-practice: Lessons from PROSPER on transformative practitioner–scientist partnerships and prevention infrastructure development. American Journal of Community Psychology, 48(1–2), 106–119.

Spoth, R., Schainker, L. M., Redmond, C., Ralston, E., Yeh, H. C., & Perkins, D. F. (2015). Mixed picture of readiness for adoption of evidence-based prevention programs in communities: Exploratory surveys of state program delivery systems. American Journal of Community Psychology, 55, 253–265.