April 2020
|
April 2020 // Volume 58 // Number 2 // Tools of the Trade // v58-2tt2
Publishing Extension Evaluations in Academic Research Journals: Some Recommendations
Abstract
Extension evaluation studies often can provide the basis for valuable disciplinary contributions. This article presents recommendations for developing academic journal manuscripts from Extension evaluations. A journal article will be focused on research questions that may be distinctly different from the evaluation questions that drove the original study. Research questions should be identified that reflect current issues and debates within the discipline, rather than relate only to a perfunctory report of the evaluation's outcomes. Other recommendations involve selecting a target journal, optimizing methodological rigor in the analysis, and generating multiple manuscripts from the same evaluation study. The article includes illustrations from my evaluation work.
Introduction
The Extension system is a hub of program innovation and evaluation across numerous disciplinary areas. Evaluation sheds light on Extension programs and informs decisions about whether to expand them, improve them, deliver them to new audiences, or move on to something else. But evaluations also can contribute to disciplinary knowledge about program theory, program implementation, and translational science. For these purposes, Extension academics publish their evaluation findings in peer-reviewed journals, including Journal of Extension. In addition, scholarship is frequently an expectation of Extension position descriptions. Thus, publishing is an important part of the job for many Extension academics (Alter, 2003; Culp, 2009; Franz & Stovall, 2012).
In this article I offer some reflections and recommendations related to Extension scholarship, particularly the process of turning Extension evaluations into research papers for disciplinary journals. These recommendations stem from my own experiences and those of my colleagues on a diversity of evaluation projects that we believed had potential to contribute to the research literature. I have found that a change in approach is needed as one shifts focus from the evaluation's stakeholder audience(s) to a broader, national research audience. My recommendations reflect that shift. The appendix provides an illustrative case study from one of my evaluation projects.
Reflections and Recommendations
It's all about the research question. The counterpoint of this recommendation is this: It's not about your original evaluation questions, and it's not about reporting on the program's overall effectiveness. I find a common misconception to be that if a program has been found to be effective, that result will necessarily be the core of a publishable paper. On the contrary, finding evidence of program effectiveness will be welcome news for your local stakeholders, but probably of little interest to a national audience—unless it relates to something they want to know. Build the paper around a current question for which "the field" desires an answer.
You need to know the research literature. How do you determine what "the field" wants to know? There is no substitute: Read the relevant journals, and try to determine whether there is some element within your data set that addresses current research debates, toward which your study can contribute.
The research paper might cover only a limited component of your original evaluation study. Your manuscript might focus on certain statistical relationships among your program outcomes, or an innovation you introduced in measuring those outcomes, or some surprising insights about the program's target audience. If your intervention is on the cutting edge of program theory, your paper might focus on the intervention itself, but you must make the case for why it is an innovation compared to standard practice. Make sure your paper contributes a new idea—that should be the paper's core.
You should pick your target journal before you start writing. The choice of journal makes an enormous difference for how you will write your paper. First, regarding logistics, it will dictate how long the manuscript can be. You do not want to write prodigiously and then discover you need to cut much of what you have written. In addition, most journals, including Journal of Extension, have different submission categories, so part of your decision process might involve whether to submit your paper as a regular article or a "brief report." Second, deciding where to publish will constitute a decision about what kinds of audiences you wish to reach. That decision will affect elements of your writing, such as your assumptions regarding readers' primary interests (relevant for how you frame the study's significance) and their prior knowledge about the subject (relevant for the amount of introductory background information you may need to provide). Third, consider the prestige of your candidate journals. The world of academic publishing has distinct pecking orders, and journals within a discipline are ranked, formally and informally, in terms of prominence. One common metric is the journal's impact factor, a measure of how frequently its articles are cited (Braverman, 2018). To make this decision, you must make a cold, hard assessment about the quality of your paper, including its potential contribution and its methodological rigor. My personal approach is to aim high at the outset with regard to journal quality and prominence. The most significant potential price of this strategy will be a time delay and an ego blow if the manuscript gets rejected. That cost, however, can be balanced by receiving reviewer comments that will help you improve the paper for your next go-round.
Methodological rigor is necessary but not sufficient. Whether your analyses are quantitative, qualitative, or both, methodological rigor is of paramount importance. The expectations for rigor are typically higher for academic journals than for your local evaluation stakeholders. Therefore, it may be advisable for you to return to your data set to conduct new analyses. This might mean revisiting questions covered in your original evaluation, adding supplemental analyses to cover those questions more thoroughly, and/or addressing new research questions that differ from your previous evaluation questions. For example, if your primary analysis was an assessment of differences between program and control groups, you might add selected covariates for greater statistical power or conduct an attrition analysis to test whether your results are biased due to some participants' having quit the program early. If your prior results do not hold up with greater rigor, they probably were not on firm ground to begin with. After all, that is why rigor is valued. And if they do hold up, the justification for your research conclusions will be strengthened. Consult on these questions with a colleague with strong methodological expertise. You might consider inviting that colleague onto your writing team, in which case he or she could lead the reanalysis and be listed as a coauthor.
It is perfectly OK to get multiple papers from the same evaluation study. Evaluation studies are often wide-ranging, with multiple questions, outcomes, measurement strategies, and analyses. By contrast, journal articles must be tightly focused and are almost always strictly limited in length. Given that an article is built around a specific research question, the paper must closely follow the logic of that question, including only the data and analyses that are directly relevant. Therefore, if you are fortunate enough to have multiple potential research questions embedded in your evaluation, they can be addressed in separate papers.
Conclusion
With these recommendations in mind, I encourage Extension academics to be alert for opportunities to turn Extension evaluations into rigorous scholarship. Not all evaluations will fit this bill as evaluations are conducted for different purposes. But many evaluations will have relevance, at least in part, for the broader research literature. I wish you success in your publishing endeavors!
References
Alter, T. R. (2003). Where is Extension scholarship falling short, and what can we do about it? Journal of Extension, 41(6), Article 6COM2. Available at: https://www.joe.org/joe/2003december/comm2.php
Braverman, M. T. (2018). The evolving landscape for academic publishing: Essential knowledge for Extension scholars. Journal of Extension, 56(3), Article v56-3tt1. Available at: https://joe.org/joe/2018june/tt1.php
Braverman, M. T., Geldhof, G. J., Hoogesteger, L. A., & Johnson, J. A. (2018). Predicting students' noncompliance with a smoke-free university campus policy. Preventive Medicine, 114, 209–216. doi:10.1016/j.ypmed.2018.07.002
Braverman, M. T., Hoogesteger, L. A., & Johnson, J. A. (2015). Predictors of support among students, faculty and staff for a smoke-free university campus. Preventive Medicine, 71, 114–120. doi:10.1016/j.ypmed.2014.12.018
Culp, K., III. (2009). The scholarship of Extension: Practical ways for Extension professionals to share impact. Journal of Extension, 47(6), Article v47-6comm1. Available at: https://joe.org/joe/2009december/comm1.php
Franz, N. K., & Stovall, C. E. (2012). JOE's niche in the Extension scholarship movement. Journal of Extension, 50(5), Article v50-5comm2. Available at: https://joe.org/joe/2012october/comm2.php
Appendix
Publishing Evaluation Studies: A Case Study
The Intervention and Its Setting
Oregon State University (OSU) instituted a policy in fall 2012 that made its main campus in Corvallis completely smoke free. Previously, smoking was prohibited indoors and within 30 ft of building entrances, but the new policy prohibited smoking and vaping (use of e-cigarettes) anywhere on campus.
The Original Evaluation
In spring 2013 I was part of a team that conducted an online survey of all students, faculty, and staff on campus to evaluate the success of the policy in its first year. The evaluation addressed five primary questions, which included these two:
- What are the levels of support for the new OSU policy among students, faculty, and staff?
- What has been the level of compliance with the smoke-free campus policy?
The data collection was completed in June 2013. Our responding sample included 5,698 students and 2,055 faculty and staff. Among the major findings were these results:
- 72% of students and 77% of faculty expressed the opinion that the OSU campus should be 100% smoke free.
- 57% of student smokers and 80% of faculty/staff smokers had not smoked on campus at all since the policy went into effect.
We reported our findings and conclusions to the primary stakeholder audiences on campus, and an executive summary of the evaluation was posted online. These findings were valuable in leading to an understanding that the new policy was very well supported on campus, but there was room for improvement with regard to policy compliance, especially by students.
The Research Journal Papers
Once we had reported our results and conclusions to primary stakeholders and used them for policy implementation, we examined whether there might be some research findings that were relevant for sharing in disciplinary journals. We did not presume that our overall findings, for example, the raw sample percentages listed above, would be of broad interest to researchers beyond our campus. Rather, we shaped research papers around new research questions.
Paper 1: An Analysis of Patterns of Policy Support
Our first paper from the data set (Braverman, Hoogesteger, & Johnson, 2015) analyzed the predictors of support for the smoke-free policy.
- Primary research question: What are the correlates of policy support among students, faculty, and staff? That is, what factors tend to distinguish between respondents who support versus oppose the smoke-free policy?
- New data analyses: We conducted regression analyses that identified significant predictors of policy support.
- Potential contribution of this paper: An increased understanding of the factors that underlie policy support. If similar patterns are found on other campuses, health administrators and researchers can use this type of information to plan tobacco policy decisions and develop effective information campaigns.
Paper 2: An Analysis of Policy Violations by Student Smokers
A later paper (Braverman, Geldhof, Hoogesteger, & Johnson, 2018) focused exclusively on students within our sample who reported being tobacco users. We sought to examine why some student tobacco users had smoked on campus, in violation of the policy.
- Primary research question: What are the correlates of policy violation (smoking on campus) by student smokers?
- New data analyses: We conducted regression analyses that identified significant predictors of students' policy violation. This included the creation of a new variable (presence of tobacco cravings) that was based on the coding of students' qualitative comments.
- Potential contribution of this paper: An increased understanding of why students violate a smoke-free campus policy, particularly with regard to the potential role of tobacco dependence and addiction. This type of information can contribute to developing effective policy education, enforcement strategies, and smoking cessation services.
Illustrative Points from This Case Study
Several points can be highlighted from this example that illustrate the overall recommendations presented previously:
- Each paper was focused on a small part of the overall data set.
- The research questions for each paper differed from the original evaluation study's primary questions. Indeed, the later analyses did not appear in the evaluation reports because they were not germane to the evaluation questions being answered.
- Each paper required a new round of data analysis, which was more complex than the analyses in the original evaluation. The evaluation had made use primarily of descriptive data (percentages who support and oppose, etc.), but the research papers required multivariate analyses to build an understanding of those outcomes.
- For Paper 2, we brought a colleague with strong methodological expertise onto our team—after we had identified the new research question that we wanted to examine.