The Journal of Extension - www.joe.org

October 2018 // Volume 56 // Number 6 // Feature // v56-6a2

Readying Extension for the Systematic Analysis of Large Qualitative Data Sets

Abstract
Land-grant Extension institutions face increasing expectations to use data to communicate value and drive program and organizational development. In this article, we introduce the University of Wisconsin–Extension Data Jam Initiative, an integrated qualitative software, methods, and data analysis curriculum. The Data Jam Initiative is an evaluation capacity building framework for collaborative, mentorship-based analysis sessions across an institution and across disciplines. Through sharing exemplar applications of this curriculum, we illustrate how the Data Jam Initiative prepares Extension institutions for using qualitative data in service of communication to stakeholders, program development, and organizational growth.


Christian Schmieder
Qualitative Research Specialist
christian.schmieder@ces.uwex.edu

Kyrie E. H. Caldwell
Qualitative Research Assistant
kyrie.caldwell@ces.uwex.edu
@DataJamsUWEX

Ellen Bechtol
Qualitative Research Assistant
ellenbechtol@gmail.com

University of Wisconsin–Extension
Madison, Wisconsin

The Imperative to Use Data

Using data in program development, organizational development, and external communications has become increasingly imperative for Extension institutions. Since the 1970s, threats of budget cuts have been one of the drivers of this trend (Andrews, 1983; Rennekamp & Arnold, 2009; Rennekamp & Engle, 2008). Technological developments regarding digital data collection and storage have further accelerated the imperative to use data in organizations striving for societal change (Fruchterman, 2016).

Today, Extension professionals are expected to ground their programming decisions in the realities of the people they serve, on both a team level and a program area level. On a state level, Extension institutions have an increasing need to communicate their value to partners, stakeholders, and funders as indicated by data-derived insights (Lamm & Israel, 2013; Taylor-Powell & Boyd, 2008). On a regional level, organizations such as the North Central Cooperative Extension Association strive to empirically understand, conceptualize, and communicate the work the Extension system does. Although quantitative measures can provide common indicators across programs and state Extension organizations, stakeholders also require and are responsive to the stories and patterns behind the numbers. Funding partners from local to federal levels are asking for programmatic narratives and descriptions of concrete impact. Producing these narratives and descriptions requires consistent and contextual narrative data. In other words, working with both quantitative and qualitative data is important at all levels of Extension.

Conducting scaling analyses of larger textual data sets is not as easy as conducting scaling analyses of quantitative data. Qualitative data lead to highly context-sensitive, rich, and complex insights but require careful systematic analysis. Generating and storing large amounts of textual data (e.g., impact narratives, research reports, federal reports, evaluation reports, program development documentation) has become much easier in the past decade—at least technically. Yet the sheer availability of data does not imply that an organization is prepared for analyzing these data. This is especially true when such data are collected through various sources across a complex institution and accumulated over time.

We believe that fruitfully using qualitative data on a broad scale in Extension requires analytic collaboration resting on broadly distributed data literacy and analytic capacity. We believe this for the following three reasons: (a) conducting sound analysis of large amounts of qualitative data is time intensive; (b) performing consistent analysis of complex qualitative data from a multifaceted institution requires that multiple organizational perspectives are represented during the analysis; and (c) involving more analysts, versus fewer, in the analysis of qualitative data is a methodologically stronger approach due to the control of subjective biases. Thus, we assume that distributed, collaborative, and decentralized data analysis is an answer to the data imperative, leading us to consider what needs to be in place to carry out such analysis.

The Data Jam Initiative

Through the University of Wisconsin–Extension Data Jam Initiative, we foster analytic skills and build institution-wide capacity in using digital tools that aid in the analysis of large amounts of textual data. We focus on connecting teams, researchers, evaluators, and educators with similar needs regarding qualitative analysis. This approach in turn fosters commonly shared organizational concepts and analytic skills that we view as the prerequisite to developing our programming consistently as an institution and powerfully communicating how Cooperative Extension at University of Wisconsin–Extension affects the lives of the people we serve.

We drew inspiration for the Data Jam model from game jams, in which video game developers meet for a short amount of time to produce game prototypes. According to Preston, Chastine, O'Donnell, Tseng, and MacIntyre (2012), "jams provide participants an opportunity to improve their skills, collaborate with their peers, and advance research and creativity" (p. 51). The goal of our Data Jams is for participants to produce concrete write-ups, models, initial theories, and visualizations through collaboration. We then share these products with colleagues, partners, and relevant stakeholders via our blog and newsletters.

Such a collaborative and hands-on approach is vital for the success of evaluation capacity building in Extension institutions (Taylor-Powell & Boyd 2008, p. 59). Our initiative aims at a core goal of evaluation capacity building as an intentional effort "to continuously create and sustain overall organizational processes that make quality evaluation and its uses routine" (Compton, Baizerman, & Stockdill, 2002, p. 14, as cited in Taylor-Powell & Boyd, 2008, p. 56). Following Taylor-Powell and Boyd's (2008) specification of evaluation capacity building in Cooperative Extension, our initiative fosters "general awareness, skills, resources, and infrastructures to support evaluation, that is, the organizational processes that embed evaluative inquiry into the organization" (p. 56). We believe that the Data Jam Initiative addresses these needs as follows: We use a shared and actively supported technical tool ("skills, resources, and infrastructures"), shared analytic strategies and processes ("skills, resources"), and collaborative work based on mentorship across the organization (all of the above specifications).

The pedagogical backbone of the Data Jam Initiative is a flexible curriculum designed to provide facilitators with the tools to create spaces for colleagues analyzing real evaluation data collaboratively. The technical backbone of this effort is qualitative data analysis (QDA) software. We use the software package MAXQDA, but the curriculum is not limited to a specific product. QDA software supports analysts in organizing, annotating, and sorting qualitative data (Gibbs, 2014; Schmieder, 2014). It does not do analysis but acts as a powerful workbench for annotating and sorting data, creating complex retrievals, and documenting and visualizing analytic processes. QDA software serves as a stable retainer and organizing device for data sets that are used institution-wide (Schmieder, 2018). We also use it as a teamwork platform and teaching tool for qualitative analysis as colleagues create the previously discussed research or evaluation products.

We highlight our use of QDA software because we believe it is a helpful but underused organizational tool. To our knowledge, the extent and the nature of the use of QDA software in the Extension system have not yet been specifically documented. Our search for common QDA software programs in the Journal of Extension online database indicated that Extension professionals have used these programs for decades. Over the past half dozen or so years, Journal of Extension article authors have increasingly mentioned QDA software, perhaps indicating increased use in Extension institutions. Prior to 2010, only 10 Journal of Extension articles mentioned QDA software packages (e.g., Fitzpatrick, Gagne, Jones, Lobley, & Phelps, 2005; Jemison, Wilson, & Graham, 2004), compared with 19 articles from 2010 to 2016 (e.g., Baughman, Boyd, & Kelsey, 2012; Inwood, 2015; Van Offelen, Schroeder, Leines, Roth-Yousey, & Reicks, 2011). We hope that our use of QDA software in the Data Jams and associated reporting herein adds to this discourse and to the discourse on the use of any software for organizational learning and management in Extension institutions.

Another critical element of the Data Jam Initiative's pedagogical approach is a mentorship model of teamwork, rather than an expert model of delivery. By analyzing data in groups, experienced users mentor less experienced users. Our emphasis lies not on knowledge transfer but on creating spaces in which colleagues can be both mentees and mentors around qualitative analyses. For example, an educator comes to a Data Jam and practices the basics of analyzing qualitative data with the software. At a later Data Jam, this colleague acts as a mentor for a new colleague. Over time, the colleague improves analytic skills due to both being mentored by more experienced colleagues and mentoring less experienced colleagues.

This approach is in accordance with findings from Ghimire and Martin's 2011 and 2013 studies of evaluation competence in Extension educators, through which the authors suggested that capacity building in evaluation should be driven by experiential learning that requires mentorship, teamwork, and safe spaces for collaboration. We designed the initiative to establish an analytic culture that emphasizes work in and across teams. We emphasize equality in analysis teams; we encourage the integration of different perspectives; and we model techniques for arriving at consensus when we analyze.

Examples of Data Jams

The Data Jam Initiative includes monthly 1-day Data Jams for individual colleagues and multiday Data Jams for research teams, and we have used the curriculum in a graduate methods class at the University of Wisconsin–Madison. Supported by an eXtension fellowship, we developed the prototype of an online learning platform around Data Jams for Extension professionals nationwide (https://fyi.uwex.edu/datajams/) that contains curriculum and training materials. At the time of this writing, University of Minnesota Extension, University of Washington Extension, and University of California Cooperative Extension also had conducted Data Jams. Since 2016, we have held close to 50 full days of training in multiple formats. These varying formats expand the kinds of organizational goals and purposes Data Jams can fulfill.

Cover Crops Research: Data Jams as a Framework for Supporting Analysis

Data Jams are our educational framework for supporting research teams in their use of qualitative data. After data have been collected by colleagues (a process that we also support separately), they contact us regarding analysis. Together, we then plan and schedule an initial Data Jam as a launching point for the project's analytic phase. After the first Data Jam, the team works independently on its data. We then meet again with the team for subsequent Data Jams, if needed. As projects progress, teams plan and conduct these analysis sessions increasingly independently.

In one such project, colleagues had conducted four 2-hr focus group sessions on the use of cover crops in Wisconsin. In these sessions, farmers were asked why they did or did not use cover crops and what kinds of resources they would need to expand their use of them. The three analysis team members had limited backgrounds in qualitative methods, as their training was primarily in agronomy and related fields. In the first Data Jam, we all read data from one of the focus group sessions aloud and developed a qualitative coding scheme. Through this initial Data Jam, our colleagues developed skills in analyzing data together as a group. They also learned how to use MAXQDA for their analysis. The products of that session were a framework and project management plan through which the team independently analyzed further data. When the first round of independent analysis was complete, the team met with us again for a second Data Jam. At that time, they continued and deepened their collaborative analysis, including by creating an inventory of core themes directly relating to their research questions.

Thus, the team members were able to work collaboratively and critically with complex qualitative data, even without having extensive qualitative evaluation backgrounds. From this effort, the team presented its results at a conference and has begun work on multiple articles and reports. In addition, one of the analysis team members from the cover crops project went on to support a different project. Here she supported colleagues in managing data, providing analytic process insights, and assisting with project management planning. These outcomes suggest that the Data Jam curriculum has provided participants with tools to create scholarly and informative work and to take on mentorship roles in qualitative evaluation.

Supplemental Nutrition Assistance Program: Data Jams as an Opportunity for Building Capacity in Evaluation

The Supplemental Nutrition Assistance Program at University of Wisconsin–Extension (FoodWIse) has undertaken efforts to build and strengthen its culture of analysis and evaluation. In the context of statewide training efforts regarding data input and writing, FoodWIse evaluation specialists conducted three separate Data Jams for 18 educators and FoodWIse coordinators. In this instance, the Data Jam format was a vehicle for engaging specific colleagues with impact narratives that were collected from all colleagues in FoodWIse. The goals were to create shared skills in analyzing these narratives and to create a common understanding of current trends and challenges in the FoodWIse program. Through analyzing data together, coordinators and educators learned and reflected on their collective work.

The educators and coordinators produced reports in which they identified and summarized common outcomes of FoodWIse programming, including how FoodWIse programming supports the work of coalitions and partner organizations in communities. These reports were shared with all FoodWIse colleagues. Additionally, these results were used in statewide specialist meetings, informing discussions around evaluation requirements and guidance for evaluation and reporting. End-of-session evaluations also indicated that working with data hands-on increased participants' understanding of our statewide reporting systems. This result suggests that the Data Jam format affords broader institutional benefits that go beyond practical experience with analytic processes and tools.

Statewide Programming Outcomes: Data Jams as a Mechanism for Producing Timely Reports to Leadership

Our associate deans requested the analysis of our colleagues' 2017 programming outcome narratives to support leadership in (a) building a new programming approach (designated as Extension Programs) based on an understanding of our existing programming inventory, (b) identifying efforts and outcome areas in misalignment with existing Extension Programs designs, and (c) identifying areas in which program development or planning needed to occur. The associate deans identified several analysts with deep knowledge of all programmatic areas to be involved in a cross-disciplinary analysis across five of our existing Extension Programs.

On the first day, we coanalyzed a data set as a group and built a draft coding scheme. In subsequent days, teams analyzed separately while regularly checking in on the analytic frame. Over the course of 5 days, we analyzed 285 narratives of programming outcomes. This analysis was the basis for five reports describing programming audience, approach, and outcomes and the programs' contributions to federally planned programs. The reports have been used in onboarding new programming leaders, redesigning our statewide programming, and discussing our collective work with colleagues. The findings also were integrated into our 2017 Federal Report.

Because all colleagues involved in the analysis had been part of one or more Data Jams in the past, all were familiar with the curriculum's analytic process. They were used to working in analysis teams, and they were familiar with using the MAXQDA software. Thus, as an institution, we were able to tap into a ready pool of analysts with shared experience regarding qualitative analysis processes. This pool consisted of approximately 70 to 100 colleagues from different disciplines and locations across the state. The analyses that can be produced by such a group illustrate the impact and strength of the Data Jam Initiative: As an organization, we can respond quickly and efficiently to leadership's requests to analyze large amounts of data from outcome reports, plans of work, and other textual material.

A Response to the Data Imperative

Data Jams build capacity in analyzing textual data. However, the intent of this initiative goes beyond individual learners. In Data Jams, participants learn how to use the same software and perform the same processes, enabling them to analyze qualitative data as a team. Engaging in collaborative analyses using similar tools and analytic frameworks allows Extension colleagues to align with one another's concepts and viewpoints. In the cover crops example, the Data Jams allowed the team to identify main themes in their focus group results. In the FoodWIse example, colleagues created a common understanding of statewide programming. In the programming outcomes example, the Data Jam process generated a framework for understanding audiences, efforts, and outcomes of various statewide programs.

We see these outcomes as progress toward the development of a shared, consensus-based interpretation of institutional data and the development of shared understandings of programmatic issues and opportunities. These results in turn are prerequisites for being able to analyze and use large existing data sets on an institutional level in service of communication to stakeholders, program development, and organizational growth.

Yet even with such an approach, data use will continue to be a challenge for Extension institutions; increasing amounts of data only increase the challenge. Although we are encouraged by the impact of the Data Jam Initiative in our institution, we emphasize that it is not a quick fix for structural issues around data collection and analysis. The initiative addresses the core needs for evaluation capacity building with regard to professional development, which are (a) training, (b) technical assistance, (c) collaborative evaluation projects, (d) mentoring and coaching, and (e) communities of practice (Taylor-Powell & Boyd, 2008, p. 58). However, as Taylor-Powell and Boyd (2008) have pointed out, success of evaluation capacity building in Extension is also dependent on institutional resource allocation and support, as well as on organizational leadership, demand, incentives, structures, policies, and procedures.

Author Note

Author Ellen Bechtol was no longer affiliated with University of Wisconsin–Extension at the time of this article's preparation.

Acknowledgments

The Data Jam Initiative is a collaborative effort led by us that began in February 2016. Major contributions to the conceptualization and execution of the initiative come from former members of the Program Development and Evaluation unit, current members of the Office of Program Support Services, and members of the Qualitative Analysis Community of Practice in Cooperative Extension at University of Wisconsin–Extension: Eloisa Gomez, Maria Habib, Larry Jones, Jennifer Kushner, Jeffrey Lewis, Bridget Mouchon, Samuel Pratsch, Joe VanRossum, and Kerry Zaleski. We also would like to thank our colleagues Matthew Calvert, Jay Dampier, and Jenna Klink for providing extensive feedback on earlier versions of this article. Special thanks goes to the cover crops analysis team of Liz Binversie, Heidi Johnson, and Kevin Shelley; Josset Gauley, Jen Park-Mroch, and the FoodWIse Data Jammers; and the programming outcomes analysis team of Josset Gauley, Maria Habib, Lorre Kolb, Travis Olson, Paul Roback, and Kadi Row. And, of course, we would like to thank all the brave Data Jammers for going with us on this exciting journey!

References

Andrews, M. (1983). Evaluation: An essential process. Journal of Extension, 21(5). Available at: https://www.joe.org/joe/1983september/83-5-a1.pdf

Baughman, S., Boyd, H. H., & Kelsey, K. D. (2012). The impact of the Government Performance and Results Act (GPRA) on two state Cooperative Extension Systems. Journal of Extension, 50(1), Article 1FEA3. Available at: https://www.joe.org/joe/2012february/pdf/JOE_v50_1a3.pdf

Fitzpatrick, C., Gagne, K. H., Jones, R., Lobley, J., & Phelps, L. (2005). Life skills development in youth: Impact research in action. Journal of Extension, 43(3), Article 3RIB1. Available at: https://www.joe.org/joe/2005june/rb1.php

Fruchterman, J. (2016, Summer). Using data for action and for impact. Stanford Social Innovation Review, 30–35. Retrieved from https://ssir.org/articles/entry/using_data_for_action_and_for_impact

Ghimire, N. R., & Martin, R. A. (2011). A professional competency development model: Implications for Extension educators. Journal of International Agricultural and Extension Education, 18(2). doi:10.5191/jiaee.2011.18201

Ghimire, N. R., & Martin, R. A. (2013). Does evaluation competence of Extension educators differ by their program area of responsibility? Journal of Extension, 51(6), Article 6RIB1. Available at: https://www.joe.org/joe/2013december/rb1.php

Gibbs, G. R. (2014). Using software in qualitative analysis. In SAGE handbook of qualitative data analysis (pp. 277–295). London, UK: Sage. Retrieved from http://eprints.hud.ac.uk/14873/

Inwood, S. (2015). Opportunities for Extension: Linking health insurance and farm viability. Journal of Extension, 53(3), Article 3FEA1. Available at: https://joe.org/joe/2015june/a1.php

Jemison, J. M., Jr., Wilson, L., & Graham, J. (2004). Effecting land-use changes through education and implementation: Assessing the effectiveness of the Watershed Stewards Program. Journal of Extension, 42(3), Article 3RIB4. Available at: https://www.joe.org/joe/2004june/rb4.php

Lamm, A., & Israel, G. (2013). A national examination of Extension professionals' use of evaluation: Does intended use improve effort? Journal of Human Sciences and Extension, 1(1). Retrieved from https://www.researchgate.net/publication/299412532_National_Examination_of_Extension_Professionals%27_Use_of_Evaluation_A_National_Examination_of_Extension_Professionals%27_Use_of_Evaluation_Does_Intended_Use_Improve_Effort

Preston, J. A., Chastine, J., O'Donnell, C., Tseng, T., & MacIntyre, B. (2012). Game jams: Community, motivations, and learning among jammers. International Journal of Game-Based Learning, 2(3), 51–70. https://doi.org/10.4018/ijgbl.2012070104

Rennekamp, R. A., & Arnold, M. E. (2009). What progress, program evaluation? Reflections on a quarter-century of Extension evaluation practice. Journal of Extension, 47(3), Article 3COM1. Available at: https://www.joe.org/joe/2009june/comm1.php

Rennekamp, R. A., & Engle, M. (2008). A case study in organizational change: Evaluation in Cooperative Extension. New Directions for Evaluation, 2008(120), 15–26. https://doi.org/10.1002/ev.273

Schmieder, C. (2014). Zur Wahl von QDA-Software. Hintergründe, Funktionalität, Hilfestellungen [On choosing QDA software: Backgrounds, functionality, guiding aids]. In J. Kruse (Ed.), Einführung in rekonstruktive Interviewforschung. Weinheim, Germany: Juventa.

Schmieder, C. (2018). Constructing and using a large organizational dataset: Identifying equity practices in an institutional civil rights database. In N. Woolf & C. Silver (Eds.), Qualitative analysis using MAXQDA: The five level QDA method. New York, NY: Routledge.

Taylor-Powell, E., & Boyd, H. H. (2008). Evaluation capacity building in complex organizations. New Directions for Evaluation, 2008(120), 55–69. https://doi.org/10.1002/ev.276

Van Offelen, S. J., Schroeder, M. M., Leines, D. R., Roth-Yousey, L., & Reicks, M. M. (2011). Go Wild with Fruits and Veggies: Engaging children in nutrition education and physical activity with animal characters. Journal of Extension, 49(2), Article 2RIB6. Available at: https://joe.org/joe/2011april/rb6.php