The Journal of Extension - www.joe.org

February 2014 // Volume 52 // Number 1 // Feature // v52-1a7

Integrating Digital Response Systems Within a Diversity of Agricultural Audiences

Abstract
Extension educators have new computer-assisted tools as audience response systems (clickers) for increasing educational effectiveness and improving assessment by facilitating client input. From 2010-2012, 26 sessions involving 1093 participants in six diverse client categories demonstrated wide audience acceptance and suitability of clickers in agricultural and horticultural programming. Farmers, ag students, and Master Gardeners provided anonymous information using wireless clickers. Analyzed data was shared in each session. Such user-friendly technology improved pedagogy with rapid and sustained learner engagement and through enhanced peer-to-peer instruction. Pre-post assessment and re-teaching techniques provided documentation for group demographics, educational evaluation, and programmatic impacts.


William Sciarappa
Associate Professor
Rutgers, The State University - NJAES
New Brunswick, New Jersey
Sciarappa@njaes.rutgers.edu

Vivian Quinn
Agricultural Assistant
Rutgers Cooperative Extension - Monmouth County
Freehold, New Jersey 07728
vquinn@njaes.rutgers.edu

Introduction

Extension educators are dedicated to creating changes in behavior and documenting programmatic impacts. Our university Extension faculty incorporate logic models to plan program pathways, including how to assess various teaching methods and evaluate educational outcomes. Pertinent questions and answers of client adoption, attitudes, and knowledge gains are important indicators of the success of an educational outreach. Traditionally, numerous manual surveys and polls have been used in this regard. These evaluative methods can be tedious, time- consuming, educationally disruptive and/or have poor response rates. Educators now have a time-savings tool to evaluate programming and assess client needs through wireless computer assisted systems (Gustafson & Crane, 2005; MacGeorge, et al, 2008).

Audience response systems (ARS) were first utilized and/or experienced by many Extension educators at our National Association of County Agricultural Agents meeting in 2009, where session moderators incorporated this new technology into several professional improvement sessions. Program and presenter assessment was quickly polled via computers equipped with PowerPoint® software along with supplemental software from Turning Technologies, Inc.® (Turning Technologies Website). Utilizing ARS at these sessions was a very formative introduction of a potentially powerful new technology.

The roles of Extension educators continue to rapidly evolve and modernize. Integrating wireless, digital response systems (known as "clickers" and acronyms like ARS – audience response systems, SRS – student response systems, PRS – personal response systems, CRS - classroom response systems, etc.) have demonstrated efficiency and educational improvement, especially in large lecture halls, for taking attendance, evaluating group opinions, determining depth of experience, assessing level of knowledge, and quantifying amount of learning (Barker & Killian, 2011; Hatch, Jensen, & Moore, 2005).

This article outlines an Extension program that applied and tested such promising digital technology from 2010 to 2012 among a wide diversity of agricultural audiences and topics. Participants attended commercial vegetable meetings, pesticide training sessions, Master Gardener classes and conferences, primary and secondary school science seminars, university guest lecturers in International Agriculture and Horticultural Science, workshop presentations in educational technology (NACAA, Rutgers University-School of Environmental & Biological Sciences, Turning Technologies, Inc.) and a university class in Organic Farming.

Background

As Extension and specialist staff numbers dwindle in the new national economy, agents and educators are being asked to do "more and more" with "less and less." Often they are "closing the ranks" and engaging audiences that differ significantly from their previous clientele experiences. Faced with a diversity of frequently changing audiences, an educator first needs to know whom they are instructing in order to tailor the talk to the crowd. Clickers were said to meet these instructional needs (Sevian & Robinson 2011). In utilizing ARS technology with new clients, the educator can quickly "break the ice" and utilize valuable feedback as to the group's interests, occupations, demographics, and educational levels. Knowing one's audience has been shown to improve educational outreach in any area and will assist Extension staff in serving an increasingly diverse audience. Such specific information enhances educational effectiveness. Previous work also suggested that ARS helps Extension personnel build programmatic impact data, document professional improvement, and aid the promotional process (Bird & McClelland 2010; Salmon & Stahl, 2005).

Purpose

The general purposes of this extended case study of an Extension outreach program presented here were to:

  • Integrate new digital ARS technology into a diversity of Ag Science sessions.
  • Document and assess the utility of those tools to assist and test learning in real time.
  • Determine suitability for the range of extension classes and their participant satisfaction.

Underlying objectives were to:

  • Create demographic and content questions designed for different categories of clients.
  • Assess opportunities to quickly re-teach areas of factual or comprehensive deficiency.
  • Measure typical Extension impacts as the level of knowledge gained and adoption.

A final milestone was to share and test this instructional tool and educational concepts with other Extension educators in the agricultural and horticultural area.

Methods

One hundred ARS responder units (ResponseCard®) were obtained from Turning Technologies® and utilized with a USB receiver unit inserted into either a Sony Vaio® or Dell Computer® with PowerPoint® 2007. Audience response system technology (Figure 1, photo Sciarappa) was used whenever appropriate and convenient in various Extension sessions among different categories of audience. Survey questions concerning participant demographics, class content, current issues, learning assessment, and adoption were created by the instructor for classroom polling and pre-post assessment.

Questions were developed based upon relevancy to the specific group and specific topics being taught. More academic questions were used for the college students, more applied agriculture questions were used for the farmers, and more horticultural questions used for the Master Gardeners. For farmers, response questions investigated sensitive issues on age, acreage, attitudes, crops, crop damage, and income. For Master Gardeners, the purpose was to share experiences and foster socialization by publically comparing background interests and knowledge. For university students, priorities were to more effectively reveal the current level of knowledge, opinions, academic levels, and travel logistics. For younger students, the main goal was to immediately engage emotionally, create a comfortable learning environment, and then rapidly develop and sustain intellectual attention. These customized questions remained the same for each session for each specific group of learners. The questions were often used before the educational session and immediately afterwards to assess knowledge gain and provide an opportunity for re-teaching.

Figure 1.
Agriculture and Horticulture Students Enjoy Anonymous Feedback Through Wireless Audience Response Devices (ARS)

Agriculture and Horticulture Students Enjoy Anonymous Feedback Through Wireless Audience Response Devices (ARS)

A minimum of three session replications for each of five categories of learners in a 3-year period was the goal. Response results were always taken anonymously, and then the cumulative results were shared with the audience. The sharing of this analyzed data and various opinions and answers was designed to stimulate more peer-to-peer interaction among the class with the educator facilitating the discussion (Smith et al., 2009). Analyzed data was displayed on large projection screens as bar charts or pie charts with the percentage of correct and incorrect answers received in each of the three to seven multiple choice categories. These results were concurrently saved to the computer hard drive and file named according to the date of the specific class.

A suitability scale was constructed by the instructor to rate overall utility of ARS in various learning situations (as suggested by Shaffer, 2009) and represents a combination of educational factors and rubrics emphasized by current leaders in the field of ARS technology. A scale of 1 to 100, with 100 being best was divided into quartiles—four main scoring categories of 1 to 25 points each (Table 1). The categories were devised by previous training in lesson plan components and evaluation as well as outreach experience to include audience satisfaction, instructor satisfaction, pedagogical support, and Extension utility.

Table 1.
Overview of Extension Clientele Using ARS

Category #Sessions #People Average Age   Student Satisfaction Instructor Satisfaction Pedagogical Synergy Extension Utility Total Score
Farmers 6 430 54.3   21.7 23.1 23 23.7 91.5
Ag Students 4 113 19.9   23.5 23.3 24 23.8 94.5
Ag Students Guest 6 150 20.7   22.5 22.5 23.5 23.8 92.3
Master Gardeners 5 165 48.8   24 24 24 24 96
School 5 & 12 2 150 12.5   22.5 22.5 24 24 94.5
Faculty Peers 3 85 39.7   22.7 22.7 24.1 24.3 94.3
Grand Totals 26 1093 28.32   22.7 22.7 23.9 23.9 93.6
Note: A combined scale of 100 total points was divided into four main scoring sub- categories of 1 to 25 points each instructor and student satisfaction, pedagogy and extension utility.

A general rubric for each separate category provided baseline guidance and consistency, with scoring ranges of 21-25 points being in the "A" range, 16-20 points in the "B" range, 11-15 in the "C" range, etc. The audience satisfaction rubric considered the parameters as ease of use and the values of information sharing, polling, and fact checking being compiled by ARS surveys, Rutgers University on-line evaluations, general survey comments, and frequent conversations (MacGeorge, et al., 2008). The rubric for instructor satisfaction determined by the educator considered logistics, mechanics, utility, and informational value. The pedagogical support category ranked parameters as class dynamics, discussion enhancement, peer-to-peer instruction, and re-teaching opportunities as learned at seminars by Harvard Professor Eric Mazur and Dr. Jeff Borden, Director Center Online Learning – Pearson. The Extension utility category considered evaluation ease and value of pre-post testing in measures of knowledge gain and documentation of programmatic impact.

Results

Program Participants

Participants who were surveyed with ARS totaled 1,093 divided among six different categories of learners in 26 sessions. These selected sessions represented 35.6% of the 73 of the agent's agricultural presentations from 2010 to 2012. Class size ranged from 24 to 75, except in one case with class size exceeding 100 where the units were utilized in the shared mode that allowed two users to insert independent responses with one responder. The diversity of agriculturally oriented classes exceeded the initial goal, and ARS integration was seamless. Classes included three commercial vegetable grower meetings, three pesticide applicator training, five Master Gardener classes, two primary and secondary school sessions, three educational technology workshops at NACAA, RU-SEBS and Turning Technologies, six university guest lecturers in Horticultural Science and International Agriculture, and four undergraduate university classes in Organic Farming.

ARS Suitability

From 2010 to 2012, the Farmer category of agricultural sessions had six samples with 430 respondents. Three sessions were on vegetable production, and three were on pesticide training. Their average age ranged from 53 to 55 years old. ARS suitability on the composite score system was rated as highly positive, averaging 91.5% or in the "A" zone as did all the following five categories (Table 1).

The Ag Student category for the Organic Farming class where the agent was the primary instructor had four samples with 113 respondents. Average student age was 20 years old—mostly juniors and seniors, and ARS ratings averaged 94.5%. The Ag Student category where the agent was a guest lecturer had six sessions totaling 150 people with an average class age of 21—mostly seniors. ARS suitability ranked again in the "A" zone averaging 92.3%.

The Master Gardener group had five sessions in 3 years with 165 respondents. Their average age was 49 years old, and ARS suitability was 96.0%. The two local schools in Monmouth County totaled 150 students in the Schools  category but were composed of quite different ages, averaging 16 and 9 years old. The older group had the lowest suitability average of 89%, while the youngest had the highest at 100%, possibly because the High Tech High School students were not as impressed with the simple "high tech" devices as were the elementary students or the Master Gardeners. The last category of Peers averaged 39.7 years of age, with a total of 85 participants. Their suitability values of ARS technology also reached the "A" zone, with an average ranking value of 94.3%.

In all six client categories, student satisfaction, instructor satisfaction, pedagogy, and Extension utility were ranked in the grade "A" level, averaging 93.6% despite a wide range of age (9-55 years old), a wide range of agricultural interests, and a wide range of occupations and life stages. Additionally, many interested professionals were exposed to these outreach sessions and intended to pursue ARS systems. A few educators borrowed our ARS system for their own programs, and they reported successful experiences. This direct sharing of technical knowledge and equipment was adopted by three other agricultural agents, one Extension specialist, and two county trainers. As a whole, these various results met study objectives and substantiated the wide utility, suitability, and practicality of ARS systems throughout a diversity of agricultural Extension audiences.

Audience Responses

In all cases except one class, over 95% of the participants surveyed were first-time users of ARS and quickly adapted to this new technology. Participants reported that these hand-held clicker devices were easy and entertaining to use without any problems. Demographic polling data enabled the instructor to proactively adjust teaching styles and level of technical instruction, meeting key study goals of increasing teaching effectiveness and knowledge gain. Empirical data was publicly analyzed to show knowledge gains or reveal learning deficiencies with wireless testing before and after instruction. Selected examples of some learning results related to study objectives from the three main extension categories—Farmers, University Ag Students and Master Gardeners—are provided in Figures 2-5. As a whole, this data shows the flexibility of ARS software as well as the data compilation process to document knowledge gain and illustrate what the learning audience views.

Figure 2.
Farmer Meeting—Anonymous Single Answer for Public Viewing

Farmer Meeting—Anonymous Single Answer for Public Viewing

Figure 3.
Vegetable Conference—Multiple Answers for Farm Crops

Vegetable Conference—Multiple Answers for Farm Crops

Compiled data shown in Figures 2 and 3 compare ARS data analysis with a single answer and a multiple answer. Figure 2 shows the attending farmers being a mixed group into small-, medium-, and large-sized farms. Figure 3 shows the primary crops being grown and compared as a percentage and as a slice of the pie. Both survey results allow the ag educators to focus their remarks to the specific audience.

Figure 4.
Pesticide Training Session—70 Participants—Question "To determine the correct respiratory protection to wear when applying pesticides, (multiple choice)."

Pesticide Training Session—70 Participants—Question "To determine the correct respiratory protection to wear when applying pesticides, (multiple choice)."

Figure 5.
Pesticide Training Session—70 Participants—True/False Question "When applying pesticides, it's OK to re-use the chemical cartridges on a respirator for several days in a row.

Pesticide Training Session—70 Participants—True/False Question "When applying pesticides, it's OK to re-use the chemical cartridges on a respirator for several days in a row.

Compiled data in Figures 4 and 5 for our post-test gauged group learning after instruction. Figure 4 shows 98% of applicators answered correctly—education accomplished. On the same post-test audience, Figure 5 shows 36% answered incorrectly on a true/false question having a 50/50 percent chance of being correct or incorrect. Time to re-teach. Re-teaching boosts knowledge gain, which meets Extension and study goals of increasing and quantifying teaching effectiveness.

Table 2.
Example of Pre- and Post-Test Results

Question: In most instances, soil sampling for fertility testing should be taken to a depth of:
  Pre-Test Post-Test
6 to 8 inches 14 46.6% 28 95.5%
2 to 4 inches 8 26.6% 0 0%
Down to 12 inches 8 26.6% 1 3.4%
Only surface samples should be taken 0 0% 0 0%
TOTALS 30 100% 29 100%
Question: The soil profile horizon found between the topsoil & parent material is known as:
  Pre-Test Post-Test
Bedrock 1 3.4% 2 6.9%
Subsoil 13 44.8% 26 89.6%
S-horizon 9 31.0% 1 3.4%
Mid-level 6 20.6% 0 0%
TOTALS 29 100% 29 100%

Table 2 is an example of pre-post test results where in both questions 1 and 2 the general knowledge level of the class providing correct answers was evaluated at 46.7% and 44.8% in the pre-test, while the post-test correct results were 95.6% and 89.7%, respectively. These same questions can then appear in the final exam to measure long-term retention. This ARS approach shows multiple advantages over the traditional system of passing out forms, providing pencils/pens, and correcting papers afterwards. In addition to these time-saving advantages and increased convenience in grading results, the students get one final and very important re-enforcement of learning goals; a pedagogical principle that is often not incorporated in traditional methods. A few minutes of subsequent class dialog and knowledge gain is greatly improved, which is our Extension impact measure for the teaching task. This experience clearly demonstrates how such technology can facilitate teaching effectiveness, precisely measure group learning, and use results to solidify take-home messages.

Discussion

The utilization of ARS responders in these diverse learning situations clearly and consistently served to improve educational efforts in two basic ways: by providing the Extension educator or agent with an overview of the audience and by quickly providing feedback to the participants as to their comprehension level. This mutual feedback function developed greater educator-to-student interaction and fostered more student-to-student or peer-to-peer learning (Shaffer, Collura, 2009; Mazur, 1991-2010). Quick and anonymous sharing of analyzed data was not intrusive to educational flow and actually enlivened the human dynamics of the classroom. Also the analysis stimulated more creative group discussion in which minority thoughts from shy participants had equal access to the digital stage. This absolute feedback from the entire class population was clearly helpful to the instructor in profiling the audience and targeting the topics for re-teaching. This empirical evidence and objective experience formed the factual basis in assessing ARS integration into Extension outreach (Table 1).

Conclusion

The educational studies reported here demonstrated that new digital technology with student response tools are easily integrated within a diversity of Extension classes. The ARS system allowed an educator to quickly gauge the diversity of both adult and pre-adult class levels with a preliminary survey and pre-test, and then to accurately quantify class learning with a post-test provided within class time. Educational process could be flexibly adjusted according to class experience. The quantification of knowledge gained, adoption, or behavior change provided empirical data to support programmatic impact needs.

The ARS devices were just as effective in satisfying class needs, as evidenced in student evaluations. Students eagerly utilized an intriguing and interactive piece of new technology. Having these wireless remote responders in participants' hands rapidly engaged and bonded any new class. "Clickers" worked consistently well within a PowerPoint® framework to provide an instant graphical analysis of classroom responses in an anonymous approach displayed to the entire class on a large screen. The hardware and software options were cost effective, easily portable, very dependable, and usable by other colleagues.

Classroom instruction with the clickers encompassed a wide diversity of agricultural audiences in a quick and convenient manner regardless of age, interests, or occupation. Creating appropriate survey questions was a relatively simple matter. Instant analysis of student responses was quite valuable and insightful, especially with data representing demographics and pre-post evaluations. This feedback fostered positive group dynamics and allowed non-linear instruction. Pre- and post-testing was quite a time-saver that documented key extension impacts, i.e., knowledge gain and anticipated behavioral change. Ample opportunities were found to re-teach deficient areas of factual knowledge and comprehension. These findings, extracted from a diversity of classroom case studies, support similar positive claims by several other non-agricultural but science-oriented educators (Ribbens, 2007; Mazur, 1991-2012). The initial investment of time, effort and money to integrate this technology into a diversity of ag-science courses was quite rewarding to both the educator and the learner.

Educational Implications

During my 40-year career in ag-science presentations, I have utilized an abundance of new educational technology. Some systems quickly come and go, while others have remained as standards for quite awhile. Judging from my basic pedagogical training as a certified high school science teacher, it is my opinion that ARS technology will last because the system can immediately improve the educational effectiveness and programmatic impact of most any professional Extension educator grounded in technical content and basic outreach techniques.

Such digital devices offer expanded opportunities for Extension agents seeking to increase interactive outreach and to invoke humanistic education principles. This tool is really not an issue of new versus old technology or traditional versus progressive philosophy, but simply an effective application for Extension education in general (Dewey, 1938). For Extension educators, applications seen in these sessions worked equally well in agricultural conference centers and university classrooms. Future applications in the field or online will develop more fully as smartphone adoption increases within the farming community. Extension outreach will continue to expand with mobility into the field environment, allowing remote polling in conjunction with farming webinars and real-time interaction at a distance.

Acknowledgments

The educational program described here was partially supported by a Rutgers University School of Environmental and Biological Sciences grant – ICF for New Digital Technology. Training tips were provided by Turning Technology® Inc. representatives and the technical service group. Special thanks to Agricultural Assistants Brian Hulme and Kevin Soldo.

References

Barker, W., & Killian, E. (2011). Tips and tools: The art of virtual program evaluation – measuring what we do with pizzazz. Journal of Extension [On-line], 49(1) Article 1TOT4. Available at: http://www.joe.org/joe/2011february/tt4.php

Bird, C., & McClelland, J. (2010). Have you used clickers in programming? Journal of Extension (On-line), 48(5) Article 5T0T9. Available at: http://www.joe.org/joe/2010october/tt9.php

Dewey, John (1938). Experience & education. New York, NY, Kappa Delta Pi ISBN 0-684-83828-1

Gustafson, C. & Crane, L. (2005). Polling your audience with wireless technology. Journal of Extension (On-line), 43 (6) Article 6TOT3. Available at: http://www.joe.org/joe/2005december/tt3.php

Hatch, J., Jensen, M., & Moore, R. (2005) Manna from heaven or "clickers" from hell. Journal of College Science Teaching.Jul/Aug2005, Vol. 34 Issue 7, p36-39. 4p.

MacGeorge, E., Homan, S.R., Dunning Jr., J.B., Elmore, D., Bodie, G., Evans, E., Khichadia, S., Lichti, S.M., Feng, B., & Geddes, B. (2008). Student evaluation of audience response technology in large lecture classes. Educational Technology Research & Development.Apr2008. Vol. 56 Issue 2, p125-145. 21p.

Mazur, E., (1991-2012). The Mazur Group. Peer instruction, technology and education, learning science, project Galileo. Retrieved from: http://mazur.harvard.edu/education/educationmenu.php 

Ribbens, R. (2007) Why I like clicker personal response systems. Journal of College Science Teaching. Nov/Oct2007, Vol. 37 Issue 2, p60-62. 3p.

Salmon, T.P. & Stahl, J.N. (2005). Wireless audience response system: Does it make a difference? Journal of Extension [On-line], 43(3) Article 3RIB10. Available at: http://www.joe.org/joe/2005june/rb10.php

Sevian, H., & Robinson, W., E. (2011). Clickers promote learning in all kinds of classes—Small and large, graduate and undergraduate, lecture and lab. Journal of College Science Teaching.Jan/Feb2011, Vol. 40 Issue 3, p14-18. 5p.

Shaffer, D. M., & Collura, M. J. (2009) Evaluating the effectiveness of a personal response system in the classroom. Teaching of Psychology. Oct-Dec2009, Vol. 36 Issue 4, p273-277. 5p.

Smith, M. K., Wood, W. B., Adama, W., K., Wieman, C., Knight, J. K., Guild, S., & Su, T. T. (2009) Why peer discussion improves student performance on in-class concept questions. Science. Vol. 323 Issue 5910, p122-124. 3p.

Turning Technologies, Inc. (2013). Assessment Delivery & Data Collection Solutions; Solutions for Higher Education, Jan. 2013. Retrieved from: http://www.turningtechnologies.com