The Journal of Extension - www.joe.org

October 2012 // Volume 50 // Number 5 // Tools of the Trade // v50-5tt5

Testing a New Generation: Implementing Clickers as an Extension Data Collection Tool

Abstract
Using clickers to gauge student understanding in large classrooms is well documented. Less well known is the effectiveness of using clickers with youth for test taking in large-scale Extension programs. This article describes the benefits and challenges of collecting evaluation data using clickers with a third-grade population participating in a childhood obesity prevention program.


Sondra M. Parmer
Project Manager, Nutrition Education Program
Alabama Cooperative Extension System
parmesm@auburn.edu

Greg Parmer
Master IT Specialist
Alabama Cooperative Extension System
gparmer@aces.edu

Barb Struempler
Professor
Department of Nutrition, Dietetics and Hospitality Management
struebj@auburn.edu

Auburn University
Auburn, Alabama

Introduction

Understanding the impact of programming is an essential function of Extension efforts (Bailey & Deen, 2002; Stup, 2003; Franz & Townson, 2008). While Extension professionals know that evaluating programming is important, identifying tools available to resourcefully evaluate programs may be less understood. "Clickers" may offer an effective method for collecting impact data of Extension programming.

Bird and McClelland (2010) introduced clickers for use in Extension programming. They theorized that participant names could be paired with clicker device numbers to overcome certain data collection limitations. To determine the feasibility of this suggestion, the study reported here examined using clickers for the collection of individual impact data over multiple time points. In addition, clicker use is explored when large numbers of participants or data points are involved.

Project Description

In Alabama, Extension has implemented an ambitious evaluation strategy for a childhood obesity prevention initiative, Body Quest: Food of the Warrior. Body Quest was a 17-week intervention that collected pre-, post-, and weekly nutrition evaluation data (behavior, preference, and intent) from more than 2,000 third graders. Each student responded to 500 questions throughout the intervention, resulting in more than one million data points to be collected and analyzed. In typical Extension evaluations, each student would respond to questions on a paper test, and the responses would be entered into an electronic format for grading and analyses. Due to the scope of Body Quest, a more efficient method for completing this process was needed to reduce the burden at the county level. Clickers were chosen as the method used to collect test data.

Why Use Clickers?

Clickers are traditionally used in college classrooms as a way for instructors to gauge students' knowledge of particular subjects (Wood, 2004; Bergtrom, 2006). However, using clickers to collect test data from youth is not well documented in the literature. Although using clickers to collect and transfer data is a novel process, Body Quest researchers chose to use clickers because:

  • A large amount of data could be collected and electronically transferred without a substantial manpower commitment at the county level.
  • A portable system that required limited set up time was needed. Extension educators were moving between classrooms each day, with no time between classes.
  • No Internet access was required at the data collection point.
  • Clickers would integrate a "game approach" that was anticipated to engage the students more than traditional pencil and paper testing.

Data Collection Process

To use clickers as a mechanism to collect data, the following process was designed. Prior to use in the classroom, the educator had to assign a specific clicker to a student. Using the pre-assigned clicker, the students answered evaluation questions at each class meeting by pressing a number on their clicker that corresponded to their answer from a written assessment tool. An educator had a handheld receiver that recorded these answers in the classroom. After class, the educator downloaded student responses into a Turning Technologies® software package (ResponseCard AnyWhere Desktop) that acted as a data repository. Once responses were downloaded, a second Turning Technologies® software package (TurningPoint) was used to join the students' response data to the students' demographic data. After this joining, all data were electronically uploaded to a central SharePoint site for analyses.

Lessons Learned

After using this data collection method with clickers for a school year, many lessons have been learned. The use of clickers with youth for testing did meet expectations. After having used clickers for data collection, results show that clickers offer a viable method for data collection in Extension programs.

Clickers allowed for large amounts of data collection with minimal manpower. It is estimated that 25% more time per educator is available to invest in direct classroom education by developing this time-saving data management process. This significant time savings is the greatest benefit of this process.

Portability requirements also were met, and classroom set up for assessment was relatively simple. This was primarily due to the choice made to use a handheld receiver rather than a USB receiver that requires a computer as the collection vehicle. Moreover, students responded to questions from written tests rather than questions shown on a PowerPoint presentation. This also lessened set up time needed by the educator in the classroom.

Because Internet access in rural public schools is limited, making use of a process not dependent on the Internet was key. A stand-alone system was implemented so that data could be uploaded upon the educator's return to the office, where Internet access was available.

Even in a testing situation, clickers collecting these data were viewed as digital toys and "fun to use" by students and teachers. Adding a technology aspect to testing helped create a more engaging tool for today's digital natives.

Limitations

There were limitations to using clickers. One occurred as the result of using a handheld receiver. Demographic data could not be matched to student responses without an additional software package. Working between two software packages was cumbersome for county educators. However, the handheld receiver did allow for educators to move freely and quickly between classrooms.

Second, third-grade students had a learning curve for using the exact finger pressure to press a button. This limitation may not hold true for older students, who are more physically developed and may possess stronger fine motor skills.

Third, the potential for "glitches" is inherent with any technology being used in a classroom. Getting Extension educators comfortable with the technology was an additional step that would not be needed with traditional pencil and paper testing.

Summary

Clickers provided an effective and replicable method to collect and transfer data for evaluating a large-scale, statewide Extension program. Future efforts should explore using clickers for testing with older students and using a USB receiver to combat some limitations experienced with this program. This original process has significant ramifications to Extension evaluations and has been a revolutionary accomplishment for Body Quest.

Acknowledgments

This project was funded by SNAP-Ed through the Alabama Cooperative Extension System and the Alabama Department of Human Resources.

References

Bailey, S. J., & Deen, M. Y. (2002). A framework for introducing program evaluation to Extension faculty and staff. Journal of Extension [On-line], 40(2) Article 2IAW1. Available at: https://www.joe.org/joe/2002april/iw1.php

Bergtrom, G. (2006). Clicker sets as learning objects. Interdisciplinary Journal of Knowledge and Learning Objects, 2, 105-110. Retrieved from: http://www.ijello.org/Volume2/v2p105-110Bergtrom.pdf 

Bird, C., & McClelland, J. (2010). Have you used clickers in programming? Journal of Extension [On-line], 48(5) Article 5TOT9. Available at: https://www.joe.org/joe/2010october/tt9.php

Franz, N. K., & Townson, L. (2008). The nature of complex organizations: The case of Cooperative Extension. In M. T. Braverman, M. Engle, M. E. Arnold, & R. A. Rennekamp (Eds.). Program evaluation in a complex organizational system: Lessons from Cooperative Extension. New Directions for Evaluation, 120, 5-14.

Stup, R. (2003). Program evaluation: Use it to demonstrate value to potential clients. Journal of Extension [On-line], 41(4) Article 4COM1. Available at: https://www.joe.org/joe/2003august/comm1.php