The Journal of Extension - www.joe.org

February 2014 // Volume 52 // Number 1 // Tools of the Trade // v52-1tt3

Using Turning Point to Conduct an Extension Needs Assessment

Abstract
Turning Point has the capability to collect market research data from an audience. This use is not supported by Turning Point, making the analysis challenging. A method for downloading raw data into a spreadsheet was identified. Once in the spreadsheet, the data could be cleaned and sorted. Although not straightforward and somewhat time consuming, it is believed that this method is superior to using paper surveys.


Bradley M. Carlson
Extension Educator and Associate Extension Professor
University of Minnesota Extension
Mankato, Minnesota
bcarlson@umn.edu

Introduction

University of Minnesota Extension Crops Systems conducted a needs assessment around issues relating to crop production, agricultural drainage, and water quality in Minnesota. Audience Response Devices (ARDs) were examined as a low input (cost, time, staff) (Kay & LeSage, 2009) alternative to printed surveys. Turning Point 2008 (Turning Technologies, Youngstown, OH) was chosen, because it was already in use in Minnesota, and many of its attributes were attractive for this purpose. The Turning Point system uses wireless response devices (referred to as "clickers") and drives off of Microsoft PowerPoint. Two main uses of Turning Point by Extension have been to facilitate audience participation and to collect evaluation data at the conclusion of an event (deKoff, 2013; Bird & McClelland, 2010). A literature review shows that Turning Point has not been documented to have been used for market research, and Turning Point does not claim this ability.

A key feature of any ARD is anonymity of responses. This can be a problem with diverse audiences when you wish to segment response data. In this case, farmer response was of specific interest, making it necessary to separate those responses from other groups. It was surmised that Turning Point was capable of this, because it claims the ability to merge data from multiple locations, and it has a reporting feature that allows for demographic separation.

A set of 22 questions were delivered prior to presentations made in the winter of 2011–2012. The first two questions were demographic in nature, asking the participant their role in agriculture and the number of acres they farm or advise. Data collection occurred at 24 separate events, with a combined audience of over 1,000 individuals. Participants were instructed to not answer if they had already responded at a previous event. An attempt was made to give a clicker to each individual at each event, but there was no way of ensuring that they used them. A total of 696 individuals responded to at least one question. It is unknown whether the non-responders elected to not answer or abstained because they had already answered at a different event. Because of this, the exact response rate cannot be determined, but it was at least 70%.

Analyzing the Data

Users of Turning Point 2008 typically generate summary reports that load into Microsoft Excel. As mentioned above, Turning Point has an option to merge data. When this was attempted for the survey reported here, the merged data indicated only 121 responses, compared to the actual number of 696. In addition, the demographic report did not function properly, because the program failed to identify the first question as being demographic in nature. Turning Point's help line was contacted for assistance, but ultimately it was determined that Turning Point would not be able to do the type of analysis desired. This consisted of multiple sorts of data, as well as determining statistical correlation. Therefore, a novel approach needed to be taken to download, sort, and analyze the data.

In order to complete the analysis, the data needed to be exported into a spreadsheet. Turning Point 2008 does not support this directly, so the function "Export Session XML File" was used, because this file could be opened by Microsoft Excel with data in separate cells.

The Turning Point program saves data on a per question basis, including User ID (unique to each clicker), points (not used in this application), a time signature (not of significance for this application), and the response to the question. Each of these values record in a separate cell in the spreadsheet when opened as an .xml file, along with a large amount of superfluous information (primarily html code).

The simplest solution to manage this data is to cut and paste IDs and responses to a separate spreadsheet. Data needs to be converted from responses per question to responses per user ID. While the identity of the individual that provided the responses is unknown, it is necessary to know each response associated with a unique individual. This is complicated by the fact that not everyone answered every question. Data from the XML file is contained in columns and sorted based on user ID. Each question has a different number of data rows depending on the response rate (this means that not every user ID shows up for each question).

The only way to manage this problem was to manually go through the data and insert blank space for each unanswered question. When this "data cleaning" was completed, there was a row of data specific for each individual respondent, including blank spaces for unanswered questions. It should be noted that Turning Point 5 (which has been released since this analysis) will export report data on a per respondent basis, simplifying, but not completely eliminating this step.

Analysis of the data was accomplished by simple data sorts in Excel. Data was separated based on whether the respondent was a farmer, input supplier, independent crop consultant, or other. A second level of sort was used to look at farmer data based on farm size and response to a question that indicated attitude toward environmental issues. Simple summary statistics such as totals, percentages, and means were used. Sigma Plot was used to conduct a Chi Squared test to determine correlation of responses.

Conclusion

Analysis of the data was straightforward, once the method for doing so was discovered. The process seemed tedious and time consuming, yet it is believed that it still was more efficient than using printed surveys. Using the most recent version of Turning Point will further streamline this process. While it was not the intent to compare an ARD-generated survey to a printed one, the relatively high response rate suggests that using ARDs may result in a larger data set. The final conclusion is that Turning Point can be used for the purposes of collecting information, but it has not been designed for this purpose and lacks reporting tools to enhance this use.

References

Bird, C., & McClelland, J. (2010). Have you used clickers in programming? Journal of Extension [On-line], 48(5) Article 5TOT. Available at: http://www.joe.org/joe/2010october/tt9.php

deKoff, J. P. (2013). Using audience response devices for Extension programming. Journal of Extension [On-line], 51(3), Article 3TOT4. Available at: http://www.joe.org/joe/2013june/tt4.php

Kay, R. H., & LeSage, A. (2009). Examining the benefits and challenges of using audience response systems: A review of the literature. Computers & Education 53:819-827.