The Journal of Extension - www.joe.org

October 2019 // Volume 57 // Number 5 // Tools of the Trade // v57-5tt3

Using Online Panels to Inform Extension Programming

Abstract
Extension personnel are faced with measuring public stakeholder behavior, perceptions, and preferences so as to inform program development. At the same time, many are faced with a lack of the financial resources necessary for acquiring generalizable and statistically representative samples. To reconcile these challenges, our team purchased an online survey panel from Qualtrics. Throughout the process of gathering and analyzing data, our team gained insights that may be of interest to others considering the use of online survey panels to sample broad stakeholder populations. Additional findings related to survey design and implementation also provide guidance for those interested in using this sampling methodology.


Miriah Russo Kelly
Assistant Extension Educator
University of Connecticut
Storrs, Connecticut
Miriah.Kelly@uconn.edu

Tessa Getchis
Extension Educator
University of Connecticut
Avery Point–Groton, Connecticut
Tessa.Getchis@uconn.edu

Anoushka Concepcion
Assistant Extension Educator in Residence
University of Connecticut
Avery Point–Groton, Connecticut
Anoushka.Concepcion@uconn.edu

John Bovay
Assistant Professor
University of Connecticut
Storrs, Connecticut
John.Bovay@uconn.edu

Background

Extension professionals are looking for new and different ways to uncover the interests, needs, behaviors, and preferences of diverse stakeholder groups. In this technological age, it is becoming easier to quickly tap into stakeholder feedback, and the costs for doing so are getting lower (Hays, Liu, & Kapteyn, 2015; Hill, 2013; Monroe & Adams, 2012). Using online survey panels can help Extension professionals better understand their stakeholders. This article presents guidance on how best to implement this method for use in Extension programming.

Use of online panels is a recognized sampling methodology that has become popular in the past 5 years. As response rates to traditional survey methodologies (i.e., paper mailings) decline, researchers are looking for new ways to gather quality data needed to make informed decisions (Craig et al., 2013). Additionally, for those without a background in statistics or survey research, panel companies are beneficial because they handle the important responsibilities related to recruiting adequately, verifying identities, anonymizing responses, and compensating participants in a timely manner (Craig et al., 2013).

Baker et al. (2010) found that many options for using survey panels exist and the methods used to procure panel lists vary widely across providers. The most important consideration for choosing the right panel is to determine whether a probability sample is required or a nonprobability sample is sufficient. Probability samples provide a higher level of accuracy and are more statistically representative, and hence are more generalizable (Baker et al., 2010; Kennedy et al., 2016). Unfortunately, they also come with a higher price tag. Vaske (in press) noted that probability sampling costs about $50 per participant whereas nonprobability sampling costs about $10 per person. Nonprobability samples are likely to be sufficient in many Extension contexts; however, an assessment of how the data will be used, and the implications of that use, must be carefully considered before deciding that a nonprobability sample is sufficient.

Insights from Our Approach

After much consideration, our team opted to purchase a Qualtrics online nonprobability panel to inform local aquaculture programming. We collected 1,756 responses over a 6-week period. The survey addressed public stakeholders' behavior, knowledge, perceptions, and preferences related to local aquaculture and willingness to pay for locally sourced aquaculture products. Working with experts in Qualtrics software, survey design, and aquaculture subject matter, our team developed an instrument specifically designed for distribution to a panel-based audience. In doing so, we learned a great deal about how to effectively use online panels to help inform Extension programming.

Lesson 1: Optimize for Mobile or Avoid It Altogether

When developing an online survey, researchers must consider whether the design and layout of the instrument will allow it to be implementable in both desktop and mobile applications. Vanette (2015) explained that the vast majority of participants will attempt to take a survey on a phone and that this is especially true of younger respondents. However, the quality of data acquired from mobile applications tends to be lower than that of data acquired via desktop applications (Baker-Prewitt, 2013; Peterson, Mechling, LaFrance, Swinehart, & Ham, 2013). Consequently, our team chose to implement a desktop-only instrument. Those who do want to gather responses using mobile modes should (a) keep the content simple and avoid any unnecessary wording, (b) position positive Likert scale responses at the top of the set of response options and negative responses at the bottom, (c) insert page breaks to avoid extended scrolling down the page, and (d) avoid the need for horizontal scrolling (Qualtrics Customer Support, personal communication, November 17, 2017; Vanette, 2015).

Lesson 2: Pilot, Pilot, Pilot, and Then Pilot Again

Given that the survey we developed was complex—consisting of 100 questions organized in six blocks and involving integration of display and skip logic, randomization, and embedded piped data—it was vital for us to pilot test the survey several times prior to the official launch (Vaske, in press). This process ensured that the survey was functioning properly and yielding the data expected, and it allowed us to calculate the appropriate response time for the survey. Participants who did not fall within the acceptable range were excluded from the final survey results.

Lesson 3: Use Attention Filters but Do Not Get Carried Away

Vaske (in press) has emphasized the importance of using attention filters. An attention filter is a question designed to check whether participants are actually reading the questions. We included one filter within a matrix type question. The filter read "This is an attention filter. If you are still with us, please click x." If a participant neglected to select the correct choice, he or she was redirected to the end of the survey and was not counted in our final sample numbers. Seven percent of participants who tried to complete the survey failed the attention filter question, indicating that this is another important step for ensuring the quality of the data collected. However, survey designers should use caution when implementing this technique, as Qualtrics Customer Service professionals (personal communication, March 7, 2018) explained that having too many filters can erode trust with participants and cause them to drop out of the survey early.

Lesson 4: Use Quota Sampling in Nonprobability Sample Contexts

We relied on a quota sampling approach for our evaluation research. Initially we set strict quotas for age, income, and gender categories (based on 2010 census data); however, Qualtrics was unable to obtain the number of respondents in each of the categories we set. Therefore, we were required to expand the quota categories to help achieve representativeness of the sample. Additionally, we were unable to set quotas for race and ethnicity, and the resulting sample did not well represent the Hispanic/Latino community, which was of particular interest to us. Furthermore, it has been found that many nonprobability samples include high percentages of adults without children in their households (Kennedy et al., 2016), and data on our participant pool supported these findings. Using quotas can help get to a more generalizable sample in a nonprobability context. However, to ensure generalizability and representativeness of the responses, Extension professionals using online panel methodology should consider the use of probabilistic samples, where possible.

Conclusion

In the process of planning for, implementing, and collecting survey responses from an online panel, our team learned a great deal about how to best use this sampling methodology as well as the limitations of this approach. Given the growing interest in this strategy, we hope to provide others with insights they might find useful in taking a similar path. Use of the best practices presented here can help Extension professionals optimize the survey participant experience while gathering the data needed to meaningfully inform Extension programming.

Acknowledgments

Our work was supported by the U.S. Department of Agriculture and the National Oceanic and Atmospheric Administration.

References

Baker, R., Blumberg, S. J., Brick, M., Couper, M.P., Dennis, M., Dillman, D., . . . Zahs, D. (2010). AAPOR report on online panels. Public Opinion Quarterly, 74(4), 711–781.

Baker-Prewitt, J. (March 6, 2013). Mobile research risk: What happens to data quality when respondents use a mobile device for a survey designed for a PC? Retrieved from https://burke.com/Library/Conference/CASRO%20Online%20Research%20Conference%202013%20-%20Baker-Prewitt%20paper_5.13.pdf

Craig, B., Hays, R., Pickard, A., Cella, D., Revicki, D., & Reeve, B. (2013). Comparison of US panel vendors for online surveys. Journal of Medical Internet Research, 15(11).

Hays, R., Liu, H., & Kapteyn, A. (2015). Use of Internet panels to conduct surveys. Behavior Research Methods, 47(3), 685–690.

Hill, P. (2013). Real, fast, feedback. Journal of Extension, 51(1), Article 1IAW4. Available at: https://joe.org/joe/2013february/iw4.php

Kennedy, C., Mercer, A., Keeter, S., Hatley, N., McGeeney, K., & Gimenez, A. (2016). Evaluating online nonprobability surveys: Vendor choice matters; widespread errors found for estimates based on Blacks and Hispanics. Retrieved from https://www.pewresearch.org/methods/2016/05/02/evaluating-online-nonprobability-surveys/

Monroe, M. C., & Adams, D. C. (2012). Increasing response rates to web-based surveys. Journal of Extension, 50(6), Article 6TOT7. Available at: https://joe.org/joe/2012december/tt7.php

Peterson, G., Mechling, J., LaFrance, J., Swinehart, J., & Ham, G. (2013). Solving the unintentional mobile challenge. Retrieved from https://c.ymcdn.com/sites/www.casro.org/resource/collection/0A81BA94-3332-4135-97F6-6BE6F6CEF475/Paper_-_Gregg_Peterson_-_Market_Strategies_International.pdf

Vannette, D. (2015). 4 ways to optimize your survey for mobile devices. Retrieved from https://www.qualtrics.com/support/survey-platform/survey-module/mobile-survey-optimization/

Vaske, J. J. (In press). Survey research and analysis: Applications in parks, recreation, and human dimensions. Champaign, IL: Sagamore Publishing.