December 2018
|
December 2018 // Volume 56 // Number 7 // Tools of the Trade // v56-7 tt1
Intercept Surveys: An Overlooked Method for Data Collection
Abstract
Intercept surveys are a tool Extension educators can use to capture local data quickly and with minimal cost. We used intercept surveys at city farmers' markets to test the efficacy of food safety signage. From our experience with the intercept survey process, we identified a set of best practices that can benefit other Extension educators interested in developing and implementing this type of research.
Introduction
Intercept surveys (ISs) are a tool that Extension educators can use to capture local data quickly and with minimal cost. Educators can turn local community events into research venues to collect data that improve programs, to explore specific client needs, and to capture perceptions about situations, products, or services (Dillman, Smyth, & Christian, 2014; Kelley & Wehry, 2006). With the appropriate research questions, a well-thought-out research design, and dedicated time and energy, educators can use the IS approach with confidence.
We found the literature in the use of public ISs to be limited, with the exception of articles that deal with ISs in natural resources settings (Flint et al., 2016). This article addresses that gap by highlighting design and implementation practices educators should consider when ISs are to be used for data collection.
Case Example
To illustrate the use of ISs, we present an overview of our use of the tool for evaluating the efficacy of food safety signage at two city farmers' markets over the course of 4 weeks. The project was approved by the University of Maryland Institutional Review Board (IRB). Census data were used to match two experimental and two control farmers' markets (City-Data, 2016). The two control farmers' markets did not receive the food safety signage or associated giveaway item until after the study period had ended. (The giveaway item was a promotional combination food peeler/scrub brush intended to encourage people to wash produce and brush produce having netted surfaces, such as cantaloupe.) Applicable census data and the timeline for the study are shown in Tables 1 and 2.
Demographic | Experimental site 1 | Experimental site 2 | Control site 1 | Control site 2 |
Unemployment (%) | 15.3 | 4.7 | 8.8 | 10.6 |
Female (%) | 42.6 | 51.7 | 54.0 | 56.8 |
White (%) | 24.5 | 6.3 | 52.5 | 32.3 |
Black (%) | 65.0 | 10.1 | 36.0 | 52.4 |
Bachelor's degree or higher (%) | 29.9 | 80.9 | 53.2 | 45.4 |
Median age (years) | 31.3 | 31 | 37.9 | 29.5 |
Site type | August 6–28, 2016 | September 3–20, 2016 |
Experimental | Displayed signage and food peeler | Administered experimental surveys |
Control | Displayed no signage or food peeler | Administered control surveys |
At the end of the intervention period, we asked patrons at the experimental and control farmers' markets to participate in a brief (3- to 5-min) survey. Upon survey completion, each respondent received a $5 token for the farmers' market as compensation. We calculated the numbers of completed surveys from the experimental and control farmers' markets that would be needed to show statistical significance (p < .05) during analysis. Overall, respondents were agreeable to taking part in the survey.
Development and Implementation Best Practices
From our experience with ISs, we identified a set of best practices that can benefit Extension educators interested in using this survey mode. Here we emphasize best practices for developing and implementing an IS tool.
IS Development
- Match the research need to the research mode. Research problems that need more than six easy-to-answer questions or include sensitive information are not suitable for ISs. If the research problem requires several open-ended questions, ISs are not appropriate because of the time needed for the participant to answer (verbally or in writing).
- Decide on the population to be sampled on the basis of the research question. This decision will lead to choices of venue(s) and time(s) appropriate for accessing the population. In the case study presented here, the desired population was a mix of people in urban areas representing diverse socioeconomic characteristics, and farmers' markets were a venue where a large and diverse population interested in produce could be accessed. Review the census data for your venue to determine whether the survey will need to be offered in multiple languages. Plan to sample participants from the starting time to the ending time of the event to eliminate time-of-day sampling bias.
- Design the survey instrument to be convenient for respondents to answer. Respect the time of participants who will agree to help you by making sure your consent and survey takes only a few minutes to complete. Keep in mind that written consent is less appealing to participants than verbal consent. The survey should be designed to be easily read to participants, especially if there is a concern about the literacy level of participants. Avoid questions that give the option to "choose all." Respondents may default to choosing all because it is easy, leading to the possibility of missing trends during data analysis. Conduct pilot testing of the survey tool to assess timing, the consent process, and ease of use.
- Strategize regarding what you will do when two or more people want to complete the survey at the same time—that is, how you will accommodate such requests.
- Visit the chosen venue(s) to review access, layout, and other logistics to help you prepare your implementation plan.
IS Implementation
- Choose interviewers with the necessary skills if you are asking others to help. Interviewers have to be comfortable walking up to strangers and asking for an interview and need to understand what to do if somebody declines to be interviewed. Consult your university's IRB guidelines to ensure that you follow all necessary rules for using volunteers to conduct research.
- Prepare for the event. Know whether dates can be added if you do not get all the needed surveys on the initial interview date. Organization will be critical on the day of the event. Organize a "materials bag" so that each interviewer knows where the incentives and completed and uncompleted surveys are kept. Interviewers should be able to quickly access materials and should have extra pens, clipboards, battery chargers, and surveys.
- Confirm approval to be at the site. Contact the site or event manager, and check local zoning and permitting regulations. You may want to obtain a letter of approval that you can present should any participants question your right to be at the site. In the case example presented here, in addition to gaining management approval at the market, we bought incentives that could be spent at the farmers' market.
- Implement the ISs. Dress appropriately for weather conditions, wear something with your Extension branding, and look friendly. Have your university identification with you. Position yourself in a high-traffic area such as an entrance, an exit, or long lines where people are already waiting. Introduce yourself to vendors or on-site personnel; they can become advocates for you.
Discussion and Lessons Learned
Our case example introduces ISs as a research tool for Extension educators and suggests valuable lessons learned for future implementation. Applying a systematic approach to sampling will improve the quality of the data obtained. The presence of Extension at local events provides many opportunities to collect valuable information, and ISs are a tool Extension professionals can use to garner local perspectives that can be applied in a variety of ways, from incorporating community insights into needs assessments to modifying existing programs. ISs cut across all Extension programs, and the value of this tool in Extension deserves further investigation.
References
City-Data. (2016). Baltimore, Maryland (MD), zip code map locations, demographics. Retrieved from http://www.city-data.com/city/Maryland.html
Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys: The tailored design method (4th ed.). Hoboken, NJ: Wiley.
Flint, C. G., Mascher, C., Oldroyd, Z., Valle, P. A., Wynn, E., Cannon, Q., & Unger, B. (2016). Public intercept interviews and surveys for gathering place-based perceptions: Observations from community water research in Utah. Journal of Rural Social Sciences, 31(3), 105–125.
Kelley, K. M., & Wehry, R. H. (2006). Consumer interest in gardening topics and preferred information sources. Journal of Extension, 44(2), Article 2RIB7. Available at: https://www.joe.org/joe/2006april/rb7.php