The Journal of Extension - www.joe.org

December 2012 // Volume 50 // Number 6 // Tools of the Trade // v50-6tt7

Increasing Response Rates to Web-Based Surveys

Abstract
We review a popular method for collecing data—Web-based surveys. Although Web surveys are popular, one major concern is their typically low response rates. Using the Dillman et al. (2009) approach, we designed, pre-tested, and implemented a survey on climate change with Extension professionals in the Southeast. The Dillman approach worked well, and we generated response rates as high as 79%. However, the method was not problem-free. We share several lessons learned and recommendations for increasing response rates with Web-based surveys and draw attention to the importance of personalized and repeated contact for improving survey response rates.


Martha C. Monroe
Professor
mcmonroe@ufl.edu

Damian C. Adams
Assistant Professor
dcadams@ufl.edu 

School of Forest Resources & Conservation
University of Florida
Gainesville, Florida

Introduction and Background

Web-based (online) surveys, typically involving email requests with Web survey links, are popular for collecting data on program evaluation and attitudes. There are several benefits to online surveys, including low cost, wide availability of survey design and implementation tools, ease of implementation including reminders, and built-in features that facilitate data cleaning and improve the survey experience for respondents and researchers (Dillman, Smyth, & Christian, 2009; Boyer, Adams, & Lucero, 2010; Israel, 2011).

Participation in online surveys is thought to be easy for frequent computer users (Israel, 2011) and those with high-speed Internet access (Archer 2003). However, one major concern is online surveys' typically low response rates (Archer, 2008; Miller & Smith, 1983; Wiseman, 2003). On average, online survey response rates are 11% below mail and phone surveys, and rates as low as 2% have been reported (Petchenik & Watermolen, 2011).

A variety of factors, like poor survey design, excessive survey length, and lack of interest hurt response rates (Dillman et al., 2009). For example, needs assessments tend to get lower response rates than evaluations (Archer, 2008). Several strategies can increase response rates to online surveys. Our recent survey of Extension professionals in seven Southeastern states followed the Dillman guidelines and resulted in response rates of 62% to 79%. We describe the survey implementation and provide suggestions about using online surveys.

Our Process

We designed and pre-tested our survey with Extension professionals (n=32) according to Dillman et al. (2009). Our objective was to assess Extension perceptions about climate change. Based on pre-test feedback, we refined the survey to be completed in about 15 minutes and emphasized confidentiality of responses.

The Dillman approach relies on personalized, repeated contact to boost response rates, which can be facilitated with online surveys. To implement our survey online, collaborators from each state's Extension system were recruited to provide email lists, administrators' support for the survey, and logos for their system(s) and to review their state's survey. Collaborators received a copy of their state's survey, an implementation timeline, our approved IRB protocol, and drafts of expected communication to respondents.

We personalized the survey implementation by (1) using messages to each person by name (e.g., "Dear Sam"), (2) tailoring each survey with state-specific introductory information, logos, and demographic questions, (3) including collaborators' and Extension administrator names on all communications and the survey, and (4) including contact information for collaborators and project organizers. The survey also explained how responses will benefit the state and regional Extension program.

For repeated contact, we included: (1) an introductory email informing potential respondents of the upcoming survey; (2) an email with a personalized survey link; (3) reminder emails, with personalized links, to partial- and non-respondents over a 4-week period; and (4) two reminder emails sent by Extension administrators to their list at about weeks two and four of implementation.

The introductory email was sent using MS Word's mail merge tool, and other messages were sent via SurveyMonkey with personalized links or via Mail Merge with static (non-personalized) survey links for those who "opted out" of SurveyMonkey contact. Respondents completing the survey were no longer contacted and partial respondents were reminded they could continue where they left off. The personalized and repeated approach was successful, with one respondent commenting, "How did my boss know I hadn't filled out your survey yet?"

Lessons Learned

Our approach worked well, and we learned important lessons. First, personalized links allow tracking of respondent status (e.g., partial respondent), but if they are forwarded and more than one person responds to a link, responses can be overwritten. This is handled by adjusting SurveyMonkey's settings to prevent multiple responses to personalized links. Second, we observed a gradual increase in responses overall, with the administrators' reminders causing a noticeable and important bump in responses (Figure 1). Third, a small percentage in each state (0 - 9%) opted out of receiving SurveyMonkey messages. To overcome this, we identified opt-outs and relied on Mail Merge and static survey links to reach them. Fourth, a small percentage (6%) of potential respondents contacted us at the email address we provided for feedback and questions. Some complained about the content of the survey, while others offered additional information. Less than 1% of the respondents expressed concerns about completing an online survey.

Finally, our ability to contact potential respondents was influenced by the quality of email lists. It is critical "to know exactly whom a mailing list does or does not include, and to develop different ways of dealing with any deficiencies" (Dillman 2009:50). There was little consistency in lists from state to state. In some cases, 1862 and 1890 Extension faculty were in separate lists; in others, staff who do not actively engage in Extension programming were included along with faculty, specialists, and agents.

Figure 1.
Response Rate from Georgia; Red Bars Denote Reminder Messages

Response Rate from Georgia; Red Bars Denote Reminder Messages

Recommendations for Increasing Online Survey Response Rates

  • Determine if an online survey is a viable option and whether accurate email addresses are available, preferably with names. This enables you to send personalized reminders and follow up with respondents or non-respondents as needed (e.g., to assess survey bias). Relying on an organization to forward your survey link hinders the personalized/repeated approach, which reduces response rate.
  • Ask respected leaders (e.g., Associate Dean for Extension) to allow you to use their names on the "from" and signature lines of messages and to send their own messages encouraging responses. Introductory alerts increase response rates (Dillman et al., 2009); an authority figure sending this email is even more powerful.
  • Make sure your survey works well with your population. Pilot test it online to identify problems with question mechanics, formatting, question language, skip logic, viewing it in different browsers, and with survey length.

Dillman (2000) famously suggests "there is no other method of collecting survey data that offers so much potential for so little cost" (p. 400). Using this tool effectively and in a way that generates adequate response rates could be a significant improvement in our ability to understand needs and evaluate programs.

References

Archer, T. M. (2003). Web-based surveys. Journal of Extension [Online], 41(2) Article 4TOT6. Available at: http://www.joe.org/joe/2003august/tt6.php

Archer, T. M. (2008). Response rates to expect from Web-based surveys and what to do about it. Journal of Extension [Online], 46(3) Article 3RIB3. Available at: http://www.joe.org/joe/2008june/rb3.php

Boyer, C. N., Adams, D. C., & Lucero, J. (2010). Rural coverage bias in online surveys?: Evidence from Oklahoma water managers. Journal of Extension [Online], 48(3) Article 3TOT5. Available at: http://www.joe.org/joe/2010june/tt5.php 

Dillman, D. A. (2000). Mail and Internet surveys: The tailored design method, Second edition. New York: John Wiley and Sons

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Mail and Internet Surveys: The Tailored Design Method, Third edition. New York: John Wiley and Sons

Israel, G. D. (2011). Strategies for obtaining survey responses for Extension clients: Exploring the role of e-mail requests. Journal of Extension [Online], 49(3) Article 3FEA7. Available at: http://www.joe.org/joe/2011june/a7.php 

Miller, L. E., & Smith, K. L. (1983). Handling nonresponse issues. Journal of Extension [Online], 21(5). Available at: http://www.joe.org/joe/1983september/83-5-a7.pdf

Petchenik, J., & Watermolen, D. J. (2011). A cautionary note on using the Internet to survey recent hunter education graduates. Human Dimensions of Wildlife 16(3): 216-218.

Wiseman, F. (2003). On the reporting of response rates in Extension research. Journal of Extension [Online], 41(3) Article 3COM1. Available at: http://www.joe.org/joe/2003june/comm1.php