The Journal of Extension - www.joe.org

June 2011 // Volume 49 // Number 3 // Feature // v49-3a7

Strategies for Obtaining Survey Responses from Extension Clients: Exploring the Role of E-Mail Requests

Abstract
Extension professionals want to use the Web for conducting surveys, but studies show using the Web alone introduces significant bias. The study reported here compared strategies for obtaining responses that might minimize cost and bias. E-mail and postal invitations to the Web-hosted survey version were compared to the postal mail-only standard. The response rate was highest when using an e-mailed invitation, followed by the traditional mail-only mode and the postal invitation/Web-hosted mode. It appears the best strategy for minimizing the cost of collecting data and maximizing representativeness is to use e-mail invitations when available and postal mail for those without e-mail.


Glenn D. Israel
University of Florida
Gainesville, Florida
gdisrael@ufl.edu

Introduction

Surveys are widely used to collect information to identify client needs and evaluate the quality and impact of Extension programs. Many Extension professionals hope that Web-hosted surveys can be used as a low-cost method for collecting data. For example, Web-hosted surveys have been implemented using e-mail addresses obtained from specialized audiences with nearly universal Internet access (e.g., Malone, Herbert, & Kuhar 2005; West 2007). Recent studies suggest, however, that relying on the Web alone may introduce significant bias in the data when there is not universal access (Israel, 2010; Messer & Dillman, 2009; Smyth, Dillman, Christian & O'Neill, 2009). There also is evidence that combining mail and Web to collect data from Extension clients results in data that is comparable to traditional mail surveys (Israel, 2010), but it is not clear what role e-mailed invitations should play in implementing Web-hosted surveys.

The study reported here builds on previous research and explores whether invitations to participate via e-mail can generate responses that are similar to the postal mail-only standard and be more cost effective. The study divided Extension clients into two groups: those who provided an e-mail address and those who did not. Then it explored how pairings of the survey invitation and the mode of response affected the willingness of clients to respond to the survey. Specifically, the research questions addressed were:

  1. To what extent will Extension clients respond to a Web-hosted survey following an e-mail invitation?

  2. Does the e-mailed invitation result in quicker responses and reduced costs?

  3. Do clients who use the Web have different characteristics and answers from those responding by mail?

Background

Web-hosted and mixed-mode (e.g., a combination of Web and paper) surveys offer alternatives to mail-only surveys, but the consequences of these alternatives are not fully understood. The variety of browsers and hardware configurations affect access to surveys, while the complexity and nature of applications (both benign and harmful) affect people's willingness to respond via the Web. This has complicated the process of contacting clients and their decision-making as to when and how to respond to survey requests (Dillman, Smyth, & Christian, 2009).

As noted by Couper, Kapteyn, Schonlau, and Winter (2005), access to the Web by the intended recipients is one of the key factors in determining whether clients will respond to a Web-hosted survey. As of September 2009, 77% of American adults use the Internet (Pew Internet & American Life Project 2009). Data from the Pew Internet & American Life Project (2009) also show the following.

  • Persons with a college education were more likely to have Internet access (95%) than those with only a high school diploma (72%) or less than a high school diploma (37%).

  • Older Americans are less likely to use the Internet (43% of those 65 or older), while the percentage increases 77% for persons 50-64 years of age, 83% for those 30-49, and 93% for 18-29 years old.

  • Hispanics (61%) and Black, non-Hispanics (72%) are less likely to use the Internet than White, non-Hispanics (80%).

  • Rural residents, including many farm families, have slightly lower rates of using the Internet (71%) than do urban (73%) and suburban residents (75%).

The pattern for the adoption of a broadband (i.e., high-speed) connection in the home is similar but lower. Horrigan (2009) reported that 63% of American adults had a broadband connection in the home in April 2009, up from 55% in May 2008. As with Internet access, people with lower educational attainment, African Americans, elders, and rural residents were less likely to have a broadband connection (Horrigan, 2009). Although the 1-year increase in the percent having a broadband connection is impressive, these data suggest that some segments of Extension's clientele continue to have much better access to the Internet than do others.

In addition to access, other factors that might affect the propensity to respond to a Web-hosted survey (when a request is sent by e-mail or postal mail) include ready access to a computer and available time to complete the survey "now"; having experience with using Internet forms from shopping, on-line banking, or previous surveys; and deriving psychological benefits from participating in and responding quickly to the survey. Having experience using the Internet might increase preference for this mode (Messer & Dillman, 2009). Given this set of factors, it is not surprising that people who responded via the Web were different in several ways from those who responded by mail (Israel, 2010; Smyth et al. 2009).

Finally, whether the Web is an appropriate mode rests on the nature of Extension's relationship with the intended recipients of the survey. When there is no prior relationship, it is inappropriate to initiate a request to complete a Web-hosted survey via e-mail (Council of American Survey Research Organizations, n.d., p. 8). This is because the Internet is not considered a public utility (Dillman et al. 2009). Thus, needs assessment surveys or evaluation surveys that include nonparticipants with no prior relationship should send an invitation by postal mail in order to conform to ethical standards for conducting surveys.

With this background, an e-mail invitation to a Web-hosted survey was expected to result in a higher response rate than for a postal invitation with a paper survey. Conversely, a postal invitation to a Web-hosted survey was expected to result in a lower response rate than for a postal invitation with a paper survey. Respondents who received either an e-mail or a postal mail invitation and then answered via the Web were also expected to differ from those sending a completed survey via the mail.

Methods

The study used data collected for the annual survey of Florida Cooperative Extension Service's (FCES) clients in 2009. FCES provides an array of educational programs, and the survey was sent to a sample of clients who had attended a workshop or seminar, called the Extension office, or visited the office. The 2009 customer satisfaction survey followed the same procedures as described in previous studies (Israel, 2010; Israel & Galindo-Gonzalez, 2009).

For the study, a random sample (n=1,530) was selected from lists of Extension clients in 14 of Florida's 67 counties. (Note: randomization was used to assign counties to one year in the 5-year rotation system.) The selected clients were sorted into two strata: those providing an e-mail address and those not providing an e-mail address. Those providing an e-mail address were randomly assigned to treatment groups 1, 2, and 3:

  1. Mail only: The request for a response included only the mail mode (n = 155).

  2. Web preference: The initial request for a response included only the web mode and the follow-up provided a choice of web and mail (n = 146).

  3. E-mail preference: The initial request was sent via e-mail and the follow-up provided a choice of web and mail (n = 133).

The second strata, clients not providing an e-mail address, were randomly assigned to groups 4 and 5:

  1. Mail only: The request for a response included only the mail mode (n = 554).

  2. Web preference: The initial request included only the web mode and the follow-up provided a choice of web and mail (n = 542).

A unimode design was used in constructing the mail and Internet surveys (Israel, 2010). This included using the same questions and question order and, more important, minimizing visual design differences (see pages of the mail questionnaire and selected screens of the Internet survey in Figure 1). The 2-page mail questionnaire used gray shading to distinguish blocks of related questions. Similarly, the Internet survey presented questions in groups, such as items 1 - 4, or singly on a separate screen (Figure 1).

Figure 1.
Design of the Mail and Web Questionnaires

Design of the Mail and Web Questionnaires

The Web survey was hosted on a university server. Clients who had received the invitation via postal mail and responded using the Web had to type the URL into their browser's address bar and then enter a 6-digit personal identification number (PIN) to access the survey (see the topmost screen capture for the web survey in Figure 1). Those in the e-mail group could click on the link in the message to access the URL and then enter the PIN. Upon entry, the informed consent information was presented. When the "Agree to participate" button was selected, the screen containing the initial questions was presented.

The correspondence was constructed to provide the same verbal and visual presentation to clients. Additional information, including the URL for the survey and a PIN, was included in the postal invitation letter and the follow up letter to clients who were in the Web preference treatments (groups 2 and 5). A series of contacts were used to implement the survey during the summer and fall, 2009, as shown in Table 1. As expected, some clients in group 5 who received the Web preference invitation did not have access to the Internet. These clients received a copy in the mail. A few clients who called or emailed the author because of difficulty accessing the survey on the Web were sent an email message containing a link to the survey and their PIN.

Table 1.
Survey Procedures by Experimental Treatment

Mailing schedule (in days)Mail only (groups 1 and 4)Web preference (groups 2 and 5)E-mail preference (group 3)
-3 Standard pre-notice letter Standard pre-notice letter Pre-notice letter alerting to an e-mail invitation
0 Invitation letter Questionnaire Postage-paid return envelope Invitation letter including URL and PIN E-mail letter including URL link and PIN
7 Standard reminder postcard Standard reminder postcard E-mail reminder postcard
21 Reminder letter Replacement questionnaire Postage-paid return envelope Reminder letter including URL and PIN Replacement questionnaire Postage-paid return envelope Reminder letter including URL and PIN Replacement questionnaire Postage-paid return envelope

Findings

To What Extent Will Extension Clients Respond to a Web-Hosted Survey Following an E-Mail Invitation?

To answer this question, the response rate to the mail-only treatment (groups 1 and 4) was used as the standard for comparison since this mode has been used since 2003. As shown in Table 2, a total of 80 clients (53.0%) responded to the mail-only invitation in the group providing e-mail addresses (group 1) and 303 clients (56.3%) responded from the group without an e-mail address (group 4). Note that one of the mail-only respondents requested the Web version of the survey and responded using that mode.

Table 2.
Response Rates by Experimental Treatment

Treatment GroupSample sizeReachable numberaCompletes Percent responding by mailPercent responding by webTotal response rateb
MailWebTotal
1. Mail only1551517918052.3.753.0
2. Web preference14613717496612.435.848.2
3. E-mail preference133104858667.755.863.5
4. Mail only554538303030356.3.056.3
5. Web preference54252415011226228.621.450.0
Total15301454557220777   
aUndeliverable and ineligible were subtracted from the reported sample size.
bResponse rates were calculated as (total completed/sample size)*100.

When clients were sent the invitation via e-mail (group 3), 66 surveys were completed (63.5%), which is based on response rates of 7.7% by mail and 55.8% by Web. The percentage responding by mail shows that the postal follow-up did not elicit many responses.

The clients who were sent the link to the Internet survey via postal mail and later had a choice of mail or Web (groups 2 and 5), resulted in the lowest response rate (48.2% for group 2 who provided an e-mail address and 50.0% for group 5 who did not provide an e-mail address). As expected, proportionately more surveys were completed via the Web by group 2 that had provided e-mail addresses than for group 5 who had not (35.8% and 21.4%, respectively).

Does the E-Mailed Invitation Result in Quicker Responses and Reduced Costs?

Clients receiving the e-mail invitation tended to respond more quickly (some responded within minutes of receiving the e-mail) than those receiving the invitation via postal mail (Figure 2). Over 83% of the e-mail group responded within 20 days (and, consequently, did not need to be sent a follow-up letter and replacement questionnaire). In contrast, 44% of the Web preference group (that did not provide an e-mail address) had responded within this time period (group 5, Table 3). Of those responding via the Web-hosted survey, a large majority did so before the replacement questionnaire needed to be sent, irrespective of the treatment group (greater than 80% in the e-mail and two Web preference treatments).

Figure 2.
Cumulative Percent of Completed Surveys by Treatment

Cumulative Percent of Completed
Surveys by Treatment


Table 3.
Percent Completed Surveys by Response Mode and Time

  By mailBy Web
Sample strataTreatment Group<21 days21+ days<21 days21+days
E-mail address available1. Mail only75.023.3 1.8a .0
2. Web preference 6.1b19.768.26.1
3. E-mail preference .012.183.34.6
No e-mail available4. Mail only70.030.0 .0 .0
5. Web preference 8.8b48.535.5 7.3
aRespondent asked to complete the survey on the Web rather than fill in the paper form.
bRespondent requested a paper copy of the survey upon receipt of the Web invitation.

In terms of the cost per completed survey, the e-mail treatment was much less expensive ($1.08 in postage per completed survey) than the mail-only and Web preference treatments with e-mail addresses ($3.18 and $3.12, groups 1 and 2, respectively) and the mail-only and Web preference treatments without e-mail addresses ($3.03 and $3.29, groups 4 and 5, respectively). Thus, using e-mail invitations can reduce postage costs as compared to the other treatments.

Do Clients Who Use the Web Have Different Characteristics and Answers from Those Responding by Mail?

Given that some clients responded via the Internet, an analysis of how Web and mail respondents might differ is important. First, characteristics and substantive answers of "early" respondents (who completed the survey before the replacement questionnaire was sent) were compared. If there are only small differences between the treatment groups then one can be confident that a survey using an e-mail or Web preference invitation would provide representative results (Smyth et al., 2009). Given this, the results in Table 4 show that clients who did not provide an e-mail address and responded by mail (group 4), tended to be older (61.7 years) and less likely to be female (48.1%) than respondents in the other groups (p=.002 and .030, respectively). Among respondents who provided an e-mail address, there was no significant difference in age or sex among the treatment groups (p=.411 and .918, respectively). One other demographic difference was based on residence. Those responding by mail (of those in group 1 who provided an e-mail address) were more likely to live in a subdivision while those in the e-mail and both Web preference groups were more likely to live in a downtown residence of a city or town.

Table 4.
Comparison of Respondents by Treatment Group for Responses Prior to the Last Contact

 E-mail addressNo e-mail address 
 1. Mail only2. Web pref.3. E-mail4. Mail only5. Web pref.p-valueb
Demographic items      
Age (Χ years)57.357.654.361.757.1.002
 .411.006 
Sex (% Female)66.162.264.848.160.0.030
 .918.059 
White, non-Hispanic86.297.896.391.492.0.230
Black, non-Hispanic1.72.2.03.43.5 
Hispanic8.6.0.03.44.6 
Other3.5.03.71.9.0 
 .069.588 
Educational attainment      
Some high school or less3.4.01.92.42.2.217
High school graduate or GED6.89.113.019.420.0 
Some college39.038.644.440.337.8 
College bachelor's degree25.440.925.919.425.6 
Post graduate degree25.411.414.818.514.4 
 .376.768 
Place of residence      
Farm10.521.45.526.122.2.012
Rural, non-farm42.126.238.227.128.9 
Subdivision in a town or city10.52.41.84.43.3 
Downtown residence in city/town36.850.054.642.545.6 
 .029.862 
Use of Extension's services items      
Number of years (Χ)9.08.38.913.412.0.012
 .929.402 
Number of contacts last year (Χ)6.97.26.06.67.6.924
 .792.507 
Visited Solutions for your life website      
Yes20.722.720.819.223.6.058
No70.775.071.780.371.9 
Don't know8.62.37.6.54.5 
 .765.029 
Satisfaction and outcome items      
Information accuracy      
Very dis./Dissatisfied/No opiniona1.79.83.92.45.6.524
Satisfied22.424.423.125.725.6 
Very satisfied75.965.873.171.968.9 
 .429.370 
Timely delivery      
Very dis./Dissatisfied/No opiniona1.712.26.12.87.8.163
Satisfied22.019.524.529.425.6 
Very satisfied76.368.369.467.866.7 
 .293.143 
Information relevance      
Very dis./Dissatisfied/No opiniona.017.16.13.37.8.018
Satisfied25.419.524.529.127.8 
Very satisfied74.663.469.467.664.4 
 .023.247 
Ease of understanding      
Very dis./Dissatisfied/No opiniona.014.67.82.96.7.034
Satisfied22.424.419.625.228.9 
Very satisfied77.661.072.671.864.4 
 .060.225 
Opportunity to use information      
Yes86.485.486.585.984.3.891
No10.212.27.712.113.5 
Don't know3.42.45.81.92.3 
 .877.933 
Shared information with another      
Yes84.559.175.979.378.7.015
No10.340.922.217.820.2 
Don't know5.2.01.92.91.1 
 .004.599 
Overall satisfaction      
Very dis./Dissatisfied/No opiniona6.99.17.34.32.2.522
Satisfied22.420.514.623.027.8 
Very satisfied70.770.578.272.770.0 
 .840.496 
aThe responses categories were combined in calculating the Chi-square statistic.
bCompares across the 5 treatment groups. P-values within experimental treatment groups (1-3 and 4-5, respectively) are shown below the percentages or means for the groups. Chi-square test for independence was used in calculating p-values for categorical and ordinal variables and analysis of variance was used for interval-level variables.

Differences among the treatment groups with regard to using Extension's services were substantively minor (see the middle of Table 4). Although the mean number of years using Extension's services was lower for respondents providing an e-mail address than for those who did not, the number of years did not significantly differ between treatments within the two strata. On the other hand, among respondents who did not provide an e-mail address, those returning the questionnaire by mail were more likely to report that they did not visit FCES' Web portal, Solutions for your life website, than did those responding via the Web (80.3% and 71.9%, groups 4 and 5, respectively). Overall, all of the groups were similar in the percentage that affirmed using the website.

Finally, there were differences among the treatment groups for the satisfaction and outcome items that indicate that respondents for the Web preference treatment among those providing an e-mail address (group 2) differed from respondents in the other treatment groups (see the bottom of Table 4). Respondents for the Web preference treatment among those providing an e-mail address were more negative about the relevance of and ease of understanding information (p=.023 and .060, respectively) and less likely to share information (p=.004) than respondents in the mail only and e-mail groups. Among those who did not provide an e-mail address, there were no significant differences between the treatments for any items.

Given that there were differences among the treatment groups for "early" respondents, the next phase of the analysis examined whether the differences are reduced by adding responses from the final contact (which sent a replacement questionnaire to nonrespondents in all the treatment groups via postal mail). The results in Table 5 show the difference by age were no longer significant between mail and Web preference groups for those who did not provide an e-mail address (p=.227 for groups 4 and 5). Overall, the average age of respondents who provided an e-mail address was less than that of those who did not provide an e-mail address (p=.006). Similarly, the difference for sex also remained. Clients who did not provide an e-mail address and responded by mail (group 4) were less likely to be female (46.1%) than respondents in the other treatment groups (p = .001). Finally, the differences between the treatments for place of residence were still present after sending the replacement questionnaire.

Table 5.
Comparison of Respondents by Treatment Group for All Contacts

 E-mail addressNo e-mail address 
 1. Mail only2. Web pref.3. E-mail4. Mail only5. Web pref.p-valueb
Demographic items      
Age (Χ years)57.856.354.561.059.5.006
 .379.227 
Sex (% Female)65.868.263.546.153.8.001
 .854.075 
Race      
White, non-Hispanic87.297.095.287.890.0.067
Black, non-Hispanic1.31.51.66.14.4 
Hispanic7.7.0.03.44.8 
Other3.91.53.22.7.8 
 .086.248 
Educational attainment      
Some high school or less2.5.01.66.43.2.007
High school graduate or GED11.412.314.322.225.8 
Some college36.740.042.936.937.7 
College bachelor's degree26.635.425.417.820.2 
Post graduate degree22.812.315.916.813.1 
 .659.258 
Place of residence      
Farm13.223.87.926.522.1.013
Rural, non-farm32.930.236.537.931.9 
Subdivision in a town or city10.51.61.63.45.5 
Downtown residence in city/town43.444.454.042.240.2 
 .024.325 
Use of Extension's services items      
Number of years (Χ)9.39.08.312.712.4.014
 .845.766 
Number of contacts last year (Χ)6.58.55.96.05.9.414
 .364.894 
Visited Solutions for your life website      
Yes20.824.620.616.212.7.014
No72.772.373.082.884.9 
Don't know6.53.16.41.02.5 
 .873.235 
Satisfaction and outcome items      
Information accuracy      
Very dis./Dissatisfied/No opiniona1.36.53.23.75.1.796
Satisfied26.025.825.827.229.9 
Very satisfied72.767.771.069.165.0 
 .599.507 
Timely delivery      
Very dis./Dissatisfied/No opiniona2.68.15.14.07.5.503
Satisfied25.624.225.431.127.7 
Very satisfied71.867.769.564.964.8 
 .700.171 
Information relevance      
Very dis./Dissatisfied/No opiniona1.311.35.15.48.7.314
Satisfied26.925.827.129.229.1 
Very satisfied71.862.967.865.462.2 
 .152.304 
Ease of understanding      
Very dis./Dissatisfied/No opiniona.011.56.63.75.9.135
Satisfied27.326.224.629.630.0 
Very satisfied72.762.368.966.764.0 
 .063.467 
Opportunity to use information      
Yes84.283.687.179.975.5.247
No10.514.86.515.719.7 
Don't know5.31.66.54.44.8 
 .436.452 
Shared information with another      
Yes84.067.776.671.968.1.080
No12.032.318.824.428.7 
Don't know4.0.04.73.73.2 
 .024.515 
Overall satisfaction      
Very dis./Dissatisfied/No opiniona5.26.26.25.74.7.541
Satisfied26.026.216.924.531.4 
Very satisfied68.867.776.969.863.9 
 .708.190 
aThe responses categories were combined in calculating the Chi-square statistic.
bCompares across the 5 treatment groups. P-values within experimental treatment groups (1-3 and 4-5, respectively) are shown below the percentages or means for the groups. Chi-square test for independence was used in calculating p-values for categorical and ordinal variables and analysis of variance was used for interval-level variables.

Regarding the use of Extension's services, differences existed (middle of Table 5) between respondents who provided an e-mail address and those who did not. Respondents in the three treatment groups (1-3) that provided an e-mail address had been using Extension for fewer years than those in the two groups (4 and 5) that did not provide an e-mail address (p=.014). Similarly, respondents in the three treatment groups that provided an e-mail address were more likely to have visited the Solutions for your life website than did those in the groups that did not provide an e-mail address (p=.014). There was no difference between the three treatment groups within each of the strata (that is, whether an e-mail address was provided or not).

Finally, most of the differences between treatment groups were reduced and not significant for the satisfaction and outcome items, as shown in the bottom of Table 5. Respondents who provided an e-mail address and were in the Web preference group showed, however, lower satisfaction on the ease of understanding item and were less likely to share information than were respondents in the mail and e-mail groups (p=.063 and .024, group 2 versus groups 1 and 3, respectively). Overall, none of the satisfaction and outcome items were significantly different when all five treatment groups were tested simultaneously.

Conclusions and Discussion

The study reported here assessed the utility of using an e-mail invitation for collecting data from Extension clients. The results from FCES' customer satisfaction survey suggested that many clients would be excluded from participating in the survey if the survey invitation is sent via e-mail only because they lack access to the Web or will not provide an e-mail address. When an e-mail address was available, the results between clients who responded to a postal mail invitation were substantively identical to those for clients who responded to an e-mail invitation and significant savings were achieved over the standard postal administration.

Given that Extension professionals have client groups without universal access to the Web, relying solely on e-mail invitations will likely bias the results to under represent respondents who are older, male, and living in a downtown residence of a city or town and longer-term clients and those unlikely to use Extension websites for getting information. Thus, it appears the best strategy for minimizing the cost of collecting data and maximizing the representativeness of the data is to use postal invitations and paper surveys to complement an e-mail invitation to a Web-hosted survey. Care also should be exercised in the design of the paper and Web versions to avoid mode-based errors (Dillman et al., 2009).

It is clear that the Web preference approach, which uses the mail to send the invitation to complete a Web-hosted survey, results in fewer completed surveys and a lower response rate. This was true in studies by Kongsved, Basnov, Holm-Christensen, and Hjollund (2007) and Smyth et al. (2009). At the same time, this method had the highest cost per completed survey. These findings are consistent with those obtained by Israel (2010), Messer and Dillman (2009), and Smyth et al. (2009). In short, pushing clients to a Web-hosted survey by sending a URL in a letter is a poor strategy for collecting data.

Acknowledgement

This research is part of the Florida Agricultural Extension Station project FLA-AEC-004832. Paper presented at the annual meeting of the Southern Rural Sociological Association, Orlando, FL, February, 2010.

References

Council of American Survey Research Organizations. (n.d.). Code of standards and ethics for survey research. Retrieved from: http://www.casro.org/pdfs/CodeVertical-FINAL.pdf

Couper, M. P., Kapteyn, A., Schonlau, M., & Winter, J. (2005). Noncoverage and nonresponse in an Internet survey. Social Science Research, 36, 131-148.

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail, and mixed-mode surveys: The tailored design method. (3rd ed.) Hoboken, NJ: John Wiley & Sons.

Horrigan, J. B. (2009). Home broadband adoption 2009: Broadband adoption increases, but monthly prices do too. Retrieved from: http://www.pewinternet.org/Reports/2009/10-Home-Broadband-Adoption-2009.aspx.

Israel, G. D. (2010). Using Web-hosted surveys to obtain responses from Extension clients: A cautionary tale. Journal of Extension [On-line], 48(4), Article 4FEA8. Available at: http://www.joe.org/joe/2010august/a8.php

Israel, G. D., & Galindo-Gonzalez, S. (2009). Diverse market segments and customer satisfaction: Does Extension serve all clients well? Journal of International Agricultural and Extension Education, 16(1), 89-103.

Kongsved, S. M., Basnov, M., Holm-Christensen, K., & Hjollund, N. H. (2007). Response rate and completeness of questionnaires: A randomized study of Internet versus paper-and-pencil versions. Journal of Medical Internet Research, 9(3), e25. Retrieved from: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2047288/?tool=pmcentrez

Malone, S., Herbert, Jr., D. A., & Kuhar, T. P. (2005). On on-line survey process for assessing impact of an email-delivered pest advisory. Journal of Extension [On-line], 43(1), Article 5RIB2. Available at: http://www.joe.org/joe/2005october/rb2.php

Messer, B. L., & Dillman, D. A. (2009). Using address-based sampling to survey the general public by mail vs. Web plus mail. Paper presented at the annual meeting of the American Association for Public Opinion Research, Hollywood, FL, May.

Pew Internet & American Life Project. (2009). Demographics of Internet users: August 18-September14, 2009 tracking survey. Retrieved from: http://www.pewinternet.org/Static-Pages/Trend-Data/Whos-Online.aspx

Smyth, J. D., Dillman, D. A., Christian, L. M., & O'Neill, A. (2009). Using the Internet to survey small towns and communities: Limitations and possibilities in the early 21st century. American Behavioral Scientist, 53(9), 1423-1448.

West, B. C. (2007). Conducting program evaluations using the Internet. Journal of Extension [On-line], 45(1), Article 1TOT3. Available at: http://www.joe.org/joe/2007february/tt3.php