The Journal of Extension - www.joe.org

August 2010 // Volume 48 // Number 4 // Feature // v48-4a8

Using Web-Hosted Surveys to Obtain Responses from Extension Clients: A Cautionary Tale

Abstract
Surveys are important tools for Extension professionals. Given the development of Web-hosted surveys, two important questions are "When can they be used?" and "How does the data differ from those collected by other methods?" The study reported here compares three modes of delivery: mail only, mail/Web choice, and Web preference with a mail option. Data showed the response rate for the mail-only mode was highest (64.5%), followed by the mail/Web choice mode (59.2%) and the Web preference mode (52.6%). The evidence indicates a need to consider how the results might be affected by methodological decisions to use the Internet.


Glenn D. Israel
Professor
University of Florida
Gainesville, Florida
gdisrael@ufl.edu

Introduction

These are turbulent times for Extension professionals who wish to use surveys to identify clients' needs or evaluate outcomes of their educational programs. The Internet has sparked considerable interest as a means for conducting surveys to collect data. Continued development of new technologies, especially with regard to the Internet, have complicated the process of contacting clients and increased opportunities for individuals to decide when and how to respond to survey requests (Dillman, Smyth, & Christian, 2009). The variety of Web browsers and hardware configurations, as well as the evolving Internet environment (including threats from viruses, worms, and other malware) affect people's access to and willingness to respond to Web-hosted surveys.

Web-hosted and mixed-mode surveys offer alternatives to telephone and mail surveys, but the consequences of employing these alternatives are not fully understood. Recent research on using the Internet has focused on the use of the Postal Service's Delivery Sequence File (DSF) (Dillman et al., 2009; Smyth, Dillman, Christian, & O'Neill, 2009), which contains addresses for nearly every household in the United States, as a sampling frame for surveys of the general public. While these studies suggest that the general public can be surveyed using a mixture of mail and Web modes, the utility of Web-hosted surveys for Extension clients where there is usually a list of names and addresses (but not necessarily e-mail addresses) needs to be demonstrated.

Many Extension professionals hope that Web-hosted surveys can be an easy, low-cost method for collecting data. A number of evaluation surveys have been developed in conjunction with Web-based educational information (see for example, O'Neill, 2004; Wiersma, 2007), but such "opt-in" evaluation surveys often amount to a convenience sample, and, consequently, the credibility of evaluation findings are significantly weakened. Other evaluations have been implemented using e-mail addresses obtained from specialized audiences with nearly universal Internet access to send an e-mail invitation with a link to the survey (Malone, Herbert, & Kuhar, 2005; West, 2007). Despite the enthusiasm for Web-hosted surveys, many have design weaknesses that lead to coverage, sampling, nonresponse, and measurement error (see Dillman et al., 2009). Thus, a thorough analysis of when and how Web-hosted surveys should be implemented is needed.

The study reported here explored the willingness of a broad spectrum of clients who have obtained information from Extension and for whom no e-mail address is available to respond to a customer satisfaction survey via the Web. Specifically, the research questions are:

  1. To what extent will Extension clients respond to a Web-hosted survey?

  2. Does using the Web reduce the time needed to collect data and postage costs?

  3. Do clients who use the Web differ from those responding by mail?

  4. Are there substantively important differences in customer satisfaction?

Background

Conducting surveys to collect data on needs for program development or for evaluating program participation, satisfaction, and outcomes entail several important considerations. Selecting a mode—drop-off/pick-up, group administration, telephone, mail, Web-hosted, or mixed mode—for a survey is one of these. Because of resource constraints, group administration (typically during a seminar or workshop) or mail administration is often chosen.

In the case of needs assessment surveys for program development, mailing lists of Extension clients often have incomplete coverage—from potential clients missing from the list and collaborators and other Extension staff who are inadvertently included. Although evaluation studies usually focus on program participants and use registration lists, there is the problem of contacting nonparticipants to constitute a comparison group for more rigorous evaluation designs. Studies by Dillman et al. (2009) and Link et al. (2008) suggests that the DSF can be a useful sampling frame for surveys of broad clientele groups to minimize coverage problems. Coverage of specialized groups, such as small farm operators, remains problematic.

Whether clients will use the Internet to respond to surveys is influenced by a number of factors, not the least of these is access to the Internet. As of December, 2008, 75% of American adults use the Internet (Pew Internet & American Life Project, 2009). Data from the Project also show:

  • Persons with a college education were more likely to have Internet access (95%) than those with only a high school diploma (67%) or less than a high school diploma (35%).

  • Older Americans were less likely to use the Internet (41% of those 65 or older) while the percentage increases for persons 30-49 (82%) and 87% for 18-29 years old.

  • Hispanics (58%) and Black, non-Hispanics (64%) were less likely to use the Internet than White, non-Hispanics (77%).

  • Rural residents had lower rates of using the Internet (63%) than do urban (71%) and suburban residents (74%).

In addition, just over half (55%) had a broadband connection (Horrigan, 2008). This is significant because Web pages take longer to display with a slow connection, and this increases the psychological cost of responding. Again, people with lower educational attainment, minorities, elders, and rural residents were less likely to have a broadband connection (Horrigan, 2008). These data suggest that some segments of Extension's clientele have better access to a Web-hosted survey than do others.

In addition to access, other factors that might affect whether clients respond to a survey via the Web (when a request is sent by mail) include ready access to a computer when the mail is opened (that is, the system is booted up and ready for use) and available time to complete the survey "now," providing an easy-to-type Universal Resource Locator (URL) to access the survey, having experience with using Internet forms, and deriving psychological benefits from participating in the survey. Having experience using the Internet might increase preference for this mode and reduce psychological costs because experience creates cognitive fluency (Schwarz, Bless, Wänke, & Winkielman, 2003). Don Dillman (personal communication, January 17, 2009) suggested that providing a choice to respond by mail or the Web increases the complexity of the response decision, and this might reduce response rates. Given these factors, it is not surprising that people who responded via the Web were found to be different in a number of ways from those who responded by mail (Smyth et al., 2009).

Finally, whether Web-hosted surveys are appropriate rests on Extension's relationship with the intended recipients. In the case of an evaluation survey targeting clients who participated in an Extension program and who have provided their e-mail address, there is a clear prior relationship. However, it is considered inappropriate to initiate a request to complete a survey using the Internet when there is no prior relationship (Council of American Survey Research, Organizations, no date:8) because the Internet is not considered a public utility (Dillman et al., 2009). In the case of needs assessment surveys or an evaluation survey that includes nonparticipants, people who receive the survey might not have had prior contact with Extension. An invitation to participate in the survey should be conveyed through the mail or by telephone in order to conform to ethical standards for conducting surveys.

With this background, an invitation to respond to a survey using the Internet is expected to have a lower response rate than one by mail, primarily because some clients do not have Internet access. In addition, clients who are offered a choice of responding by mail or the Web will opt for the mail version more often because it is readily accessible. Because of the complexity argument, I expect that fewer clients who are presented a choice will respond as compared to the traditional mail survey.

Methods

The study reported here used data from Florida Cooperative Extension's (FCES) customer satisfaction survey. The survey was sent to clients who had attended a workshop or seminar, or called or visited the Extension office. In 2008, a random sample of 1,402 clients was selected from names on registration lists of scheduled educational programs, as well as sign-in sheets at county offices and phone logs from the professional staff. Questions on overall satisfaction with the services provided by Extension, clientele's satisfaction on four dimensions of quality, outcomes of the use of Extension service, and demographic attributes of the respondents were included. The survey has been conducted annually since 1997 (see Israel & Galindo-Gonzalez, 2009; Terry & Israel, 2004).

Selected clients were randomly assigned to a treatment group:

  1. Mail only: The request for a response includes only the mail mode.

  2. Mail/Web choice: The choice to respond by mail or Web is offered in both the initial mailing and follow-up mailing of the questionnaire.

  3. Web preference: The initial request for a response includes only the Web mode and the follow-up provides a choice of Web and mail.

The mail and Web-hosted surveys were constructed using Dillman et al.'s (2009) unimode design principles. This included using the same questions and question order and, more important, working to minimize differences in visual design (Figure 1). The mail questionnaire was printed on a standard sheet of paper using black text and gray shading to create the figure-ground contrast to distinguish answer spaces and blocks of related questions. Similarly, the Internet survey presented questions in groups, such as items 1 - 4, or singly on separate screens (Figure 1).

Figure 1.
Design of the Mail and Web Questionnaires

Design of the Mail and Web
Questionnaires

The Internet survey was hosted on a University of Florida Web site. Respondents using the Web version typed the URL into their browser's address bar and then enter a 6-digit PIN number to access the survey. (See the topmost screen capture for the survey in Figure 1.) Upon entry, the informed consent information was presented, along with "Agree to participate" or "Do not agree to participate" buttons. When "agree" was selected, the screen containing the initial questions was presented.

Letters also were constructed to provide a similar verbal and visual presentation to clients. The URL for the survey and a PIN number was included in the invitation letter and the follow up letter to clients who were in the mail/Web choice or the Web preference treatments. A series of contacts were used to implement the survey, as shown in Table 1. As expected, some clients who received the Web preference invitation did not have access to the Internet. They later received a copy in the mail. A few clients who called or emailed the author were sent an email message containing a link to the survey and their PIN number.

Table 1.
Survey Procedures by Experimental Treatment

Mailing Schedule (in days)Mail OnlyMail/Web ChoiceWeb Preference
-3 Standard pre-notice letter Standard pre-notice letter Standard pre-notice letter
0 Invitation letter

Questionnaire

Postage-paid return envelope

Invitation letter including URL and pin number

Questionnaire

Postage-paid return envelope

Invitation letter including URL and pin number
7 Standard reminder postcard Standard reminder postcard Standard reminder postcard
21 Reminder letter

Replacement questionnaire

Postage-paid return envelope

Reminder letter including URL and pin number

Replacement questionnaire

Postage-paid return envelope

Reminder letter including URL and pin number

Replacement questionnaire

Postage-paid return envelope


The data were analyzed using SAS for Windows, version 9.2 (SAS Institute Inc.). The Chi-square test for independence was used to test for differences in demographic attributes, use of Extension, and satisfaction/outcomes by mode of response (Web or mail) for categorical and ordinal variables. Analysis of variance was used to test for differences for interval variables by mode of response.

Findings

To What Extent Will Extension Clients Respond to a Web-Hosted Survey?

The response rate to the mail-only treatment served as the standard for comparison, with 282 clients (64.5%) responding to the mail-only invitation (Table 2). When clients were presented with a choice of responding by mail or Web, 258 surveys were completed (59.2%), which is based on response rates of 51.4% by mail and 7.8% by Web. The Web preference treatment, where clients were first given the link to the Internet survey and later the choice of mail or Web, resulted in the lowest response rate (52.6%). The response rate for this treatment was higher by Web (29.2%) than by mail (23.4%). The Web preference treatment results suggest that a substantial proportion of Extension clients can be enticed to respond via the Internet, but the mail/Web choice treatment indicates that more would prefer the mail survey.

Table 2.
Response Rates by Experimental Treatment

TreatmentSample SizeaMail CompletesWeb CompletesTotal CompletesPercent Responding by MailPercent Responding by WebTotal Response Rateb
Mail only437282028264.5%0%64.5%
Mail/Web choice4362243425851.4%7.8%59.2%
Web preference44510413023423.4%29.2%52.6%
Total1318610164774   
aUndeliverable and ineligible were subtracted from the reported sample size.

bResponse rates were calculated as (total completed/sample size)*100.


Does Using the Web Reduce the Time Needed to Collect Data and Reduce Postage Costs?

Clients who responded via the Web tended to do so sooner than those using the mail. Over 80% who responded by the Internet did so within 20 days (Figure 2). Only 69% of clients who used the mail questionnaire responded within the first 20 days. Although a higher percentage of clients using the Web responded earlier, this advantage was offset by the overall lower response rate. Just 105 clients responded via the Web in the Web preference treatment by day 20, so the remaining 340 clients received the follow-up letter and replacement questionnaire. Only 244 clients in the mail-only treatment were sent the replacement questionnaire because 193 responded within 20 days. A total of 257 clients in the mail/Web choice treatment were sent the replacement questionnaire after 179 clients responded to the initial invitation (151 by mail and 28 via the Web).

The extra mail-out cost for the Web preference treatment was offset by savings from avoiding charges for surveys returned in postage-paid envelopes. But in terms of the cost per completed survey, the Web preference treatment was more expensive ($3.07 in postage per complete) than the mail-only and mail/Web choice treatments ($2.68 and $2.83, respectively). Thus, using a Web-hosted survey did not reduce postage costs when clients have to receive the invitation via the mail.

Figure 2.
Cumulative Percent of Completed Surveys by Mode

Cumulative Percent of Completed
Surveys by Mode


Do Clients Who Use the Web Differ from Those Responding by Mail?

Given that some clients will respond via the Internet, an analysis of how Web and mail respondents might differ is important. First, characteristics of Web and mail respondents from Web preference treatment group are compared. Smyth et al. (2009) suggest that if there are no differences or only small ones between Web and mail respondents, then one can be confident that a survey using the Web only would provide representative results.

The results in Table 3 show that clients who responded via the Web differ from those who responded by mail on a number of characteristics. Clients who responded via the Web were significantly younger than those who responded by mail (53.9 and 59.8 years, respectively). Respondents using the Web also tended to have higher levels of educational attainment (especially some college or completed a college degree) than did those responding by mail. Clients using the Web to respond were more likely to be working for pay (60.6%) than those responding by mail (47.1%), which reflects some of the age-based differences noted above. Finally, clients responding by the Web were much less likely to live on a farm than those responding by mail (10.8% and 25.7%, respectively).

In addition to the demographic differences, clients using the Web differed from those using mail on two of the service utilization variables. Clients responding via the Web reported twice the number of contacts during the last year as did those responding by mail (8.6 and 4.1, respectively, Table 3). Even more striking is that the former group was more than four times more likely to have visited FCES' Solutions for your life Web site (a portal to information on a host of topics) than did clients responding by mail (36.2% and 7.8%, respectively).

Since response mode differences were found for a number of client demographics and service utilization measures, the Web and mail responses were combined into their respective treatment groups, mail/Web choice and Web preference, to examine whether the combined data produced results that were comparable to the standard the mail-only treatment. The results in Table 4 show that demographic characteristics of both the mail/Web choice and the Web preference treatments were not significantly different for any measure. For example, the mean age differed by only .2 years between the mail-only, mail/Web choice, and Web preference respondents. No significant differences between the treatment groups were evident for sex, race, educational attainment, place of residence, and employment status. The service utilization variables also did not show significant differences between the treatment groups for the number of years using Extension, number of contacts with Extension during the last year, and whether the client had visited the Solutions for your life Web site.

Are There Substantively Important Differences in Customer Satisfaction?

The satisfaction and outcome items in Table 3 show that clients responding via the Web differed somewhat from those responding by mail. Specifically, the two groups differed on two items measuring service quality (information accuracy and ease of understanding), with clients responding via the Web having a larger percentage reporting "Very satisfied" (75.8% and 77.2%, respectively) than did those responding by mail (62.1% and 64.7%, respectively). The other two service quality items (timely delivery and information relevance) showed a similar, but non-significant trend. Two outcome measures, having an opportunity to use the information and sharing the information with another person, did not have different results by response mode. Finally, a larger percentage (76.2%) of clients responding via the Web reported that they were "Very satisfied" with the overall service of the Extension office than did those responding by mail (62.1%).

Though there are a number of differences by response mode in the substantive items, comparisons among the three treatment groups show no significant differences (Table 4). In sum, combining Web and mail responses in the mail/Web choice and the Web preference treatments resulted in response distributions that are similar to those for the mail-only treatment for the substantive items.

Table 3.
Comparison of Responses by Mode for the Web Preference Treatment

Demographic ItemsMailWebDifferencep-value
Age (mean)59.853.95.9.003
Sex (% Female)52.562.5-10.0.131
Race    
White, non-Hispanic92.994.5-1.6.901
Black, non-Hispanic4.02.41.6 
Hispanic2.02.4-.4 
Other1.0.8.2 
Educational attainment    
Some high school or less4.92.32.6.010
High school graduate or GED27.512.315.2 
Some college31.443.0-12.6 
College bachelors degree19.630.0-10.4 
Post graduate degree16.712.34.4 
Place of residence    
Farm25.710.814.9.012
Rural, non-farm29.735.4-5.7 
Urban44.653.9-9.3 
Employment status (% work for pay)47.160.6-13.5.000
Use of CES services items    
Number of years (mean)9.19.0.1.947
Number of contacts last year (mean)4.18.6-4.5.000
Visited Solutions for your life Web site    
Yes7.836.2-28.4.000
No90.360.829.5 
Don't know1.93.11.2 
Satisfaction and outcome items    
Information accuracya    
Very dissatisfied/Dissatisfied/No opinion3.95.5-1.6.030
Satisfied34.018.815.2 
Very satisfied62.175.8-13.7 
Timely deliverya    
Very dissatisfied/Dissatisfied/No opinion4.910.2-5.3.157
Satisfied32.023.48.6 
Very satisfied63.166.4-3.1 
Information relevancea    
Very dissatisfied/Dissatisfied/No opinion6.87.8-1.0.318
Satisfied35.025.89.2 
Very satisfied58.366.4-8.1 
Ease of understandinga    
Very dissatisfied/Dissatisfied/No opinion2.95.5-2.6.025
Satisfied32.417.315.1 
Very satisfied64.777.2-12.5 
Opportunity to use information    
Yes85.080.54.5.639
No12.016.44.4 
Don't know3.03.1.1 
Shared information with another    
Yes71.781.3-9.6.114
No27.316.410.9 
Don't know1.02.3-1.3 
Overall satisfactiona    
Very dissatisfied/Dissatisfied/No opinion3.94.6-.7.038
Satisfied34.019.214.8 
Very satisfied62.176.2-14.1 
aThe responses categories were combined in calculating the Chi-square statistic.

Table 4.
Comparison of Responses by Treatment

Demographic ItemsMail OnlyMail/Web ChoiceWeb Preferencep-value
Age (mean)56.756.856.6.978
Sex (% Female)55.253.558.2.592
Race    
White, non-Hispanic91.896.093.8.542
Black, non-Hispanic4.52.43.1 
Hispanic1.9.82.2 
Other1.9.8.9 
Educational attainment    
Some high school or less1.82.03.5.937
High school graduate or GED19.918.919.0 
Some college38.638.237.9 
College bachelors degree25.423.625.4 
Post graduate degree14.317.314.2 
Place of residence    
Farm12.511.817.3.368
Rural, non-farm35.238.232.9 
Urban52.350.049.8 
Employment status (% work for pay)51.555.954.6.155
Use of CES services items    
Number of years (mean)10.110.29.0.460
Number of contacts last year (mean)5.85.66.6.457
Visited Solutions for your life Web site    
Yes17.318.423.6.284
No80.580.473.8 
Don't know2.21.22.6 
Satisfaction and outcome items    
Information accuracya    
Very dissatisfied/Dissatisfied/No opinion4.37.94.8.357
Satisfied22.723.425.5 
Very satisfied73.068.769.7 
Timely deliverya    
Very dissatisfied/Dissatisfied/No opinion4.79.27.8.165
Satisfied22.422.527.3 
Very satisfied72.967.364.9 
Information relevancea    
Very dissatisfied/Dissatisfied/No opinion6.97.97.4.409
Satisfied22.827.829.9 
Very satisfied70.364.362.8 
Ease of understandinga    
Very dissatisfied/Dissatisfied/No opinion4.05.64.4.872
Satisfied25.826.324.0 
Very satisfied70.268.171.6 
Opportunity to use information    
Yes79.375.982.5.415
No18.520.914.5 
Don't know2.23.23.1 
Shared information with another    
Yes77.079.277.1.446
No18.617.621.2 
Don't know4.53.31.8 
Overall satisfactiona    
Very dissatisfied/Dissatisfied/No opinion4.77.14.3.462
Satisfied21.522.225.8 
Very satisfied73.870.670.0 
aThe responses categories were combined in calculating the Chi-square statistic.

Conclusions and Discussion

The results suggest that the Web-hosted surveys can be used effectively to collect information from Extension clients when the invitation is sent via postal mail. But differences between clients who responded via the Web and those responding by mail indicate that Extension professionals should avoid relying on the Web alone. An exception would be if nearly all of the targeted clients have access to the Internet. On the other hand, the similarity of the results between the mail-only, mail/Web choice, and Web preference treatments suggests that mixed-mode surveys can be considered as an option for collecting data from clients. These results are remarkably consistent with the recent study by Smyth and others (2009).

On the down side, response rates were reduced for the two treatments involving the Internet, more so when clients were pushed to use the Web (as in the case of the Web preference treatment). This can result in less data being available for the intended analysis and targeted precision if the lower response rate is not factored into calculations for the initial sample size. Also, the analysis of days to respond indicate that Extension professionals are unlikely to reap significant savings on the postage costs during survey implementation. The study did not explore, however, the utility of conducting the entire survey electronically in situations where email addresses are available for all of the clients.

It is not clear why the response rate was lower when the Web option was included. The data suggest that few respondents were motivated enough to go to the computer, type in the URL and a PIN number, and then complete the Web-hosted version. This supports Don Dillman's (personal communication, January 17, 2009) hypothesis that the complexity and uncertainty in responding via the Web might act as a disincentive for potential respondents. Although the mail/Web choice treatment response rate was lower than that for the mail survey, the effect might have been reduced because the simplicity of the short, two-page printed survey clearly conveyed to recipients that the response task was quick and easy.

It is also noteworthy that respondents using the Web had twice as many contacts with Extension during the last year and, consequently, had been more motivated to respond using this mode, as reflected by the higher percentage that reported "very satisfied" responses to the service quality and overall satisfaction items. This is consistent with leverage-salience theory (Groves, Singer, & Corning, 2000) in that the salience of the invitation to respond via the Web was more important (that is, have greater leverage) for some clients than for others, resulting in the observed differences in the characteristics between Web and mail respondents. The observed mode differences, coupled with the arguments of leverage-salience theory, give weight to Dillman et al.'s (2009) recommendations that broad-based appeals should be designed for survey invitations and mixed-mode approaches should be considered as an alternative to a Web-only approach.

Acknowledgement

This research is part of the Florida Agricultural Extension Station project FLA-AEC-004832.

References

Council of American Survey Research Organizations. Code of Standards and Ethics for Survey Research. Retrieved January 21, 2009 from: http://www.casro.org/pdfs/CodeVertical-FINAL.pdf

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail, and mixed-mode surveys: The tailored design method. (3rd ed.) Hoboken, NJ: John Wiley & Sons.

Israel, G. D., & Galindo-Gonzalez, S. (2009). Diverse market segments and customer satisfaction: Does Extension serve all clients well? Journal of International Agricultural and Extension Education, 16(1), 89-103.

Groves, R. M., Singer, E., & Corning, A. (2000). Leverage-salience theory of survey participation. Public Opinion Quarterly, 64, 3, 299-308.

Horrigan, J. B. (2008). Home broadband adoption 2008: Adoption stalls for low-income Americans even as many broadband users opt for premium services that give them more speed. Retrieved January 21, 2009 from: http://www.pewInternet.org/pdfs/PIP_Broadband_2008.pdf

Link, M. W., Battaglia, M. P., Frankel, M., Osborn, L., & Mokdad, A. (2008). Comparison of address-based sampling (ABS) versus random-digit dialing (RDD) for general population surveys. Public Opinion Quarterly, 72, 1, 6-27.

Malone, S., Herbert, Jr., D. A., & Kuhar, T. P. (2005). On on-line survey process for assessing impact of an email-delivered pest advisory. Journal of Extension [On-line], 43(1), Article 5RIB2. Available at: http://www.joe.org/joe/2005october/rb2.php

O'Neill, B. (2004). Collecting research data online: Implications for Extension professionals. Journal of Extension [On-line], 42(3). Available at: http://www.joe.org/joe/2004june/tt1.php

Pew Internet & American Life Project. (2008). Demographics of Internet users: November 19-December20,2008 Tracking Survey. Retrieved January 21, 2009 from: http://www.pewInternet.org/trends/User_Demo_10%2020%2008.htm

Smyth, J. D., Dillman, D. A., Christian, L. M., & O'Neill, A. (2009). Using the Internet to survey small towns and communities: Limitations and possibilities in the early 21st century. Unpublished working paper.

Schwarz, Norbert, Bless, Herbert, Wänke, Michaela, & Winkielman, Piotr. (2003). Accessibility revisited. Pages 51-77 in Galen V. Bodenhausen and Alan J. Lambert (eds.), Foundations of Social Cognition: A Festschrift in Honor of Robert S. Wyer, Jr. Mahwah, NJ: L. Erlbaum.

Terry, B. D., & Israel, G. D. (2004). Agent performance and consumer satisfaction. Journal of Extension [On-line], 42(6), Article 6FEA4. Available at: http://www.joe.org/joe/2004december/a4.php

West, B. C. (2007). Conducting program evaluations using the Internet. Journal of Extension [On-line], 45(1), Article 1TOT3. Available at: http://www.joe.org/joe/2007february/tt3.php

Wiersma, J. J. (2007). Development and impact of an Extension Web site. Journal of Extension [On-line], 45(5), Article 5RIB5. Available at: http://www.joe.org/joe/2007october/rb5.php