August 2012
|
August 2012 // Volume 50 // Number 4 // Tools of the Trade // v50-4tt4
Increasing Your Productivity with Web-Based Surveys
Abstract
Web-based survey tools such as Survey Monkey can be used in many ways to increase the efficiency and effectiveness of Extension professionals. This article describes how Survey Monkey has been used at the state and county levels to collect community and internal staff information for the purposes of program planning, administration, evaluation and planning effective training.
Web-based surveys enable surveyors to create and publish their own questions for distribution via emailing or embedding in a website. Recipients can access and respond to the survey questions faster, more cheaply, and with less effort than traditional paper surveys (Archer, 2003). Survey Monkey is one online tool that offers a free version, limiting the user to no more than 100 respondents and 10 survey questions, as well as priced versions that allow for more tools. Templates are available on the website, or surveyors can create questions from scratch. Additionally, numerous reports can be generated that summarize survey results. Other similar online survey publishing tools are available in Google and other sites such as SurveyGizmo, Instant Survey, and Zoomerang.
This article describes how Survey Monkey has been used at the state and local levels in the University of Missouri's Extension Service to collect community data, in program planning and administration, and to collect data before or after a training event.
Collecting Community Data
Utilizing an online survey is a cost-effective way to gain input from the local community (Archer, 2003). Although not appropriate for all audiences, an online survey emailed to a listserv or posted on a website will reach a much larger audience than handing out or mailing hard copies. It is important to remember, however, that not everyone has access to or is comfortable using a computer and may prefer the hard copy (O'Neill, 2004). Utilizing both methods of collecting data from a large community would ensure a more heterogeneous population, and therefore more accurate data.
In Missouri, Survey Monkey was utilized to develop a community health survey for a specific neighborhood in St. Louis City. The goal was to determine the neighborhood residents' ability to access healthy and affordable food and safe physical activity. Results from the online survey helped inform a health coalition on the needs, strengths, and weaknesses of the neighborhood. The hope is to influence policy change that will specifically meet the needs of this neighborhood, leading to better acceptance, lasting change, and better health outcomes.
Using Online Survey Tools for Program Planning and Administration
Web-based surveys have been an essential tool for the Family Nutrition Education Program (FNEP) administrators to efficiently collect information from a staff of 19 nutrition educators. Online surveys were used prior to a staff training to determine interest in professional development topics, preferred scheduling of future trainings, and requests for materials. Staff could respond anonymously to multiple-choice questions and provide comments in text boxes. In addition, Survey Monkey was used to collect information about program diversity. The goal was to determine whether all of the FNEP target audiences were being served by regional staff.
Online surveys helped determine whether FNEP target audiences were either currently receiving programming, requesting programming, or planning for future programming. To organize this information, a large table was created to keep the survey length under the 10-question limit of the free version. From this data, a strategic plan was developed for programming in the region. As a means to receive suggestions regarding training and materials, to a platform to categorize staff input, Survey Monkey is an invaluable tool for administrative staff in FNEP.
Survey Monkey is also an extremely useful tool for compiling and storing evaluation data collected at Extension programs (West, 2007). Purchasing an unlimited account enables educators to enter survey data electronically, categorize surveys into folders, and easily view analyzed results. Results can also be exported and used in more sophisticated data analysis software. Showing the impact of Extension programs is vital to continued funding and support. Survey Monkey is a simple and effective tool for compiling evaluation outcomes, leading to better communication of Extension's impacts to stakeholders.
Collecting Data for Planning and Evaluating Training Events
Online surveys have been used at the state level to collect training topic ideas from educators in the field. By providing an online link, ideas for upcoming statewide trainings can be shared at any time. Educators who submit topics and identify themselves receive immediate feedback or support.
Surveys have also been used at the state level to collect information before and after statewide training events. Before a training event, a survey can collect questions that program participants have about a specific training topic. Collection of these questions helps the presenter tailor the presentation content to the audience's needs. Surveys can also collect post-training event evaluations. Quick collection of comments and questions from training participants enables a rapid response time to questions received.
Online survey tools can help Extension professionals become more efficient and effective at gathering, analyzing, and using internal and external feedback.
References
Archer, T. M. (2003). Web-based surveys. Journal of Extension [On-line], 41(4) Article 4TOT6. Available at: https://www.joe.org/joe/2003august/tt6.php
O'Neill, B. (2004). Collecting research data online: implications for Extension professionals. Journal of Extension [On-line], 42(3. Article 3TOT1. Available at: https://www.joe.org/joe/2004june/tt1.php
West, B. C. (2007). Conducting program evaluations using the Internet. Journal of Extension [Online], 45(1) Article 1TOT3. Available at: https://www.joe.org/joe/2007february/tt3.php