October 1999 // Volume 37 // Number 5 // Tools of the Trade // 5TOT1

Previous Article Issue Contents Previous Article

Dot Posters: A Practical Alternative to Written Questionnaires and Oral Interviews

Abstract
Dot posters provide a quick, inexpensive, and reliable method for collecting information in public settings such as farmers' markets. Instead of filling out questionnaires or being interviewed, respondents are asked to answer close-ended questions on large flip charts by using stick-on "dots". Consumer response to the approach was overwhelmingly positive as 90% agreed to participate and 94% preferred this data collection method to written questionnaires. Overall, the dot posters add to rather than detract from the atmosphere in the markets. Dot posters represent an accessible and useful tool that should be considered for many research situations.


Larry Lev
Extension Economist
Oregon State University
Corvallis, Oregon
Internet address: Larry.s.lev@orst.edu

Garry Stephenson
Small Farms Extension Agent
Benton County/Oregon State University
Corvallis, Oregon


Have you hesitated to collect information to avoid placing a burden on respondents? Have you been concerned about the representativeness of your data because so many people refused to cooperate? Did you ever wonder whether you could make your research more participatory and more fun? This paper discusses a method developed to address these three challenges. It is believed that dot posters represent a quick, inexpensive and reliable method for collecting information.

Over a three-month period in 1998, dot posters were used as a primary research tool for conducting consumer research at three farmers' markets in Oregon. The research effort was initiated because little was known about the economic and social functions of the markets. Three classic data collection techniques were considered and rejected: face-to-face interviews (high personnel requirements), mail-back surveys distributed in the market (insufficient response rate), and random mail survey (not adequately targeted at farmers' market shoppers). Beyond these practical problems, none of these three approaches provides a fun and participatory research experience.

A new method was developed that achieved excellent results for response rate, data reliability, and participant satisfaction. In terms of nuts and bolts this is what was done. The authors first carefully crafted the questions (usually four per market session) that they wanted to have answered and wrote them as close-ended questions on large flip charts with the answers defined in a scale across the bottom. Hundreds of "dots" (round, colorful sticky labels also known as color coding labels) in strips of four each were prepared in advance. On market day, four flip charts were set up in a row at the market.

During the market, consumers were approached and asked if they had a moment to answer the questions. If they agreed, they received a strip of dots with instructions to place one on each poster "...where it makes the most sense." While in most instances the questions were answered "self-service" (that is, the consumers placed their own dots), assistance was offered to those consumers who didn't have a spare hand because they were loaded down with purchases or kids. Data tabulation at the end of the market is quite simple. You haven't asked many questions and the responses to each question are all on a single sheet of paper. In 20 or 30 minutes, you have a market data set tabulated.

The open display of the set of answers (the dots) during the research process raises the possibility that individuals will be swayed by the responses already posted. Given the type of questions asked, however, the authors did not feel this occurred. The reliability of responses was checked in two ways. First, the same questions were asked on multiple market days in order verify the consistency of answers. In addition, on one key question (how much consumers spent in the market), the authors were able to crosscheck purchase information with sales information collected from the vendors.

This approach restricts the number of questions that can be asked. This constraint was addressed by conducting research on multiple days in each market. This allowed certain key questions to be asked more than once and thereby gain greater confidence in those responses and to rotate in a variety of other questions. Only questions with closed-ended answers that can be quickly understood by respondents can be used. Some examples of questions that used successfully are:

  • How much have you (or will you) spend in the Farmers' Market this morning?
  • Was the Farmers' Market your primary reason for coming downtown this morning?
  • Do you plan on doing additional shopping or eating downtown this morning?
  • Do you come to the farmers' market for the products, the atmosphere, or some combination?
  • On average, if a specific item costs $1.00 in the grocery store, how much would be willing to pay in the farmers' market for a similar product produced locally?

Consumer response to this approach was overwhelmingly positive. On two occasions the authors kept track of the percentage of people who agreed to participate, 90% of those approached (and EVERYONE was approached) agreed. This compares very favorably with response rates seen for other research methods. On another occasion, a dot poster was used to ask people how they compared this data collection approach to a written questionnaire. The overwhelming majority (94%) preferred the dot poster approach.

Respondents particularly value two key attributes of the process. First, it is very fast. Most people answered all four questions in a minute or two. Second, because the respondents could see how others were answering, the whole process felt less extractive than other types of survey research (and the posters with all of their brightly colored dots also look cool in the market). Overall it is believed that the research adds to rather than detracts from the atmosphere in the market.

Using the dot poster approach the authors were able to tally responses from as many as 318 respondents in a 3.5-hour period. This broad participation can dramatically alter the assessment of consumer attitudes. As an example, at one market each participant was interviewed and asked if he/she had any suggestions "for improving or changing the market". While the vast majority indicated that they liked the market as is, five people were adamant that high prices were "ruining the market". Those five responses would have heavily colored our view of consumer attitudes if the poster question asking consumers to rate their satisfaction with the market hadn't shown that 78% answered "great" and an additional 21% said it was "good". At the next market, a question about price levels was asked and only 7% indicated that high prices had caused them to restrict their purchases. So the dot posters gave a more complete and accurate assessment of consumer sentiment than the interviews.

Overall this is an accessible and useful tool that should be considered for many research situations. The authors have found that beyond the research results themselves, this approach promotes excellent communication with the both the general public and the media.

Results and further discussion of the dot poster approach are provided at:
http://smallfarms.orst.edu/analyzing_three_farmers.htm
and
http://eesc.orst.edu/agcomwebfile/Magazine/98Fall/
OAP98%20text/OAPFall9802.html
.
An earlier use of the approach in workshop settings is described in
http://www.joe.org/joe/1995october/tt1.html.