The Journal of Extension - www.joe.org

February 2012 // Volume 50 // Number 1 // Tools of the Trade // v50-1tt2

The "Ballot Approach" to End-of-Event Evaluation in Indian Country and Beyond

Abstract
This article presents the "ballot approach" to end-of-event evaluation, an approach developed in response to frustration with existing surveys and inadequate quantity and quality of feedback to my programming in Indian country. The ballot approach is grounded in the participatory development theory and methods popularized by Robert Chambers and others. Adoption of the ballot approach coincided with a year-to-year trebling of response rate and a dramatic increase in total responses. Two open-ended questions generated 157 written comments compared to three comments the year prior. Although causality cannot be established, the approach appeared effective and appropriate, supporting creativity and adaptation in evaluation methods.


David S. Wilsey
Assistant Extension Professor and Educator
Extension Center for Food, Agriculture, and Natural Resources
University of Minnesota Extension
Cloquet, Minnesota
dwilsey@umn.edu

Introduction

Evaluation is an essential part of programming but presents a multi-faceted challenge, two facets being information capture and quality. In a 2006 article on drawing in evaluation, Evans and Reilly cited Cronbach's (1982) assertion that evaluation plans and instruments inappropriate for their audience can yield results that are tenuous and unusable. This says nothing of the challenge of simply obtaining feedback, one potentially increased when working in diverse or underserved communities.

In 2008, I began working with the Fond du Lac Band of Lake Superior Chippewa (FDL) in northern Minnesota. I was new to the region and am not of Indian descent. Information about FDL and my early work there on participatory needs assessment was recently featured in this journal (Wilsey & Beaulieu, 2010). As my program transitioned from planning to implementation, I became increasingly uncomfortable with established evaluation practices.

Specifically, I was uncomfortable with existing end-of-event (EOE) forms. Our surveys were, in my opinion, overly formal, overloaded with personal information requests, and generally inappropriate for someone striving to develop trust in a community. At first, I simply did not use them, or any other evaluation tools. Eventually, I developed a "stripped-down" EOE survey, but found few individuals willing to complete it. Recently, I have struggled to develop novel approaches to capture feedback and to ensure that the feedback that I receive is of high quality. I worked with evaluation specialists to refine my EOE survey and to learn better practices for timing and manner of delivery —this helped. Still, I continued to consider not just what I am asking for and when, but also how, and if the latter might be affecting response quantity and quality.

Response

Much of my academic training falls under the rubric of Farming Systems Research and Extension, an approach emphasizing participatory methods in marginalized regions of the world and epitomized by the scholarship of Robert Chambers (1997, for example). So it was frustration paired with my experiences working internationally—across cultural, language, and literacy barriers—that catalyzed development of the "ballot approach," an alternative approach to EOE survey forms.

Process

Early in 2011, the second occurrence of a large, annual program event was approaching, an event with tremendous potential for overall program evaluation. Beginning with my refined EOE survey, I drafted statements I wanted to be able to make after the event. The ballot approach emerged from a brainstormed list of "physical" feedback approaches. For each statement, I considered the mechanism that would permit a ballot-type response. Three types emerged:

  1. Single question, multiple boxes
    "This event provided an opportunity to interact with members of my community;" one box for each of four Likert-scale responses (four boxes).

  2. Multiple statements, multiple boxes
    "This is my first Storytelling" and "I attended the 2010 Storytelling" (two boxes).

  3. Single statement, single box
    "Please share something positive about this event" (one box)

Next, I determined need for boxes (15) and ballots (index cards) per person (7) for the evaluation questions and a drawing. I also decided to test a penny-ballot, where individuals took a penny from a bowl and placed it in a jar labeled with their Band or community affiliation.

Third, I developed an instruction sheet to be stapled around the ballots. The instruction sheet and all cards were numbered for identification, serving two purposes. First, any card could be entered in the drawing for a prize and later matched with the instruction sheet retained by the individual. Second, numbered cards permitted verification that each person only responded once to each question. For example, I was surprised when nearly everyone (strongly) agreed to one statement, but 14 strongly disagreed. Upon investigation, those 14 were two full sets of seven cards placed in one box, verified by card numbering.

Finally, I determined the layout and location for the evaluation route: along the corridor where participants would line up for the end-of-event feast (Figure 1).

Figure 1.
"Ballot Boxes" Line the Hallway Where Participants Lined Up for the Feast

Ballot Boxes Line the Hallway Where Participants Lined Up for the Feast

Outcome

To demonstrate the outcome of the ballot approach, I will first share the outcome of the prior year's EOE form. At the (first) 2010 Storytelling event, I developed a simplified print questionnaire to evaluate our program's newspaper feature. Participants had multiple means to access the form, including:

  • Registration

  • Our program display

  • Roving solicitors

Evaluation forms had tear-off numbers and respondents were entered in a drawing for a gift certificate. Twenty-nine out of approximately 250 people completed the form—about 12%.

I piloted the "ballot approach" at the 2011 storytelling event. For each of four primary questions an average of 124 individuals of the approximately 350 people in attendance responded—about 35%. The year-to-year response rate trebled, and the response total increased by about 100. Moreover, 283 individuals (about 80%) placed a penny in a one of the jars indicating Band/community affiliation. Finally, two open-ended questions generated 157 written comments compared to three in 2010!

Conclusions

No claim can be made that the ballot approach caused a year-to-year trebling in response rate—too many other factors changed as well. Nevertheless, the increased rate and number of responses and the overall number and quality of written comments strongly suggest that in the FDL context the EOE ballot approach was superior to an EOE form. Rather than a prescription, this article's message is to consider creative, adaptive approaches to program evaluation in Indian Country—and beyond.

References

Chambers, R. (1997). Whose reality counts? Putting the last first. London, UK: ITDG.

Cronbach, L. (1982). Designing evaluations of educational and social programs. San Francisco: Jossey-Bass.

Evans, W. & Reilly, J. (1996). Drawings as a method of program evaluation and communication with school-age children. Journal of Extension [On-line], 34(6) Article 6FEA2. Available at: http://www.joe.org/joe/1996december/a2.php

Wilsey, D. S., & Beaulieu, S. (2010). We listen to them: Assessing natural resource perspectives and priorities in a Tribal community. Journal of Extension [On-line], 48(5) Article 5FEA9. Available at: http://www.joe.org/joe/2010october/a9.php