June 1995 // Volume 33 // Number 3 // Tools of the Trade // 3TOT2

Previous Article Issue Contents

A Gauge of Success in Public Issues Education

Abstract
A technique was developed and tested to estimate the extent to which Extension education helped citizens progress through the issue cycle on public ballot measure. The clock-like gauge proved to be useful, but the field test suggested improvements.


James S. Long
Retired Leadership Development Specialist
895 Sable Drive
Roseburg, Oregon

Jo Mark
Cooperative Extension
Washington State University
Pullman, Washington
Internet address: mark@mail.wsu.edu


As citizens contribute to public decisions, they may cycle through a sequence of steps or "issue cycle." We often wonder how much Extension education programs help citizens proceed through this issue cycle. To answer this question, we adapted and tested an assessment technique. An introduction to the technique and how it worked in a field trial are presented.

The Technique

First, we developed a clock-like gauge of the issue cycle:

                          1    Becoming concerned


      Evaluating    8           2    Talking with others


Taking action    7                 3    Defining the issue
(like voting)

 Making a choice    6           4    Searching for
                                     alternatives

                          5    Anticipating consequences
                               of alternatives

Second, we placed the gauge, with instructions to participants, on a single sheet, front and back, for a community issues forum that, in one evening, considered three policy questions on the ballot for an upcoming election. Third, at the start of the community forum, we asked participants to read the instructions and indicate where they were in the issue cycle for each ballot issue and, then, to hold the sheet until the end of the forum. Finally, at the conclusion of the evening's program, we asked participants to turn over the sheet and again indicate where they were in each of the three issue cycles.

What We Found

From the 27 to 29 responses for which we received both a pre- and a post-assessment, we discovered that:

  1. For 35% to 45% of the audience, the public issues forum helped participants move one to six steps onward in the cycle for the three ballot issues.

  2. In contrast, most others showed no change from their initial positions near step seven--ready to vote! Apparently, they came to the forum with their minds already made up.

  3. A few indicated a reversal of one to five steps; perhaps the forum stimulated them to reconsider their initial position.

  4. The forum was most helpful for persons who started in Stage 1 (becoming concerned) and in Stage 3 (defining the issue). Participants in these two phases shifted forward an average of 3.5 to 5 steps across the three ballot measures.

  5. Even though the ballot issues were very different--one was only advisory; another reflected community-wide consensus; and a third was volatile--the pattern of change across the ballot measures was surprisingly similar:

    • participants, initially, were positioned throughout the cycle;

    • about half did not change; and the other half indicated varying degrees of change--forward or backward.

  6. Most participants responded to the survey quickly, accurately, and completely; but a few filled out just the pre- or the post-assessment.

In conclusion, the clock-like gauge:

  1. readily showed us the initial position of each participant for each issue and where that individual was in the issue cycle at the conclusion of the program;

  2. it readily differentiated "changers" and "no-changers";

  3. it detected the direction of change; and

  4. it offered information to assess the effectiveness of the forum for persons at different starting points.

In short, the gauge rendered data useful in estimating the contribution of an educational program in helping participants progress through the issue cycle across quite different kinds of public issues.

Recommendations

  1. The wording of the instrument can be refined, for example, to focus the "Evaluating" step (Stage 8) more clearly on evaluating a policy decision as implemented--not evaluating an alternative in Step 5.

  2. Also, we believe the clock-like gauge could be made more clear by placing the numbers outside of the circle rather than inside.

  3. Next time, we'd want to orally introduce the instructions, as well as ask participants to read them.

  4. The program format could be adapted to invite participants, after they've completed the pre-assessment, to portray a profile of starting points for the benefit of the moderator and panelists. A quick show of hands, for instance, could represent the range of starting points.

We believe this "gauge" merits further consideration as a tool to help evaluate Extension's contributions to public issues education. We'd welcome your thoughts and experiences.