Fall 1991 // Volume 29 // Number 3 // Feature Articles // 3FEA7

Previous Article Issue Contents Previous Article

Concept Mapping as a Program Planning Tool

Concepts are patterns or regularities we see in events or objects. Concept mapping is a way to pictorially represent concepts and relationships held by an individual or a group. Extension staff and volunteers in 18 New York counties have been experimenting with concept mapping as an aid to setting program direction. In almost every case, participants have found the approach useful to program development and evaluation techniques. We'd be glad to share our experiences should you wish to explore use of concept mapping.

Michael W. Duttweiler
Program Specialist
Program Development and Evaluation
Cornell Cooperative Extension
Cornell University, Ithaca, New York

Extension staff nationwide are sharpening their skills in collecting diverse information to help determine program direction. A significant challenge is interpreting information once gathered. Agents often ask: "What do I do with this information now?" or comment that resulting summaries "seem only to reinforce present priorities and program direction." A process is needed to draw on individual and group thought and present information to prompt further analysis. This article describes one technique, computer-based concept mapping, that can help in the interpretive process.

What Is Concept Mapping?

Concepts are patterns or regularities we see in events or objects.1 The knowledge that we have about any subject consists of concepts relative to the subject and the relationships among those concepts. Concept mapping is a way to pictorially represent concepts and relationships held by an individual or a group.

Concept mapping isn't a new idea. Educators have used various forms of it extensively.2 A student-drawn food chain showing relationships between producers, consumers, and decomposers is an example of a simple concept map. Applications of concept mapping to Extension program planning are more recent and have emphasized individual learner needs.3 The typical application in program planning is in summarizing knowledge and perceptions about educational needs. Until recently, no simple, mechanized approach existed for using concept mapping in a group setting.

What Does Concept Mapping Entail?

The Concept System computer software4 provides an efficient and effective means for concept mapping in a group setting. The mapping process consists of three phases: (1) group brainstorming to list community needs or concerns, (2) individual rating of the resulting list of needs, and (3) individual grouping of the needs into themes. Rating and grouping information is entered into the computer. Average ratings for each need, statistical summaries of the groupings, and maps pictorially showing relationships between and among individual items and groups are produced.

Statistical processes used in summarizing the group data are sophisticated, including multidimensional scaling and hierarchical cluster analysis.5 The products, however, are understandable summaries that prompt further analysis. Various options exist for plotting and examining cluster and rating information.

How Does Concept Mapping Work?

The best way to show the process is to "walk through" a fictitious example. Demo County decided to begin their plan of work process by conducting a community forum to gain diverse, external viewpoints of county needs. About 25 leaders were selected to provide diverse views of the community through a four -hour forum. A facilitator led the group through a brainstorming session around the question: "What are the needs of Demo County over the next four to five years that might be addressed through educational means?" The brainstormed items were entered into the mapping software and projected simultaneously onto a screen. When the hour brainstorming session ended, everyone relaxed at dinner to reflect on the topics they'd identified.

While the group was eating, the software was used to print out sorting cards and rating sheets. On returning, each participant received a set of sorting cards and instructions to group the items as follows:

    "Group the items in any way that makes sense to YOU by placing the cards in piles. Group by similarity, NOT by priority, that is, group things that you see as related to each other. You might have many groups or only a few depending on how you see the collection of items."

After the sorting process, participants recorded the items included in each of the groups they had formed.

The next step was to have participants rate the importance of each item using the following instructions:

    "You are rating the importance of each item relative to all others on the list. You are rating each item independently, not as part of a group."

A five-point rating scale was used (the software can accommodate up to a nine-point scale). Participants then handed in their rating and sorting information, and were promised a complete summary of the informaton generated. Following the forum, the individual rating and sorting information was summarized and processed to prepare the findings for Extension personnel.

What Products Are Generated?

Three initial products resulted from the forum. The most basic one was a list of the items resulting from the brainstorming exercise in the order in which they occurred. Also provided was a summary of the groupings or clusters developed by participants. This summary can be thought of as the "most probable" groupings among participants.

Table 1 is a partial list of cluster data for the Demo County example. Each group or cluster was named (economic development and environmental issues) to reflect the nature of the topics it contained, a significant step in interpretation. The cluster listing included the average rating of each item listed and an overall cluster rating. Comparing ratings provided the relative importance assigned by the participants to each item and cluster. In the example, environmental issues (cluster rating of 3.6) were seen as relatively more important than economic development issues (cluster rating of 2.96).

Table 1. Partial list of clustered items and average importance ratings.

Cluster 2-Economic Development Issues
    Fishing industry, tourism impact (3.06)
    Development of waterways (2.89)
    Tourism-skiing and other activities (2.83)
    Rapid changes in technology (2.78)
    Business involvement in education (3.44)
    Government regulations (3.17)
    Government role in leisure opportunities (2.00)
    Small scale economic development (3.17)
    Future of the port (2.39)
    Industrial development (3.22)
    Industry/government/educational collaboration (3.61)
    Cluster rating= 2.96
Cluster 3-Environmental Issues
    Acid rain problems (3.72)
    Solid waste management (4.00)
    Ground water protection/management (4.11)
    Recycling organic wastes (3.50)
    General recycling education (3.67)
    Land and water use (3.89)
    Nuclear power plants (3.11)
    Toxic waste site clean up (3.44)
    Effects of local economy on landscape (3.00)
    Cluster rating= 3.60
(1 = lowest importance, 5 = highest importance). Items appear in
the order in which they were clustered statistically, not by

A third product (Figure 1) was a plot or map of the eight clusters. The map, drawn by the software, first plotted individual items relative to all other items based on a statistical summary of the grouping data. Cluster borders were drawn around all items within each cluster. Clusters mapped near each other (agriculture and environment, for example) can be interpreted as being closely related. The height of the walls around each cluster represents the relative priority of each cluster (based on the average ratings of all items within each cluster).

Figure 1. Concept map of clustered items.

Clusters seen as closely related are plotted near each other.
Height of "walls" indicates relative importance of each cluster.
Cluster shape is based on location of plots of individual items
within each cluster.

What Are the Benefits?

Opportunities for rich analysis abound. Continuing with Figure 1, you could ask: Are the clusters labeled appropriately (community resources/services)? Why are some clusters clearly strong themes (family issues), while others appear as loose aggregations (educational issues)? Should some clusters be split into subtopics (individual/family well-being)? Should some clusters be merged (community resources/services and economic development)? What relationships among themes underlie how they were perceived by participants (between economic development and environment)? Did participants miss important relationships (between family issues and economic development)?

Since participants (rather than Extension staff) organize the information into themes, this reduces the tendency to view findings in terms of existing priorities, audiences, and subject areas. This also encourages a fresh look at program potential. The graphical depiction of concerns invites exploration of underlying issues.

This approach has limitations. Basic computer literacy and equipment are needed. The software, although user friendly with an effective tutorial, does require several hours of self- instruction and experimentation before most people would be comfortable using it. Data entry and computation require time when using with groups of more than a few people. Some participants react negatively to the mechanical nature of the sorting and rating procedures. The software isn't practical to use with groups of less than about six people. And, as any other group data-gathering or decision-making process, the outcome depends on group composition, dynamics, and facilitation.

A similar process would be possible without using the software-at least for small groups. The software increases the potential for input from large numbers of participants and assures that summarizing the grouped data reflects the independent thinking of participants rather than predetermined areas of need. Most importantly, information is presented in forms that invite further analysis and exploration. This analysis is key in translating diverse information on educational needs into realistic educational responses.

Extension staff and volunteers in 18 New York counties have been experimenting with concept mapping as an aid to setting program direction. In almost every case, participants have found the approach useful to program development and evaluation techniques. We'd be glad to share our experiences should you wish to explore use of concept mapping.


1. J. D. Novak and D. B. Gowin, Learning How to Learn (London: Cambridge University Press, 1986).

2. M. A. Moreira, "Concept Maps as Tools for Teaching," Journal of College Science Teaching, VIII (No. 5, 1979), 283-86.

3. J. D. Novak, "Introduction to Concept Mapping: A Handbook for Educators" (Ithaca, New York: Cornell University, Department of Education, 1986).

4. The software was developed by William Trochim of the Cornell University Human Service Studies Department. The software, including program documentation and tutorial, is available through Concept Systems, P. O. Box 4721, Ithaca, New York 14852.

5. W. M. K. Trochim, "An Introduction to Concept Mapping for Program Planning and Evaluation," Evaluation and Program Planning, XII (No. 1, 1989), 1-16.