August 2001 // Volume 39 // Number 4 // Research in Brief // 4RIB2

Previous Article Issue Contents Previous Article

Learning How to Connect the Dots: An Assessment of a Community Development Program

Abstract
Program development models often stress the science of developing programs such as identifying needs, establishing program goals, and measuring outcomes. Although these components are essential for successful programming, educators can easily overlook important connections that require the art of program of development. Understanding the art of program development is critical when designing community development programs. This article offers a conceptual approach for connecting the science and art of program development. An actual community development program is used to illustrate that learning how to connect the dots is a critical component to successful community program development.


Marlene K. Rebori
Community and Organizational Development
University of Nevada Cooperative Extension
Reno, Nevada
Internet Address: mreborit@unr.edu


Introduction

Community-based educators are increasingly required to develop programs based on assessed needs and to evaluate impact. However, when designing and implementing community development programs aimed at building community capacity, important connections necessary for successful program development can often and easily be overlooked when applying traditional evaluation methods.

"Learning how to connect the dots" is offered as a conceptual framework for program development that applies a holistic approach to reveal the larger programmatic picture. An actual assessment from a community development program is used to illustrate that learning how to connect the dots is a critical component to program development and that without these connections, a program is more susceptible to failure.

The Community Development Program: Citizens Changing Communities (C3)

Many community development programs focus on building community "capacity." Community capacity is typically described as citizens learning skills relating to how to cooperatively work together for problem solving and shared decision making. Some of the realized benefits to building community capacity include collective efficacy, a sense of community through shared connections, improving neighborhood social interactions, and a healthy democracy (Lochner, Kawachi, & Kennedy, 1999; Gates, 1999).

Like many local governments, those of Washoe County and the City of Reno, Nevada use citizen advisory boards as the venue for public participation in local decision making. The advisory boards in the Greater Reno Area provide a forum where citizens can learn about the activities of their county government and provide information back to the commissioners regarding the issues and concerns of their neighborhood community. An assessment with board members (n=114) indicated that 70% of citizens join local advisory boards to be involved in their community and to help create local change. However, no training program exists for these boards that emphasizes capacity building skills, despite board member's desire for such a training program.

To fill this gap, the program Citizens Changing Communities (C3) was developed to provide capacity training for local community advisory boards in the Greater Reno area. The board training program focused on five training components:

  1. Time and meeting management;
  2. Conflict management;
  3. Problem-solving;
  4. Goal-setting and action planning; and
  5. Decision making styles and techniques.

Through a collaborative effort among the governmental liaisons in Reno and Washoe County, along with the University of Nevada Cooperative Extension, C3 became a voluntary training program for local advisory boards in 1998.

Initial Program Design

In addition to designing C3 as a board-training program, during the pilot year four different program delivery techniques were also examined. Board members were randomized into one of four treatment groups to determine if the method of program delivery increased or decreased participation in the training program, or affected comprehension of program materials.

Two parameters of program delivery were measured: 1) technology (both high and low levels) and 2) touch (both high and low levels). Touch referred to the level of human attention participants received. A 2x2 factorial design (Nachmias & Nachmias, 1996) was used to organize the two parameters of measurement (technology and touch) into four types of program delivery techniques. The four types of techniques are outlined in Table 1. The content of program materials was the same across all four treatments. The only difference was in how participants received either the program materials (i.e., printed or Web accessed) and the training (i.e., a hands-on workshop or no workshop).

Table 1
Four Program Delivery Techniques Used in Citizens Changing Communities (C3)

Touch Parameter Technology Parameter
High Low
High A hands-on workshop with Web-accessed curriculum materials. A hands-on workshop with printed curriculum materials.
Low Web-accessed curriculum materials only. Printed curriculum materials. Materials mailed directly to participant's home address

A pre-test was administered via the telephone to gather a base measurement of current board capacity building skills and behaviors, and to notify board members of their delivery technique. Program participation was measured as either: 1) accessing the Web page and reading the posted material; 2) attending a workshop; or 3) reading the printed materials. Reminder postcards were also mailed to all board members' home address describing how to access the program via their treatment method. Post-test telephone surveys measured program participation, effectiveness and application of program materials, and program recommendations.

Program Participation Results

Twenty-six percent of total board members participated in C3 during the pilot year (1998). Although board members indicated on the assessment the "lowtech/lowtouch" technique (i.e., mailed curriculum materials) was a low preferred technique of program delivery (16%), program results indicated this technique had the highest rate of participation (54%; within that randomized group). All other groups had ≤ 19 % participation (Table 2). The "preferred technique" as noted in Table 2 refers to the preference given by members on the assessment; "technique used" reflects the percentage of actual participation per each randomized treatment group.

Table 2
Percent of Preferred and Used Techniques Among Advisory Board Members

  Preferred Technique
by Percent
Technique Used
by Percent
hightech/hightouch
(Web with workshops)
8% 19%
lowtech/hightouch
(printed material with workshop)
70% 19%
hightech/lowtouch
(Web only)
5% 7%
lowtech/lowtouch
(printed material only)
16% 54%

The assessment with board members indicated 66%% access the Internet. In both "hightech" randomized groupings, (i.e., Web with workshops and Web only), 67% of respondents access the Internet. Of all board members who do access the Internet, 86% access the Internet on a daily or weekly basis. Within the randomized "hightech" groupings, 44% of board members access the Internet on a daily or weekly basis. The "hightech/lowtouch" treatment (i.e., Web material only) was the least used (7%) program technique (Table 2). The top three reasons cited for not accessing the Web materials were:

  1. No Internet access (34%);
  2. Did not receive the reminder postcard (25%); and
  3. No time to access (23%).

Although the board members strongly indicated in their assessment that a hands-on workshop was the most preferred method of program dissemination (70%, per Table 1), very few board members actually attended workshops. Board members also indicated in the assessment that September was the most preferred time of year for training (76%) and March was the second most preferred (53%). Therefore, two workshops were offered. One in September (1998) and the other workshop in March (1999). Total participation in all "hightouch" components (i.e., workshops with Web materials and workshops with printed materials) was 19% (Table 2).

Learning How to Connect the Dots

Program participation results may appear surprising when applied to traditional program development and evaluation models. Although an assessment was conducted and results indicated a high interest for program content, delivery methods, and an audience readiness; actual program participation did not match assessment responses. While program development models provide a systematic method for measuring learner needs, program effectiveness, and accountability, they cannot offer prescriptive measures. Program development and evaluation models are not predicators for actual participation nor can they predict program success (Maehl, 2000).

As practitioners working in the realm of life-long learning, our responsibility does not end with evaluation. As Extension educators, our task is to reflect on and gain meaning from participation results. Reflection is a retrospective process and requires the educator to appreciate programming as a dynamic living process. Educators working in life-long learning must recognize that teaching and program development is a living system. It is within this living system that we as educators must continually examine and learn from, making adjustments and adaptations as necessary (Foley, 2000).

It has now been more than 2 years since C3 was first developed. Since that time, the program has grown beyond initial evaluation results. C3 training materials have expanded to other community boards throughout the state of Nevada; requests for materials by the advisory boards continue to grow; and recently the program has evolved into mandatory training for advisory boards in partnership with Washoe County and Cooperative Extension. Oddly enough, no changes were made to program content, delivery methods, or audience readiness.

Changes that did occur over the last few years involved a holistic approach to programming that emphasized learning how to connect the dots. Program development is as much of an art as it is a science (Newman, 2000). Learning how to connect the dots became a conceptual framework for understanding how to incorporate the art of program development in addition to the science. The processes of identifying missing components and learning how to make those connections for the C3 program are outlined below.

Identifying the Disconnection

Connect to the Learner

Although the assessment probed learner needs and readiness, it missed the personal connection to potential participants. This disconnection was evident in low participation for the hands-on workshop. Board members rated their preference for a hands-on workshop high (70%), but actual participation was low, as previously reported. This missed connection could be further supported by the high rate of participants who read mailed program materials (54%). Reading program materials mailed directly to one's home requires no personal connection between the educator and learner, or among other participants. Mailed program materials simply offered learner convenience and autonomy.

Connecting with the learners in your community is the art of program development, not the science. Although many program development and evaluation models stress the importance of gathering stakeholder input and identifying target audience needs (Patton, 1997 Mayeske, 1999), most of these models miss the art of how to make connections with the learner. These vital connections cannot be made solely through assessments or input gathering sessions.

Most Extension program development models are geared to the evaluation or research perspective, not to the learners. When striving to build community development programs that focus on capacity building, there is more to teaching and delivering a program than merely assessing the need, developing program goals, and evaluating effectiveness.

The social processes of community development include building relationships and sharing experiences through interaction, especially when striving to build community capacity. The vital connection between educator and learner begins by modeling the relationship you are trying to facilitate. This includes building a relationship with the learner that involves establishing your identity, your character, and your commitment to working with them to overcome the hard challenges they face in the community.

Connect to the Dynamics

Another disconnection concerned board-operating dynamics. Even educators who are enmeshed in their community can easily become myopic when it concerns the social dynamics and processes of the community. During the development and implementation of C3, the advisory board process was not operating as intended. This dysfunction was not conveyed during any meetings, nor was it conveyed on the assessments with board members.

Specifically, a majority of board members were frustrated with local government officials, frustrated with their roles in the community, and frustrated with the organizational structure of the boards. For example, many board members were feeling overwhelmed by the required work and felt ignored by county officials in their pleas for help. Although the organizational needs of the boards may have benefited from the C3 training program, board members were struggling with process issues and unable to concentrate on skill building.

Once an accurate picture of the dynamics became revealed (i.e., the dots become connected), all programming efforts re-focused on improving the operating process for the advisory boards and building the relationships between county officials and the boards themselves. The C3 program become more than just a training program on capacity building skills. It evolved into a community development process that modeled community capacity, improving the government participation process and engaging in civic dialogue. This was carried out in a variety of ways that included amendments to the development code, making the application review process more user friendly, and streamlining the citizen input process, among a host of other activities.

Conclusion

There are many variables an educator has to connect with when developing and evaluating programs. Most program development models highlight the obvious components; the difficult task is uncovering the hidden and often-invisible factors. Learning how to connect the dots equates to practicing the art of program development by recognizing the multitude of factors at play in the design of program success.

Developing and evaluating programs require more than simply systematically assessing the inputs and outputs, and documenting the impacts as the popular abbreviated logic model suggests. Educator who want to develop quality programs in their communities must also realize the importance of applying a holistic approach for learning how to connect the dots that calls for both the art and science of program development. As we work with members of our communities, we need to remember that learning how to connect the dots will reveal the hidden picture of what is important and truly needed.

References

Foley, G. (2000). Teaching adults. In G. Foley (Ed.), Understanding adult education and training (2nd ed.). Australia: Allen and Unwin.

Gates, C. T. (1999). Creating a healthy democracy. In The civic index: Measuring your community's civic health (2nd ed.). National Civic League.

Lochner, K., Kawachi, I., & Kennedy B. P. (1999). Social capital: A guide to its measurement. Health and Place 5: 259-270.

Maehl, W. H. (2000). Lifelong learning at its best: Innovative practices in adult education programs. San Francisco: Jossey-Bass Publisher.

Mayeske, G. W. (1999). Life cycle program management and evaluation: An organic and heuristic approach (4th ed.) Cooperative State Research, Education and Extension Service. U.S. Department of Agriculture, Washington, D.C.

Nachmias, C. F. & Nachmias, D. (1996). Research methods in the social sciences (5th ed.). New York: St. Martin's Press.

Newman, M. (2000). Program development in adult education and training. In G. Foley (Ed.), Understanding adult education and training (2nd ed.). Australia: Allen and Unwin.

Patton, M. Q. (1997). Utilization-focused evaluation (3rd ed.). Thousand Oaks, CA: Sage Publications.