June 2008 // Volume 46 // Number 3 // Tools of the Trade // 3TOT2

Previous Article Issue Contents Next Article

An Online Reporting System for Evaluating Multi-Site Parenting Education Programs

Abstract
Experience from developing and implementing an online reporting system to evaluate parent education programs in rural communities is described. The information we collect from multi-site project coordinators has fostered ownership of the program data and promoted accountability in evaluation outcomes. Coordinators report on parent experiences from surveys and write narratives to characterize their organization's community partnerships on a quarterly basis. Community collaborations and capacity building activities of 18 unique parent education programs are captured in a standardized way. Five important tips are shared for others interested in developing and implementing online reporting systems for evaluation purposes.


Cheryl L. Peters
Extension Educator & County Extension Director
Michigan State University Extension
Presque Isle County, Michigan
cpeters@msu.edu

Denise Rennekamp
Parenting Education Program Coordinator
Extension Family and Community Development
Oregon State University
Corvallis, Oregon
Denise.Rennekamp@oregonstate.edu

Sally Bowman
Assistant Program Leader, Associate Professor, and Family Development Specialist Extension Family and Community Development
Oregon State University
Corvallis, Oregon
bowmans@oregonstate.edu


Rationale for an Online Reporting System

An Extension Service team received a grant from a private foundation (The Ford Family Foundation) to provide evaluation services, technical assistance, and networking opportunities for parenting education programs in 18 rural communities throughout Oregon and Northern California's Siskiyou County. The purpose of the Enhancing the Skills of Parents Program II (ESPPII) is to assist communities in strengthening family-focused programs that improve the skills of parents with children from birth to 8 years of age in their targeted communities. The challenge for this project was to find common measures that would be relevant to many programs delivering a variety of educational curricula and to design an online evaluation reporting system that was affordable, efficient, and perceived positively by local community program coordinators.

The advantages of using Web-based technologies for distance education, professional development, online communities, and the collection of research survey data for Extension educators have been described (O'Neill, 2004; Kallioranta, Vlosky, & Leavengood, 2006). Some Extension Services have moved to Web-based reporting for plans of work and reports of accomplishment (Richardson, Gamble, & Mustian, 1998), with varying degrees of acceptance and success.

This article describes the need, development, and implementation of an online reporting system to collect cluster evaluation data on varied parenting education programs. A cluster evaluation method was selected for collecting and analyzing information from the 18 project sites (Patton, 1997), using multiple methods and a combination of quantitative and qualitative data collection and analysis.

The ESPP II projects reach families through a variety of programming opportunities, including formal educational series using evidence-based curricula, one-time educational workshops on special topics, home visitation programming, and family activity events that emphasize social gatherings in schools and common spaces (e.g., libraries, parks). In addition, the 18 projects are developing leadership for parenting education, strengthening their organizations, and collaborating with other organizations in order to build community capacity for parenting education (Chaskin, Brown, Venkatesh, & Vidal, 2001). As communities implement parenting education programs that meet local needs, identifying and evaluating indicators that can help or hinder community capacity and sustainability are critical to measure.

It was essential to design a reporting system that simplifies and reduces the evaluation workload by streamlining reporting processes and eliminating multiple paper forms or Excel files. Local project coordinators were similar to Extension educators, balancing many programs and funding streams, making it difficult to prioritize evaluation.

Data Collection

The project coordinators are responsible for collection of data about parent experiences and community partner relationships on a quarterly basis. Information is gathered from parenting education program participants: type of program in which they participated, experiences with the program, and any changes they made as a result of their participation. Coordinators also report on activities with community collaborations, including Advisory Board meetings. Coordinators can enter family contact information for their own use and produce a mailing list or class roster.

The evaluation administrators do not have access to this confidential information about participants; however, all other data, such as participant demographics are accessible. The Parenting Skills Ladder (PSL) Survey (Katzev, MacTavish, Pratt, & Weatherspoon, 2002) is utilized as a tool to assess change in skills, regardless of the specific parenting education curriculum that is taught. A Parent Workshop Evaluation (PWE) Survey collects participant satisfaction information on one-time events.

Community collaborations and capacity-building efforts are captured from program coordinators through narratives that detail efforts in community capacity building, community awareness, and school collaborations. Also included are success stories and challenges from community capacity and parenting education activities, as well as progress toward short- and long-term programming goals as defined by their uniquely constructed project logic models. A report function combines coordinator reports of leveraged funds from new sources as part of the outcome evaluation data in the online system.

Purpose

The goals of the online reporting system were to provide an accessible, secure, user-friendly portal to enter and keep information in one location. It was important to both reduce the burden of data compilation for coordinators with multiple funding streams and make the creation of reports convenient and useful. A standardized reporting format was created that helps document the program outcome for each project, every quarter, regardless of varying curricula and programming locations.

Ownership of the data was fostered by allowing coordinators to share accounts within their office and promoting accountability for evaluation collection among program staff. The data generates reports that are stored and secure, creating a record for the entire 4-year initiative. The reports can be produced for PDF download and Excel export for program coordinators (including built-in statistical tests for retrospective pretest surveys). The system can generate individual and compiled reports.

Development and Implementation of the Online System

The steps in developing the online reporting system included:

  • Designed new reporting forms for data collection

  • Worked with Web programmers to design the system, review security requirements, and determine data storage

  • Sought input from coordinators on "What would you want in a reporting system?"

  • Entered first three quarters of data for each site to pilot test the system

  • Incorporated feedback from three coordinators who piloted the system

  • Continued testing phase, uploaded existing data and printed and saved test reports

  • Launched system and user-friendly training manual simultaneously

  • Continued training for users and reporting of system bugs

The system is monitored and modified as needed to be responsive to coordinators.

Tips to Develop an Online Reporting System for Evaluation

Here are five tips gleaned from the development and implementation of an online reporting system.

  1. Before creating the reporting system, have the evaluation design well conceptualized. Know what information you want to capture and exactly how you want it reported.

  2. Develop clear reporting forms and know fields of data entry before meeting with Web programmers. Likewise, outline desired reporting formats, including any statistical analysis you want built into the system.

  3. Think through user management issues carefully; consider the audience and gather feedback from prospective users on their needs.

  4. Pilot test for several weeks, and check all functions of the system thoroughly. Volunteers from outside of your area or office can test the system interface and give feedback.

  5. Provide training for users. For example, a manual with tips and examples to use the system effectively is helpful for getting started. Updates to all users on tips for working within the system should be shared as lessons are learned.

References

Chaskin, R. J., Brown, P., Venkatesh, S., & Vidal, A. (2001). Building community capacity. New York: Aldine De Gruyter.

Kallioranta, S. M., Vlosky, R. P., & Leavengood, S. (2006). Web-Based Communities as a Tool for Extension and Outreach. Journal of Extension [On-line], 44(2). Available at: http://www.joe.org/joe/2006april/a4.shtml

Katzev, A., MacTavish, K., Pratt, C., & Weatherspoon, J. (2002). Enhancing the skills of parents program: First year progress report. Oregon State University, Family Policy Program.

O'Neill, B. (2004). Collecting research data online: Implications for Extension professionals. Journal of Extension [On-line], 42(3) Article 2FEA$. Available at: http://www.joe.org/joe/2004june/tt1.shtml

Patton, M. Q. (1997). Utilization-Focused Evaluation. Thousand Oaks: Sage.

Richardson, J.G., Gamble, K. J., & Mustian, R. D. (1998). Creation of a Web-based accomplishment reporting system. Journal of Extension [On-line], 36(2). Available at: http://www.joe.org/joe/1998april/a1.html