April 1998 // Volume 36 // Number 2 // Feature Articles // 2FEA1
Creation of a Web Based Accomplishment Reporting System
Abstract
In recognition of the increasing public demands for accountability, the North Carolina Cooperative Extension Service embarked on a mission to develop a new reporting/accountability system to meet current and anticipated future accountability needs. These efforts involved a large number of persons in developing a conceptual design for the new system. Focus was placed on being able to capture planned program measures of progress and impacts as well as contacts, and program successes. A computerized World Wide Web graphics based system was developed for entry and accumulation of the reports. The system is now implemented and user friendliness was demonstrated when 97 of 102 units met a reporting deadline that came only one month after final release of the program.
The demands for increased levels of accountability seem to be ever-increasing for practically all societal entities. With this expectation so prevalent, it is no surprise that policy makers are increasingly focusing close attention on the relevance of organizations and their value to their constituents, as well as to society as a whole. Such emphasis on accountability led to the passage of the Government Performance and Results Act (GPRA), which is now being implemented. Reports from across the country indicate that states, counties, and others appear to have similar ideas relative to making sure their tax dollars are being expended as intended.
The North Carolina Cooperative Extension Service (NCCES) has not been isolated in this age of accountability and has been focusing considerable efforts to satisfy current and anticipated organizational and programmatic accountability needs and demands. One major component of this increased focus is the development of a completely new reporting system that attempts to capture all accomplishment results emanating from programming efforts in each county. Results include contacts, measures of program progress, impacts, volunteer time, cost-benefits, success stories, and delivery strategies for all planned programs. It also includes civil rights reporting as well as successes from other program efforts not included in regularly planned programs, such as special educational efforts dealing with disasters.
It is clear that a system for reporting all required aspects of accountability functions needs to be included in a single system. With this general concept in mind, state program leadership appointed a special program Reporting and Accountability Task Force to develop the criteria needed to address all future accountability needs of the organization. The task force was charged with identifying all reporting needs; developing goals and objectives for a new reporting system; identifying the parameters of a new system; and ultimately developing a diagrammatic model that could be studied, revised, and used as the design for the new system.
The Reporting and Accountability Task Force held meetings with Extension personnel and conducted interviews with key county Extension directors. Those interviewed were asked to canvas their associates by various means to secure as much input as possible as to what was and was not needed for local accountability, preferences of time-lines for required reporting, and special wishes, such as user friendliness of a new system. Throughout the entire process, the committee received direct input from agents in more than 60 of the state's 100 counties. Altogether, more than one-half of all agents in the NCCES provided input into conceptualizing and designing the new reporting system. From the initial development of goals and objectives through the final roll-out of a World Wide Web based system, agents across the state were included in review and decision making on a continuous basis.
Building on the open dialogue established throughout the state, as well as with colleagues throughout the country, the task force developed a goal and a list of objectives for the new reporting system. The goal was:
To establish an effective and efficient reporting system that is user-friendly, easily accessible, and provides needed organizational accountability requirements.
With the goal as the guiding principal, objectives were then developed that included:
- Provide cumulative program progress;
- Provide a mechanism for reporting program success;
- Capture State Major Programs (SMP), other programs and special projects;
- Accessible at all organizational levels;
- Capture creative use of program delivery;
- Meet reporting requirements of Cooperative State Research, Education, and Extension Service (CSREES), the federal partner;
- User friendly;
- Continuously updated, accessible, and monitored; include data and information necessary for reports at all levels of the organization;
- Continuous comprehensive instructions and training to be provided for proficient use of the system to include inputs and outputs;
- Continuous allocation of resources to include personnel, hardware, and software;
- Reduce information processing.
Throughout this conceptual process, Extension administration and program leadership were continuously involved to assure that everyone was on the same conceptual plane. After creation of the objectives, a list of parameters was developed. It included such specifics as what was required and when, time-lines, items needed for adequate accountability, and things that would be useful to have but not really seen as of vital importance for organizational accountability purposes.
The next step was actual development of a diagrammatic model design. The design had to include all necessary components as well as those thought to be important for future reporting needs. Also, one objective was to design a system that was all- inclusive, that is, only one reporting system rather than several different ones, all functioning slightly different. As the model was designed, all interested persons were given the opportunity to review and make suggestions.
The State Major Plan Task Force spent hours during its monthly meetings intensely analyzing each component and recommending changes. Actual time-lines for required reporting probably created the greatest discussion, and ultimately, the Extension Administrative Council would have the final say in what was or was not included, and when reporting would occur. Similar long discussions were common among the administrative group as well. Often, seemingly tiny adjustments would precipitate lengthy discussion, analysis, and intensity. Ultimately, a final diagrammatic model was accepted as the blueprint for the system.
The next steps included development of input and output specifications to meet the needs of the Extension Technology Services information systems group for their guidance in the design of the required computer programs. In tandem with this on- going process, parameters were being established by the State Major Plan chairs to identify specific program measures for each plan objective. Parameters indicating program progress and results would become the major component of the entire reporting system.
With key sections and components of each section identified, it was clear that the major part of the system would be the measures of progress (MOPS) and Impacts associated with each of the objectives within each of the twenty State Major Plans. Since real program outcomes rather than inputs were now the focus, the guidance and training requirements for actually developing realistic MOPS and Impacts was a daunting task. Altogether, seventy-six objectives were given MOPS and Impacts in which reports would be completed. During this process, many individual SMP task forces recognized the significance of their expected outcomes, and many decided they had been perhaps too creative in developing a large number of measures. Ultimately, due to the press of software development requirements, final decisions had to be made with the MOPS and Impacts which would be used for reporting 1996 accomplishments. An example of MOPs and Impacts is shown in Exhibit 1 for one objective in one of the twenty state major plans.
Example of Program Measures of Progress and Impacts For One of Seventy-Six State Major Program Objectives
State Major Plan
AGING WITH GUSTO!
OBJECTIVE 1. Participants in aging issues programs will increase awareness, gain knowledge, change attitudes, develop skills, and adopt practices and behaviors to help make their later years more financially secure.
Measures of Progress:
- Increased awareness and knowledge of financial management
techniques and consumer issues.
NUMBER_______ (5 CELLS) - Adoption of financial management and consumer practices.
NUMBER_______ (5 CELLS) - Increased knowledge of estate planning.
NUMBER_______ (5 CELLS) - Adoption of estate planning practices.
NUMBER_______ (5 CELLS) - Increased awareness and knowledge of retirement planning and
savings.
NUMBER_______ (5 CELLS) - Adoption of retirement and savings practices.
NUMBER_______ (5 CELLS)
Impacts:
- Improved financial status through adoption of consumer and
financial management practices.
NUMBER ADOPTING__________(4 CELLS) - Increased savings and/or increased retirement contributions for
future financial stability.
DOLLARS $ __________ (7 CELLS) - Developed and implemented an estate plan.
NUMBER___________ (4 CELLS) - Developed and implemented a plan for possible future
incompetency and dependency.
NUMBER ___________ (4 CELLS)(NOTE: CELLS indicates the number of spaces computer allows for entering numbers)
Other components of the SMP reports include volunteers, volunteer hours and calculated value, and cost-benefit analyses, plus a narrative description of the program progress and results. Calculation of volunteer value is automatically set at a rate of $10 per hour. The literature gives wide ranges for valuing volunteer time, so a value was selected that would be reasonably conservative, yet high enough to reflect reasonable value of one's time.
The cost-benefit analyses may be most difficult to make in some circumstances and relatively easy in others. Considerable discussion was focused on whether to include a requirement for the cost-benefit information. The final decision was to include such information. A fact sheet was developed to assist agents in understanding cost-benefits. While an assessment of this component may be difficult following initial use by agents, it is clear that a new paradigm is emerging in which agents are making assessments of the value of their time and its most worthy allocation for greatest impact.
Separate sections were included for success stories for planned SMP programs, and those of a non SMP focus. Often, valuable work is performed that has not been included in a plan of work. An example was the experiences in 1996 with two major hurricanes, and the need to provide all types of disaster relief information and assistance on short notice. With this new reporting system, the successes of these emergency educational efforts were reported and resulted in the Secretary's Honor Award for Emergency Response being presented to the North Carolina Cooperative Extension Service by USDA. To provide guidance for writing success stories, a special training fact sheet has been developed. Success stories are limited to no more than 150 words, and will be truncated by the computer if they contain more than the prescribed word limit.
The civil rights section includes all components of previously needed information. It has only been adjusted to fit the new system, with time-lines being changed to twice a year rather than reports being required once a year.
The final section is one that allows those reporting to indicate their program delivery strategies. This section is optional, but will be accumulated at the state level so that use trends can be observed. N. C. State University is currently developing a university outreach reporting system, and reporting activities is expected to be a key part of that program. Therefore, while NCCES is more interested in MOPS and Impacts of programs, the types and numbers of program delivery activities remain important to a large number of people. As a result, this optional component of the NCCES system may later become a required entity as well.
Following the design and specifications phases of the system development, emphasis was placed on developing a computer system that could accommodate all of the intricate components of the reporting system. Initial plans rested with development of a text-based system linked to all county units.
Fortunately, an innovative idea for using a graphically- based user interface (GUI) in conjunction with a client-server computing model began to emerge. This idea was rendered possible by a major statewide investment in the NCCES information technology infrastructure. These improvements provided for continuous Internet connectivity to all of the NCCES county centers and allowed for the development of a modern software solution for the new reporting system.
The skills required for developing a graphics based system required development of new skills by members of the Extension Technology Services group. This challenge was met, and a new World Wide Web graphics based system was completed. All components of the new system were developed, tested, and released within an eight month period during 1996.
The decision to use the World Wide Web as the delivery mechanism for the new reporting system contributed greatly to the speed with which the new system could be deployed. Most states have moved to make the World Wide Web available to all of their county units. The system developed in North Carolina would be easy to adapt and modify for use by other states.
Testing of each computer program component involved initial release of an Alpha version of the program to six volunteer counties. Comments were received from the testing counties, and these were compiled into a punch list for the program development team to analyze and make adjustments in the system as indicated. Following this step, the second, or Beta version, was released to the same six counties for testing and review. Only then was the system released to all reporting units. This process is time consuming, but proved to be extremely valuable in that the systems released were essentially bug-free.
Training for personnel in all 100 counties was conducted to introduce them to the new system and to practice with Section A. Universal comments of user friendliness have been received. Also, many agents are now finding the task of completing their reports so easy that they are entering their own information rather than giving it to secretaries for entering. Naturally, this is an evolving process and some will continue to depend on others to make their entries. User friendliness of the system was further indicated when 97 of 102 reporting units met the accomplishment reporting deadline date, which was only one month after final release of the system.
The resources devoted to development of this new NCCES reporting system have been enormous. Yet, in order to meet the organizational accountability needs that are required and expected now and in the future, we believe the NCCES has developed a system that will accomplish the objectives that were initially developed. Thanks to rapidly emerging computer technologies, a system has been implemented that could have only been imagined just a short time ago. With the similarity in information technology infrastructure deployed throughout the Cooperative Extension System, much of the work accomplished in North Carolina could be readily modified and used in other states.
Based on initial feedback, NCCES has embarked on a new system that will provide the needed focus that should be placed on achieving actual program accomplishments. It is, however, clear that the knowledge curve is quite broad as to what measures of progress really are and what really entails program impacts. While obvious refinement will need to be made in the quantity and quality of indicated program results, the steps being taken have led far along the path of improved focus on programs and their accomplishments.
Quality training coupled with increased levels of knowledge and awareness of program impacts should provide a solid basis for helping the NCCES to meet all accountability needs in the future. There was much learned in constructing the NCCES system that other states could benefit from as they develop or refine their own systems.