August 1998 // Volume 36 // Number 4 // Feature Articles // 4FEA5

Previous Article Issue Contents Previous Article

Florida Cooperative Extension's County Program Review Process

Abstract
County program reviews can be an important part of the program development and evaluation process. In Florida, reviews are conducted through a case study protocol that triangulates information to ensure validity and reliability. Sources of data for the review include: planning and reporting documents, original educational material, observation of instruction, and personal interviews with agents, program assistants, volunteers, advisory members, local government officials, collaborators, and clientele.


Steve Jacob
Assistant Professor
Internet address: sgj@gnv.ifas.ufl.edu

Glenn D. Israel
Professor

William R. Summerhill
Professor Emeritus

Program Development and Evaluation
University of Florida
Gainesville, Florida


County program reviews are a comprehensive assessment of the program delivery and educational services offered by the faculty and staff of a local Extension office. In Florida, the county program review process is designed to assess program quality, facilitate program improvement, foster cooperation among Extension's various units, and assist in achieving the best use of institutional resources. The information gathered further assists faculty and administration in future planning efforts and guides the evaluation of new program proposals, budget requests, and capital project requests.

Florida's county program review process is primarily a formative evaluation, to improve program delivery at the county level, and to assess state-level program and administrative support of the county faculty. To serve this end, the process includes an administrative review that assesses the support and guidance provided by county and district Extension directors. The review also assesses compliance with Affirmative Action and Americans with Disabilities guidelines.

This process achieves validity and reliability by incorporating multiple sources of data that are both qualitative and quantitative and analyzed in a case-study framework. Quantitative data collection begins approximately two months before the county office visit. The county director forwards all mail lists, two years' worth of all advisory committee meeting minutes, and a list of all county faculty and staff by appointment type and/or assignment. This information is used to identify clientele, advisory members, and collaborators to interview during the on-site phase of the review. Further, this information indicates the extent of the advisory committees' involvement in program planning and delivery.

Other quantitative data is gathered and reviewed before the county visit, especially demographic and economic information from the Decennial Census and Census of Agriculture. Further, clientele contacts, current plans of work, and the most recent reports of accomplishment are gathered for all programs delivered in the county. Clientele contacts are compared with demographic and economic information to assess the adequacy of program coverage, productivity, and to assist in the identification of potential new audiences. Plans of work are evaluated against criteria of logic and evidence of quality planning. Reports of accomplishment are reviewed for program outcomes, evaluation results, and indicators of effectiveness and productivity. The quantitative data begin to establish a view of county program delivery that is either confirmed or rejected by qualitative data gathered in the county.

Qualitative data are gathered through personal interviews with three major groups within the county: (a) agents, program assistants, staff, and volunteers; (b) Clientele and advisory members; and (c) collaborators (including local government and agencies). The interviewers ask questions about the program development process, delivery, quality, and productivity. Generally, the review team spends 2-4 days in the county, depending on the number of faculty and staff in that particular office. As a rough rule of thumb, the team conducts 10-15 interviews for each county major program (approximately 40-80 planned days) conducted. Each faculty member typically has two or three county major programs. Agent interviews are scheduled for two hours, while most other interviews are scheduled to last 30 minutes.

The review teams have consisted primarily of state Extension specialists, but also have included program area leaders and county faculty from other Extension districts in Florida. Review team members and the county faculty being evaluated have almost unanimously embraced the process, and in no case has the inclusion of a specific specialist, program area leader, or county faculty member caused any unresolved problems. This is undoubtedly due to the consideration given to the composition of the review team, which is chosen with input from the county faculty.

Initially, the review sites were chosen because of personnel or productivity issues in a particular county. Now, evaluation specialists and key administrators have communicated the purpose of the reviews as a program development and evaluation function and most faculty in the state now have a clear understanding of that mission. Now, district directors nominate counties for review on an annual basis. The different directors use various rationales, which have included: (a) letting the counties volunteer for review; (b) random rotation; (c) reviewing offices with new county directors.

Program reviews in Florida have become an important monitoring and evaluation tool. The process in Florida is based upon case study methodology, which triangulates both quantitative and qualitative data to give a reliable and valid picture of program delivery in the county. Impacts of the reviews for county faculty have been numerous. Many have reinvigorated the advisory process and placed new emphasis on program planning, which has identified new audiences and programs, and led to improvements in old programs. In other cases, opportunities for county faculty to collaborate with one another across program areas have been developed. In some cases faculty have pursued professional development plans to improve the quality of their outreach. In all cases, county program reviews have led to organizational renewal and increased attention to program quality.