June 2018
|
June 2018 // Volume 56 // Number 3 // Feature // v56-3a2
Capturing Outcomes Often Overlooked: A Pilot Evaluation of Florida Individual Contact Teaching
Abstract
Extension provides personalized educational services collectively known as individual contact teaching (ICT), yet these services often are not evaluated. This article presents results from pilot testing of an approach for evaluating ICT. Extension representatives and clientele provided data on ICT events, primarily office consultations and landscape site visits. Most client respondents were very satisfied with the services received; had increased their knowledge, skills, and future preparedness to address the applicable issue; and had changed a behavior following the education. These positive findings reveal the importance of evaluating such services. Extension professionals are encouraged to use a personalized evaluation approach such as the one described to ensure that the collective value of ICT is captured.
Introduction
Extension professionals nationwide provide personalized services in the form of on-site visits, phone calls, email consultations, and office visits. These individual contact teaching (ICT) methods (Seevers & Graham, 2012) can offer a more integrative service experience whereby a client may have increased opportunity to clarify an issue, integrate new information, and gain a more in-depth understanding of a topic (Guion, 2006). Some have referred to individual education as the most essential element of extension education (Oakley & Garforth, 1985). Research also has indicated that although ICT methods may be costly in terms of time and other resources, they have greater benefits than other educational methods (Galindo-Gonzalez & Israel, 2010; Oakley & Garforth, 1985). For example, use of ICT can help establish and maintain credibility between agents and their clientele, stimulate beneficial behavior adoptions, and result in economic value impacts, as adoption of ICT recommendations have been shown to produce operational savings for clients (Petrzelka, Padgitt, & Wintersteen, 1999).
Despite these and other findings, there remains a gap between the ICT activities conducted by Extension and those ultimately evaluated for broader impact outcomes (Warner, 2015). It has been recommended that Extension professionals shift the evaluative focus beyond accountability-driven participation rate objectives, which can be attributed in part to state and federal reporting requirements and general accountability pressures (Lamm, Israel, Diehl, & Harder, 2011). Additionally, recent needs assessments have identified several barriers to capturing a broader scale of ICT outcomes, including the variability of ICT consultation formats, perceived lack of time for evaluation, and the absence of a standardized evaluation tool for recording and meaningfully interpreting client feedback (Warner, 2015).
The University of Florida ICT Evaluation Survey Tool is an evaluation instrument designed to improve documentation of agent–client interactions and create a database for evaluating these interactions (Ali & Warner, 2017). The expectation is that implementation of the tool will mitigate the perceived barriers of time, diversity of recommendations, and lack of a standardized framework for effectively notating and measuring participation, reaction, instructional outcomes, practice-level behavior changes, and longer term social, economic, and environmental condition changes. Instructional outcomes include changes in knowledge, attitudes, skills, and aspirations (KASA) among the clientele who are directly related to the practice-level behaviors.
Evaluation is "assessing what was intended (goals and objectives), what happened that was unintended, what was actually implemented, and what outcomes and results were achieved" (Patton, 2008, p. 5). With supportive tools and resources, evaluations of ICT can meet the burden of this definition and beyond. Evaluation of personalized teaching activities can allow Extension to provide tangible benefits not only for clientele in the form of more adaptive and targeted service but also for the Extension professionals who are provided with more immediate, thorough feedback that can be critical for enhanced program planning and stronger relationships with the community at large (Warner, 2015). Therefore, the existing emphasis on participation-level ICT evaluation appears to underutilize the range of potential evaluation outcomes that ICT methods can produce as a developmental resource. Through a pilot test of the University of Florida ICT Evaluation Survey Tool, we sought to measure a more diverse range and scale of outcomes and to identify relevant implications for Extension professionals and the community members they serve.
Purpose and Objectives
The purpose of our study was to measure participation, reactions, changes in KASA, and practice-level outcomes related to Florida ICT services in order to identify implications for U.S. Extension professionals who educate Extension clients individually and to inform best uses for the University of Florida ICT Evaluation Survey Tool. The specific objectives that guided the study were as follows:
- Describe common types of ICT educational events.
- Describe common topics of ICT educational events.
- Measure the level of satisfaction, perceived knowledge gain, and behavioral intent resulting from ICT education.
- Assess behavior change 6–12 months following ICT education.
Evaluation Approach
Instrumentation
We designed the evaluation tool in three parts in Qualtrics (Qualtrics Research Suite, 2009). We constructed the tool so that it could be customized to each educational interaction, given the wide range of possible topics. We designed the first part to be filled out by Extension representatives who could be involved with evaluating ICT. There were fields for the Extension representative to input his or her role (Extension Agent, Master Gardener, Program Assistant, County Extension Director, or other), the client name and email address, type of interaction (email consultation, office visit, site visit, telephone consultation, workshop), and, optionally, the specific recommendation. After engaging with clients, Extension representatives submitted the consultation form online, thereby documenting the interaction provided to the client.
The second part was a follow-up client survey sent directly to the client via the email trigger function in Qualtrics upon the Extension representative's submission of the first part of the instrument. The Extension client was asked to indicate level of satisfaction with the service, whether the question posed was answered, and whether his or her knowledge, skill, and preparedness to address the issue had increased (Table 1). We also asked whether the client planned to follow the specific recommendation if one was provided by the Extension representative. If the Extension representative did not provide a specific recommendation, the client was asked whether he or she intended to adopt a practice; if the client indicated yes, an open box displayed so that the client could enter a description of the intended change.
Customer satisfaction survey question | Response options |
How satisfied are you with the service you received from your Extension office? |
Very dissatisfied Dissatisfied Satisfied Very satisfied Unsure |
Was your question answered? |
Yes No Unsure |
As a result of contacting the Extension office, I: "Have increased my knowledge about this question Am better prepared to address the issue in the future Have increased my skill level associated with managing this question |
Strongly disagree Disagree Agree Strongly agree Unsure |
Do you plan to do the specific recommendation provided by your Extension office? |
Yes No Unsure |
(When no recommendation provided) Do you plan to change any practices as a result of the consultation you received? |
Yes No Unsure |
By embedding data from the first part of the survey in the second part, we were able to create a combined data set with information from the Extension representative and client, meaning that diverse educational interactions could be collectively evaluated. This approach also allowed us to personalize the second part of the tool for the client, which helped with response rate (Dillman, Smyth, & Christian, 2009). Upon client completion of the customer satisfaction survey, all county reports that included client feedback were recorded in Qualtrics.
The third part was administered 6 to 12 months following the ICT event to confirm behavior change among those respondents who indicated that they would adopt a behavior change following the consultations they received. The follow-up included a reminder that the client had indicated planning to make a change and one question to assess this change (Table 2).
Question | Response options |
Did you change a practice as recommended by your Extension office? |
Yes, I am already doing this No, but I intend to do this soon No, I don't intend to do this in the future |
Recruitment
We publicized the availability of the tool for pilot testing during two professional development sessions and an internal conference presentation on ICT evaluation. Twenty individuals expressed interest in participating in the pilot test. At the end of the pilot test period, we had received data from six counties. The second part of the survey was sent to 113 recipients of ICT, and 48 answered the survey, for a response rate of 42.5%. The post-follow-up survey was sent to 40 eligible respondents, and 24 provided responses, for a response rate of 60.0%.
Data Analysis
We analyzed our data using SPSS (Version 24.0) (IBM Corp., Armonk, NY). Frequencies and percentages were used to meet the first, second, and third objectives of the study. We categorized the topics covered during the ICT events using the open-ended responses provided by the Extension representatives, or the client when the Extension representative did not provide this information. Prior to our conducting the study, our research protocol was approved by the University of Florida Institutional Review Board.
Results
Objective 1: Describe the Most Common Types of ICT Educational Events
The most common ICT method used by the Extension representative respondents was office visits (f = 21, 44.7%), followed by site visits (f = 13, 27.7%) (Table 3).
Type of interaction | Frequency | Percentage |
Office visit | 21 | 44.7 |
Site visit | 13 | 27.7 |
Email consultation | 7 | 14.9 |
Telephone consultation | 5 | 10.6 |
Workshop | 1 | 2.1 |
Note. N is less than 48 because one Extension representative did not provide this information. |
Objective 2: Describe the Most Common Topics of ICT Educational Events
The most common topic addressed by individual contact educational events was plant nutrition (f = 23, 48.9%), followed by vegetable garden/fruit tree management (f = 21, 44.7%) and irrigation management (f = 21, 44.7%) (Table 4).
Topic | Frequency | Percentage |
Plant nutrition | 23 | 48.9 |
Vegetable garden/fruit tree management | 21 | 44.7 |
Irrigation management | 21 | 44.7 |
Plant selection | 17 | 36.2 |
Pest management | 14 | 29.8 |
Problem diagnosis | 13 | 27.7 |
Management of ornamental plants and trees | 12 | 25.5 |
Weed management | 5 | 10.6 |
Lawn management | 2 | 4.3 |
Other | 3 | 6.4 |
Note. N is less than 48 because one Extension representative did not provide this information. Frequencies and percentages sum to greater than 47 and 100%, respectively, because individual contact teaching events often covered more than one topic. |
Objective 3: Measure the Level of Satisfaction, Perceived Knowledge Gain, and Behavioral Intent Resulting from ICT Education
Most of the client respondents (f = 42, 93.3%) were very satisfied or satisfied with the service received during the ICT education (Table 5). Although three individuals indicated that they were very dissatisfied, we believe they intended to indicate very satisfied as their other responses and open-ended feedback were very positive. All respondents (f = 45) indicated that they increased their knowledge, skills, and preparedness related to addressing their issues as a result of the ICT. Most of the respondents (f = 41, 91.1%) intended to change a behavior as a result of the ICT. The remaining 9% (f = 4) were unsure whether they would make any changes, and no one indicated they would not.
Satisfaction level | Frequency | Percentage |
Very satisfied | 40 | 88.9 |
Satisfied | 2 | 4.4 |
Very dissatisfied | 3 | 6.7 |
Note. N is less than 48 because three respondents did not provide this information. Although three individuals indicated they were very dissatisfied, we believe they intended to indicate very satisfied as their other responses and open-ended feedback were very positive. |
Objective 4: Assess Behavior Change 6–12 Months Following ICT Education
The follow-up survey indicated that most respondents (f = 21, 87.5%) had already changed their landscape practices as a result of participating in the ICT. The remaining respondents (f = 3, 12.5%) indicated that they had not changed their behaviors but intended to do so soon.
Conclusions and Discussion
ICT is an important part of Extension, allowing for personalized service for clientele (Oakley & Garforth, 1985; Petrzelka et al., 1999; Warner, 2015). Summary data indicate that Florida Extension provides the most ICT services in the form of office consultations and landscape site visits. Extension professionals in Florida diagnose problems, answer questions, and provide recommendations in a wide array of topic areas. Our results indicate that in Florida ICT services are predominantly related to plant nutrition, management of vegetable gardens and fruit trees, and landscape irrigation management. Our results demonstrate that Extension clients who receive these personalized services are generally very satisfied and are extremely likely to change their behavior as a result, findings that are consistent with others on this teaching method (Oakley & Garforth, 1985; Petrzelka et al., 1999).
The very high satisfaction rating and occurrence of behavior change emphasize the importance of evaluating ICT. Warner (2015) stated that a personalized evaluation tool could be a valuable feedback mechanism for providing useful information that would help Extension personnel evaluate their ICT interactions. The approach described here can help build rapport with clientele (Oakley & Garforth, 1985) and can be used to overcome challenges associated with diversity of ICT consultations (Warner, 2015). Accordingly, Extension professionals should examine their current ICT evaluation strategies on the basis of the results reported here.
We remind the reader that our findings are based on a small sample from Extension professionals who volunteered for this pilot study, and we do not intend to generalize our results to all Extension clients who engage in ICT. However, we believe the positive findings illustrate outcomes that are often overlooked when these educational offerings are documented only at the participation level. In cases where the outcomes of this educational method are reported only at the participation level, Extension professionals should consider measuring knowledge and skill increase along with behavior change and resulting impacts (Lamm et al., 2011).
To further ensure that Extension agents have easy access to client feedback and the impacts of their ICT education, we have developed links to individual county reports that allow agents to access their local data at any time. As clients provide feedback by responding to the client satisfaction surveys, the reports are automatically updated in Qualtrics. Therefore, with regular viewing, agents are up-to-date on the impact and thus quality of their services. Importantly, they also are made aware of ICT educational topics that should be revised or maintained. An added benefit is that agents have easy access to ICT records, which can be incorporated into their annual reports and plans of work.
Our next steps include further promoting the tool for statewide adoption and collecting user data to ensure that the tool continues to be functional and easy to use. An experimental design and statistical comparisons with other evaluation methods are needed to make stronger conclusions about using this pilot tested evaluation tool. We believe that future research is needed to understand how to effectively structure ICT evaluation. Research that focuses on the connection of outcomes and the influence of enhancing the personalization of a follow-up behavior change survey may provide valuable lessons for the overall process. We also suggest that the Extension professionals who engaged in this new evaluation method should be interviewed to capture their perceptions of the approach. Petrzelka et al. (1999) asserted that ICT methods were an important educational tool for the practice of extension in the 21st century. We enthusiastically agree and add that Extension professionals should ensure appropriate and sound evaluation to capture the real value of these important educational offerings.
Acknowledgment
Our work was supported by the University of Florida Early Career Scientist Seed Fund.
References
Ali, A. D., & Warner, L. A. (2017, March). Client satisfaction and Extension services: Measuring the effectiveness of the individual contact teaching tool. Poster presented at the University of Florida Center for Landscape Conservation and Ecology Urban Landscape Summit, Gainesville, Florida.
Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Mail and Internet surveys: The tailored design method (3rd ed.). New York, NY: John Wiley and Sons.
Galindo-Gonzalez, S., & Israel, G. D. (2010). The influence of type of contact with Extension on client satisfaction. Journal of Extension, 48(1), Article 1FEA4. Available at: http://www.joe.org/joe/2010february/a4.php
Guion, L. A. (2006). Educational methods for Extension programs. Retrieved from University of Florida Digital Collections, http://ufdc.ufl.edu/IR00002176/00001
Lamm, A. J., Israel, G. D., Diehl, D., & Harder, A. (2011). Evaluating Extension programs. Retrieved from Florida Cooperative Extension Electronic Data Information Source, http://edis.ifas.ufl.edu/wc109
Oakley, P., & Garforth, C. (1985). Guide to extension training. Rome, Italy: Food and Agriculture Organization of the United Nations. Retrieved from http://www.fao.org/docrep/t0060e/T0060E00.htm
Patton, M. Q. (2008). Utilization-focused evaluation (4th ed.). Thousand Oaks, CA: Sage.
Petrzelka, P., Padgitt, S., & Wintersteen, W. (1999). Extension's portfolio for the 21st century: A place for one-on-one consultations. Journal of Extension, 37(6), Article 6COM1. Available at: http://www.joe.org/joe/1999december/comm1.php
Qualtrics Research Suite. (2009). Provo, UT: Qualtrics Labs Inc.
Seevers, B., & Graham, D. (2012). Education through Cooperative Extension (3rd ed.). Fayetteville, AR: University of Arkansas.
Warner, L. A. (2015). Evaluating horticultural site visits and individual teaching activities in Extension. Journal of Extension, 53(4), Article 4COM2. Available at: http://www.joe.org/joe/2015august/comm2.php