June 2004 // Volume 42 // Number 3 // Research in Brief // 3RIB2
Use of Information Technology by County Extension Agents of the Florida Cooperative Extension Service
Abstract
Mixed-mode data collection using a dedicated Web site and traditional
paper instrumentation was used to investigate information technology
use by county agents of the Florida Cooperative Extension Service.
The response to this census was 90.33% (n = 299). Patterns of hardware
and software use on the job and self-rated overall IT skills were examined.
Agents also self-assessed their ability to perform specific tasks using
selected software. Future IT training needs were assessed via a "felt-needs" methodology.
Results include significant differences in self-rated IT skills between
age groups, a high hours/week of computer use, and a hierarchical list
of training needs.
Introduction
During the early days of the personal computer's diffusion, Cantrell (1982) reported that Extension educators, lacking computer competencies, were in jeopardy of becoming less computer literate than their clientele. Ten years later, Ruppert (1992) stated "Extension educators cannot escape the computer revolution and will be challenged in their roles with the responsibility of helping people understand and make the best use of such technology" (p. 4). Eight years thereafter and after monumental technological progress in personal computing, Albright (2000) stated that knowledge had become the central focus of the global economy and that a transition to "incorporate the technology required for the dissemination of knowledge" (p. 11) is nowhere more important than within organizations that have always produced knowledge (i.e., Extension).
It can thus be asserted that the ability of Extension agents to use computers, software, and associated peripheral devices for purposes of serving clientele, research, and in support of Extension's administrative infrastructure, has become an essential job-related skill.
Purpose
The purpose of the study reported here was to investigate the use of information technology (IT) by county Extension agents of the Florida Cooperative Extension Service (FLCES). This population was chosen due to its access to the Internet, convenience, and the longitudinal perspective provided by the Ruppert study. The objectives were to:
-
Identify FLCES county Extension agents' demographic characteristics and describe their self-rated level of overall IT skills vis-á-vis those characteristics.
-
Determine how FLCES county Extension agents are using IT on the job, and assess specific software skills.
-
Determine future IT training needs for FLCES county Extension agents.
Procedure
Population
The population for the study consisted of the 331 county Extension agents in the employ of the Florida Cooperative Extension Service at the time the data was collected (population statistics courtesy of FLCES District Extension Director's Office, 05/22/2002).
Instrument
Data for this study were collected by way of an instrument that was adapted from Albright's 2000 survey of Texas county Extension agents' computer skills. Albright's instrument, the "Survey of Computer Technology Skills and Training Needs" (SCTS), used computer competencies from two documents: The Texas Education Agency's "Texas Essential Knowledge and Skills (for Texas teachers, and Texas students in grades K-12)" and the "Texas Technology Essential Knowledge and Skills (for Texas teachers, and Texas students as of the 8th grade)." The latter is a document Albright reports as having "quickly become a national standard among educational institutions" (Albright, 2000, p. 38). The adaptation, which included 99 individual items, was subjected to expert review and a pilot test.
To determine future training needs, the SCTS employed the methodological framework of the (1980) Borich Needs Assessment Model as verified by Barrick, Ladewig, and Hedges in 1983. Accordingly, county agents were asked "to report self-perceived technology skills, their ability to apply the skills to their work, and their perception of the importance of the technology skills" (Albright, 2000, p. 59).
This information is measured by three questions that ask an agent to assess the importance of, their ability to apply, or their knowledge about the technology skill (e.g., word processing) being considered. Response to the three questions uses a 5-point Likert-type scale (1 being low), the levels of which correspond to an agent's self-perceived importance of, ability to apply, or knowledge about the technology skill being considered. Three constructs, "Importance," "Knowledge," and "Application" are thus measured by the questions and are "operationalized" as follows.
-
Importance: Importance of this skill to your job function.
-
Knowledge: Your knowledge of this topic (your ability to accurately recall or summarize the subject matter).
-
Application: Your ability to use this skill in your job. (Albright, 2000, p. 62)
To analyze the data a "score," equal to the (overall) mean level of the response, is calculated for each of the three constructs. Albright (2000) posits that a weighting of this score, which combines the agents' knowledge of a skill with the level of importance that they ascribe to a skill, "is a stronger relationship to consider for training" (p. 84) than is the knowledge score alone. This weighting functions as a mathematical equation using the scores of the Knowledge and Importance constructs: (Importance Mean â Knowledge Mean) X Importance Mean = Training Need (Albright, 2000, p. 87). The effect of the weighting is to accentuate the importance of a skill should the knowledge of that skill be small. Performed for all skills considered (e.g., word processing), the weighted knowledge scores are then ranked, with the most pressing training needs garnering the highest values.
Data Collection
Data collection followed a "mixed-mode" method (Ladner, Wingenbach, & Raven, 2002), in this case allowing individuals 3 weeks to complete a Web-based survey instrument before sending non-respondents a paper version of the instrument. It is believed that this method accommodates individuals who do not wish to respond to the prior mode (e.g., the Web) in which the survey was offered (Dillman, 2000).
The Web-based survey was introduced by a message e-mailed to all county agents from the Dean for the Florida Cooperative Extension Service. Two days later the study commenced with an individualized e-mail sent by the researcher to each agent containing specific information on the survey's rationale, a hyperlink to the World Wide Web site hosting the survey instrument, and the agent's unique (identifying) access code. Reminder messages containing the hyperlink to the survey and the agent's unique access code were sent to non-respondents once a week over the next 3 weeks.
Three weeks after the beginning of the study, the population of agents who had not completed the Web-based instrument was sent a packet via conventional mail that included information from the researcher on the survey's rationale, a paper version of the survey instrument, and a self-addressed stamped return envelope. The information plainly stated that the survey could still be completed on-line and provided the URL to the site and the individual's unique access code. Two weeks later, a reminder message (also providing the URL to the site and the individual's unique access code) was sent by conventional mail to those agents who had not yet completed the survey in either mode. The survey concluded 2 weeks after this reminder.
Non-response error was addressed by differentiating the respondents into four groups (early on-line, late on-line, all on-line, and paper), which were examined for differences. On variables of interest, no differences were found.
Findings
Demographic Characteristics and Self-Rated IT Skills of the Respondents
Two hundred ninety-nine agents, or 90.33% of the population, completed the survey either on-line or on paper. By gender, the respondents were 57.86% female and 42.14% male, a figure close to that of the general population of FLCES county Extension agents (58.01% female and 49.99% male) at the beginning of the study. The majority of respondents (63.54%) indicated that their age fell between 41 and 60 years. Most respondents (69.90%) reported work experience, including both inside and outside of Extension, of 16 or more years. Table 1 presents this information.
Characteristic |
N |
%N |
---|---|---|
Gender |
||
Male |
126 |
42.14 |
Female |
173 |
57.86 |
Age Group |
||
20-30 |
35 |
11.71 |
31-40 |
51 |
17.06 |
41-50 |
97 |
32.44 |
51-60 |
93 |
31.10 |
61-70 |
19 |
6.35 |
No response |
4 |
1.34 |
Years Work Experience |
||
Less than 5 years |
22 |
7.36 |
5-10 years |
31 |
10.37 |
11-15 years |
34 |
11.37 |
16+ years |
209 |
69.90 |
No response |
3 |
1.00 |
Agents were asked to self-rate their overall IT skills on a scale from "poor" to "excellent." As shown in Table 2, 84.95% of the respondents reported their skills to be either "average" or "above average." By gender, 85.37% of the males and 84.97% of the females rated their skills as being either "average" or "above average."
Overall IT Skills Rating | N | %N |
---|---|---|
Very Poor |
3 |
1.00 |
Poor |
18 |
6.02 |
Average |
129 |
43.14 |
Above Average |
125 |
41.81 |
Excellent |
22 |
7.36 |
No Response |
2 |
0.67 |
Analysis of variance determined that statistically significant differences in mean self-rated overall IT skills existed between the age groups: F(4, 292) = 3.59, P < 0.0070. A Duncan's test (Table 3) was then performed, which showed that the differences were between the 51-60 age group and the younger age groups and between the 61-70 age group and the younger groups, excluding the 51-60 age group.
Levels of the Independent Variable |
N |
Mean |
Duncan Grouping |
---|---|---|---|
Age Group 20-30 |
35 |
3.714 |
A |
Age Group 31-40 |
49 |
3.714 |
A |
Age Group 41-50 |
97 |
3.474 |
A |
Age Group 51-60 |
93 |
3.387 |
A B |
Age Group 61-70 |
19 |
3.105 |
B |
How FLCES County Extension Agents are using Information Technology on the Job
Average Weekly Computer Use
As shown in Table 4, one hundred-thirteen agents (37.79%) responded that they use their computers, both at home and at work, over 20 hours a week. Another 78 agents (26.09%) reported computer use at between 16-20 hours per week.
Level of Use |
N |
%N |
---|---|---|
1-5 Hours/week |
18 |
6.02 |
6-10 Hours/week |
44 |
14.72 |
11-15 Hours/week |
46 |
15.38 |
16-20 Hours/week |
78 |
26.09 |
20+ Hours/week |
113 |
37.79 |
Use of Electronic Mail
Asked if they use e-mail, 100% (n =299) of the respondents answered "yes." Agents were then asked to give their average daily use of e-mail. As is shown in Table 5, 26.42% of the agents responded "31-45 minutes a day," and 25.08% responded "46-60 minutes a day." Asked if they use e-mail to communicate with clientele, a large majority, 91.97%, of the agents said "yes."
Average Daily Use |
N |
%N |
---|---|---|
0-15 minutes a day |
16 |
5.35 |
16-30 minutes a day |
73 |
24.41 |
31-45 minutes a day |
79 |
26.42 |
46-60 minutes a day |
75 |
25.08 |
Over 60 minutes a day |
56 |
18.73 |
A follow up to the use of e-mail question asked agents to estimate the number of clientele they reached each month via e-mail. Table 6 details the results.
Est. No. of Clientele Reached |
N |
%N |
---|---|---|
1-25 clientele per month |
170
|
56.86 |
26-50 clientele per month |
47
|
15.72
|
51-75 clientele per month |
14
|
4.68
|
76-100 clientele per month |
19
|
6.35
|
100+ clientele per month |
29
|
9.70
|
Not applicable |
16
|
5.35
|
No response |
4
|
1.34
|
Use of Presentation Software
Asked if they use presentation software such as Microsoft PowerPoint or Corel Presentations, 245 (81.94%) of the respondents answered "yes." Table 7 shows the response to this question. The majority of agents, 74.92% (n = 224), responded that they used Microsoft PowerPoint, while 6.02% (n = 18) used Corel Presentations, and 1.34% (n = 4) indicated that they used another product.
Average Yearly Use of Presentation Software | N | %N |
---|---|---|
0-5 times a year |
59 |
19.73 |
6-10 times a year |
51 |
17.06 |
11-15 times a year |
33 |
11.04 |
16-20 times a year |
30 |
10.03 |
More than 20 times a year |
76 |
25.42 |
No response |
50 |
16.72 |
Use of the World Wide Web
Agents were asked if they "could 'surf' or browse the Internet." Two hundred ninety-eight, or 98.33% of the respondents, answered "yes." When asked the open-ended question, "in general, what is your opinion of the World Wide Web and its use in Extension work," 226 agents voluntarily responded in writing. Of these agents, almost all of them, 97.34% (n = 220), used very strong positive statements such as "indispensable," "wonderful resource," "vital for quick information & ideas," "absolutely essential," "invaluable," or "very important."
Assessing Specific Software Skills
The study endeavored to generate an objective assessment of agents' computer skills by asking if they could perform specific tasks. To these ends, a series of yes/no questions asked about skills associated with the six types of software considered by the study. As is indicated in Table 8, questions about skills associated with e-mail and surfing the World Wide Web received the highest percentages of "yes" response.
Specific Software Skill |
N |
%N |
---|---|---|
|
||
Can you attach and send files (attachments) through e-mail? |
285 |
95.32 |
Are you a member of an e-mail listserv? |
281 |
93.98 |
Can you find addresses in your e-mail program's address book? |
281 |
93.98 |
Can you create and use e-mail distribution lists using your e-mail program? |
214 |
71.57 |
Can you access your e-mail away from the office? |
197 |
65.89 |
Do you use e-mail folders to organize sent or received e-mail messages? |
179 |
59.87 |
Word Processing |
||
Can you use edit features such as cut and paste? |
282 |
94.31 |
Can you set page margins? |
263 |
87.96 |
Can you create tables? |
236 |
78.93 |
Can you set tabs? |
228 |
76.25 |
Can you perform "mail merge" using a dataset of names, etc.? |
95 |
31.77 |
Spreadsheet |
||
Can you format cells in a spreadsheet to number, or currency, etc.? |
153 |
51.17 |
Can you sort data in a spreadsheet? |
125 |
41.81 |
Can you write formulas in a spreadsheet? |
121 |
40.47 |
Can you create a graph or chart in a spreadsheet (using the Ss. software?) |
117 |
39.13 |
Can you use nested functions in a spreadsheet? |
64 |
21.40 |
Presentation Software |
||
Can you use different views in the presentation software package? |
228 |
76.25 |
Can you insert graphics and pictures from a variety of resources? |
205 |
68.56 |
Can you create a master slide? |
200 |
66.89 |
Can you create a slide show that runs automatically? |
173 |
57.66 |
Can you create automatic builds and transitions? |
147 |
49.16 |
World Wide Web |
||
Can you use a search engine such as Yahoo or Google to find Web pages? |
293 |
97.99 |
Can you download files from the Internet? |
284 |
94.98 |
Can you bookmark frequently used Web pages? |
275 |
91.97 |
Web Page Editing/Development |
||
Can you edit Web pages? |
62 |
20.74 |
Can you create hyperlinks? |
57 |
19.06 |
Can you create a Web page using MS FrontPage or another HTML editor? |
56 |
18.73 |
Can you incorporate graphics into Web pages? |
56 |
18.73 |
Can you convert existing files into HTML? |
48 |
16.05 |
Can you create a Web page using native HTML? |
21 |
7.02 |
Determine Future IT Training Needs for FLCES County Extension Agents
This aspect of the study followed the Borich methodology employed by Albright. Using the weighting method discussed earlier, training needs were calculated and ordered for each of the six computer skills areas the study considered. The results are listed in Table 9 and show that E-mail skills appeared at the top of the list of training needs.
Skill Category |
Importance Mean |
Knowledge Mean |
Weighted Knowledge Score |
Rank of Training Need |
---|---|---|---|---|
|
4.42
|
3.44
|
4.33
|
1
|
Presentation |
4.10
|
3.29
|
3.32
|
2
|
Word Processing |
4.36
|
3.61
|
3.27
|
3
|
WWW |
3.95
|
3.49
|
1.81
|
4
|
Web Development |
3.45
|
3.00
|
1.55
|
5
|
Spreadsheet |
3.00
|
2.67
|
0.99
|
6
|
Limitations, and Implications for Practice and Research
Due to the nature of this study, the specific IT infrastructure in place within the FLCES, and specific IT knowledge and skills that might be possessed by FLCES county agents, the findings of the study cannot be generalized to Extension organizations elsewhere. However, they are likely to offer insight to those organizations.
The findings do provide some interesting implications and suggest some recommendations for training and professional development programs to enhance program effectiveness of FLCES agents. E-mail, presentation software, and word processing ranked highest in terms of (perceived) importance, and in level of knowledge by the respondents. The construct scores for these items also show these skills areas ranking highest for training needs. An implication of the rankings of training needs may be that agent respondents are indicating that even though they feel knowledgeable about this software, they are also aware that there is more to know.
From a training and professional development standpoint, a key implication of this study is that agents have a desire to lean more about the staples of their everyday IT experience: E-mail, presentation software, word processing, and the Web. But the findings related to the knowledge construct and the weighted knowledge score also suggest that such training needs to be done at a "beyond the basics level" in order to ensure that those agents who find such skills to be important, but also have an existing base of knowledge, are given the support to learn more in-depth skills.
Providing more in-depth training in skills areas related to agents' strongest level of usage, guided by what they say is important and have some knowledge base in, has potential implications for both the success of training efforts and the effectiveness of Extension programs overall. Results of the study show that a majority of respondents had not taken a computer course since 2000. Given the rapid change and updating of software, this finding suggests that a need exists for multiple levels of training, including that which updates skills sets.
A focus on targeted training needs can also serve to enhance program effectiveness, by upgrading agents' skills in software they use frequently to deliver programs. For example, more than 81% of respondents said they used presentation software, and a majority indicated they used it 11-20 times a year or more. Yet slightly less than half, 49.16%, could create automatic builds and transitions, a skill that can be used to enhance the visual interest and attention level of electronic slide presentations.
Based on the results of this study, it can be concluded that agents of the Florida Cooperative Extension Service have embraced information technology and are using it on the job more than ever before. More than two-thirds of the agents reported using their computers from 16 to over 20 hours a week (including use at home). Additionally, the vast majority of agents in this study used e-mail to communicate with clientele, over three-quarters use presentation software, and just over 20% responded that they could edit or create Web pages.
These findings suggest a shift in the way Extension agents conduct their jobs and a potential change in the way Extension outreach is delivered. Agents still spend time in face-to-face interaction with clientele, but it is apparent from this study that agents may also be using information technology to facilitate routine communication and information dissemination (to their clientele). Further research is needed in order to assess the influence of computer technologies with respect to Extension educators' impact on clientele.
References
Albright, B. B. (2000). Cooperative Extension and the information technology era: An assessment of current competencies and future training needs of county Extension agents. (Doctoral dissertation, Texas A&M University, 2000). Dissertation Abstracts International, 61, 2668.
Cantrell, M. J. (1982). An assessment of attitudes, needs and delivery system for microcomputer applications by agricultural and extension educators in Mississippi. (Doctoral dissertation, Mississippi State University, 1982). Dissertation Abstracts International, 43, 3488.
Dillman, D. A. (2000). Mail and Internet surveys, the tailored design method (2nd ed.). New York, NY: John Wiley & Sons, Inc.
Ladner, M. D., Wingenbach, G. J., & Raven, M. R. (2002). Internet and paper based data collection methods in agricultural education research. Journal of Southern Agricultural Education Research, 52(1), pp. 49-60.
Ruppert, K. C. (1992). Factors affecting the utilization of computers by county Extension agents in Florida. (Doctoral dissertation, University of Florida, 1992). Dissertation Abstracts International, 54, 2915.