April 2000 // Volume 38 // Number 2 // Feature Articles // 2FEA2

Previous Article Issue Contents Previous Article

Beyond Perception: A Pretest and Posttest Evaluation of a Regional Internet Extension Inservice Training

Abstract
An Internet inservice training titled "Soil Acidity and Liming" was offered to county Extension agents representing six states (Alabama, Georgia, Florida, South Carolina, North Carolina, and Virginia). A Web page was constructed with materials provided by several specialists, and Listserv questions received responses from nine specialists representing the six states. A pretest and posttest were submitted by the agents on-line, along with their responses to a brief questionnaire. The questionnaire responses show that most of the agents were very receptive to this method of inservice training. The pretest and posttest scores show that the training resulted in a significant increase in knowledge of the subject matter presented.


Robert M. Lippert
Extension Soil Fertility
Department of Crop and Soil Environmental Science
Clemson University
Internet Address: blpprt@clemson.edu

Owen Plank
Extension Agronomist
Crop & Soil Science Department
University of Georgia

Rama Radhakrishna
Program Evaluation and Accountability Specialist
Extension Staff Development
Clemson University


Introduction and Background

In the past two years, two Internet inservice trainings have been offered to county Extension agents in various states of the Southeast. For each training, a questionnaire was used on-line, and responses were tallied and reported in Lippert, Plank, Camberato, and Chastain (1998) and Lippert and Plank (1999). The questionnaires included questions that focused on previous computer and Internet experience, assessment of the material presented, and acceptance of using the Internet to learn the material. The agent responses were overall positive and very receptive to this form of inservice training.

Subsequently, a 3-week Internet training was offered to over 150 county Extension agents from six states (Alabama, Georgia, South Carolina, North Carolina, Florida, and Virginia) for a regional training titled "Soil Acidity and Liming." Nine specialists representing these states participated in the Web development and Internet discussions. In addition to questions selected from the previously used questionnaires, a pretest and posttest were given on-line. The intent was to move beyond personal perceptions regarding the effectiveness of this form of training and use a more empirical tool for assessing the utility of Internet instruction for knowledge acquisition.

Training Objectives

Objective 1: To determine if the Internet could be successfully used for distance instruction of Extension agents with a topic covering significant theoretical concepts in addition to many practical applications.

Objective 2: To use an appropriate instrument to assess the amount of actual knowledge gain as a result of the Internet training.

Training Content and Delivery

Prior to the Internet training, several agents were randomly surveyed via e-mail and asked to suggest topics of interest as well as the preferred time of year for the training. The title "Soil Acidity and Liming" was selected in response to this informal survey. The 3-week training was held from March 22 to April 16, 1999, (with a 1-week break because many specialists were traveling that week). Even though the training was scheduled for several weeks, the actual "hands-on" learning time was intended to be about 5 hours, equivalent to a day of traditional classroom-style training.

Training material was obtained from lecture notes and Extension information available from the participating states. The material was organized into a comprehensive text for instruction. The first week's topics were "Origin and Forms of Acidity," "The Effect of Soil Acidity and Liming on Crop Growth," and "The Effect of Lime Materials on the Neutralization of Aluminum." The second week's topics were "Conventional Lime Sources and Lime Quality" and "Alternative Liming Materials." The menu page for each week's training contained learning guidelines that listed the information the agents should know by the end of that section.

The Web page was created by a graduate student programmer who incorporated text, photos, and graphics under the direction of the senior training coordinator. Labor costs for the programmer were the only appreciable expenses encumbered for the course. When the Web page was near completion, the senior training coordinator subscribed registered agents to the Listserv by using their e-mail usernames.

The training was approved for four Certified Crop Advisor (CCA) credits in soil fertility. The CCA program was established to give agricultural professionals a standardized certification of competency in various areas. Members must take CCA approved training each year to retain certification. Instructions regarding how to access the Website and to use the Listserv were sent to the agents by e-mail. The Listserv is a means of electronic communication similar to an e-mail distribution list. All specialists and county agents were subscribed to the Listserv by the senior training coordinator. An e-mail message sent to the Listserv username (in our case acidity-l@clemson.edu) went to all participants who were subscribed to this address. A reply to the Listserv likewise went back to all subscribers. The Listserv serves as a "slow motion" conversation or as an electronic "bulletin board." Access to the Listserv software was provided by the university computer center.

The URL (Web address) for the training can be found at http://hubcap.clemson.edu/~blpprt/acidity.html.

During a 2-week period prior to the training, the agents were urged at four different times to take a 25-question multiple-choice pretest (Figure 1). It was developed so they could submit their answers on-line. The questions were created to cover the key points presented on the Web. The questions were reviewed by the two specialists involved with the Web page development and two specialists not involved with the training to ensure the quality of the questions. We received 121 pretest responses.

At the end of training, two Listserv appeals were made for posttest and questionnaire responses. A week later county agents were e-mailed individually to ask them to take the posttest. One final Listserv appeal for agents to take the posttest was made 10 days after the training was completed. We received 93 posttest and questionnaire responses. The questionnaire questions included: previous training completed through the Internet, extent of Web material read, extent of Listserv correspondence read, and the number of questions the respondent asked on the Listserv. At the end of the questionnaire were four open-ended questions.

Figure 1. Questions Used for the Pretest and Posttest

  1. Rainfall in excess of evaporation removes primarily ____ from the soil.
  2. In general, when comparing dicots to monocots, the pounds per ton of crop removal of soil calcium is:
  3. Sources of acidic hydrogen in the soil do not include:
  4. Which of the following creates the most soil acidity per pound of N?
  5. Active acidity refers to:
  6. The two main effects of acidity on plant growth are:
  7. The single most important factor affecting Ca and Mg availability in acid soils is:
  8. Mechanisms of phosphorus deficiency in acid soils do not include:
  9. All the following micronutrients become less plant available as the soil pH increases except:
  10. In acid soils, legumes often show N deficiency symptoms because:
  11. Nitrification is optimal in the pH range of:
  12. The main benefit of lime on crop growth is:
  13. The buffer solution that was developed to determine the lime requirement for soils containing primarily kaolinitic clays having a low CEC is:
  14. The material used as the standard by which the acid neutralizing capability of all other liming materials is measured is:
  15. The two principal factors which influence aglime quality is its acid neutralizing capacity and:
  16. The acid neutralizing capacity of lime is usually measured as the:
  17. Among the following materials, which has the highest CCE?
  18. A liming material has a CCE greater than 120. It probably has an appreciable amount of:
  19. The particle size of ground agricultural limestone is measured by:
  20. Hydrated lime is all of the following except:
  21. Boiler ash ...(various properties given as possible responses to complete the sentence):
  22. Flue dust ...(various properties given as possible responses to complete the sentence):
  23. Paper mill lime is not commonly used for agricultural purposes because of its:
  24. The most abundant element in wood ash is:
  25. The major constraints to land application of wood ash do not include:

Results

Pretest and Posttest

The pretest and posttest scores are shown in Table 1. The table also indicates the number and percent of correct and incorrect responses for each of the 25 questions (grouped by subject categories), percent gain in knowledge scores from pretest to posttest, and significance levels for differences in pretest and posttest knowledge scores as determined by the Chi-square test. Paired participant and unpaired participant data were compared, and we found no differences between paired and unpaired data. Therefore, we used unpaired data because of the larger representation of participants.

Table 1
Pretest and Posttest Scores for Internet Inservice Training

  Pretest Posttest Difference
Items   f % f % %
Sources and Forms of Acidity
Q1 Correct 107 88.4 92 98.9  
  Incorrect 14 11.6 1 1.1 +10.5*
Q2 Correct 60 49.6 78 83.9  
  Incorrect 61 50.4 15 16.1 +34.3**
Q3 Correct 29 24.0 38 40.9  
  Incorrect 92 76.0 55 59.1 +16.9**
Q4 Correct 82 67.8 81 87.1  
  Incorrect 39 32.2 12 12.9 +19.3**
Q5 Correct 60 49.6 53 57.0  
  Incorrect 61 50.4 40 43.0 +7.4 NS
Effects on Plant Growth
Q6 Correct 102 84.3 90 96.8  
  Incorrect 19 15.7 3 3.2 +12.5*
Q7 Correct 42 34.7 60 64.5  
  Incorrect 79 65.3 33 35.5 +29.8**
Q8 Correct 30 24.8 43 46.2  
  Incorrect 91 75.2 50 53.8 +21.4**
Q9 Correct 76 62.8 80 86.0  
  Incorrect 45 37.2 13 14.0 +23.2**
Q10 Correct 94 77.7 81 87.1  
  Incorrect 27 22.3 12 12.9 +10.0 NS
Q11 Correct 36 29.8 48 51.6  
  Incorrect 85 70.2 45 48.4 +21.8**
Lime and Assessing Lime Requirement
Q12 Correct 50 41.3 77 82.8  
  Incorrect 71 58.7 16 17.2 +41.5**
Q13 Correct 19 15.7 52 55.9  
  Incorrect 102 84.3 41 44.1 +40.2**
Q14 Correct 60 49.6 77 82.8  
  Incorrect 61 50.4 16 17.2 +33.2**
Q15 Correct 100 82.6 90 96.8  
  Incorrect 21 17.4 3 3.2 +14.2*
Q16 Correct 103 85.1 89 95.7  
  Incorrect 18 14.9 4 4.3 +10.6*
Q17 Correct 90 74.4 80 86.0  
  Incorrect 31 25.6 13 14.0 +11.6*
Q18 Correct 52 43.0 68 73.1  
  Incorrect 69 57.0 25 26.9 +30.1**
Q19 Correct 116 95.9 91 97.8  
  Incorrect 5 4.1 2 2.2 +1.9 NS
Alternate Lime Sources
Q20 Correct 78 64.5 76 81.7  
  Incorrect 43 35.5 17 18.3 +17.2*
Q21 Correct 84 69.4 79 84.9  
  Incorrect 37 30.6 14 15.1 +15.5*
Q22 Correct 20 16.5 55 59.1  
  Incorrect 101 83.5 38 40.9 +42.6**
Q23 Correct 39 32.2 69 74.2  
  Incorrect 82 67.8 24 25.8 +42.0**
Q24 Correct 31 25.6 31 33.3  
  Incorrect 90 74.4 62 66.7 +7.7 NS
Q25 Correct 73 60.3 57 61.3  
  Incorrect 48 39.7 36 38.7 +1.0 NS
Significant at * p <.05; ** p <.001 Scale: Substantial gain (30% and above); Moderate gain (20-29%); Little gain (10-19%); Negligible or no gain (0-9%)

For ease of reporting, the knowledge gain percentages between pretest and posttest were categorized into: 1) Substantial gain (30% and above); 2) Moderate gain (20-29%); 3) Little gain (10-19%); and 4) Negligible or no gain (0-9%). As shown in Table 1, knowledge scores for all the 25 questions increased from pretest to posttest. Of the 25 questions, seven showed substantial gains (over 30%) in knowledge scores from pretest to posttest; four questions showed moderate gain (20-29%), nine showed little gain (10-19%), and five questions showed negligible or no gain (0-9%) in scores from pretest to posttest. The five questions that showed negligible or no gain were not statistically significant at the .05 level. Overall, the knowledge score gain from pretest to posttest ranged from a low of +1% (question 25) to a high of +43% (question 22).

Findings from this training indicate that participants had some previous knowledge of the subject matter topics. Based on the percent gain in knowledge scores between the pretest and posttest, however, it was possible to increase participant knowledge in four topic areas via the Internet training.

Nature of the Listserv Discussions

There is much discussion in the literature regarding the use of the Listserv in classroom situations and how the students adapt to it. Velayo (1994) and Collins (1998) present excellent reviews of this aspect of the Listserv. They discuss various strengths of Listserv use, such as being able to collect data, reaching a large number of diverse people easily, conversations not influenced by physical responses from others, and the ability to reflect and compose a comment at the student's own pace and convenience. Both authors support this form of electronic communication as a viable learning tool.

Typically, the literature refers to classroom situations that cannot be transferred to the inservice training approach we are presenting. For example, Williams and Merideth (1996) document student use of a Listserv to supplement class discussion, but the Master's level students initially met for 32 hours the first week. The second week, they met for 19 hours, and during the last four weeks, all discussion was restricted to the Internet. Thompson et al. (1997) pointed out that for a graduate level class, where all discussion was confined to the Internet, about 15 weeks was required for the students to overcome Listserv phobia. An inservice training for professional adults that lasts only 2 or 3 weeks has considerations that are not addressed by studies of a semester-long Internet class for university students where there may or may not be face-to-face interaction.

Thirty-one agents participated in the Listserv discussions, some of them sending more than one e-mail. A previous training covering cotton fertility had a total of 59 e-mails sent through the Listserv. During this training, there were 168 e-mails posted on the Listserv, reflecting nearly a three-fold increase. The increase was likely due to the inclusion of agents from two additional states in the training (Virginia and Florida) and the use of a topic with wider appeal.

The e-mails from the agents mostly consisted of questions addressed to the specialists. A few agents initially sent e-mails addressed directly to one of the lead coordinators, who subsequently forwarded them through the Listserv. Perhaps this was due to some slight initial Listserv phobia. Only towards the end of the training did a few agents express personal views that went beyond simply asking questions. These e-mails in particular were quite lengthy.

Piburn and Middleton (1997) used a Listserv as a way of allowing the students to share their thoughts for a course preparing them for a career in middle school teaching. They noted that "Just as in spoken conversation, some people are quiet and others loquacious. The most talkative person posted 51 messages with [a total of] 790 lines. Another posted only one message consisting of one line of text. Some of the differences in verbosity were due to familiarity with the computer medium." This is likely applicable with the county agents as we observed later in the training. A comment by an agent, though, sums up the reluctance to communicate on the Listserv. "We are hesitant to ask dumb questions after hearing Ph.D.'s talk to one another." Perhaps students are more likely to ask questions on a Listserv than county agents who are already expected to be knowledgeable in many aspects of crop production. Romiszowski and de Haas (1989) also point out that "There are people who don't trust their thoughts in print. There will be an amount of people only reading messages and never responding."

Questionnaire Response Summary

The questionnaire provided space for open-ended written responses to four specific questions. When asked "What advantages do you see with Internet inservice training?" 59 agents replied that they could do the training at their own pace and when convenient. Some agents replied not having to travel (14) and low expense (16). Twenty-six agents reported that the regional approach to the training was a benefit for them because, through the Listserv discussions, they could learn about agent experiences in nearby states and have access to information from many knowledgeable specialists. Seventeen agents pointed out that they were very glad the material would remain accessible on the Web indefinitely for future reference.

Responses to the question "What disadvantages do you see with Internet inservice training?" included:

  • the ease of procrastination (18),
  • problems with office distractions (6),
  • lack of personal contact (12),
  • problems with Internet access due to computer shortages or very slow modem connections (6),
  • lack of immediate feedback to Listserv questions (4),
  • the possibility of a more detailed discussion to questions in a face-to-face situation (5),
  • excessive e-mails (8), and
  • the need for other means of agent and specialist interaction so the training won't be so passive (3).

To the question, "Regarding information delivery, what changes would you like to see when the next inservice training is offered on the Internet?" nine agents pointed out that the training would have been more convenient if scheduled in January or February, when they are not so busy. Three agents suggested some way of organizing the Listserv questions and answers by topic so the discussion wouldn't seem so disjointed. Thirty-six agents volunteered such comments as "Excellent, well thought out, great format, good text and visuals, and material organization was outstanding."

The responses to the question "What was the most important thing you learned as a result of this training?" were very consistent. Eleven agents responded "good review," eight agents responded "the economic advantages of using lime," 16 agents responded "The causes of soil acidity," and 20 agents indicated that they most benefited from the section on types and properties of alternative liming materials.

The question, "The use of the Internet can provide a learning experience as effective as a face-to-face class" received the following responses:

  • strongly disagree (2%),
  • disagree (17%),
  • neither agree or disagree (26%),
  • agree (44%), and
  • strongly agree (11%).

Conclusions

The pretest and posttest results clearly show the effectiveness of the Internet for actual knowledge acquisition of theoretical and applied agricultural topics. As also found in with previous Internet trainings, there is a general acceptance of this style of learning. A majority of the agents (55%) thought that a training offered through the Internet can be as effective as a face-to-face learning environment.

A future training will incorporate a learning style test, in addition to a pretest and posttest, to see if there is a correlation between the agents' personal style of learning and their ability to learn with an Internet training. We will also study the relationship between their test performance and various demographics, such as age, level of education, sex, etc. Future training sessions will also utilize more interactive tools, such as video clips and intermittent self-grading mini-tests so the agents can monitor their own progress as they read through the material.

References

Collins, M. (1998). The use of e-mail and electronic bulletin boards in college-level biology. Journal of Computers in Mathematics and Science Teaching, 17(1), 75-94.

Lippert, R.M. & Plank, C.O. (1999). Responses to a first time use of Internet inservice training by agricultural Extension agents. Journal of Natural Resources and Life Sciences Education (in press).

Lippert, R.M., Plank, C., Camberato, J., & Chastain, J. (1998). Regional Extension in-service training via the Internet. Journal of Extension [On-line], 36(1). Available: http://www.joe.org/joe/1998february/a3.html.

Piburn, M.D. & Middleton, J.A. (1997). Listserv as journal: computer-based reflection in a program for pre-service mathematics and science teachers. Paper presented at the International Conference on Science, Mathematics and Technology Education. Hanoi, Vietnam

Romiszowski, A.J & de Haas, J.A. (1989). Computer mediated communication for instruction: using e-mail as a seminar. Educational Technology, 7-14.

Thompson, J.C., Malm, L.D., Malone, B.G., Nay, F.W., Oliver, B.E. & Saunders, N.G. (1997). Enhancing classroom interaction in distance education utilizing the world wide web. Paper presented at the annual meeting of the Mid-Western Educational Research Association. Chicago, Illinois.

Velayo, R.S. (1994). Supplementary classroom instruction via computer conferencing. Educational Technology, May-June, 20-26.

Williams, H.L. & Merideth, E.M. (1996). On-line communication patterns of novice Internet users. Computers in the Schools, 12(3), 21-31.