The Journal of Extension - www.joe.org

October 2015 // Volume 53 // Number 5 // Feature // v53-5a8

Modernizing Training Options for Natural Areas Managers

Abstract
A recent shift in desires among working professionals from traditional learning environments to distance education has emerged due to reductions in travel and training budgets. To accommodate this, the Natural Areas Training Academy replaced traditionally formatted workshops with a hybrid approach. Surveys of participants before and after this change indicate that a traditional in-person format was preferred in the past, but a hybrid format is preferred now. Respondents indicated the new format is more effective at providing highly desired benefits than the traditional face-to-face approach. These findings have implications for many Extension programs targeting working professionals across large geographic areas.


Sarah E. Friedl
Biological Scientist and Natural Areas Training Academy Workshop Coordinator
Quincy, Florida
sefriedl@ufl.edu

Holly K. Ober
Extension Wildlife Specialist and Associate Professor
Quincy, Florida
holly.ober@ufl.edu

Taylor V. Stein
Extension Forestry Specialist and Professor
Gainesville, Florida
tstein@ufl.edu

Michael G. Andreu
Extension Forestry Specialist and Associate Professor
Gainesville, Florida
mandreu@ufl.edu

University of Florida

Introduction

During the previous decade or so, the popularity of distance education programs has increased substantially (Moore & Kearsley, 2011). Through distance learning, educators have the ability to offer low-cost courses and programs to individuals spread across vast geographic regions, often allowing users to learn at their own pace. The flexibility and convenience of such online courses is especially important for working professionals who are required to pursue continuing education trainings, but at the same time face decreasing travel budgets and increasing workloads that afford them less time away from their daily work duties (Norman, 2013). Traditionally, continuing education and other similar professional trainings have been offered through face-to-face instruction by Extension agents/specialists and other subject matter experts. However, because online trainings have the potential to increase the geographic scope of audiences as well as reduce costs for both instructors and participants (Allred & Smallidge, 2010), there is growing momentum to shift from the traditional face-to-face format to more online education in Extension.

Effective distance education programs are not simply face-to-face programs put online (Davis, 2014; DuCharme-Hansen & Dupin-Bryant, 2004); curriculum for each course must be properly modified and adapted to be presented for remote students in a virtual environment. This should include interactive media and appropriate facilitation and communication from instructors. Distance education is particularly effective when it involves careful sequencing or layering of content (Boling, Hough, Krinsky, Saleem, & Stevens, 2012), so students have opportunities to apply concepts they learn to real world situations. Adult learners in particular prefer learning environments that incorporate problem solving, with opportunities to incorporate both new knowledge and past experiences, engaging in higher-order thinking (Huang, 2002; Ruiz, Mintzer, & Leipzig, 2006). For this reason, a hybrid approach that blends face-to-face and online formats can be very well suited for adult Extension audiences (Norman, 2013).

When implemented correctly, distance learning can yield positive results, despite the often reduced participant-participant and participant-instructor interactions characteristic of traditional face-to-face environments. In fact, even though face-to-face courses may be preferred by some participants (Davis, 2014), a multimedia-rich and highly interactive online learning environment can lead to achievement levels comparable to or above traditional instruction (Boling, Hough, Krinsky, Saleem, & Stevens, 2012; Choi & Johnson, 2005; McCann, 2007). Hybrid courses can provide benefits over strictly face-to-face learning because participants are able to work through foundational content ahead of time at their own pace, and then revisit that content as often as they choose. This in turn leads to more meaningful engagement in face-to-face discussions and activities (Al-Busaidi, 2013; Allen, Bourhis, Burrell, & Mabry, 2002; McLaughlin et al., 2013). Flipped classrooms, where lectures are presented online followed by in-person sessions that involve active learning activities to promote student engagement, can lead to improved understanding of course material (Missildine, Fountain, Summers, & Gosselin, 2013).

The Natural Areas Training Academy (NATA) is an Extension program offered by the University of Florida that provides training for conservation land managers. Since its inception in 2000, NATA has delivered over 90 workshops. Participants who complete a series of four 24-hour core workshops offered exclusively by NATA (plus one additional training offered by the Florida Forest Service) earn a Certificate in Natural Areas Management (CNAM). The CNAM, conferred by University of Florida Extension, has been adopted as training required for promotion among entry-level land managers by several counties in Florida (Colverson & Demetropoulos, 2010).

Until 2011, all NATA workshops were offered as 3-day face-to-face trainings at remote locations that provided settings relevant to workshop material. During 2009-2011, participant enrollment dropped significantly due to numerous factors such as increased government travel restrictions, budget cuts, and the inability to leave regular work duties for extended periods of time.

Because a large proportion of potential participants were facing similar challenges, we decided to adopt a hybrid format (combination online and in-person) for our trainings beginning in 2012. Our goal was to create workshops that offered the flexibility and convenience of an online program that also facilitated the networking and hands-on learning that come with in-person trainings (Norman, 2013). By 2014, all four core workshops had been converted to a hybrid format, at which time we surveyed participants to gain an understanding of how they perceived this change in workshop structure, as well as the quality of their learning experience.

Methods

Traditional Workshops

From 2000 to 2011, the four core workshops offered by NATA occurred as 3-day in-person trainings and were generally offered once per year. The entirety of each workshop was held at a remote site with rustic accommodations, such as a state park. The material for each face-to-face workshop was taught through a variety of methods. Workshop participants engaged in group activities, collected data on field trips around the training site, listened to presentations from guest instructors who were experts in their field, and presented their results from case-study exercises to each other. Participants were also provided with a take-home binder that contained worksheets, copies of all lecture presentations, and additional resources that we felt might be beneficial for post-workshop learning.

Workshop Conversion

We decided that all four core workshops would be converted to a hybrid format, but recognized that it would not be appropriate for each training to be structured exactly the same way. In order to keep with the 24 instructional hours of the traditional workshop formats, we adopted a 2-day online/1-day in-person approach for two workshops and a 1-day online/2-day in-person approach for the other two. Each workshop's format was determined based on the amount of lecture material presented and the amount of field material that could not (or was not appropriate) to be replicated online (e.g., amount of lecture presentations versus field activities and group work).

The online material was presented asynchronously for each workshop through CourseSites by Blackboard, a free online learning platform. This allowed users to access and work through the material at times convenient for them and at their own pace. The length of the online portion of each workshop depended on how much material was presented, generally 4 – 8 weeks. Workshop content was divided into modules, each of which contained interactive video presentations (with PDF copies of the slides available for download), ungraded self-assessment quizzes, and optional readings and resources. Additionally, some modules also contained assignments that were not graded but were required to be submitted to earn workshop credit. Assignments were designed to substitute for activities that would have been done in groups at the face-to-face workshops, giving participants an opportunity to apply the concepts and information learned.

Surveys

In 2010, before conversion to the hybrid delivery format, an electronic survey was sent to individuals who had completed a traditional NATA workshop within the previous 5 years (approximately 250 people). The survey included 18 questions on topics such as which factors respondents considered when deciding to enroll in a workshop, what benefits they most desired in a workshop, what benefits they actually gained by participating in a workshop, how relevant the workshop material was to their job, whether they had learned new skills at the workshop, how useful the workshop information was, and how often they use the techniques or skills learned at the workshop in their job. Survey participants were also asked which format of workshop they preferred (i.e., entirely online, entirely face-to-face, or a blended approach), and which factors (such as time away from work and registration costs) would prevent them from attending future NATA workshops.

In 2014, a similar electronic survey was sent to workshop participants who had completed at least one of the converted hybrid workshops during 2012-2014 (n=70). This survey asked the same 18 questions with the same question wordings and answer choices as the 2010 survey, but to understand how the incorporation of the hybrid teaching model (i.e., use of technology) associated with participant satisfaction, questions were added that addressed participants' perceived experience with the hybrid workshops. The additional questions were 5-category Likert-type ranging from 1 (strongly disagree) to 5 (strongly agree) that corresponded to the Technology Acceptance Model (Davis 1989). A three-item scale measured participants' perceived ease of use of the online system, a six-item scale assessed their perceived usefulness of the hybrid workshop format, a three-item scale evaluated the flexibility provided by the hybrid format, and a four-item scale evaluated participants' overall satisfaction (adapted from Al-Busaidi, 2013; Arbaugh, 2000; Venkatesh & Davis, 2000).

Results and Discussion

The 2010 survey had a response rate of 25.6% (n=64). Although fairly low, this response rate is considered adequate for deriving reliable results from online surveys (Nulty, 2008). The majority of respondents identified the primary focus of their job at the time of the survey as land management (23%), biology (20%), outreach/education (9%), park ranger (7%), or ecology research (7%). The number of years respondents had been employed in a natural resource management field ranged from 1 – 32 (mean = 9.6), and most respondents identified their current employer as a state agency (55%), county government (27%), or a private company (13%).

The 2014 survey had a response rate of 44.3% (n=31). The majority of respondents identified the primary focus of their job at the time of the survey as land management (43%), volunteer coordination (14%), biology (11%), and outreach/education (11%). The number of years respondents had been employed in a natural resource management field ranged from 1 – 22 (mean = 9.6), and predominant employers shifted to municipalities (35%), state agencies (28%), and county governments (17%).

Participant Perceptions of the Program: 2010 versus 2014

Decision to Enroll

We asked one ranking question with six answer choices regarding the most important factors considered when deciding whether to enroll in a workshop, and found little change in responses between 2010 and 2014. In both years the top three factors were "how closely the topic relates to my job responsibilities," "my personal interest in the topic," and "whether or not my boss thinks I should attend." In both years, respondents found three other factors to be of lesser importance: "the number of days the workshop will keep me from my daily responsibilities," "the location of the workshop," and "the reviews I've heard from previous workshop participants." The consistency in responses suggests that during both periods investigated our workshop topics aligned well with managers' job responsibilities, managers' interests, and priorities of their supervisors.

We asked one ranking question with six answer choices regarding the importance of various factors in preventing enrollment in additional NATA workshops. The top three hurdles were the same in both 2010 and 2014, but the rank order changed with "high registration costs" ranking the highest in 2010 and "time away from the job" in 2014 (Table 1). Both years, all other factors were deemed less important ("my supervisor does not support me attending trainings'," "NATA does not offer trainings on topics I want to learn about," "I've taken all the workshops NATA offers on topics I am interested in'," "trainings offered by other groups better meet my needs," and "I was dissatisfied with previous NATA workshops"). The consistency of high-ranking challenges to enrollment between survey periods indicates that our conversion from a fully in-person approach to the hybrid format (which reduced registration costs, travel costs, and time away from the job) was apt.

Table 1.
Order of Average Ranks of Factors Preventing Enrollment in Additional Workshops, as Reported Through Surveys Conducted at Two Different Points in Time
Rank Order 2010 Survey 2014 Survey
1 high registration costs time away from the job
2 travel restrictions high registration costs
3 time away from the job travel restrictions

Benefits Desired and Benefits Gained

We asked one question with five answer choices regarding what benefit participants most hoped to gain, and found no change in responses between 2010 and 2014. The three benefits most desired were ranked the same during both years (Table 2). We also asked one question with five answer choices regarding what benefits participants actually gained, and this time found a shift in response between 2010 and 2014. Specifically, "understanding of a particular topic" was rated the highest desired benefit both years. It was the third highest benefit gained in 2010, but was the highest benefit gained in 2014. In 2010, there was a mismatch between the benefits desired and the benefits gained, whereas this discrepancy was less apparent in 2014 (Table 2). This suggests that NATA was doing a better job in 2014 of meeting participants' expectations in comparison to 2010.

Table 2.
Order of Average Ranks of Benefits Desired and Benefits Actually Gained Through Workshop Participation, as Reported Through Surveys Conducted at Two Different Points in Time
Rank Order 2010 Survey 2014 Survey
Benefits Desired Benefits Gained Benefits Desired Benefits Gained
1 understanding of a particular topic awareness of what other managers are doing understanding of a particular topic understanding of a particular topic
2 awareness of what other managers are doing awareness of resources I could use to get my job done awareness of what other managers are doing awareness of what other managers are doing
3 familiarity with new techniques understanding of a particular topic familiarity with new techniques expand my network of professional contacts

Relevance

We asked one question to assess how useful the information covered during the workshop(s) attended was for meeting participants' current job responsibilities, and received similar feedback both years. The majority of respondents indicated the information was "extremely useful" (49% in 2010; 55% in 2014), with a lesser percentage indicating the information was "somewhat useful" (44% in 2010; 39% in 2014), a small minority indicating the information was "marginally useful" (7% in 2010 and 7% in 2014), and no one indicating the information was "not at all useful" either year. These responses suggest NATA has been consistently helping participants learn information useful to their jobs, with a slight increase in usefulness between 2010 and 2014.

We also asked one question about how often new knowledge gained during workshops was put to use on the job and found a slight change between years. In 2010, 12% of respondents reported using information very often, 37% often, 43% occasionally, 6% rarely, and 2% never. Similarly, 7% of respondents reported using information very often in 2014, 45% often, 39% occasionally, 10% rarely, and 0% never. This confirms that although there were some changes in the primary focus of the jobs of survey respondents between the two points in time surveys were conducted, the information covered by NATA was, and continues to be, relevant to resource managers' jobs.

Satisfaction

We asked one question regarding overall satisfaction with workshops, and had similar responses both years. In 2010, 71% of respondents indicated they were completely satisfied, 25% somewhat satisfied, 1% neither satisfied nor dissatisfied, 3% dissatisfied, and 0% completely dissatisfied. In comparison, although a lower percentage of respondents were completely satisfied in 2014 (39%), 61% were somewhat satisfied, and 0% reported being neither satisfied nor dissatisfied, dissatisfied, or completely dissatisfied. In other words, although there was a drop in participants who were completely satisfied with the course, there were no neutral or negative ratings to NATA in 2014.

We asked one question with four answer options regarding preferences among workshop formats (number of days at a remote location versus online). There was a clear shift in preference to incorporation of an online learning format between surveys (Table 3). It is noteworthy that the format used in 2010 was preferred in 2010, and that the format used in 2014 is preferred in 2014. We cannot speculate whether this reflects a preference for the format participants are most familiar with versus a true change in acceptance of online learning platforms over time. Regardless, it is clear that the workshop format currently in place is preferred at the current time.

Table 3.
Average Rank Ordering of Workshop Format Preferences, as Reported Through Surveys Conducted at Two Different Points in Time
Rank Order 2010 Survey 2014 Survey
1 3 days at a remote site 2 days at a remote site with 1 day online
2 2 days at a remote site with 1 day online 1 day at a remote site with 2 days online
3 1 day at a remote site with 2 days online 3 days at a remote site

Participant Perceptions of the Hybrid Approach: 2014

Respondents reported high perceived ease of use of the online system, high usefulness of teaching strategies employed, high flexibility, and high satisfaction (Table 4). Satisfaction with the workshop was most closely correlated with perceived flexibility and perceived usefulness, corroborating previous research (Al-Busaidi 2013; Arbaugh 2000).

Table 4.
Descriptive Statistics and Spearman Correlations Among Variables Measured to Assess Acceptance of Technology in the 2014 Survey (n=31)
Variable Mean SD 1 2 3
1. Perceived Ease of Use 4.14 0.71
2. Perceived Usefulness 3.82 0.89 0.33
3. Perceived Flexibility 4.06 0.98 0.41* 0.77*
4. Perceived Satisfaction 3.91 1.21 0.30 0.55* 0.62*
Indicates p<0.05

Conclusions and Recommendations

Due to increased flexibility and accessibility, distance learning is becoming more common in Extension and outreach education programs. Our experience indicated that workshop participants were satisfied with the hybrid workshop format because it provided flexibility while enabling them to learn information relevant to their job responsibilities. Our survey results indicated the hybrid workshop format is doing an even better job of providing the most highly desired benefits than did the traditional face-to-face learning environment we used for 11 years. Because time spent away from daily job duties was cited as the biggest hurdle to enrolling in additional training opportunities in 2014, the hybrid workshop format seems to be a viable alternative to our traditional face-to-face trainings.

It should be acknowledged that there are costs of transitioning away from a face-to-face workshop. Although survey results showed mostly positive impressions of the hybrid format, results do indicate that there was a drop in the proportion of participants who were completely satisfied with the workshop. Likely some participants might not feel as strongly about a workshop that has less connection with instructors and other participants and time in the field. The hybrid workshop is more efficiently doing its job of providing a valuable learning opportunity for professionals, but it might not evoke as strong a feeling of satisfaction as the face-to-face workshop format. However, as an increasing number of Extension programs are moving in the direction of offering at least a portion of their trainings online, we expect that with time there will be a shift in participant satisfaction that better aligns with this hybrid format.

The market for online training is large and expected to continue expanding due to technological advances, growing acceptance among end-users, and improved understanding among educators of how to make the best use of new technology. Hybrid trainings may well be the best approach to reach adult Extension audiences across large regions at reduced costs to both participants and instructors.

References

Al-Busaidi, K. A. (2013). An empirical investigation linking learners' adoption of blended learning to their intention of full e-learning. Behaviour & Information Technology, 32(11), 1168-1176.

Allen, M., Bourhis, J., Burrell, N., & Mabry, E. (2002). Comparing student satisfaction with distance education to traditional classrooms in higher education: a meta-analysis. The American Journal of Distance Education, 16(2), 83-97.

Allred, S. B., & Smallidge, P. J. (2010). An educational evaluation of Web-based forestry education. Journal of Extension [On-line], 48(6) Article 6FEA2. Available at: http://www.joe.org/joe/2010december/a2.php

Arbaugh, J. B. (2000). Virtual classroom characteristics and student satisfaction with internet-based MBA courses. Journal of Management Education, 24(1), 32-54.

Boling, E. C., Hough, M., Krinsky, H., Saleem, H., & Stevens, M. (2012). Cutting the distance in distance education: perspectives on what promotes positive, online learning experiences. Internet and Higher Education, 15(2), 118-126.

Choi, H., & Johnson, S. D. (2005). The effect of context-based video instruction on learning and motivation in online courses. The American Journal of Distance Education, 19(4), 215-227.

Colverson, P., & Demetropoulos, L. (2010). The Natural Areas Training Academy: preparing Florida's land managers for the modern challenges of land management. Natural Areas Journal, 30(2), 232-237.

Davis, F. D. (1989), "Perceived usefulness, perceived ease of use, and user acceptance of information technology", MIS Quarterly 13(3), 319–340.

Davis, J. M. (2014). Extension clientele preferences: accessing research-based information online. Journal of Extension [On-line], 52(5) Article 5RIB2. Available at: http://www.joe.org/joe/2014october/rb2.php

DuCharme-Hansen, B. A., & Dupin-Bryant, P. A. (2004). Distance education plans: course planning for online adult learners. Tech Trends, 49(2), 31-39.

Huang, H. (2002). Toward constructivism for adult learners in online learning environments. British Journal of Educational Technology, 33(1), 27-37.

McCann, B. M. (2007). The effectiveness of extension in-service training by distance: perception versus reality. Journal of Extension [On-line], 45(1) Article 1FEA4. Available at: http://www.joe.org/joe/2007february/a4.php

McLaughlin, J. E., Griffin, L. M., Esserman, D. A., Davidson, C. A., Glatt, D. M., Roth, M. T., Gharkholonarehe, N., & Mumper, R. J. (2013). Pharmacy student engagement, performance, and perception in a flipped satellite classroom. American Journal of Pharmaceutical Education, 77(9), 196. doi: 10.5688/ajpe779196

Missildine, K., Fountain, R., Summers, L., & Gosselin, K. (2013). Flipping the classroom to improve student performance and satisfaction. Journal of Nursing Education, 52(10), 597-599.

Moore, M. G. & Kearsley, G. (2011). Distance education: A systems view of online learning. 3rd edition. Belmont, CA: Wadsworth Publishing Company.

Norman, M. A. (2013). Using a hybrid approach for a leadership cohort program. Journal of Extension [On-line], 51(5) Article 5IAW2. Available at: http://www.joe.org/joe/2013october/iw2.php

Nulty, D. D. (2008). The adequacy of response rates to online and paper surveys: what can be done? Assessment & Evaluation in Higher Education, 33(3), 301–314.

Ruiz, J. G., Mintzer, M. J., & Leipzig, R. M. (2006). The impact of e-learning in medical education. Academic Medicine, 81(3), 207-212.

Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: four longitudinal field studies. Management Science, 46(2), 186-204.