October 2012
|
October 2012 // Volume 50 // Number 5 // Feature // v50-5a5
Converting Face-to-Face Curricula for Online Delivery: Lessons Learned from a Biomass Harvesting Guidelines Curriculum
Abstract
With shrinking budgets, staff reductions, and increased availability and access to digital technologies, Extension educators will be seeking ways to convert face-to-face programs to alternate formats. When converting Minnesota's biomass harvesting guidelines for online delivery, we learned many lessons while planning, developing, and testing our curriculum that can help others through a similar process.
Introduction
Outreach and Extension programs are increasingly using digital technologies to deliver information and resources to the public. Examples include videoconference (Pankow, Porter, & Schuchardt, 2006), webcasts (Local Government Environmental Assistance Network [LGEAN], 2004), electronic newsletters (Westa, Broderick, & Tyson, 2005), and curriculum and training materials on web sites and DVDs (Penuel, Bienkowski, & Korbak, 2005; Dunn, Thomas, Green, & Mick, 2006; Mayfield, Wingenbach, & Chalmers, 2006; Zimmer, Shriner, & Scheer, 2006). As compared to face-to-face training, these technologies facilitate reaching more clientele cost-effectively while providing participants with quality learning experiences (Mayfield, Wigenbach, & Chalmers, 2006).
Face-to-face training sessions through workshops are among some of Extension's traditional approaches of delivering information to the public. However, recognizing the usefulness and effectiveness of delivering information through the Web (Mayfield, Wigenbach, & Chalmer, 2006), the face-to-face curriculum of Minnesota's forestry biomass harvesting guidelines (BHG) (Minnesota Forest Resources Council [MFRC], 2007) was converted into an online format. Those guidelines were created to minimize environmental impacts associated with removing woody biomass during timber harvesting operations. This article reports on the approach used and lessons learned from developing an online version of that curriculum.
Minnesota's Biomass Harvesting Guidelines
Target Audience
Loggers, landowners, and natural resource managers are the target audience for the BHG training. They tend to be widely dispersed in rural areas throughout the forested region of Minnesota. While many natural resource managers use computers on a daily basis, some members of the target audience use computers very little. While access to high speed Internet service is improving for the target audience, it is not universally available.
Face-to-Face Curriculum Content and Delivery
Shortly after BHG's publication, a curriculum was created to present the content to loggers and natural resource managers during 4-hour face-to-face workshops. That curriculum included presentations through the use of PowerPoint and small group breakouts. BHG curriculum content addressed the rationale for the development of the guidelines; an overview of the guidelines for wildlife and biodiversity, soil productivity, and water quality and riparian management zones; and approaches for incorporating the guidelines into planning, design, and operational activities. Because of the diversity of topics, five presenters were needed to deliver the material.
More than 425 loggers and natural resource managers were introduced to the BHG through a series of face-to-face workshops. Two Minnesota Logger Education Program (MLEP)-sponsored conferences drew over 300 of those individuals. In addition, elements of the BHG curriculum were also presented during workshops sponsored by the University of Minnesota Extension to the target audience averaging 100 participants per workshop. The large audiences resulted from the need to get the information out quickly so loggers could operate on state timber sales and so that foresters could design timber sales that incorporated the guidelines (Minnesota Department of Natural Resources [MN DNR], 2008).
In order to attend, many participants had to travel an hour or more each way, taking more time away from their normal work routine. In addition, the face-to-face workshops were difficult to schedule and expensive to offer because of the conflicting schedules of presenters. While evaluations suggested that the workshops largely met existing needs, other individuals, including new loggers and natural resource managers, will need this training in the future. Future needs based on the need to learn more about the BHG are anticipated to be for smaller audiences who will want access to the training throughout the year.
Approach
The development of our online curriculum underwent a rigorous process that included planning, developing and testing the curriculum. The approach is described below.
Curriculum Planning
A project team was assembled to develop the online curriculum. It included three Extension educators, an Extension technology specialist, and the Executive Director of MLEP. The initial project budget was $25,000, with funding from the University of Minnesota Extension and MLEP and with additional in-kind contributions provided by project team members. The largest portion of the budget was allocated for contracting with a professional—a third-party contractor to design the curriculum online using a specific technology we identified. We met face-to-face several times and used e-mail frequently to discuss hardware and software capabilities and knowledge of our target audience, potential strategies to create and deploy the curriculum (e.g., who would do the development work, how to break the content into manageable units for the learner, which software to use), how to maintain learner interest and engagement, and how to test and evaluate the new curriculum using some learning technology features that support various training methods (Brown, 2001).
During the initial project planning, we identified key features that were considered necessary to incorporate into the project. With that information, we focused on building the following features into the product:
- Make the learning materials easy to use. Some of the target audiences have limited computing experience, making simple operation and clear instructions important, which included providing introductory slides about how to move from one screen to the next.
- Familiarize users with the guidebook provided. After the training, we wanted them to know where to look for the guidelines on their own.
- Support multimedia integration, possibly including audio, video, slides, and animation (Beaudin & Quick, 1996; Brown, 2001).
- Use software that would make it easy to modify the curriculum when the guidelines are revised and allow access without creating an account or entering a password.
- Allow the users to work at their own pace so that they can complete individual modules within the curriculum, exit, and return later without the need to go back through the entire curriculum again.
- Keep the content focused and concise.
- Design the curriculum for access over low-bandwidth connections.
- Make a DVD version of the curriculum so that target audience members who don't have Internet capability could still have access to the curriculum.
- Incorporate true/false or multiple choice quizzes to reinforce key content and to provide some assurance that the learner was retaining it. Require respondents to correctly answer each question before proceeding to the next module.
- Create a unique code that could be reported elsewhere to demonstrate completion of the entire curriculum. We wanted a unique code to make sure that one person in an office or business didn't complete the curriculum and then share it with others to falsely report completion.
After determining the content and scope for the online curriculum, a request for proposals from third-party contractors was developed. While we reviewed bids from seven contractors, with bids ranging from $45,000-$70,000, our budget was insufficient to hire a professional content developer. This led us to in-house staff. A contract was developed with a research and outreach professional who had a high level of familiarity with the content, as well as a variety of instructional technologies and our target audience. That individual brought unique skills in the subject matter (e.g., ability to add content, find relevant visuals, create transitions, propose draft questions), which ended up being an important asset during the development of the curriculum.
Online Curriculum Development
The developer modified the face-to-face curriculum into six modules. Draft versions of the entire written narration, curriculum, and presentation slides were sent to members of the project team and two individuals outside Extension to evaluate the accuracy of information. The developer used the comments to revise the curriculum.
Prior to creating all of the online modules, the developer drafted one module for our review. Our review focused on whether the key learning features were achieved. The first module was revised based on our feedback, and development proceeded on the remaining modules. Questions were added at the end of four of the six modules, and specific locations in the curriculum were identified where the user was directed to look-up information in the printed guidelines to help ensure a linkage between the training and the printed materials. A learner with no printed copy of the guidelines was directed to an electronic version of the guidebook. A link to the guidelines is provided during the training session.
Adobe Presenter® version 6 (Adobe Systems Incorporated, 2006) was used to develop and deploy the audio slideshows with embedded interactive elements. Adobe Presenter® was selected because it has several user interface options (e.g., video, sound, pictures, text), the modules can be designed in PowerPoint, there is a testing/quiz function, and there are administrative tracking features that would allow us to evaluate learner progress during training sessions once the system was operational.
Near the end of the curriculum development process, we chose to hire a professional narrator. We listened to samples produced by the narrator before contracting with that individual. We opted for a female narrator because we felt that our target audience, mostly males, might be more receptive to a female than a male voice.
Usability Testing
To assess whether the key learning features were achieved, we invited 12 selected members of the target audience to participate in on-site usability testing (Bennett, Johnson, & Parker, 2009). Participants were given an evaluation sheet that we prepared asking participants to rank achievement of the key learning features. Prior to that testing, we created a structured process to conduct the assessment at a computer lab using information presented in table 1. The lab selected was located centrally near several logging businesses and natural resource managers who were involved with setting up timber sales. To encourage participation, lunch and continuing education credits were provided to testers.
During the usability testing, 12 loggers and natural resource professionals completed the two modules each person was assigned to review. Half of the testers conducted their reviews before lunch and the other half afterward. We observed their interaction with the curriculum, noting areas of confusion and other opportunities to improve the learner's experience using observation criteria listed in Table 1. After reviewing the two assigned modules, participants were asked to complete a written evaluation of the curriculum based on their review (Table 2). The testers then gathered for a focus-group style discussion (Morgan, 1998; McCoy, 2007) of the curriculum content and format, and we noted their concerns. We then considered these reviews and further modifications to the online curriculum were made by the developer.
Publication
After final revisions, the Web-based training was published online and is hosted on a University of Minnesota computer server. The curriculum is available at <http://www.mlep.org/onlinebiomassintro.htm>.
The same content is also available on DVD for those with no (or slow) bandwidth connection. The DVD includes a unique code at the end of the last module that learners need to provide to demonstrate completion of the curriculum.
Elements/tasks to observe | Observation criteria |
Graphic | |
Navigation |
Is the learner able to find his/her way through the assigned modules? Are the "Next" and "Back" buttons positioned in the right place on the screen? Is the navigation intuitive? |
Interface |
Are the elements and icons used on the interface intuitive? Does the learner use all the elements provided? Is the learner searching for a particular feature such as Pause or Mute? Does the learner find the interface confusing or easy to use? How often does the learner use the features such as Mute, Pause, References and other such features on the interface? |
Instructional Design | |
Content |
Is the learner showing interest in completing the modules? Is the learner motivated to read information presented on the screen or is he/she clicking the "Next" button without reading the content presented? Is the learner able to comprehend content provided on the screen? Is the learner spending more time on a particular screen? How does the learner react when he/she answers a question incorrectly? Is the learner comfortable with the feedback received? Is the learner comfortable with the wording used in the modules? Does the learner apply the concepts learned from the curriculum or does he/she answer questions by trial and error? |
Audio |
Is the audio sufficient on each screen? Is the learner distracted by the audio? Does the learner avoid reading text because of the audio? What is the learner's reaction to the pace of the audio? Does the learner point out any discrepancies in wording? |
Parameter | Rating |
The module provides the ability for learners to exit and return easily | 4.0 |
The learning technology is easy to use | 3.8 |
The module can be navigated easily | 3.8 |
The information presented in the module is easy to understand | 3.8 |
The format of the evaluation/test questions at the end of the module is clear | 3.8 |
The purpose of the online training is clear | 3.7 |
The content contributes to the achievement of the unit learning objectives | 3.7 |
The module clearly outlines the complete training program | 3.5 |
The appearance of the module is attractive and effective | 3.3 |
The training module downloaded quickly | 3.0 |
The learning objectives are clear | 3.0 |
The module provides a high level of interactivity/engagement | 2.7 |
The module's graphics, animations, sounds clips, etc., are clear | 2.5 |
The module accommodates participants' learning style | 2.3 |
Note: Each participant reviewed two of the six modules prior to completing this assessment, which were used in developing the final BHG online curriculum (n=12). Testers were asked to evaluate and assess the parameters within their two assigned modules using the following numerical rating: 1 = strongly disagree, 2 = somewhat disagree, 3 = somewhat agree, 4 = strongly agree. |
Lessons Learned
During the development of this project, we learned a number of lessons that are translated into recommendations for conversion of a face-to-face training curriculum into a successful online training delivery. These recommendations are noted below.
Engage Key Partners in the Process
Given the importance of reaching and engaging loggers as a key target audience, MLEP involvement was essential. MLEP's contributions included funding support, an in-depth knowledge of loggers' learning styles, creative ideas for solving problems, and identification of loggers for usability testing. Also, leveraging Extension's resources with MLEP helped form a successful partnership for a common goal.
Include a Technology Specialist on the Project Team
Our technology specialist brought invaluable expertise to our team through experience with previous educational design projects, an initial systematic assessment that helped us clarify the project goals and approach, and designing for a specific target audience. That individual also guided us through a process to solicit bids from developers and narrators, evaluated learning software options, and helped design and conduct usability testing.
Identify Key Learning Features
Recognizing a high level of variation in our target audiences' familiarity with computers and access to internet-based communication technology, we required a platform that was very simple for learners to use in their home and office environment. Identifying and prioritizing key learning features early in the process made it easier for us to evaluate software options and select a developer with the necessary skills to complete the project.
Assess Whether or Not You Can Afford a Professional Developer
Wanting to create as professional a product as possible, we planned to use the majority of our $25,000 budget to contract with a professional developer. While the seven proposals from outside developers offered excellent approaches for creating our curriculum, the bids exceeded our budget. This led us to modify our development process, and we evaluated existing software applications and other resources available within the university. This approach greatly reduced our cost, without sacrificing quality.
Break Up the Content into Modules
The guidelines encompass a wide range of topics, which led us to create six modules. Breaking the content into six concise, focused modules allowed us to build relatively short presentation segments (each 10 - 20 minutes in length) that focus on a single topic (i.e., introduction and instructions on how to use the training; soil productivity; water quality and riparian management zones; wildlife habitat and biodiversity; planning, design, and operational activities; and guidelines specific to brushland and open land biomass harvesting). Shorter modules are easier to access over a low-bandwidth connection. The modular format also provides learners flexibility to move through the content during separate sessions without having to repeat content (Mayer, 2009).
Create a Draft Module for Review
The developer created a single, full-featured draft module for us to review prior to additional development work. This allowed us to provide feedback based on learning features we identified early in the development process. It also allowed us to see how the final system might look to help us better understand the strengths and limitations of our approach.
Implement Best Practices for Adult Online Learning
Some examples include providing detailed instructions at the onset of the training session in the first module and using concise text and narration to develop and maintain learner engagement. Inviting learners to consult reference materials that are in their possession can also be beneficial both to build familiarity with the reference materials and to provide active and diverse learning activities.
Test the New Curriculum with Target Audience Members
Usability testing allowed us to assess whether the key learning features were addressed to the satisfaction of our target audience (Bernard, Abrami, Borokhovski, Wade, Tamin, Michael, & Bethe, 2009; Kern Learning Solutions, 2012). It also enabled us to observe the behavior of the learners during their use of the online curriculum (Table 1). Further, the testing allowed us to identify and fix issues based on actual target audience user feedback (Table 2) before the final development and public launch.
Consider Using a Professional Narrator
A learner interfaces with the training by seeing things on a computer monitor, using a mouse to go forward/backward, and listening to the narrator. We determined that a professional narrator working in a specialized recording studio would improve the learner's experience.
Make the Online-Curriculum Available Through DVD
Many in our target audience lack access to a computer with a high-bandwidth connection. Creating the DVD version has made the curriculum available to individuals with access to a private or public computer but no (or slow) Internet connection. Making the content available via DVD helps to eliminate this potential barrier.
Select Software That Can Provide Evaluation Data
If it is important to collect formative information about use of an online system (e.g., how many users complete all modules in one setting vs. multiple settings, which modules are taking longer to complete, which quiz questions are answered incorrectly most often), the hosting site has to be capable of collecting and retaining that information. While collection of this information was important to us, the University of Minnesota changed hosting servers and did not retain any of the data we needed to thoroughly evaluate the use of the curriculum. If we had known that was likely to occur, we might have selected a different host.
Implications and Conclusions
Meaningfully engaging adult learners is a challenge among Extension educators. Depending on the Extension program being offered, Extension audiences tend to be geographically distributed across relatively wide areas where distance and time involving travel may limit their attendance to Extension programs. While a target audience may require training on a regular basis to continue operations, it may not be financially efficient to offer programs frequently to small audiences. Some audience members would prefer to not miss time on the job to attend the training because it may reduce their productivity and/or profit margin. Bringing multiple trainers to a face-to-face program is becoming increasingly difficult due to scheduling conflicts and budget cuts (e.g., travel costs, increased responsibilities as staff retire and remaining individuals take on additional responsibilities). These facts are not uncommon to many Extension audiences. The lessons learned in converting the face-to-face curriculum into an online training platform are useful for educators wishing to expand and diversify their information delivery approaches to their target clients.
An Internet-based approach with a DVD option to delivering the BHG curriculum accomplished our goal of providing a quality on-demand educational product that minimizes the cost and time associated with face-to-face delivery of information. It has eliminated the need for future face-to-face training on the biomass harvesting guidelines. Over 120 natural resource professionals and loggers took the online training within the first year of making the curriculum available. Further, at least 30 copies of the DVD have been requested by loggers with slow (or no) Internet access. While user feedback is limited because the training is not conducted face-to-face, MLEP reports having received unsolicited positive feedback on the training from several loggers. Some users have asked if more online training will become available in the future.
Given the success of the project, we are currently in the process of undertaking a second project that would create an online training curriculum for all of Minnesota's forest management guidelines.
References
Adobe Systems Incorporated. (2006) Adobe Presenter 6 software. San Jose, CA. 76 p. Retrieved from: http://www.adobe.com/support/documentation/en/presenter/6/releasenotes.htm
Beaudin, B. P., & Quick D. (1996). Instructional video evaluation instrument. Journal of Extension [On-line], 34(3), Article 3FEA1. Available at: http://www.joe.org/joe/1996june/a1.php
Bennett, B. K., Johnson, J. L., & Parker, R. (2009). Educating limited acreage producers using Web-based technology. Journal of Extension [On-line], 47(6) Article 6IAW3.Available at: http://www.joe.org/joe/2009december/iw3.php
Bernard, R., Abrami, P., Borokhovski, E., Wade, A., Tamin, R., Michael, S., & Bethel, E. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational Research. 79(3): 1243-1289.
Brown, R. (2001). Thinking in multimedia: Research-based tips on designing and using interactive multimedia curricula. Journal of Extension [On-line], 39(3) Article 3TOT1. Available at: http://www.joe.org/joe/2001june/tt1.php
Dunn, C., Thomas, C., Green, C., & Mick, J. (2006). The impact of interactive multimedia on nutrition and physical activity knowledge of high school students. Journal of Extension [On-line], 44(2). Article 2FEA6. Available at: http://www.joe.org/joe/2006april/a6.php
Kern Learning Solutions. (2012). Observation criteria for the usability testing process. Kern Learning Solution Blog. Retrieved from: http://elearning.kern-comm.com/
Local Government Environmental Assistance Network (LGEAN). (2004). Seeing green with trees: the economic and environmental benefits of urban forests. Retrieved from: http://www.lgean.org/html/whatsnew.cfm?id=853
Mayer, R. E. (2009). Multimedia learning. 2nd ed. New York: Cambridge University Press.
Mayfield, C., Wingenbach, J., & Chalmers, D. (2006). Using CD-based materials to teach turfgrass management. Journal of Extension [On-line], 44(2). Article 2FEA5. Available at: http://www.joe.org/joe/2006april/a5.php
McCoy, R. (2007). Using focus groups to learn about landowner knowledge/willingness to established chestnut orchards and enhance technology transfer efforts. Retrieved from: http://www.centerforagroforestry.org/pubs/focus.pdf
Minnesota Department of Natural Resources (MNDNR). (2008). Orientation Manual for Minnesota Forest Stewardship Plan Preparers. Minnesota Department of Natural Resources, St. Paul, MN. 44 p. Retrieved from: http://files.dnr.state.mn.us/assistance/grants/forestmgmt/stewardship/orientationManual.pdf
Minnesota Forest Resources Council. (2007). Biomass harvesting guidelines for forestlands, brushlands and open lands. Minnesota Forest Resources Council, St. Paul, MN. 42 p. Retrieved from: http://www.frc.state.mn.us/documents/council/site-level/MFRC_forest_BHG_2001-12-01.pdf
Morgan, D. (1998). The focus group guidebook. Thousand Oaks, CA: Sage Publications.
Pankow, D., Porter, N., & Schuchardt, J. (2006). Training educator and community collaborators using a satellite videoconference format. Journal of Extension [On-line], 44(1). Article 1TOT6. Available at: http://www.joe.org/joe/2006february/tt6.php
Penuel, W., Bienkowski, M., Korbak, C. (2005). GLOBE Year 9 evaluation: Implementation supports and student outcomes. Menlo Park, CA: SRI International. Penuel, Bienkowski, Korbak, 2005.
Westa, S., Broderick, S., & Tyson, B. (2005). Getting the work out in the Last Green Valley: Integrating digital video, direct mail, and web-based information for specific target audiences. In: Journal of Extension [On-line] 43(1) Article 1FEA7. Available at: http://www.joe.org/joe/2005february/a7.php
Zimmer, B., Shriner J., & Scheer, S. (2006). Use of evaluation of a statewide 4-H volunteer newsletter. Journal of Extension [On-line], 44(1). Article 1RIB8. Available at: http://www.joe.org/joe/2006february/rb8.shtml