June 2008 // Volume 46 // Number 3 // Feature Articles // 3FEA4
Who's That Knocking at Our Door? Characterizing Extension's Online Clientele
Abstract
Using an online question-and-answer (Q&A) service, communicators at Oregon State University found an unexpected way to characterize Extension's online clients and to measure the impact of Extension's online resources. Over time, the Q&A process provided a database that could be used to query an online audience, which in turn suggested possible directions for managing Extension's expert knowledge and further developing online e-agents.
Introduction
Online users of Extension Service information pose a challenge for Extension communicators. Sitting at their computers, online users may not know, or even care about, the source of information they find on the Internet. Their numbers are counted by hits or visits to Web sites, but little can be learned from such statistics about who those online clients are or what they have gained from visiting a Web site.
The Internet has become an enormous stockpile of information accessed by a huge cross-section of people around the world. In the United States alone, more than 113 million homes have Internet access (ClickZ). The national strategy for eXtension is focused on this huge population, asserting that the growth of the Internet opens the door for Extension to serve a more diverse audience (Mississippi State University 2002). Who is that audience at our door? And what are they seeking?
King (2003) challenges us to imagine e-Extension as a giant building prior to groundbreaking, in need of architects to ensure that people can find high-quality information that fits their needs. Who are these people who will be seeking information at this giant online structure? Who is knocking at that door?
Whether we are thinking about a national blueprint for eXtension, or simply considering a way to respond to online clients, we will succeed in providing new knowledge in a useful and accessible fashion only if we understand whom we are serving and what they seek. In this article, we attempt to characterize a segment of Extension's online audience who has sought information online from Oregon State University Extension Service, and we consider how knowledge might be managed to serve this audience.
How Knowledge Is Managed
Extension provides expert information, and providing information means managing knowledge. In a study that explored strategies for managing knowledge, Harvard University researchers reported two very different knowledge management strategies used by successful companies (Hansen, Nohria, & Tierney, 1999). They found that companies that provide highly customized solutions to unique problems managed knowledge on a person-to-person basis. Those that provided relatively standardized services with repeated information managed knowledge in highly codified databases.
A key point was that a company's choice is not arbitrary; the benefits to both company and clients are greatest when only one strategy is emphasized and the other strategy is developed to support the first. To choose the right strategy, the researchers assert, companies must understand their customers and why they seek the company's products or services.
More can be learned from studying online businesses. According to Rigby, Reichheld, and Schefter (2002), most failures stem from assuming that customer relationships can be managed automatically by software. They cannot. Customer relationships are built from loyalty to a known and trusted product, and require open, two-way communication and learning between customer and business (Reichheld, 2001). Success in Internet communications has less to do with mastery of technology than with using traditional skills in communication: understanding your audience, developing clear and useful content, and connecting your audience with content effectively and efficiently (Emory, 1999).
So who is Extension's online audience and what do they seek?
At Oregon State University, Extension Service (OSU ES) communicators have grappled with the need to more clearly characterize their online clients. This is important in order to measure the impact of our online educational materials and to identify gaps in managing knowledge. Counters were capturing the number of downloads and forwards in addition to counting Web site visits, but more detailed information about online audiences remained elusive. Responsible for serving this unknown audience, the OSU Extension Web writer and Webmaster designed a survey that would stand as a gateway into the OSU ES Web site, requesting information from visitors in exchange for information from Extension experts. But before such a gateway to "Ask the Experts" could be constructed, OSU ES was facing a sudden flood of online queries from people requesting information from Extension.
The questions that came to our Extension Web site were unintentionally solicited through a link at the bottom of the page that had been created to report technical problems. People came to Extension's Web site looking for answers, but sometimes they needed help finding those answers. Just the way one might ask a reference librarian for help finding something in a large library, people asked the Webmaster for help finding answers on the Extension Web site.
Objective
The purpose of the study described here was to begin to characterize OSU Extension's online audience. The incoming queries provided us with an unexpected entry point to learn more about a sector that has eluded characterization because many online users access Extension information in a nearly anonymous one-way fashion. By contacting individuals who had sought and received information online, it was possible to measure the impact of Extension Web resources on an audience who knows and uses the Web. By following up clients' questions and Extension's responses with a survey, we began to characterize our online audience's demographics, the topics of information they sought, and their prior knowledge of Extension. We also attempted to gauge their level of satisfaction with the information they received, the level to which that information may have changed their behavior, and to what extent they might recommend Extension to a friend.
From these data, it was possible to ask:
- Who were these people knocking at Extension's Web site door?
- Were they familiar with Extension before they contacted OSU ES online?
- What kind of information did they seek from Extension?
- Were they satisfied with the information they received?
- Would they recommend Extension's online resources to their friends?
Approach and Methods
It started as an accident, soon after OSU ES's Web site was redesigned by staff at the Department of Extension and Experiment Station Communications (EESC). The redesigned site provided links to a library of information in several categories as well as access to county and program Web sites. It also included a small feedback button at the bottom of each Web page meant to help users report technical problems to the Webmaster. Instead, that little button opened the door to a serendipitous yearlong conversation with people seeking answers from Extension online and provided a database of email addresses of people who had sought out OSU Extension for help online.
Because the feedback link was designed to report technical problems only, all online questions came to the Webmaster at EESC, even if the query originated from a visitor to a county Web site with a question about a county program. The Webmaster responded to broken links, but routed all questions of content to the Web writer, assuming that person would know how to find the answers. Within weeks the questions of content far outnumbered the technical questions. Soon, three communications specialists at EESC were taking turns staffing the online question-and-answer (Q&A) service that had been unintentionally created, responding to as many as 12 queries daily.
The EESC communication specialists responded to all queries within 1 or 2 working days. Their first priority was to connect the questioner with existing Extension publications, county programs, Web sites, and articles from Extension's online archives. They employed some boilerplate responses to direct people to existing help in their county offices or other state agencies. Occasionally, they forwarded questions to experts on campus or in the counties.
They archived all the incoming questions and their responses, in order to build a database of frequently asked questions (FAQs). However, because they rarely received the same question twice, or any questions that could be answered by a stock supply of responses, a FAQ database became little more than a long list of individual questions.
A more useful database grew from the expanding list of email correspondence. Within a year, the three staffers had answered nearly 1,000 queries and had an email database of those who had sought information from OSU Extension online. From that database, the EESC researchers surveyed a sample of these email addresses to learn more about this online audience and the impact of the information clients had received from Extension online.
With this database of online questions received by OSU Extension, the researchers were able to determine:
- The category of information each questioner was seeking,
- The information they each received, and
- The email addresses of all those who had queried OSU ES.
They created a follow-up questionnaire to learn from each client:
- Their age group,
- The size and location of their community,
- How familiar they were with Extension before they contacted us online,
- How satisfied they were with the information they received online, and
- If they would recommend Extension's online resources to others.
This last question was prompted by research reported in the Harvard Business Review suggesting that this one question trumped more complex measures of customer satisfaction. "If growth is what you're after, you won't learn much from complex measurements of customer satisfaction or retention. You simply need to know what your customers tell their friends about you" (Reichheld 2003).
The queries were sorted to exclude questions from outside the state and questions that were clearly outside the realm of Extension, such as those more appropriately addressed to a different agency, a personal physician, or "Dear Abby." All queries that had come from Extension personnel were also screened out of the survey. After screening, a subset of 461 queries were randomly selected for the survey and grouped into seven categories based on the type of information sought.
With help from the OSU Survey Research Center (SRC), EESC researchers designed a survey that could be delivered by email, the same way OSU Extension had received the initial queries and sent their responses. The survey contained an introduction asking for help and contained the topic of the original query and the date the query was sent to OSU Extension. The SRC did not include Extension's responses in the survey email.
The SRC embedded the survey questions into the body of the email message. Respondents could reply by email, or they could print the survey and mail or fax their responses. Responses came back to the SRC staff, who compiled the data using the survey research program Statpac™. The survey included 12 questions, plus one opportunity at the end to write in anything respondents wished to say about OSU Extension, their experience with the Q&A service, or the survey. All responses were coded for anonymity.
Findings
The survey received a 40% response rate.
Some key findings included:
- Overall, 75% of respondents were new to OSU Extension.
- Respondents were closely divided among urban/suburban and rural/small town.
- Nearly half were in the state's five most populous, largely metropolitan counties.
- Oldest respondents had gardening questions.
- Youngest respondents had business questions.
- Overall, 86.1% said they would probably or definitely recommend Extension resources.
Who are these people knocking at our door? Cumulatively, 51.6 percent of respondents live in urban/suburban areas; 45.6 percent in rural/small town areas. And they are older; 30 percent are 56 years old or older; 55.6 percent are 36-55; 10 percent are 26-35. Only 1.1 percent of our respondents were 19 to 25, and all those had business-related questions. Table 1 breaks down some of these data by what information was sought; familiarity with OSU Extension; and if respondents would recommend Extension resources to others.
Categories of Queries Emailed to OSU Extension | # of Queries and # of Respondents | Age Range | Urban or Suburban Residents | First Encounter with Extension | Would Recommend Extension Resources to Others |
Soil, water, turf, landscapes | 98
queries 40 respondents | 26-35
= 7.5% 36-55 = 67.5% 56+ = 25% | 62.5% | 75% | 83% |
Programs and publications | 93
queries 32 respondents | 26-35
= 9.4% 36-55 = 53.1% 56+ = 34.4% | 50.1% | 72% | 82% |
Fruit, vegetable, and flower gardening | 80
queries 36 respondents | 26-35
= 9.1% 36-55 = 39.4% 56+ = 39.4% | 51.5% | 76% | 82% |
Business | 78
queries 25 respondents | 19-25
= 8% 36-55 = 56% 56+ = 36% | 28% | 76% | 96% |
Natural resources and sustainable practices | 47
queries 25 respondents | 26-35
= 20% 36-55 = 60% 56+ = 20% | 52% | 76% | 100% |
Foods and households | 39
queries 19 respondents | 26-35
= 10.5% 36-55 = 52.6% 56+ = 31.6% | 63.2% | 68% | 79% |
4H and youth programs | 26
queries 8 respondents | 26-35
= 33.3% 36-55 = 66.7% | 50% | 100% | 83% |
Our survey responses revealed that Extension's online Q&A service is:
- Bringing in new clients
- Reaching people across many regions and interests
- Raising Extension's profile with metro audiences
- Capable of expanding its reach through the recommendation of new clients
Discussion
Online queries from clients are sources of data. Simply by archiving each query and response, it is possible to amass a database of hundreds of email addresses and topics of interest to use in follow-up surveys. In the study reported here, we developed a profile of Web-savvy Extension clients who are actively seeking information, who are new to Extension, and who are favorably impressed with the educational information they received.
Our experience also revealed that people have questions and will use online access to Extension to get answers. Yet inviting an online audience into a vast library of information takes more than building a Web site. We rarely received the same question twice and very few questions that could be answered by a stock of FAQs. This suggests that FAQs should not replace the relationship between client and Extension experts. Client relationships cannot be managed automatically by software.
Managing knowledge to provide online educational delivery is more than tech support, as our experience shows. A common mistake that businesses make is to confuse IT with knowledge management, according to the Harvard study authors. They stress that even in a database-driven system, part of the effort must be person-to-person communication "to make sure that documents are not blindly applied to situations for which they are ill-suited" (Hansen, Nohria, & Tierney 1999).
For Extension, that effort may mean creating new professionals to help people navigate Extension-brand information on the Web. Just as businesses are developing e-commerce and universities are developing e-campuses, it is time for Extension to develop e-agents.
Our findings suggest that there is a role for the traditional one-to-one relationship with Extension clients, even with online clients, to help people find answers and navigate in-depth information. Just as library users often require help from librarians, online Extension clients may require help from Extension e-agents trained to connect people online with Extension programs, publications, and expertise.
So how should Extension manage its storehouse of expert knowledge? Should it facilitate the delivery of standardized educational material, modular enough to be useful in many situations? Or should it facilitate the customized delivery of expert knowledge applicable to particular uses and individuals?
The answer is not always clear, or the same each time. Some people sought out OSU Extension with questions that could be served by a well-designed database ("what is this bug on my tomato plant?"). Others came to Extension with a unique problem that could not be answered by a digitized response ("what should I do with my five acres?"). Is Extension's delivery of research-based knowledge a unique expert service or a commodity that can be widely applied?
If we are to invest in a database-driven knowledge management system, Extension agents need to be recognized and rewarded for their contributions in building that database with written documents that can be used and reused through technologically enabled delivery. If we are to invest in a personalized approach, Extension agents should be rewarded for sharing information directly with clients and for partnering and mentoring colleagues. Both systems require a well-stocked library of information and innovative knowledge brokers accessible to online clients.
Such a full-service online eXtension office would be staffed with e-specialists who serve an online community, develop online programs, and market online learning opportunities, in close coordination with county and statewide programs. Email queries, responses, and client email addresses would be collected in a database and used to identify gaps in existing educational services and to track client trends.
Conclusion
Many people who seek answers online still want human contact. All of the answers that the EESC researchers provided through the Web Q&A could have been found on the Web site, as links to publications, program Web sites, or Extension experts. The people who contacted the Webmaster with their questions needed a guide through the Web site and reassurance that they had found the right information from the vast online library. Examination of online client requests can help Extension identify gaps in available information and guide the planning of a knowledge management system as we move toward eXtension.
Acknowledgements
The original study was produced with the help and collaboration of Andrea Dailey, an editorial manager in the Department of Extension and Experiment Station Communications, Oregon State University, and presented to the ACE 2004 conference.
References
ClickZ. (2007). U.S. Households with computers and Internet access in 2003. Retrieved September 2007 from: http://www.clickz.com/157011
Emery, M. (1999). Who's out there? Strengthening Internet communication for agriculture through consideration of audience dimensions and user needs. Journal of Applied Communications. 83(1) p 21-35.
Hansen, M. T., Nohria, N., & Tierney, T. (1999) What's your strategy for managing knowledge? Harvard Business Review. 77 (2), p 106-116.
Hjeltnes, T. A., & Hansson, B. (2005). Cost effectiveness and cost efficiency in e-learning. Quality, Interoperability and Standards in e-learning. TISIP Research Foundation, Trondheim, Norway. Retrieved June 11, 2008 from: http://www2.tisip.no/quis/public_files/wp7-cost-effectiveness-efficiency.pdf
King, D.A. (2003) Communicators as architects of change. Journal of Applied Communications, 87(1). Retrieved June 11, 2008 from: http://www.aceweb.org/JAC/v87n1/871-3.htm
King, D. A., & Boehlje, M. D. (2000). Extension: On the brink of extinction or distinction? Journal of Extension [On-line], 38(5). Available at: http://www.joe.org/joe/2000october/comm1.html
Mississippi State University (2002). e-Extension--A national strategy. Executive summary. Retrieved June 11, 2008 from: http://asred.msstate.edu/e-extension/e-proposal82002.doc
Reichheld, F. F. (2001). Lead for loyalty. Harvard Business Review. 79(7), p.76-84.
Reichheld, F. F. (2003). The one number you need to grow. Harvard Business Review. 81(12), p. 46-54.
Rigby, D. K., Reichheld, F. F., & Schefter, P. (2002). Avoid the four perils of CRM. Harvard Business Review. 80(2), p. 101-109.