The Journal of Extension - www.joe.org

June 2012 // Volume 50 // Number 3 // Tools of the Trade // v50 -3tt1

Ensuring Data Quality in Extension Research and Evaluation Studies

Abstract
This article presents a checklist as a guide for Extension professionals to use in research and evaluation studies they carry out. A total of 40 statements grouped under eight data quality components—relevance, objectivity, validity, reliability, integrity, generalizability, completeness, and utility—are identified to ensure that research carried out by Extension professionals is credible, followed research protocols, was conducted in an ethical manner, and can withstand the test of scrutiny by reviewers. Researchers and Extension professionals can use the checklist to identify the areas that are methodologically sound and the areas that need improvement.


Rama Radhakrishna
Professor and Interim Department Head
Department of Agricultural and Extension Education
brr100@psu.edu

Daniel Tobin
Doctoral Candidate
Department of Agricultural and Extension Education
dbt127@psu.edu

Mark Brennan
Associate Professor
Department of Agricultural and Extension Education
mab187@psu.edu

Joan Thomson
Professor Emerita
Department of Agricultural and Extension Education
jst3@psu.edu

The Pennsylvania State University
University Park, Pennsylvania

Introduction

The main purpose of ensuring data quality in Extension research and evaluation studies is to present information that is credible. Such research and evaluation studies follow research protocols, conducted in an ethical manner, and withstand the test of scrutiny by reviewers. Data quality is generally understood to be the degree to which data, including research processes such as data collection and statistical accuracy, meet the needs of users (Vale, 2010). Among the critical aspects to consider when assessing data for quality are relevance, validity, reliability, objectivity, integrity, completeness, generalizability, and utility. Ensuring these critical aspects of data quality in Extension research and evaluation studies is of paramount importance if Extension is to implement and improve programming based on sound methods.

Theoretical and methodological rigor needs to be continually enhanced in order to ensure that Extension is delivering relevant and useful programs to important stakeholders (Braverman & Engle, 2009; Dunifon, Duttweiler, Pillemer, Tobias, & Trochim, 2004). Sound research and evaluation methods based in data quality help Extension provide evidence that outcomes are attributable to Extension programs and help Extension to improve its program offerings (Radhakrishna & Relado, 2009).

This article presents a checklist as a guide for Extension professionals to use to ensure quality of data for the research and evaluation studies they conduct. Definitions of data quality vary from discipline to discipline based on relevance, importance, and user needs. Synthesizing important components across various definitions of data quality and keeping in mind the broad philosophical base of agricultural and Extension education, we propose that data quality is composed of eight distinct aspects: relevance, objectivity, validity, reliability, integrity, completeness, generalizability, and utility.

In the following paragraphs, a definition for each of the eight components is discussed in order to provide background to data quality (Figure 1). Based on these definitions, we next present a checklist that usefully operationalizes data quality so that Extension professionals can ensure that their research and evaluation studies are rooted in sound methods, thereby ensuring data quality.

Figure 1.
Eight Components of Data Quality

Eight Components of Data Quality


Validity

Validity refers to the "closeness between the values provided and the true values" (Organization for Economic Cooperation and Development [OECD], 2003, p.7). Careful development of the questionnaire provides a basis for validity. A thorough examination of previous studies, an ongoing review by a panel of experts, and carrying out a field test makes the case for construct, content, and face validity (Guba & Lincoln, 1981).

Reliability

Reliability is determined by the degree to which measurements are similar (consistent) on repeated measurements (Centers for Disease Control, 2009). Careful wording of the questionnaire and pilot testing the questionnaire with subjects not included in the sample, as well as a high response rate, provide evidence for reliability.

Objectivity

Objectivity of data means that conclusions are based on statistically sound methods (Guba & Lincoln, 1981; Guba, 1981). Careful analysis of assumptions/hypotheses/objectives/research questions and use of appropriate statistical procedures and results provide evidence of objectivity.

Integrity

Integrity is concerned with minimizing errors through the process of collecting, recording, and analyzing data (CDC, 2009). Integrity can be enhanced by properly training those involved with data collection and by reviewing that the data have been properly recorded.

Generalizability

Generalizability is concerned with sound sampling procedures that yield a sample representative of the population on key variables (Guba, 1981; Guba & Lincoln, 1981) and follow-up with non-respondents (Radhakrishna & Doamekpor, 2008; Miller & Smith, 1983).

Completeness

Completeness refers to ways in which missing values that exist in a given dataset are handled (CDC, 2009). When data are missing at random, their incompleteness is due to external events that cannot be controlled, whereas data not missing at random cannot be collected due to known and expected external events (Howell, 2009). The data not missing at random must be considered during data analysis to better understand the limitations and generalizability of the study.

Relevance

Relevance refers to the degree to which data are important to users and their needs (OECD, 2003; Vale, 2010). Among the strategies to ensure a high degree of relevancy are thorough literature reviews and needs assessments.

Utility

Utility includes aspects of timeliness (data collected in a timely manner so that data maintain their relevance to their users), punctuality (release of data), and accessibility (ways in which data are made available to the intended users).

Ensuring these critical aspects of data quality for quantitative data in Extension research and evaluation studies are of paramount importance if Extension programs are to be based on sound research. Careful attention to these data quality components helps reduce errors and ensures that the research is deemed acceptable after the critical scrutiny of reviewers, Extension professionals, and faculty. Striving for data quality will help Extension maintain excellence in its pursuit to accessibly apply research to programs.

Using the Checklist

Based on the information gathered, review of Extension studies, and our experiences, a data quality checklist was developed in order to guide researchers and Extension professionals through the process of ensuring data quality (Figure 2). By using the checklist, researchers and Extension professionals can identify the areas that are methodologically sound and the areas that need improvement. To use the checklist, indicate the extent to which the data quality components are addressed in a research or evaluation study by recording a score of 4, if it is addressed; 3, addressed, but needs improvement; 2, partly addressed, requires major revisions; 1, not addressed at all; and 0 if it doesn't apply.

Ensuring data quality in all Extension research and evaluation studies is critical in order to design, deliver, and evaluate programs in a manner that is methodologically sound and rigorous. Doing so ensures that Extension will continue to provide programs that are based on sound research and are relevant to stakeholder needs.

Figure 2.
Data Quality Checklist for Research and Evaluation Studies in Extension
(Click image for a printable pdf)

Data Quality Checklist for Research and Evaluation Studies in Extension


References

Braverman, M. T, & Engle, M. (2009). Theory and rigor in Extension program evaluation planning. Journal of Extension [On-line], 47(3) Article 3FEA1. Available at: https://www.joe.org/joe/2009june/a1.php

Centers for Disease Control and Prevention (2009). Quality assurance standards for HIV counseling, testing, and referral data. Atlanta: Department of Health and Human Services, Centers of Disease Control and Prevention. Retrieved from: http://www.cdc.gov/hiv/testing/resources/guidelines/quas/overview.htm#link4

Dunifon, R., Duttweiler, M., Pillemer, K., Tobias, D., & Trochim, W. M. (2004). Evidence-based Extension. Journal of Extension [On-line], 42(2) Article 2FEA2. Available at: https://www.joe.org/joe/2004april/a2.php

Guba, E. G. (1981). Criteria for assessing the trustworthiness of naturalistic inquiries. Educational Communication and Technology Journal, 29 (2), 75-91.

Guba, E. G., & Lincoln, Y. S. (1981). Effective evaluation: Improving the usefulness of evaluation results through responsive and naturalistic approaches. San Francisco, CA: Jossey-Bass.

Miller, L. E., & Smith, K. L. (1983). Handling nonresponse issues. Journal of Extension [On-line], 21(5), Available at: https://www.joe.org/joe/1983september/83-5-a7.pdf

Organization for Economic Co-operation and Development (2003). Quality framework and guidelines for OECD Statistical activities, version 2003/1. Retrieved from: http://www.oeced.org/dataoecd/26/42/21688835.pdf

Radhakrishna, R. B., & Doamekpor, P. (2008). Strategies for generalizing findings in survey research. Journal of Extension [On-line] 46(2), Article 2TOT1. Available at: https://www.joe.org/joe/2008april/tt1.php

Radhakrishna, R. B., & Relado, R. Z. (2009). A framework to link evaluation questions to program outcomes. Journal of Extension [On-line], 47(3) Article 3TOT2. Available at: https://www.joe.org/joe/2009june/tt2.php

Vale, S. (2010). Statistical data quality in the UNECE, 2010 version. Statistical Division, United Nations. Retrieved from: http://unstats.un.org/unsd/dnss/docs-nquaf/UNECE-quality%20Improvement%20Programme%202010.pdf