The Journal of Extension - www.joe.org

June 2009 // Volume 47 // Number 3 // Feature // v47-3a7

Use of the PRKC Tool in Assessment of Staff Development Needs: Experiences from California

Abstract
This article reviews the experiences of the authors in using the Professional Research, Knowledge, and Competency (PRKC) tool in assessing staff development needs among 4-H staff members in California. The PRKC is useful for evaluating self-perceptions of knowledge and competency levels among youth development staff. We encountered a number of challenges in working with the PRKC tool that are identified here, such as determining the best way of analyzing and interpreting the responses. The conclusions described here may assist researchers in other states in using the PRKC for similar work.


Katherine Heck
Specialist in the Agricultural Experiment Station
keheck@ucdavis.edu

Aarti Subramaniam
Assistant Project Scientist
asubramaniam@ucdavis.edu

Ramona Carlos
Academic Coordinator
rmcarlos@ucdavis.edu

4-H Center for Youth Development
Department of Human & Community Development
University of California-Davis
Davis, California

Background

In 2004, the National Professional Development Task Force in 4-H published a model aimed at establishing professional development standards to provide "a road map for the 4-H youth development workforce of the future" (Stone & Rennekamp, 2004). The study examined dimensions of knowledge and experience that youth development professionals should be expected to have. Task Force members wished to identify an up-to-date knowledge and research base for 4-H youth development that could serve as a foundation for professional development work. Results of the study identified six primary domains for youth development professional competencies. These included youth development; youth program development; volunteerism; equity, access, and opportunity; partnerships; and organizational systems. This new framework was adopted in June 2004 for use by the 4-H system.

Several of these domain areas have also been identified by the National 4-H Learning Priorities Steering Committee as priority areas in professional development for youth development staff for 2007-2012. These priority areas include volunteer development, evaluating for impact (relevant to the youth program development domain), expanding outreach to underserved audiences (i.e., equity, access, and opportunity), and building effective organizational systems (National 4-H Headquarters, 2007). The steering committee recommends that professional development targeted to these priority areas should be available to all 4-H youth development professionals.

The 4-H Professional Research, Knowledge and Competencies (PRKC) document includes a self-assessment tool intended to be used to make youth development professionals familiar with the knowledge and competency areas and to help them identify areas in which they may wish to focus their own professional development. The tool includes 10 to 11 questions in each of the six domain areas on which respondents are asked to rate their own proficiency on a scale from 1 to 5. Also included is a guide following up on the tool that helps youth development workers plan their own professional development, including identifying competency areas in which they would like to improve and creating an action plan for doing so (National 4-H Headquarters, 2005 (1)).

The PRKC tool was not intended as a research instrument. However, the guidelines for use state that the PRKC can be used "when assessing the staff development training needs for 4-H Extension audiences" (National 4-H Headquarters, 2005 (2)). Some states, such as Virginia, have used the PRKC in identifying specific components of staff trainings (Garst, Hunnings, Jamison, & Hairston, 2007). As a self-rating of competency in youth development, the PRKC tool may be limited in terms of accuracy; self-ratings could be less accurate than an objective measure by others. Overall self-ratings tend to overestimate ability (Kruger & Dunning, 1999). In addition, less competent individuals may be even less accurate at self-report than more competent individuals (Kruger & Dunning, 1999).

In 2006, California implemented a new staffing model in the 4-H Youth Development Program in response to declining numbers of county-based youth development staff members. Traditionally, California has had a staffing system for 4-H that varies somewhat from that of other states. Rather than having 4-H agents in counties, California employs 4-H youth development advisors and 4-H program representatives.

The advisors are responsible for the academic components of the program and, in particular, have research responsibilities. They are required to have at least a master's degree, and some advisors have doctoral degrees. The program representatives are responsible for the day-to-day operation of the 4-H program, such as program operation and volunteer management, and the job title is not supposed to include academic or research work per se. Program representatives are not required to have as high an education level, and some have less than a bachelor's degree. Therefore, they can be expected to have less training and knowledge in the domain areas examined in the PRKC tool.

Many California counties have both an advisor and a program representative, but the number of advisor FTE positions has declined over time. Over half of the 58 counties in California currently have no advisor. The new staffing plan created a part-time position for a current advisor, called an "Academic Coordinator," whose role would be to coordinate various program components on a regional level. Three Academic Coordinators began work in the summer of 2006 in three grouped clusters of California counties and took on varying tasks, including programmatic coordination across county clusters and volunteer and staff development tasks.

Since staff development was a key goal in two of the three county clusters, the project and evaluation team needed to examine levels of staff competency and knowledge in various areas to determine overall levels of competency and evaluate what areas would be useful for new programming. After some discussion of potential methods of assessing staff training needs, the project team decided to use the 4-H PRKC tool for this purpose.

This article describes the project team's experience with using the PRKC tool for an assessment of staff competency and knowledge areas as related to the PRKC domains and offers guidance to others who may wish to use the PRKC for a similar function.

Methods

In the summer of 2006, there were 91 non-secretarial youth development staff members in the 58 California counties. Of these 91 employees, 26 were advisors (some of whom were also county directors), 50 were program representatives, and 16 had titles such as "program assistant," "program coordinator," "volunteer coordinator," or "program manager." These nonstandard titles in some cases were because staff members were paid through counties rather than through the university system. These latter titles were included as program representatives in data analyses.

The research team created an online survey based on the PRKC tool. The questions included the respondent's job title; time spent working in the program; education level; preferences on how they might like to receive professional development trainings; and all PRKC questionnaire items, followed by a box for comments on the survey.

In the online survey tool, the PRKC items were listed exactly as they appeared in the PRKC documentation. The PRKC tool asks respondents to rate their own proficiency according to a 1-to-5 Likert-type scale. The proficiency scale includes explicit definitions for the values 1 ("Good: Knows the competency is important but has not yet addressed it or does not do it consistently"), 3 ("Better: Understands and applies knowledge and skills effectively"), and 5 ("Best: Not only understands and applies the competencies, but coaches others using the same skills and behaviors for the particular area"). There are no definitions in the PRKC tool for scale responses 2 and 4.

For our survey, we also added two additional choices in the scoring, "not applicable" and "don't know." These were added in case respondents felt that a particular item was not relevant to their job duties (not applicable), did not understand the item, or did not know to what degree they were successful at it (don't know).

In order to protect respondent confidentiality, the research team randomly generated a three-digit numeric identifier for each potential respondent to track his or her results. In August of 2007, each youth development staff member was emailed a notice describing and explaining the survey and asking them to participate. This email included the respondent's unique identifier as well as a link to the online survey instrument. Respondents were given approximately 3 weeks to complete the survey. Nonrespondents were sent up to two reminder emails to encourage participation.

Results

A total of 77 staff members responded to the online PRKC survey, for a response rate of approximately 85%. Respondents included 25 youth development advisors and 52 program representatives. Respondents tended to rate themselves relatively highly on youth development and organizational systems; moderately on youth program development and equity, access, and opportunity; and lowest on volunteerism and partnerships.

We examined overall self-ratings for all the items (4,480 individual item responses). This distribution is shown in Figure 1. Slightly over half of all item self-ratings were either a 3 or a 4, and about one-quarter overall fell below a 3. For particular domains, respondents tended to rate themselves most highly in the domains of youth development and organizational systems and least highly in volunteerism and partnerships. Mean scores for individual items mostly fell between 2.5 and 4.0, while medians were typically either a 3 or a 4. Advisor self-ratings tended to be higher than program representative self-ratings.

Figure 1.
Percentage Distribution of Responses for PRKC Items
Percentage Distribution of Responses for PRKC Items

Definitions for each value:
1: Good: Knows the competency is important but has not yet addressed it or does not do it consistently.
2: (No definition provided in the PRKC.)
3: Better: Understands and applies knowledge and skills effectively.
4: (No definition provided in the PRKC.)
5: Best: Not only understands and applies the competencies, but coaches others using the same skills and behaviors for the particular area.

The individual items in which individuals tended to rate themselves most highly, and those in which individuals rated themselves lowest, are presented in Table 1 in descending order of the combined values 4 and 5 (representing a relatively high level of competency). These 10 items were selected because the overall distribution of responses fell most closely at either the top or the bottom of the 1-to-5 scale. Missing and "don't know" responses were excluded from the analysis. Most of the highest self-ratings fell into the Organizational Systems domain, such as advocating for positive youth development, applying ethical standards of the profession, and planning inclusive program environments. The lowest self-ratings predominantly fell into the volunteerism and partnerships domains and mostly related to impact assessments and community needs assessments; these are research skills that are not required for program representatives in California.

Table 1.
The Five Individual Items with the Overall Highest and Lowest Self-Ratings

Highest Self-Ratings Percentage Who Rated Themselves:
Domain Item 1 2 3 4 5
Organizational systems Advocate for positive youth development in all aspects and levels of work. 2.7 6.8 10.8 33.8 45.9
Organizational systems Apply ethical standards of the profession. 6.7 4.0 12.0 17.3 60.0
Equity, access, and opportunity Aware of and open to youth and volunteers who are diverse. 3.9 5.3 14.5 32.9 43.4
Organizational systems Develop and manage budgets in accordance with organization/ university policy and procedures. 4.6 13.8 7.7 40.0 33.8
Organizational systems Plan for and manage safe, inclusive program environments for all persons. 10.1 0.0 18.8 43.5 27.5
Lowest Self-Ratings Percentage Who Rated Themselves:
Domain Item 1 2 3 4 5
Volunteerism Develop and conduct impact assessment of volunteer efforts and communicate to stakeholders. 28.8 27.1 22.0 15.3 6.8
Volunteerism Develop and conduct organizational and community needs assessments relative to volunteer engagement. 28.8 27.1 22.0 15.3 6.8
Equity, access, and opportunity Develop and conduct community needs assessments to gain meaningful input from diverse audiences. 28.1 21.9 28.1 14.1 7.8
Partnerships Develop and conduct a community analysis. 34.6 15.4 32.7 13.5 3.8
Partnerships Apply community development tools and processes. 30.4 17.9 23.2 19.6 8.9

Highest self-ratings are the five items in which the highest percentage of individuals scored themselves a 4 or 5 (excluding "don't know" and "not applicable" responses).

Lowest self-ratings are the five items in which the highest percentage of individuals scored themselves a 1 or 2 (excluding "don't know" and "not applicable" responses).

As part of testing the relevance of the PRKC items for this group of youth development professionals, we examined the "don't know" and "not applicable" responses. A high number of "don't knows" or "not applicables" might help identify questions that may have been either confusing or not appropriate for staff members in the California 4-H youth development program. Items that had at least 10 responses that were either "don't know" or "not applicable" are shown in Table 2.

Many of these items relate to skills that program representatives in California are not required to have, such as expertise in community assessment or management of budgets. "Don't know" or, particularly, "not applicable" might be an appropriate response if the skill is not required for one's job title. However, there were items with relatively high "don't know" or "not applicable" responses that are relevant to the 4-H program's mission of research-driven and community-based youth programming, such as "Apply strategies to enhance the profession through the integration of research practice" and "Create and manage appropriate community alliances."

In addition, some items in this list may be here because of ambiguous wording. For example, in Volunteerism, one item states, "Apply societal changes to volunteer administration strategies." Respondents may have been uncertain what was meant by "apply societal changes." This item may have been better understood with wording such as "Apply knowledge about societal or demographic changes to volunteer administration strategies."

Table 2.
"Don't know" and "Not applicable" Responses in the California PRKC Survey

Domain Item "Don't Know" Responses "Not Applicable" responses
Youth development Understand and apply a model that demonstrates how multiple contexts have influence over the growth and development of youth. 4 7
Youth program development Use a logic model to represent how a program operates. 9 8
Utilize quantitative and qualitative evaluation methodology. 5 10
Analyze and interpret evaluation data. 4 9
Volunteerism Apply societal changes to volunteer administration strategies. 6 5
Develop and conduct organizational and community needs assessments relative to volunteer engagement. 5 7
Implement appropriate selection strategies to engage potential volunteers for available position(s). 3 7
Develop and conduct impact assessment of volunteer efforts and communicate to stakeholders. 6 11
Equity, access, and opportunity Develop and conduct community needs assessments to gain meaningful input from diverse audiences. 5 7
Recruits, supports and retains diverse volunteers and advisory committee members. 4 6
Partnerships Assess readiness for community alliances. 5 7
Create and manage appropriate community alliances. 6 8
Develop and conduct a community analysis. 8 17
Apply community development tools and processes. 8 13
Facilitate workforce development through 4-H youth development. 5 10
Organizational systems Develop and manage budgets in accordance with organization/university policy and procedures. 2 10
Apply strategies to enhance the profession through the integration of research practice. 4 10
Note: Not all items are shown. The items shown are those with at least 10 "don't know" or "not applicable" responses.

Challenges in Using the PRKC Survey Tool

The PRKC tool was useful in identifying areas in which staff members felt they were proficient or, conversely, felt their skills were not up to the "Better" or "Best" levels. However, our ability to identify which areas needed staff training was limited for a variety of reasons. Below we discuss some of the challenges we had in analyzing and interpreting the PRKC data for California.

Analyzing Responses

The PRKC tool includes no specific guidelines for how to analyze or measure scores on a group basis. Its intent is for individuals to get a general picture of the areas in which they feel they are strong and other areas in which they may wish to improve their competency level. In the study reported here we needed to generate specific results by analyzing the data for the entire group, as well as examining results for subgroups, such as comparing advisor and program representative responses.

We considered a variety of methods of summarizing the data. Some possibilities for analysis for items or domains include calculating mean scores, median scores, or the proportion of respondents who achieve a particular set value (for example, the percentage of respondents who score themselves at least a 3 for a particular item). In addition, we looked at the overall distribution of responses for each item. There are advantages and disadvantages to each analytical option. Mean scores can be useful for giving an overall picture of the score levels of individual items or differences across groups. However, means can be an inaccurate reflection of the true results if the responses are not normally distributed.

In addition, the scale on which respondents judged themselves, while appearing to be a continuous scale from 1 to 5, was not truly continuous: the values 1, 3, and 5 were explained as separate and distinct concepts and may not truly represent equal steps along a knowledge or proficiency path. Additionally, the values 2 and 4 had no description for their meaning. Median values may represent the midpoint of respondents' self-perceptions, but with only five values to choose from, the number of values is limited and in these data medians would almost always be either a 3 or a 4, giving relatively little information for distinguishing among items.

For our final analyses, we chose to examine individual items according to the percentage of respondents who scored themselves at least a 3. The selection of the value 3 is somewhat arbitrary; another target value could have been selected. The value 3 was chosen because it indicated effective knowledge and application of the item. One advantage to looking at the proportion achieving a particular value is that this method provides greater variability across individual items, thus giving more information. In addition, treating the data values as categorical, which this method of analysis does, does not assume that the subjective differences among the differing values are equal. For example, it does not assume that the difference between a 2 and a 3 is the same as the difference between a 3 and a 4, which would be the underlying assumption for mean values.

Interpreting Scores

We encountered difficulty in interpretation of the score values 2 and 4, as presented in the PRKC tool. While 1, 3, and 5 are specifically described with a level of proficiency ("Good," "Better," and "Best," with descriptions of what those mean), 2 and 4 have no definitions provided. Therefore, while we assume that individuals who scored themselves as a 2 or a 4 for a given item most likely felt that their proficiency lay somewhere between the described proficiency of the two contiguous values (either between "Good" and "Better" or between "Better" and "Best"), because there was no definition for 2 or 4, it is not possible to explain exactly what a respondent meant by that particular score.

Subjectivity of Self-Assessment

Any survey instrument is limited because it provides a subjective view of the respondent's beliefs, experiences, and level of skill or knowledge. In addition to the possibility that respondents may not be honest in answering, survey data are limited by their very nature as a self-report tool. Self-assessments provide no objective, external measure of the true value for a given measure.

This limitation is true of any survey data, but may be particularly relevant for a self-rating of one's own competency, which is not as objective a variable as some survey topics such as the respondent's age or education level. Whether due to modesty or fear of reprisal, some respondents may not be comfortable in rating their job-related skills on one end of the spectrum or the other. It is possible that some respondents may not be comfortable revealing self-perceived weaknesses, or conversely, self-perceived egotism, in a survey to be seen by others, even with the confidentiality additions made for this purpose such as entering an ID number rather than the respondent's name and county.

As a result of these various possibilities, it is not possible to determine how closely these data reflect the true levels of competency among youth development staff members. In general, the results reflected what might be expected. For example, staff members with longer tenure and those with advisor titles tended to rate themselves more highly than newer staff members and program representatives (respectively). However, there were some respondents whose self-rated values were outliers, and it is possible that objective assessments would have rated them (or other respondents) differently.

Discussion

While there were a number of challenges to using the PRKC survey as a tool for determining staff development needs, on the whole we found it was useful in identifying areas in which staff members had expertise as well as those where they reported lower levels of confidence. Having a ready-made assessment survey that matched to nationally identified priorities for youth development was valuable to us in a number of respects.

A key strength of using this assessment tool was our ability to rapidly develop and implement the online survey once we decided an assessment of staff members' strengths and needs was necessary as part of the evaluation for the new staffing plan. Developing and piloting a new survey of staff training needs and competencies would have required a substantial and lengthy effort, which we circumvented by using the pre-established PRKC survey; therefore, the cost savings in terms of staff time was a significant benefit.

The specific skills identified in the PRKC as important for 4-H youth development staff to have were not in all cases relevant for youth development staff in California, as indicated by the relatively high number of "don't know" or "not applicable" responses to items as well as by a comparison of the listed skills to the California program representative job description. In particular, research and evaluation skills, featured in the PRKC, are not expected of program representatives in California, although they remain relevant for advisors. Other states or counties using the PRKC to assess staff development needs may wish to evaluate the relevance of the individual items for their staff members when interpreting findings.

The results from the 77 respondents have been, and continue to be, used to develop trainings for staff development. For example, during the fall of 2007, a workshop was delivered to program representatives involving volunteer management, which arose in the PRKC data as a significant need. Additionally, trainings have been held regarding organizational management topics such as using technology for communication.

Conclusions

The Professional Research, Knowledge and Competencies tool was intended to be used by individuals working in the youth development field as a way to identify their own areas of strength and weakness, but its documentation also suggests using it as a way to identify professional development needs on a broader level. The project reported here assessed staff competency levels and training needs using that latter suggestion. The methodology for using the PRKC tool described here–anonymously, with results analyzed statewide–had certain challenges but on the whole was a useful way to assess current staff members' competency levels and to identify areas for staff trainings. We hope that other states wishing to use the PRKC for a similar purpose may find our experience and suggestions for analysis useful.

References

Garst, B. A., Hunnings, J. R., Jamison, K., & Hairston, J. (2007). Development of a comprehensive new 4-H Extension agents training program using a multi-module approach and the 4-H Professional Research, Knowledge, and Competencies (4HPRKC) taxonomy. Journal of Extension [On-line], 45(1) Article 1FEA3. Available at: http://www.joe.org/joe/2007february/a3.php

Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134.

National 4-H Headquarters (1). (2005). Planning guide for creating a 4-H professional development plan. National 4-H Headquarters, CSREES, USDA.

National 4-H Headquarters (2). (2005). 4-H PRKC self-assessment: guidelines for use. National 4-H Headquarters, CSREES, USDA.

National 4-H Headquarters. (2007). National 4-H learning priorities 2007-2012. National 4-H Headquarters, CSREES, USDA.

Stone, B., & Rennekamp, R. (2004). New foundations for the 4-H youth development profession: 4-H professional research, knowledge, and competencies study, 2004. Conducted in cooperation with the National 4-H Professional Development Task Force. National 4-H Headquarters, CSREES, USDA.