August 2000 // Volume 38 // Number 4 // Research in Brief // 4RIB2

Previous Article Issue Contents Previous Article

Abstract
Extension professionals involved in applied agronomic research projects commonly use analytical laboratories for soil sample analysis to determine the nutrient status of agricultural soils. Additionally, many Cooperative Extension clientele and program collaborators use soil testing laboratories for the same purpose. Determination of soil nutrient status is important to maximize production and returns as well as to minimize negative impacts to the environment. There is, however, a great deal of variability associated with the analytical results received from many soil testing laboratories. Variability has the potential to negatively impact applied research projects and recommendations made to Cooperative Extension clientele. In 1995 and 1996, a study was conducted to quantify and illustrate the variability problem and to develop a set of recommendations that can be used to assist Cooperative Extension professionals and program clientele in selecting a quality soils testing laboratory. Our results indicate there is a sufficient amount of variability in results received from soil testing laboratories to make it worthwhile to research analytical soil testing laboratories before selecting one for use.


Jerry Neufeld
Extension Educator-Crops
University of Idaho Cooperative Extension System
Caldwell, Idaho
Internet address: jerryn@uidaho.edu

Jay Davison
Area Specialist
Nevada Cooperative Extension
Fallon, Nevada
University of Nevada, Reno


Introduction

Cooperative Extension professionals and program collaborators commonly use laboratory analysis of soil samples in educational programs to assist in quantifying the nutrient status of soils. When done properly, soil testing is a highly effective tool in producing high crop yields for the lowest possible costs. However, inaccurate analysis can result in additional costs, lowered production, or environmental damage from excessive fertilizer applications. The accuracy of soil testing is dependent upon proper field sampling techniques and laboratory analysis.

This article discusses variability of results associated with soil testing laboratories and suggests practical actions that Extension professionals and program collaborators can take to select a laboratory that provides accurate and precise soil testing information.

Reasons for Soil Testing

In areas under intense cultivation for many years, current crop production practices remove nutrients from the soil faster than they can be replaced by natural soil formation processes. Therefore, periodic soil testing is necessary and the only tool available to quantitatively determine current soil nutrient levels. It is widely accepted in production agriculture that soil testing is a practice that helps producers obtain high yields while enabling them to use best management practices that benefit the environment (Hawkes et al., 1985). If, however, a producer is using inaccurate soil fertility data, he or she may apply fertilizer when there is no likelihood that the application will increase yield or profits. Conversely, if the soil fertility analysis data does not indicate a need for fertilizer when it is needed, maximum economic yields may be foregone and income lost.

Agricultural producers in Idaho, Nevada, and many other areas of the U.S. are experiencing pressure from various environmental groups and government agencies to reduce non-point sources of pollution. Nitrogen and phosphorous fertilizers applied to agricultural lands are sources of environmental degradation, and their detrimental effects have been well documented (Ongley, 1996).

Principal problems related to agricultural runoff are contamination of surface and ground water; loss of ecosystem diversity; ecosystem dysfunction; and increases in water-borne diseases (Ongley, 1996). Agricultural crops require large amounts of macronutrients; therefore, frequent applications are necessary for optimum crop production. Accurate and precise soil analysis enables producers to apply only the amount of fertilizers needed by the crop, thereby reducing the potential for offsite movement.

Definition of the Problem

Hundreds of laboratories in North America analyze agricultural soil samples. Consequently, there are varying levels of quality in the analytical results being provided to the customer. Also contributing to soil testing variability are the different extraction methods used to quantify the same soil constituents. Certain extraction methods have been shown through research to be more accurate and precise than other methods (Miller & Kotuby-Amacher, 1996), while other extraction methods are applicable to only certain types of soil and climates (Ankerman & Large). There are no certified reference standards across the soil testing industry for quantitatively evaluating the fertility status of agricultural soil samples. All of these factors can contribute to a high degree of reporting variability.

Analytical results from soil testing laboratories can be highly variable within individual laboratories as well as between laboratories. There are methods that can be used to deal with variability when submitting large quantities of samples. For example, soil samples with known properties (reference samples) can easily be submitted as a set of blind samples within a larger set of soil samples. The results on the reference sample can then be evaluated for deviations from its known properties.

However, the authors and most producers are more likely to submit small numbers of samples. When small numbers of samples are submitted, it is harder to check on the accuracy and precision of the laboratories' results. The authors' experience indicates that it is often difficult to determine when analytical results obtained from soil testing laboratories are accurate and that excessive variability between and within laboratories is the norm rather than the exception. Therefore, a project to evaluate the variability between and within several laboratories likely to conduct soil analysis work for northern Nevada producers was undertaken. A set of recommendations to assist Cooperative Extension professionals and clientele in selecting an accurate and precise soil testing laboratory was also developed.

Materials and Methods

In 1995 and 1996, the authors conducted a project to document and evaluate soil testing variability problems encountered when using commercial soil testing laboratories. In the fall of 1995, two 5-gallon samples of Creemon silt loam soil were collected from the upper 12 inches of soil from two locations in an alfalfa field south of Battle Mountain, Nevada. Each sample was air dried, crushed, and passed through an eighteen-mesh screen to remove large particles and debris. The samples were then thoroughly mixed to make a uniform composite sample. Each composite sample was used to fill 20 soil bags, for a total of 40 samples.

Two samples from each composite sample were sent to five different soil testing laboratories (each laboratory received four samples). Two weeks later, the remaining two samples from each composite sample were sent to the same five laboratories. This procedure was repeated in the fall of 1996, except a Sonoma silt loam from Lovelock, Nevada, was used. In summary, each laboratory evaluated four replications of four different soil samples over a period of 2 years (80 samples total).

Soil laboratories commonly use different extraction methods to analyze for the same soil constituents. Nine different soil constituents were analyzed for precision in this project. They were selected because the five laboratories use the same extraction method for these constituents, thus making direct comparisons possible. Table 1 lists the constituents, the extraction methods, and units used for this project.

Table 1.

Constituents Analyzed, Extraction Methods, and Units Used

Constituent

Extraction Method

Units

Phosphorous (P)

Sodium Bicarbonate

ppm

Potassium (K)

Ammonium Acetate

ppm

Calcium (Ca)

Ammonium Acetate

Millequivalents/100 grams

Magnesium (Mg)

Ammonium Acetate

Millequivalents/100 grams

Sodium (Na)

Ammonium Acetate

Millequivalents/100 grams

Zinc (Zn)

DTPA

ppm

Iron (Fe)

DTPA

ppm

Copper (Cu)

DTPA

ppm

Manganese (Mn)

DTPA

ppm

The analytical results received from the laboratories were summarized and then compared to the North American Proficiency Testing Program (NAPT, formerly called the Western States Proficiency Testing Program) values for the same years. The NAPT objectives are: 1) to provide an external measure of individual laboratory accuracy, 2) to develop a framework for improving the long-term quality of agricultural analyses, and 3) to identify levels of accuracy and precision for specific analytical methods (Miller & Kotuby-Amacher, 1996).

The NAPT's objectives are met through an intensive program whereby soil samples with known properties are submitted to voluntarily participating laboratories on a quarterly basis. Each laboratory analyzes the soil samples for nutrient status using established analytical procedures. (Miller & Kotuby-Amacher, 1998) Laboratories provide their results to the NAPT, where they are compiled and analyzed statistically. The statistical results provided by the NAPT to each laboratory show how they performed on the quarterly sample analysis compared to all other participating laboratories.

A statistical procedure called the "relative standard deviation" (RSD) is the main procedure used to evaluate laboratory results for precision. RSD is also known as "coefficient of variation" (CV). The RSD is a measure of the relative dispersion of the values in a data set (Little & Hills, 1978). RSD is calculated by dividing the standard deviation by the mean from a data set and then multiplying the dividend by 100. The lower the RSD value, the higher the level of precision. In 1995, the NAPT calculated an RSD value for 35 soil constituents submitted from 102 laboratories. In 1996, the NAPT calculated RSD values for 35 soil constituents submitted from 104 participating laboratories.

Laboratories participating in the NAPT can use the statistical data to compare their analytical results to industry-wide values and ultimately improve their analytical procedures. The NAPT does not provide data to the public about specific laboratories. Interested people must inquire from their individual laboratories as to whether or not they participate in this or any other proficiency program and whether they will share their proficiency testing data with you. However, the NAPT program does provide an annual report to the public with a summary of the data collected.

Following is an example of how proficiency testing program data can be used in a Cooperative Extension crops program. Anyone can request the annual report summarizing soil testing accuracy and precision results from the NAPT. You can also ask your laboratory to provide results from their participation in the NAPT. A review of the data will show how your laboratory compares to all other participating laboratories. As a rule of thumb, the NAPT suggests that accuracy data should be no greater than 10% of industry-wide values. Precision values (RSD) for individual laboratories should be no greater than 15% of industry-wide values and are analysis dependent (R. O. Miller, personal communication, April 1, 1998).

Results and Discussion

There is a wide range of variability between and within the results received from the five laboratories conducting the soil analyses for this project. Table 2 shows median RSD values obtained from the NAPT for 1995. This table also shows RSD values from the laboratories participating in this study in 1995. Table 3 shows the same data for 1996. RSD values exceeding the median NAPT values plus 15% are shown in bold type. Any RSD value exceeding the median NAPT value plus 15% indicates a lack of precision.

Table 2.

1995 NAPT RSD Values and Sampled Laboratory RSD Values

 

Constituent

& Sample ID

 

Median

NAPT RSD

Median

NAPT RSD

+ 15%

RSD Within

Sampled

Labs

 

Lab #1

RSD

 

Lab #2

RSD

 

Lab #3

RSD

 

Lab #4

RSD

 

Lab #5

RS

 

Phosphorus

12.8

14.7

 

Field A

 

74.9

4.8

10.2

8.2

67.4

12.5

Field B

44.9

4.9

22.5

16.2

44.2

30.5

 

Potassium

8.4

9.7

 

Field A

 

19.3

1.1

15.6

2.6

2.5

20.5

Field B

13.2

7.5

14.9

4.9

2.9

17.4

 

Calcium

15.5

17.8

 

Field A

 

46.9

1.3

8.9

1.1

2.9

12.5

Field B

45.6

6.4

61.5

7.1

5.7

4.6

 

Magnesium

7.0

8.1

 

Field A

 

19.1

1.5

11.0

1.8

2.2

8.5

Field B

11.1

6.7

4.9

9.2

4.7

15.5

 

Sodium

17.5

20.1

 

Field A

 

35.6

1.3

11.0

2.6

2.0

0.0

Field B

51.8

5.5

15.4

35.4

5.7

35.3

 

Zinc

9.6

11.0

 

Field A

 

27.0

13.2

26.7

13.3

8.7

6.5

Field B

24.9

14.2

13.2

16.5

7.4

8.0

 

Iron

13.5

15.5

 

Field A

 

77.0

66.9

62.8

0.0

11.8

10.0

Field B

66.2

14.5

45.2

23.1

18.2

1.9

 

Copper

10.1

11.6

 

Field A

 

42.1

15.3

34.0

0.0

6.9

6.8

Field B

30.3

6.1

5.4

8.7

14.0

7.4

 

Manganese

17.2

19.8

 

Field A

 

65.2

19.3

71.4

0.0

10.5

12.0

Field B

45.9

23.8

18.4

34.8

14.2

8.0

Table 3.

1996 NAPT RSD Values and Sampled Laboratory RSD Values

 

Constituent

& Sample ID

 

Median

NAPT RSD

Median

NAPT RSD

+ 15%

RSD Within

Sampled

Labs

 

Lab #1

RSD

 

Lab #2

RSD

 

Lab #3

RSD

 

Lab #4

RSD

 

Lab #5

RS

 

Phosphorus

12.3

14.1

 

Field A

 

121.0

2.3

13.8

6.4

85.9

8.1

Field B

104.6

1.4

9.5

4.1

69.9

3.6

 

Potassium

6.1

7.0

 

Field A

 

38.1

1.1

27.4

5.9

9.9

4.0

Field B

36.0

9.1

3.4

5.3

5.2

4.4

 

Calcium

11.0

12.7

 

Field A

 

68.1

1.5

2.3

1.1

2.5

9.7

Field B

72.2

8.4

7.2

58.2

3.1

7.4

 

Magnesium

7.3

8.4

 

Field A

 

51.0

1.5

40.7

8.9

2.4

1.4

Field B

54.3

8.2

3.9

8.2

2.5

1.2

 

Sodium

35.3

40.6

 

Field A

 

47.5

0.9

57.6

10.4

6.4

2.3

Field B

21.7

10.2

7.1

3.7

5.9

14.0

 

Zinc

9.6

11.0

 

Field A

 

35.6

4.6

58.0

6.8

22.2

10.9

Field B

21.3

4.4

15.7

7.7

31.9

4.4

 

Iron

13.1

15.1

 

Field A

 

72.0

0.0

21.5

38.5

38.5

18.7

Field B

62.9

4.9

38.2

23.1

58.3

15.9

 

Copper

10.9

12.5

 

Field A

 

20.9

13.0

28.6

6.1

19.2

10.9

Field B

19.4

13.9

10.9

6.8

20.7

5.0

 

Manganese

21.3

24.5

 

Field A

 

39.3

13.4

8.9

23.1

28.7

20.4

Field B

40.8

12.3

9.9

0.0

22.2

10.7

In nearly all cases, the RSDs within the group of sampled laboratories are much greater than the NAPT median RSD plus 15% for each constituent. This can be seen in Tables 2 and 3 by comparing the columns labeled Median NAPT RSD + 15% and RSD Within Sampled Labs. To a certain extent, this is to be expected because the number of laboratories we sampled is quite small compared to the NAPT program. Variability between sampled laboratories is not as great as the variability within laboratories.

However, for most constituents there is a wide range of variability within the laboratory's raw data. It is obvious that no laboratory was able to report results on all constituents falling within the recommended range of median NAPT RSD + 15%, although, it can be seen in the data that some laboratories were consistently more precise than others.

In several cases, the raw laboratory data shows sufficient variability in reported results to change the laboratory's fertilizer recommendation from one category to another based on factors other than the fertility of the soil sample submitted. This can have drastic implications for Cooperative Extension programs and collaborators, and/or can have detrimental impacts on the environment. It points out the importance of selecting a quality laboratory.

Recent work by Jacobson et al. (1998) also found significant variation between soil laboratories analytical results for the constituents EC, Ntotal, N24-48" depth, P, and K. Smaller differences were reported for pH, organic matter, N0-6" depth, and N6-24" depth. As with this study, the differences were sometimes striking.

At the conclusion of this project, the authors wrote letters of inquiry to 96 laboratories participating in the NAPT. We requested their identification numbers for the NAPT program and their latest NAPT results. This was done to determine the willingness of the laboratories to share their proficiency testing results with the public. The authors received 18 useable replies from these letters.

Conclusions and Recommendations

Our study results and experiences indicate that reported soil testing results can be highly variable between laboratories and within laboratories. The differences are large enough to potentially affect crop production and/or environmental quality. Selecting a laboratory that has implemented a rigorous quality control program resulting in consistently precise and accurate results is difficult without information obtained from a proficiency-testing program. The quality assurance provided by participation in proficiency testing programs demonstrates to Cooperative Extension professionals and clientele that the laboratory they are using has a long-term record of accuracy and precision in soil analysis.

The authors recommend the following steps to select a high-quality laboratory.

1. Ask the laboratory manager if the laboratory uses a quality-control program (both internal and external). If it does, ask the manager to explain it to you.

2. Ask the laboratory manager if the laboratory participates in a proficiency-testing program. If it does, ask which program and whether or not they will share the results with you. Ask the manager to explain the results to you. The authors' experience has been that few laboratories will openly share proficiency-testing-program information. If your laboratory won't share proficiency testing data with you, consider choosing one that will.

3. Make sure your laboratory uses the best-established analytical methods for the soil constituents you are most concerned about. Results on the most accurate and precise analytical methods in the industry can be obtained from a proficiency-testing program.

4. Make sure your laboratory is using analytical methods appropriate for the soils in your geographic area. Some procedures are not valid in the arid climates of the West, while other procedures are not valid in the eastern U.S.

5. If your concerns and budget are great enough, consider periodically submitting soil samples with known properties as checks along with your own samples. Check samples can be purchased from proficiency-testing programs. However, make sure to thoroughly discuss this practice and how to interpret the results with knowledgeable individuals before starting.

6. Consider using a tissue analysis program in conjunction with your soil testing program as a way to determine if the nutrients in your soils are reaching the plants. However, the issues brought up in this paper also apply to laboratory analyses of plant samples.

References

Ankerman, D., & Large, R. (Eds.). Soil and plant analysis agronomy handbook. Modesto, CA: A & L Agricultural Laboratories.

Gavlak, R. G., Hornbeck, D. A., & Miller, R. O. (1994). Plant, soil and water reference methods for the Western Region. WREP125.

Hawkes, G. R., Campbell, K.B., Ludwick, A. E., Millaway, R.M., & Thorup, R. M. (Eds.). (1985). Western fertilizer handbook. (7th ed.) The Interstate Printers and Publishers, Inc.

Jacobsen, J. S., Schaff, B. E., & Lorbeer, S. H. (1998). Soil test laboratory results and recommendation studies. Proceedings of the Great Plains Soil Fertility Conference. Denver, CO.

Little, T. M., & Hills, F. J. (1978). Agricultural experimentation--Design and analysis. New York, NY: John Wiley and Sons.

Miller, R. O., & Kotuby-Amacher, J. (1998). Western states proficiency testing program: Soil and plant analytical methods, Version 4.10. Colorado State University and Utah State University.

Miller, R. O., & Kotuby-Amacher, J. (1996). Western states proficiency testing program. Year-end report. Colorado State University and Utah State University.

Miller, R. O., & Kotuby-Amacher, J. (1996). Western states proficiency testing program. 1996 annual report supplement. Colorado State University and Utah State University.

Miller, R. O., & Kotuby-Amacher, J. (1995). Western states proficiency testing program. Year-end report. Colorado State University and Utah State University.

Ongley, E. D. (1996). Control of water pollution from agriculture. FAO Irrigation and Drainage Paper 55. Food and Agriculture Organization of the United Nations.

Mention of trade or program names within this paper does not constitute endorsement of any kind by the authors, the University of Idaho, or the University of Nevada, Reno.