Five external quality-assurance programs were operated by the U.S. Geological Survey for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) from 2000 through 2001 (study period): the intersite-comparison program, the blind-audit program, the field-audit program, the interlaboratory-comparison program, and the collocated-sampler program. Each program is designed to measure specific components of the total error inherent in NADP/NTN wet-deposition measurements.
The intersite-comparison program assesses the variability and bias of pH and specific-conductance determinations made by NADP/NTN site operators with respect to accuracy goals. The accuracy goals are statistically based using the median of all of the measurements obtained for each of four intersite-comparison studies. The percentage of site operators responding on time that met the pH accuracy goals ranged from 84.2 to 90.5 percent. In these same four intersite-comparison studies, 88.9 to 99.0 percent of the site operators met the accuracy goals for specific conductance.
The blind-audit program evaluates the effects of routine sample handling, processing, and shipping on the chemistry of weekly precipitation samples. The blind-audit data for the study period indicate that sample handling introduced a small amount of sulfate contamination and slight changes to hydrogen-ion content of the precipitation samples. The magnitudes of the paired differences are not environmentally significant to NADP/NTN data users.
The field-audit program (also known as the 'field-blank program') was designed to measure the effects of field exposure, handling, and processing on the chemistry of NADP/NTN precipitation samples. The results indicate potential low-level contamination of NADP/NTN samples with calcium, ammonium, chloride, and nitrate. Less sodium contamination was detected by the field-audit data than in previous years. Statistical analysis of the paired differences shows that contaminant ions are entrained into the solutions from the field-exposed buckets, but the positive bias that results from the minor amount of contamination appears to affect the analytical results by less than 6 percent.
An interlaboratory-comparison program is used to estimate the analytical variability and bias of participating laboratories, especially the NADP Central Analytical Laboratory (CAL). Statistical comparison of the analytical results of participating laboratories implies that analytical data from the various monitoring networks can be compared. Bias was identified in the CAL data for ammonium, chloride, nitrate, sulfate, hydrogen-ion, and specific-conductance measurements, but the absolute value of the bias was less than analytical minimum reporting limits for all constituents except ammonium and sulfate. Control charts show brief time periods when the CAL's analytical precision for sodium, ammonium, and chloride was not within the control limits. Data for the analysis of ultrapure deionized-water samples indicated that the laboratories are maintaining good control of laboratory contamination. Estimated analytical precision among the laboratories indicates that the magnitudes of chemical-analysis errors are not environmentally significant to NADP data users.
Overall precision of the precipitation-monitoring system used by the NADP/NTN was estimated by evaluation of samples from collocated monitoring sites at CA99, CO08, and NH02. Precision defined by the median of the absolute percent difference (MAE) was estimated to be approximately 10 percent or less for calcium, magnesium, sodium, chloride, nitrate, sulfate, specific conductance, and sample volume. The MAE values for ammonium and hydrogen-ion concentrations were estimated to be less than 10 percent for CA99 and NH02 but nearly 20 percent for ammonium concentration and about 17 percent for hydrogen-ion concentration for CO08.
As in past years, the variability in the collocated-site data for sam
- Digital Object Identifier: 10.3133/sir20045034
- Source: USGS Publications Warehouse (indexId: sir20045034)