A comparison of laboratory performances

  • Published on
    14-Feb-2017

  • View
    220

  • Download
    3

Transcript

  • A comparison of laboratory ~

    performances How to check the reliability of analytical environmental sewices

    Julie I. Einerson Phyllis C. Pei

    SMdia Nmonal Loboratories Albuquerque, NM 87185

    Analytical laboratory results often are used to determine whether a facility or process is in compliance with permit or regulatory conditions; therefore, credi- ble data are an essential component of the decision-making process for an en- vironmental manager.

    At Sandia National Laboratories in Albuquerqne, NM, accurate and pre- cise data are essential for managing our envirunmental programs. Because we seek high credibility of laboratory data, a study was conducted to determine the management of measurement accuracy and precision by commercial laborato- ries. This article describes some proce- dures that will help identify those labo- ratories that provide precise and accurate analytical services on a con- sistent bask Ten laboratories ( A 4 were selected . .

    to participate in the competency tests. Laboratory qualifications were evalu- ated in six areas that had preassigned weights (Table 1): precision, accuracy, quality assurance documents, ability to customize reports, turnaround time, and price. Laboratories were scored and ranked by each category. Catego- ries were then weighted and summed to produce a list of laboratories ranked in order of competence.

    A set of written instructions were given to the laboratories, and written price quotes and quality assurance (QA) plans were obtained from each laboratory. The laboratories were given check samples that were spiked with six inorganic contaminants commonly found in wastewater discharges. These samples were sent along with routine wastewater samples. Preparation of standards. Standard

    solutions containing loo0 ppm of a

    metal were purchased. The standards were certified for use in atomic absorp tion spectroscopy and, except for arse- nic, were secondary standards trace able to the National Bureau of Standards. The standards purchased were cadmium (Cd) as cadmium ni-

    tassium dichromate, mercury (Hg) as mercuric chloride, arsenic (As) as arse-

    trate, trivalent chromium (CI(,)) as po-

    nic trioxide in nitric acid, copper (Cu) as copper oxide in nitric acid, and lead (Pb) as lead nitrate. These stock solu- tions were used to prepare the check samples.

    Ten batches of 2-L acidified check standards were prepared. All of the check standard solutions were poured into a 20-L container and thoroughly mixed. The final pH of the solution wa$

    W1393BW88NW1~1121EOl.50/0 0 1988 American Chemical Society Envimn. Sci.Technol..Voi. 22. No. 10, 1988 1121

  • tested to be < 2 . The final concentra- tions of the metals (true values) after dilution were As, 1.0 ppm; Cd, 0.05 ppm; Cr,,), 0.05 ppm; Cu, 5.0 ppm; Pb, 5.0 ppm; and Hg, 0.5 ppm. Deionized distilled water blanks containing 5 mL of nitric acid per 1ooO mL. were also included as samples to detect any possible contaminant in the diluent. The water blanks submitbed to the laboratories all contained less than detectable amounts of contaminants.

    Analytical results One container was given to each of

    the 10 laboratories to analyze six check sample metal parameters: As, Cd, Cr,,,, Cu, Pb, and Hg. The labs also were given another duplicate set of the sam- ples to analyze for Cr,,, and F'b. Table 2 outlines the raw data reported by each laboratory.

    precision. The precision error was calculated as follows:

    (1) where X , and X, are replicate values and TV is the hue value.

    The smaller the difference between X , and X,, the smaller the % error and the better the precision. Laboratories G and J demonstrated 100% precision in both the chromium and lead analyses. In general, the laboratories performed

    [(X, - X,)/2]rrV x 100%

    bet& in the analyses of chrohium than lead. Filly percent of the labs were 100% precise in the analysis of chro- mium, whereas only two labs were 100% precise in the analyses of lead. The % error in precision for chromium and the % error in precision for lead were then combined by using the root mean square formula to provide an overall precision rate (Table 3). The smallest error was assigned the highest rank of 1. Except for lab A, most had good precision.

    Aecuraey. In order to assess the de- gree of agreement between the mea- sured value and the true value, the ac- curacy for each lab was measured. The difference between the reported value and the true value divided by the m e value all multiplied by 100% provides the accuracy error rate

    (2) where Xis the measured value and TV is the true value.

    Lab A showed an error of more than 1ooO-fold for arsenic. It is interesting that one lab, Lab B, consistently erred

    , on the negative side of the true value in all six parameters, whereas labs G and H consistently erred on the positive side of the m e value (except in the case of arsenic). Table 3 illustrates the average % error from the six parameters for each of the laboratories. Labs D, G, and H all demonstrated an average ac-

    '.''

    [(X - Tv) /Tv ] x 100%

    1122 Emimn. Sci.Technoi..Vol. 22, NO. 10, 1984

  • curacy error rate of less than 10%. Precision and accuracy. These cate-

    gories were weighted 4096, as accurate and precise numbers are essential for environmental data. AU the elements of the sampling process-from the sam- pling techniques to sample preserva- tions, chains of custody, sample blanks, labratory analyses, and documents- tion-contribute to the reliability of data. It must be kept in mind that if an analytical laboratory does not produce reliable results, the entire sampling ef- fort may be rendered useless. The ulti- mate cost will be much greater if re- sampling has to be conducted. More laboratory performance8 are measured in this manner now, as is evident in the EPA Contractor Laboratory Program Statement of Work. Lab A ranked tenth in precision and ninth in accuracy; Lab G ranked first in precision and second in accuracy One may conclude from this correlation that if a laboratory has poor precision in its measurement of replicate samples, then its accuracy should also be suspect.

    Quality assurance. The laboratories

    were requested to provide a copy of their Quality Assurance manual for r e view. The 11 criteria under evaluation were chosen for their importance in as- suring the quality of analytical results. This list is by no means comprehensive but is meant to highlight critical aspects of a QA program. The total points available per criterion ranged from 1 to 4. The highest w r e earned the highest rank. Table 3 summarizes the total scores earned by each laboratory. The evaluation criteria included the descrip- tion and frequency of blind check sam- ples, method detection limits, method blanks, equipment calibration, matrix

    degree of laboratory errors should be minimized. For example, lab G ranked second in QA and accuracy and tied for 6rst in the precision category. The ex- pected correlation is that excellent QA plans, when executed, will result in good accuracy and precision in analy- ses, as seen with lab G. Lab H ranked first in the QA and accuracy but fifth in precision, and lab E ranked third in QA, but eighth in precision and sixth in accuracy.

    The results here suggest that not all QA plans are practiced at the lab bench. often, for the sake of standardization, the corporate division of a laboratory

    spike sampl&, check samples, replicate samples, internal audits, EPAKLP (Contractor Laboratory Program) par- ticipant, chain-of-custody use, and cor- rective action program.

    A good quality assurance program is essential to the accuracy and precision of analytical results. If employees are expected to understand and follow specifications, it is important that a plan be documented and implemented. If a good QA plan is followed properly, the

    would produce the QA plan without in- put or feedback from the analysts. The QA prccedures could be inappropriate or not practicable in the field laboratc- ries. If no one verifies that the QA plan is being implemented, then the plan be- comes a mere marketing tool; there will be no discipline or consistency in the quality of the labs output.

    There were instances in which a lab ranked low in QA and performed well in analysis. For example, lab C ranked

    Environ. Sci. Technol.. Val. 22, No 10. 1888 1123

  • ninth in QA but third in precision; Lab J ranked eighth in QA but tied for first in precision. Possibly these laboratories did not explain their full QA programs well, the evaluation criteria were not adequate in scoring these labs, or the analysts have good techniques in spite of the QA plans. In this latter case, some additional time spent on relining a QA plan should correct the problem and result in a better score in this evalu- ation.

    a m a r o u n d time. The samples were shipped to the laboratories in ice chests on the same day via Federal Ex- press. Although laboratories agreed to report data within 14 days, few were able to meet the deadlime. Those labs that met the turnaround time received a rank of 1. The lab providing data latest was ranked tenth (Tables 3 and 4). We found that the two laboratories (A and B) with the best turnaround time rank ninth and tenth, respectively, in the ac-

    lation between the qualii of the e

    Ability to customize reports. The laboratories were requested in writing to provide the analytical data in the fol- lowing manner. (These points were spe- cific in the written instructions eiven to

    curacy category. Rushing does not pay off.

    'hrnaround time is important be- cause regulatory agencies may que sampling and rewrting within a speck .,

    the 10 laboratories.) EPA-approved methods are to be used for sample analysis. The refer- ences to the methods are to accom- pany each parameter's analytical results. The analytical results will be re- ported with confidence limits for each parameter based on the tech- niques and equipment used by the laboratory.

    *The dates the analyses were per- formed are to be reported for each parameter. All chain-of-custcdy forms are to be returned to Sandia National Labora- tories after the samples reach the lab- oratory destination. All sample dam- ages and abnormalities are to be noted on the forms and signed by the laboratory recipient. Regulatory agencies often require

    different formats for the submission of environmental monitoring reports, and this facilitates the use of computers by the agencies to track permit compliance information. The instructions used in this evaluation were some of the re- quirements imposed by the city of Albuaueraue. Those labs that followed

    fied-(ofiin shortj amount of time.. ~ e - lay in reporting the results is usually frowned upon by regulatory agencies. The monitoring of process wastewater, for example, requires a short turn- around time so that parameters that are out of compliance may be quickly cor- rected; violation penalties for out-of- compliice parameters axe based on per-day calculations. Also, it is difficult for the environmental manager to a p proach the production manager about a high metal value that was discharged three months ago and expect effective corrective action today. Therefore, the sooner the problem is identified, the easier it is to determine the source of the problem.

    Price. All the laboratories were given a specific set of analysis param- eters in writing and asked to quote a price. They were asked to analyze for 33 parameters from 11 samples; mu- tine wastewater samples and the check

    all th6 ins'tructions received a score of 10 and a rank of 1, and so forth (Tables 3 and 4). If Labs G and H (both of which followed all of the instructions)

    mation from h e laboratories. There- fore, the ability of the laboratories to

    tomers' requirements is essential. tailor analytical reports to suit cus-

    1124 Environ. Scl.Technol..Yol. 22, No. 10, 1988

    samples were mixed in for analysis. There was a wide range of prices charged by laboratories for the same amount of work. The price vs. per- formance relationship is further com- plicated by some labs charging 150% and 2W% of the base price to compen- sate for the required turnaround time in the study

    However, there is not a definite cor- relation between the quality of the work and the price of the work. The labora- tory that ranked lowest in the precision category, lab A, was also the second- highest priced vendor. The highest priced vendor, lab C, ranked ninth in the QA category. It is interesting to note that lab J ranked first in price but eighth in QA, and lab F ranked second in price but loth in QA. This indicates that the cost of implementing a QA program is not reflected in the prices charged. Theoretically, a well-documented, well-executed QA plan should lower the long-term costs to the laboratory, and subsequently, the costs to the cus- tomer.

    Oversll ranking. The ranks for each laboratory were multiplied by the weight assigned for each category. The rank x weight values were totaled for each lab and divided by 6, the number of categories. The lowest weighted av- erage ranked first, and so forth (Table

    ond, and third, respectively. Lab H ranked first in three categories, and lab G ranked first in two categories. Labs C and F, which ranked ninth and tenth in QA, respectively, tied for last in the overall ranking. No lab ranked last in more than one category. Our study con- firms that there is a wide spectrum of differences in laboratory performances and that the customer should be aware of the differences before evaluating his or her data.

    Problems One interesting problem arose during

    this study. When a laboratory is aware that check samples are sent to examine its performance, more. care may be

    4). Labs H, G, and D ranked h t , ~ e c -

    ly by the labralory,

  • taken to ensure the confidence of re- ported data. Laboratory A was given the same samples one week apart. In the first sample, the laboratory was aware of the purpose of the samples. In the second week the same sample was given to the laboratory along with other routine wastewater samples. The results are listed in Table 5.

    Analysis of the first sample resulted in parameters that were very close to the true values. Without including Cu, the average accuracy error was 12%, whereas the average accuracy error for the blind check data was 270%. It ap- pears that less care or attention was given to the blind samples. The blind As value of 1200 ppm would have been a major cause of concern if a regulatory agency submitted the sample. Based on this datum, the regulatory agency could have assessed a fine or issued a compli- ance order to the regulated industry. (In one of the authors experience, this sit- uation has occurred.)

    Conclusions Ours was a small-scale study of com-

    mercial laboratories that provide sew- ices in the Albuquerque area. We were able to select at least three laboratories with standards acceptable to Sandia Na- tional Laboratories. Among our other conclusions and suggestions are the fol- lowing. Do not take reliability for granted.

    One cannot blindly rely on commercial laboratories for the values of contami- nants; data must be critically evaluated. Check samples must be submitted to verify the precision and accuracy of the laboratory data. Sample blanks and equipment blanks are necessary to ver- ify the validity of the analyses. A good laboratory invites its customers to sub- mit check samples.

    Establish good communications. It is important to establish specific con- tacts and communications with the commercial laboratory. Good commun- ications will ensure that the require- ment? of each party are understood. Sample collection and preservation techniques, sample media, labeling, and custodies must be understood by the laboratory, just as internal proce- dures of the laboratory should be un- derstood by the customer. It is a good idea to physically audit the laboratory and review its adherence to the written QA program, the qualifications of its staff, and recordkeeping procedures.

    Continually monitor laboratory performance. A laboratorys perform- ance should be periodically checked and charted to ensure data reliability. Blind check samples and replicates should be submitted periodically, and the customer should inform the labora- tory if it produces an unacceptable er-

    ror in its accuracy or precision. Per- formance charts also may help to reveal trends or problems. The customers re- sponsibility is to demand quality work so that he or she can intelligently man- age his or her programs, but the labora- tory has an obligation to investigate questionable data and to take the appro- priate corrective actions in response to customers needs. The laboratory must provide quality work to bring the cus- tomers back. Do not expect to pay more for bet-

    ter performance. As the demand for laboratory analytical work increases in response to environmental legislation, laboratory capacities and turnaround times will become critical. One may tend to choose whatever laboratory may be available at the time of need. It is important, however, that laboratory products be closely scrutinized to as- sure that accurate data are provided and meaningful decisions are made. Fur- thermore, our study demonstrates that there is no relationship between the price of analyses and the reliability of the data; therefore, price is not a good indicator of performance. The quality of the results will be the final cost to the customer.

    Beware of comparing data from different laboratories. Many compan- ies are instituting long-term (e.g., 30 years) groundwater-monitoring pro- grams. If laboratory precision and ac- curacy are poor, one may be feeding meaningless analytical data into com- plex computer programs to perform groundwater modeling. The output will be disappointing because we have found so much variability in data from different laboratories. It is also unwise to try to compare data from different laboratories that were analyzed at dif- ferent times. Laboratory personnel, equipment, and management often change, and these variables contribute to errors. Unless there is constant vigi- lance in the oversight of laboratory per- formance, the data may not make sense.

    Maintain credibility with the labo- ratory. The customer must maintain credibility with the laboratory. He or she should not provide misleading in- formation or try to trick the labora- tory. For example, it should be stated up front that the customer plans to sub- mit 5% or 10% of quality control sam- ples with the environmental samples. The customer must continually provide feedback to the laboratory on its per- formance. Check samples must be carefully prepared and verified, where possible, by another laboratory. The pursuit of true sample values should be a partnership between the laboratory and the customer.

    Acknowledgment This article has been reviewed for suitahil- ity as an ES&T feature by Richard G. Zepp. EPA Environmental Research Laho- ratory. Athens , G A 30613. and by Lawrence H. Keith, Radian Corporation, Austin, TX 78720-1088.

    Additional readings Gautier. M. A. et al. Quolir? A , ~ . m r m t l for

    Healfh nnd En~~ironnuvzral CIwmi.cq; Los Alamos National Laboratory. Lor Alamos. NM. 1986.

    Meyerhein. R . Anolwicnl and SumplinK Rrh- niquer; New Mexico Scientific Laboratory Division. Xerox.

    Rice. C.; Brinkman. J . ; Muller. D . Rdiabilify of C h e m i d Ann1y.w of W

Recommended

View more >