of 40/40
Quality Assurance/Quality Control and Quality Assurance Project Plans Greg Thoma University of Arkansas IPEC Quality Assurance Officer

Quality Assurance/Quality Control and Quality Assurance Project Plans

  • View
    166

  • Download
    17

Embed Size (px)

DESCRIPTION

Quality Assurance/Quality Control and Quality Assurance Project Plans. Greg Thoma University of Arkansas IPEC Quality Assurance Officer. Quality Assurance/Quality Control. QA is management of the data collection system to assure validity of the data. Organization & responsibilities - PowerPoint PPT Presentation

Text of Quality Assurance/Quality Control and Quality Assurance Project Plans

  • Quality Assurance/Quality Control and Quality Assurance Project PlansGreg ThomaUniversity of ArkansasIPEC Quality Assurance Officer

  • Quality Assurance/Quality ControlQA is management of the data collection system to assure validity of the data.Organization & responsibilitiesQC refers to technical activities which provide quantitative data quality information.Data quality indicators, Calibration procedures.Quality Assurance Project PlanDocument that provides the details of QA & QC for a particular project

  • Quality?How good is good enough? 99.9% of the time?1 hour unsafe drinking water a month22,000 checks deducted from the wrong account an hour16,000 pieces of lost mail an hourWhat does data quality mean?Universal standard? Relative measure?The goal of generators of environmental data should be to produce data of known quality to support environmental decisionsIs the site clean?Does the technology work?

  • Scientific MethodObserve something interestingInvent a tentative theory or hypothesis consistent with the observationsUse the hypothesis to make predictionsTest the predictions with planned experimentsModify the hypothesis in light of the resultsConclude the theory is trueDiscrepancies between observation and theory?NoYesHow do you know if there are discrepancies?Uncertainty in observed valued reduces the ability to discriminate differences.

  • Data Life Cycle

  • Performance and Acceptance CriteriaPerformance criteria address the adequacy of information that is to be collected for the project.Primary data. Acceptance criteria address the adequacy of existing information proposed for inclusion in the project. Secondary (literature) data.

  • Performance and Acceptance CriteriaEffective data collection is rarely achieved in a haphazard fashion.The hallmark of all good projects, studies, and decisions is a planned data collection. A systematic process leads to the development of acceptance or performance criteria that are: based on the ultimate use of the data to be collected, anddefine the quality of data required to meet the final project objectives. QAG/4A

  • Performance and Acceptance CriteriaThe PAC development process helps to focus studies by encouraging experimenters to clarify vague objectives and explicitly frame their study questions.The development of PAC is a planning tool that can save resources by making data collection operations more resource-effective.

  • PAC Process at Project LevelState the problemOil contaminated soil needs to be remediatedIdentify the study questionsTestable hypotheses rather than general objectivesWe hypothesize that the contaminated soil, under nutrient rich conditions, will exhibit the highest rates of degradation due to the history of hydrocarbon exposure these microbial communities have experienced. Establish study design constraintsBudget, timeline, spatial extent, technical issues, etc.7 factors, 2 levels, 4 reps, 8 sample times!!!!

  • PAC Process at Project LevelIdentify data requirementsWhat needs to be measured? Soil properties, nutrient status, contaminant level, etc. Specify information qualityMay be qualitativeRepresentativeness, comparabilityor quantitativeDQI: precision, bias, accuracy, and sensitivityStrategy for information synthesisHow will it be analyzed? AVOVA? Regression?Optimize experimental designGet good enough data at the lowest cost

  • QA in Your Future?Intergovernmental Data Quality Task Force:Uniform Federal Policy for Implementing Environmental Quality SystemsJoint initiative between the EPA, DoD, and DOE to resolve data quality inconsistencies and/or deficiencies to ensure that: Environmental data are of known and documented quality and suitable for their intended uses, and Environmental data collection and technology programs meet stated requirements. And dont forget TQM, ISO9000, & Six Sigma!

  • A Graded ApproachThe level of planning detail and documentation may:correspond to the importance of the project to its stakeholderse.g. significant health risks associated.reflect the overall scope and budget of the effortSuperfund cleanup vs. proof-of-concept researchbe driven by the inherent technical complexity or the political profile of the project complex or politically sensitive projects generally require more documentation.

  • Quality Assurance Project PlanDocumentation of routine laboratory practiceElementsA. Project ManagementB. Data Generation and AcquisitionC. Assessment and OversightD. Data Validation and Verification

  • Group A. Project ManagementTitle PageSignature Approval SheetTable of ContentsDistribution ListProject/Task OrganizationProblem Definition/BackgroundProject/Task Description and ScheduleQuality Objectives (linked to PAC)Special Training Requirements/CertificationDocumentation and Records

  • Performance Criteria for Phytoremediation Project

    Critical measurement Method ReferencePrecision BiasComplete-nessMDL

    TPH (in soil)GC/FIDEPA 3540c EPA 801525%70-130%90% 10 mg/kgPAH and BiomarkerGC/MS-SIMEPA 827025%70-130%90%150 mcg/kgOil-Degrader Numbers (in soil)MPNHaines et al., (1996)0.3 log unitsNA90%2 MPN/gPlant Biomass Shoots RootsGravi-metric Salisbury and Ross (1985)NANA90%0.1 g

  • Performance Criteria for Phytoremediation ProjectAcceptance criteria will be developed for published meteorological data and data generated in other studies used in the modeling for this project.

    Non- Critical measurement Method ReferencePrecision BiasComplete-nessMDL

    Microbial community structurePLFA by GC/MSKennedy (1994)N/AN/A90%N/APlant available Ca, Mg, Cu, Zn and Na (in soil)Mehlich 3 ICPDonohue (1992)20%90-110%90%1 mg/kgSalinitySalinityRhoades (1996)10%N/A90%1 dS/m

  • Data Quality IndicatorsBias: systematic factor causing error in one directionPrecision: agreement of repeated measures of the same quantityAccuracy: combination of precision and biasRepresentativeness: how well the sample represents the populationComparability: how well two or more datasets may be combinedCompleteness: measure of the amount of valid data to the total planned collection of data.Sensitivity: separating the signal from the noise

  • Accuracy

  • Components of Variability

  • RepresentativenessExtremely importantNAAQS sampling next to a bus stop??Stack gas monitoring isokinetic samplingSampling plan designNumber and locationsSize and sampling method and handlingGrab vs. composite, preservation methods, etc.

  • Group B. Measurement/Data AcquisitionExperimental DesignSampling Methods RequirementsSample Handling and Custody RequirementsAnalytical Methods RequirementsQuality Control RequirementsInstrument/Equipment Testing, Inspection, and Maintenance RequirementsInstrument Calibration and FrequencyInspection/Acceptance Requirements for Supplies Data Acquisition Requirements (Non-direct Measurements)Data Management

  • Sample Handling and Preservation

  • Quality Control Checks

  • Impact of Detection Limit and Contaminant Concentration on Reporting

  • MDL and False Positive ErrorsFor 7 injections, t = 3.71

  • MDL and False Negative Errors

  • Group C. Assessment and Oversight Assessments and Response ActionsProcedures for monitoring data quality as it is collectedActions to be taken in the event of failure to meet performance criteriaStop analysis, correct problem, reanalyzeReports to Management

  • Group D.Data Validation and Usability Data review, verification, and validationReviewCheck for transcription or data reduction errors and completeness of QC information.VerificationWere the procedures in the QAPP accurately followed?ValidationDoes the data meet the PAC specified in the QAPP?Reconciliation with user requirementsIs the data suitable for use by decision makers?

  • Data Quality Assessment (DQA):The DQA process is a quantitative processBased on statistical methods Does set of data support a particular decision with an acceptable level of confidence?5 Steps: Review the PAC and sampling design;Conduct a preliminary data review;Select the statistical test; Verify the assumptions of the statistical test; and Draw conclusions from the data.

  • Example Quality Control ChartsRPD = %R =

  • Surrogate Recovery ExampleDecane recovery (%)QC batch numberA.Apblett , Novel materials for facile separation of petroleum products from aqueous mixtures via magnetic filtration

  • Benefits of Up-front Systematic PlanningFocused data requirements and optimized design for data collection;Use of clearly developed work plans for collecting data in the field; A well documented basis for data collection, evaluation, and use; Clearer statistical analysis of the final data; Sound, comprehensive QA Project Plans.

  • Benefits of QA Clear lines of responsibilityDocumented training and analytical competenceStandard procedures to assure data comparabilityCatch and correct subtle mistakes/errors

  • ConclusionsWhy go through the hassle & headache?QA/QC is just good science.Documented, defensible data.It is cheaper to do it right the first time.Your next proposal will be better too!

  • WebsiteVirtually all roads lead to:www.epa.gov/quality

  • Data AcquisitionExperimental DesignWill the results allow assessment of the hypothesis?Sampling MethodsIs it representative?How is it preserved? Transported?Cross contamination

  • Data Acquisition (cont)Analytical Measurement MethodsQuality ControlCalibrationBias & PrecisionBlanks, Duplicates, SpikesInstrument Control

  • Project ManagementOrganization & ResponsibilitiesQuality Objectives & CriteriaWhat do you want to know? (Hypothesis)What are you measuring and how good the data needs to be.Record KeepingLab, Field, Instrument notebooks

  • QA Plan for Development of ModelsProject DescriptionModel Description - Conceptual ModelComputational AspectsData Source/Quality/InputOutputModel ValidationModel Application

  • Common Mistakes in MDL DeterminationMiscalculationIncorrect standard deviation Incorrect degrees of freedomInsufficient replicates (need 7)Spike out of rangeLowest standard too far from MDLUsing method based MDL w/o verification of validity for current matrix

    Mention heavy reliance on epa documents for this talkSoil type, pH, N, P, contaminant level, moisture condition, temperatureThese 3 steps are often iterative in nature. For example if you decide regression is more appropriate than ANOVA, you will not require as many experimental replicates in the designIPEC research project have the express goal of delivering technology to industry; stakeholder confidence in the results needs to be high. But proof-of-concept projects need much less.Thinking through, and writing all this down results in fewer mistakes when going to the field 5 hours away!!In the document, the frequency and corrective action need to be addressed. If the instrument history is long, then I have approved as few as 1 in50; but normal qc practice is ~10%; of course this adds cost to the project.This is a sensitivity issue; how is IDL, MDL defined?briefly talk about calibration issues: range, regression vs. response factorEven if you are just interpreting data from a contract laboratory, this is good to know b/c it can affect how you use the dataFor df=6, t= 3.7 for alpha=0.01PQL is often taken as 3 to 5x MDLThe difference between these three items is the level of rigor and independence of the check.Data review: is the in-house examination to ensure that data have been recorded and transmitted correctly, e.g., by checking for transcription, calculation, reduction, and transformation errors, and by ensuring that there is a complete list of sample information available, such as sample matrix, blanks, duplicates, shipping dates, preservatives, holding times, etc.

    Data verification: is the confirmation by examination and provision of objective evidence that specified requirements have been fulfilled. This process can be performed internally by those generating the data or by an organization external to the analytical group or fixed laboratory. For example, how will you make sure the data are complete, correct, consistent and in compliance with technical requirements, established standards, and contractual requirements? Did the facility follow instructions, use the right SOPs, the right methods? Were activities accurately flagged?

    Data validation: is the confirmation by examination and provision of objective evidence that the particular requirements for a specific intended use are fulfilled. This process is generally performed by someone external to the data generator. Data validation is an analyte- and sample-specific process to determine the analytical quality of a specific data set. Data validation criteria are based upon the measurement quality objectives developed in the project QA Project Plan. Data validation is the first step in data usability assessment.

    These activities are much more helpful if performed as data is collected; after the fact, no opportunity for corrective action other than rejecting the data is completely defensibleThe horizontal lines represent the acceptable limits of percent recovery from the laboratory matrix samples. In March, there was a large positive excursion in the concentration determined for the stock solution. The cause was found to be contamination of sample vials with oil mist from a nearby vacuum pump a result of graduate students placing the vials conveniently close to the balance. New pre-washed vials solved this problem. The precision of the measurements was determined using triplicate extraction experiments.