Quality Assurance and Quality Assurance Project Plans by suchenfz


									Quality Assurance/Quality Control

 Quality Assurance Project Plans
           Greg Thoma
       University of Arkansas
   IPEC Quality Assurance Officer
 Quality Assurance/Quality Control
QA is management of the data collection system
to assure validity of the data.
  Organization & responsibilities
QC refers to technical activities which provide
quantitative data quality information.
  Data quality indicators, Calibration procedures.
Quality Assurance Project Plan
  Document that provides the details of QA & QC for a
  particular project

 How good is “good enough”? 99.9% of the time?
   1 hour unsafe drinking water a month
   22,000 checks deducted from the wrong account an hour
   16,000 pieces of lost mail an hour
 What does data quality mean?
   Universal standard? Relative measure?
 The goal of generators of environmental data should
 be to produce data of known quality to support
 environmental decisions
   Is the site clean?
   Does the technology work?
Scientific Method
                                   Invent a tentative theory or
 Observe something
                                   hypothesis consistent with
                                   the observations

 Test the predictions with         Use the hypothesis
 planned experiments               to make predictions

                         Discrepancies         How do you know if
                            between          there are discrepancies?
                              and             Uncertainty in observed
                            theory?         valued reduces the ability to
                                             discriminate differences.
 Modify the hypothesis in    Yes                  Conclude the
 light of the results                             theory is true
Data Life Cycle
Performance and Acceptance Criteria

  Performance criteria address the
  adequacy of information that is to be
  collected for the project.
    “Primary” data.
  Acceptance criteria address the
  adequacy of existing information
  proposed for inclusion in the project.
    “Secondary” (literature) data.
Performance and Acceptance Criteria

  Effective data collection is rarely achieved in a
  haphazard fashion.
    The hallmark of all good projects, studies, and
    decisions is a planned data collection.
  A systematic process leads to the development of
  acceptance or performance criteria that are:
     based on the ultimate use of the data to be collected,
    define the quality of data required to meet the final
    project objectives.
Performance and Acceptance Criteria

  The PAC development process helps to
  focus studies by encouraging experimenters
  to clarify vague objectives and explicitly
  frame their study questions.
  The development of PAC is a planning tool
  that can save resources by making data
  collection operations more resource-
PAC Process at Project Level

 State the problem
   Oil contaminated soil needs to be remediated
 Identify the study questions
   Testable hypotheses rather than general objectives
     • We hypothesize that the contaminated soil, under nutrient rich
       conditions, will exhibit the highest rates of degradation due to
       the history of hydrocarbon exposure these microbial
       communities have experienced.
 Establish study design constraints
   Budget, timeline, spatial extent, technical issues, etc.
     • 7 factors, 2 levels, 4 reps, 8 sample times!!!!
PAC Process at Project Level
 Identify data requirements
   What needs to be measured?
   Soil properties, nutrient status, contaminant level, etc.
 Specify information quality
   May be qualitative
     • Representativeness, comparability
   or quantitative
     • DQI: precision, bias, accuracy, and sensitivity
 Strategy for information synthesis
   How will it be analyzed? AVOVA? Regression?
 Optimize experimental design
   Get „good enough‟ data at the lowest cost
QA in Your Future?
 Intergovernmental Data Quality Task Force:
    Uniform Federal Policy for Implementing
    Environmental Quality Systems
    Joint initiative between the EPA, DoD, and DOE to
    resolve data quality inconsistencies and/or deficiencies
    to ensure that:
     • Environmental data are of known and documented quality and
       suitable for their intended uses, and
     • Environmental data collection and technology programs meet
       stated requirements.
 And don‟t forget TQM, ISO9000, & Six Sigma!
A Graded Approach
 The level of planning detail and documentation may:
   correspond to the importance of the project to its
     • e.g. significant health risks associated.
    reflect the overall scope and budget of the effort
     • Superfund cleanup vs. proof-of-concept research
    be driven by the inherent technical complexity or the
    political profile of the project
     • complex or politically sensitive projects generally require more
Quality Assurance Project Plan

 Documentation of routine laboratory
   A. Project Management
   B. Data Generation and Acquisition
   C. Assessment and Oversight
   D. Data Validation and Verification
            Group A.
      Project Management

Title Page
Signature Approval Sheet
Table of Contents
Distribution List
Project/Task Organization
Problem Definition/Background
Project/Task Description and Schedule
Quality Objectives (linked to PAC)
Special Training Requirements/Certification
Documentation and Records
  Performance Criteria for
  Phytoremediation Project

  Critical                                                Complete-     MDL
                Method   Reference     Precision   Bias
measurement                                                 ness

                         EPA 3540c                  70-
TPH (in soil)   GC/FID                   25%                90%        10 mg/kg
                         EPA 8015                  130%

 PAH and        GC/MS-                              70-
                         EPA 8270        25%                90%       150 mcg/kg
 Biomarker       SIM                               130%
                         Haines et     0.3 log
Numbers (in      MPN                               NA       90%        2 MPN/g
                         al., (1996)    units
Plant Biomass            Salisbury
   Shoots                and Ross         NA       NA       90%         0.1 g
    Roots                 (1985)
 Performance Criteria for
 Phytoremediation Project
Non- Critical                                                         Complete-           MDL
                  Method     Reference     Precision       Bias
measurement                                                             ness
                  PLFA by    Kennedy
  community                                    N/A          N/A          90%              N/A
                  GC/MS       (1994)
Plant available
Ca, Mg, Cu, Zn    Mehlich    Donohue                        90-
                                              20%                        90%             1 mg/kg
    and Na         3 ICP      (1992)                       110%
    (in soil)
   Salinity       Salinity                    10%           N/A          90%             1 dS/m

      Acceptance criteria will be developed for published meteorological data and data
              generated in other studies used in the modeling for this project.
Data Quality Indicators

 Bias: systematic factor causing error in one
 Precision: agreement of repeated measures of the
 same quantity
 Accuracy: combination of precision and bias
 Representativeness: how well the sample
 represents the population
 Comparability: how well two or more datasets
 may be combined
 Completeness: measure of the amount of valid
 data to the total planned collection of data.
 Sensitivity: separating the signal from the noise
Components of Variability

 Extremely important
   NAAQS sampling next to a bus stop??
   Stack gas monitoring – isokinetic sampling
 Sampling plan design
   Number and locations
   Size and sampling method and handling
    • Grab vs. composite, preservation methods, etc.
         Group B.
 Measurement/Data Acquisition

Experimental Design
Sampling Methods Requirements
Sample Handling and Custody Requirements
Analytical Methods Requirements
Quality Control Requirements
Instrument/Equipment Testing, Inspection, and
  Maintenance Requirements
Instrument Calibration and Frequency
Inspection/Acceptance Requirements for Supplies
Data Acquisition Requirements (Non-direct Measurements)
Data Management
Sample Handling and Preservation
Quality Control Checks
Impact of Detection Limit and
Contaminant Concentration on Reporting
MDL and False Positive Errors

                  For 7 injections,
                      t = 3.71
MDL and False Negative Errors
         Group C.
  Assessment and Oversight

Assessments and Response Actions
  Procedures for monitoring data quality as it is
  Actions to be taken in the event of failure to
  meet performance criteria
   • Stop analysis, correct problem, reanalyze
Reports to Management
         Group D.
Data Validation and Usability
Data review, verification, and validation
    • Check for transcription or data reduction errors and
      completeness of QC information.
    • Were the procedures in the QAPP accurately followed?
    • Does the data meet the PAC specified in the QAPP?
Reconciliation with user requirements
  Is the data suitable for use by decision makers?
Data Quality Assessment (DQA):

 The DQA process is a quantitative process
      Based on statistical methods
      Does set of data support a particular decision with an acceptable
      level of confidence?
 5 Steps:
     Review the PAC and sampling design;
     Conduct a preliminary data review;
     Select the statistical test;
     Verify the assumptions of the statistical test; and
     Draw conclusions from the data.
Example Quality Control Charts

 RPD =
                 %R =
Surrogate Recovery Example

                Decane recovery (%)

                                         QC batch number

 A.Apblett , “Novel materials for facile separation of petroleum products from aqueous
                           mixtures via magnetic filtration”
Benefits of Up-front Systematic
Focused data requirements and optimized
design for data collection;
Use of clearly developed work plans for
collecting data in the field;
A well documented basis for data
collection, evaluation, and use;
Clearer statistical analysis of the final data;
Sound, comprehensive QA Project Plans.
Benefits of QA

 Clear lines of responsibility
 Documented training and analytical
 Standard procedures to assure data
 Catch and correct subtle mistakes/errors

 Why go through the hassle & headache?
   QA/QC is just good science.
   Documented, defensible data.
   It is cheaper to do it right the first time.
   Your next proposal will be better too!

 Virtually all roads lead to:
Data Acquisition

 Experimental Design
   Will the results allow assessment of the
 Sampling Methods
   Is it representative?
   How is it preserved? Transported?
   Cross contamination
Data Acquisition (cont)

 Analytical Measurement Methods
   Quality Control
   Bias & Precision
    • Blanks, Duplicates, Spikes
   Instrument Control
Project Management

 Organization & Responsibilities
 Quality Objectives & Criteria
   What do you want to know? (Hypothesis)
   What are you measuring and how „good‟ the
   data needs to be.
 Record Keeping
   Lab, Field, Instrument notebooks
QA Plan for Development of Models

   Project Description
   Model Description - Conceptual Model
   Computational Aspects
   Data Source/Quality/Input-Output
   Model Validation
   Model Application
     Common Mistakes in MDL

  Incorrect standard deviation
  Incorrect degrees of freedom
  Insufficient replicates (need 7)
Spike out of range
  Lowest standard too far from MDL
Using method based MDL w/o verification
of validity for current matrix

To top