New Jersey Department of Environmental Protection
Site Remediation Program
DATA QUALITY ASSESSMENT AND
DATA USABILITY EVALUATION
TECHNICAL GUIDANCE
Version 1.0
April 2014
Preamble
The results of analyses performed on environmental matrices are used to determine if
remediation is needed. Because of the nature of environmental matrices, limitations of analytical
methods, characteristics of analytes, and inherent error associated with any sampling and
analysis procedure, the results of environmental analysis may contain an element of uncertainty
and in some cases may be significantly biased, and therefore may not be representative of the
actual concentrations of the analytes in the environmental matrices. Thus, an evaluation of the
quality of the analytical data in relation to the intended use is important in order for the
investigator to make decisions which are supported by data of known and sufficient quality.
There are many ways to evaluate the quality of analytical data in terms of precision, accuracy,
representativeness, comparability, completeness and sensitivity in relation to the intended use
of the data. Precision, accuracy, representativeness, comparability, completeness and
sensitivity are collectively referred to as the PARCCS parameters. This guidance document
describes a NJDEP-accepted, two-step process for data evaluation. The first step in the process
consists of an assessment of data quality. The second step is an evaluation to determine
whether the data can be used to support the decisions that will be made using that data. Use of
this guidance provides consistency in evaluation and presentation of data quality information
that will facilitate review. If an alternative process is used, such a process should be
documented in order to explain the thought process and may involve a commitment of
significant resources to demonstrate that the data is of known and sufficient quality and is
usable relative to its intended purpose.
To assist the investigator in obtaining analytical data of known quality, the Work Group
developed the Data of Known Quality Protocols (DKQPs). The DKQPs include specific
laboratory Quality Assurance and Quality Control (QA/QC) criteria that produce analytical data
of known and documented quality for analytical methods. When Data of Known Quality are
achieved for a particular data set, the investigator will have confidence that the laboratory has
followed the DKQPs, has described nonconformances, if any, and the investigator has adequate
information to make judgments regarding data quality.
i
The Data of Known Quality performance standards are given in Appendix B of the NJDEP Site
Remediation Program, Data of Known Quality Protocols Technical Guidance, April 2014. These
protocols will enhance the ability of the investigator to readily obtain from the laboratory the
necessary information to identify and document the precision, accuracy and sensitivity of data.
ii
Preamble .......................................................................................................................... i
1. Intended Use of Guidance Document ........................................................................ 1
2. Purpose ...................................................................................................................... 2
3. Document Overview ................................................................................................... 5
4. Procedures ................................................................................................................. 6
Figure 1: DQA and DUE Flow Chart............................................................................... 7
4.1 Data Quality Objectives ...................................................................................... 8
4.2 Uncertainty in Analytical Data ............................................................................ 9
4.3 Types of Analytical Data .................................................................................... 9
4.4 PARCCs Parameters ....................................................................................... 10
4.4.1 Precision .................................................................................................... 10
4.4.2 Accuracy .................................................................................................... 11
4.4.3 Representativeness ................................................................................... 12
4.4.4 Comparability ............................................................................................. 12
4.4.5 Completeness ............................................................................................ 13
4.4.6 Sensitivity .................................................................................................. 13
4.5 Data Quality Assessment ................................................................................. 14
4.5.1 Batch Quality Control versus Site Specific Quality Control ........................ 15
4.5.2 Evaluating Significant Quality Control Variances ....................................... 16
4.5.3 Poorly Performing Compounds .................................................................. 16
iii
4.5.4 Common Laboratory Contaminants ........................................................... 17
4.5.5 Bias ........................................................................................................... 17
4.6 Data Usability Evaluation ................................................................................. 17
4.6.1 Evaluation of Bias ...................................................................................... 20
4.6.2 General Quality Control Information .......................................................... 23
4.6.2.1 Chain of Custody Forms .......................................................................... 23
4.6.2.2 Sample Preservation Holding Times and Handling Time ........................ 23
4.6.2.3 Equipment, Trip and Field Blanks ............................................................ 26
4.6.2.4 Field Duplicates ....................................................................................... 28
4.6.3 Laboratory Quality Control Information ...................................................... 30
4.6.3.1 Data of Known Quality Conformance/Nonconformance
Summary Questionnaire .......................................................................... 31
4.6.3.2 Reporting Limits....................................................................................... 31
4.6.3.3 Method Blanks ......................................................................................... 33
4.6.3.4 Laboratory Duplicates .............................................................................. 34
4.6.3.5 Surrogates ............................................................................................... 34
4.6.3.6 Laboratory Control Samples (LCS) .......................................................... 37
4.6.3.7 Matrix Spike/Matrix Spike Duplicates and Matrix Spike/Matrix Duplicate 38
4.6.3.8 Internal Standards ................................................................................... 41
4.6.3.9 Serial Dilutions (ICP and ICP/MS) ........................................................... 42
4.6.3.10 Interference Check Solution .................................................................. 42
4.6.3.11 Matrix Spikes and Duplicates ................................................................ 44
4.6.3.12 Internal Standards for ICP/MS (for Metals) ............................................ 45
4.6.4 Using Multiple Lines of Evidence to Evaluate Laboratory QC Information . 45
4.6.5 Data Usability Evaluations for Non-DKQ Analytical Data........................... 47
4.6.6 Data Usability Evaluations Using Multiple Lines of Evidence
from DQOs and the CSM .......................................................................... 49
iv
4.6.7 Factors to be Considered During Data Usability Evaluations .................... 50
4.6.8 Documentation of Data Quality Assessments and Data Usability
Evaluations ............................................................................................... 52
REFERENCES .............................................................................................................. 54
Appendix A Supplemental Information on Data Quality Objectives and Quality
Assurance Project Plans .......................................................................... 56
Appendix B QC Information Summary and Measurement Performance Criteria .......... 58
Appendix C QC Information to be Reviewed During Data Quality Assessments ........ 66
Appendix D Data Quality Assessment Worksheets and Summary of DKQ
Acceptance Criteria .................................................................................. 71
Appendix E Evaluating Significant QA/QC Variances .................................................. 85
Appendix F Poorly Performing Compounds ................................................................. 92
Appendix G Range of Data Usability Evaluation Outcomes ......................................... 95
Appendix G Range of Data Usability Evaluation Outcomes ......................................... 96
Appendix H Data Usability Evaluation Worksheet ...................................................... 105
Appendix I Surrogates and Internal Standards ........................................................... 109
Appendix J Supplemental Examples Using Multiple Lines of Evidence ..................... 113
Appendix K: Glossary .................................................................................................. 119
Appendix L: List of Acronyms ..................................................................................... 130
v
1. Intended Use of Guidance Document
This guidance is designed to help the person responsible for conducting remediation to comply
with the Department's requirements established by the Technical Requirements for Site
Remediation (Technical Rules), N.J.A.C. 7:26E. Because this guidance will be used by many
different people that are involved in the remediation of a contaminated site such as Licensed
Site Remediation Professionals (LSRPs), Non-LSRP environmental consultants and other
environmental professionals, the generic term “investigator” will be used to refer to any person
that uses this guidance to remediate a contaminated site on behalf of a remediating party,
including the remediating party itself.
The procedures for a person to vary from the technical requirements in regulation are outlined in
the Technical Rules at N.J.A.C. 7:26E-1.7. Variances from a technical requirement or guidance
must be documented and be adequately supported with data or other information. In applying
technical guidance, the Department recognizes that professional judgment may result in a range
of interpretations on the application of the guidance to site conditions.
This guidance supersedes previous Department guidance issued on this topic. Technical
guidance may be used immediately upon issuance. However, the NJDEP recognizes the
challenge of using newly issued technical guidance when a remediation affected by the
guidance may have already been conducted or is currently in progress. To provide for the
reasonable implementation of new technical guidance, the NJDEP will allow a 6-month “phase-
in” period between the date the technical guidance is issued final (or the revision date) and the
time it should be used.
This guidance was prepared with stakeholder input. The following people were on the
committee who prepared this document:
Greg Toffoli, Chair (Department), Office of Data Quality
Nancy Rothman, Ph.D., New Environmental Horizons, Inc.
Rodger Ferguson, CHMM LSRP, Pennjersey Environmental Consulting
Stuart Nagourney (Department), Office of Quality Assurance
David Robinson, LSRP, Synergy Environmental, Inc.
Joseph Sanguiliano (Department), Office of Data Quality
Phillip Worby, Accutest Laboratories, Inc.
1
2. Purpose
The purpose of this document is to provide guidance on how to review and subsequently use
analytical data generated pursuant to the remediation of a discharge of a contaminant(s).
Laboratory Quality Assurance and Quality Control (QA/QC) is a comprehensive program used
to enhance and document the quality of analytical data. QA involves planning, implementation,
assessment, reporting, and quality improvement to establish the reliability of laboratory data.
QC procedures are the specific tools that are used to achieve this reliability.
Evaluating the quality of analytical data to determine whether the data are of sufficient quality for
the intended purpose is a two-step process. The first step of the process is a data quality
assessment (DQA) to identify and summarize any quality control problems that occurred during
laboratory analysis (QC nonconformances). The results of the DQA are used to perform the
second step, which is a data usability evaluation (DUE) to determine whether or not the quality
of the analytical data is sufficient for the intended purpose.
To assist the investigator in obtaining usable, “good’ analytical data, the NJDEP Analytical
Technical Guidance Work Group developed the Data of Known Quality Protocols (DKQPs). The
DKQPs are a collection of analytical methods that contain specific performance criteria and are
based on the conventional analytical methods published by the U.S. Environmental Protection
Agency (EPA). DKQPs have been developed for the most commonly used analytical methods.
DKQPs may be developed for other methods in the future. Analytical data generated from the
DKQPs are termed Data of Known Quality (DKQ).
When the DKQPs are followed the investigator can have confidence that the data are of known
and documented quality. This will enable the investigator to evaluate whether the quality of the
data is usable. (When the performance criteria in the DKQPs are met, it is likely that the data
will be usable for project decisions.) Information regarding the DKQPs and laboratory QA/QC is
presented in the NJDEP guidance document titled NJDEP Site Remediation Program, Data of
Known Quality Protocols Technical Guidance, April 2014 (DKQ Guidance). The DKQ Guidance
and DKQPs are published on the NJDEP web site at:
http://www.nj.gov/dep/srp/guidance/index.html#analytic_methods
.
2
The DKQP Guidance includes the Data of Known Quality Conformance/Nonconformance
Summary Questionnairethat the investigator may request the laboratory to use to indicate
whether the data meet the guidelines for DKQ. The guidance also describes the narrative (that
must be included as a laboratory deliverable pursuant to N.J.A.C. 7:26E Appendix A) that
describes QA/QC nonconformances. When DKQ criteria are achieved for a particular data set,
the investigator will have confidence that the laboratory has followed the DKQPs, has described
nonconformances, if any, and has adequate information to make judgments regarding data
quality.
A basic premise of the DKQPs is that good communication and the exchange of information
between the investigator and the laboratory will increase the likelihood that the quality of the
analytical data will meet project-specific Data Quality Objectives (DQOs), and therefore, will be
suitable for the intended purpose. To this end, the “Example: Project Communication Form
has been included with the DKQP Guidance (Appendix A) to provide an outline of the
information that a laboratory should have prior to analyzing the associated samples.
The process of obtaining analytical data that are of sufficient quality for the intended purpose
and evaluating the quality of analytical data in relation to project-specific DQOs occurs
throughout the course of a project. It is the investigators responsibility to perform the DQA/DUE
process; therefore, the investigator’s contact with the laboratory should be limited to explaining
any issues that were not adequately addressed in the narrative (nonconformance summary)
and, if provided, a Data of Known Quality Conformance/Nonconformance Summary
Questionnaire (DKQP Guidance). It should be noted that the investigator, not the laboratory, is
responsible for the usability of data.
It is not unusual for laboratory reports to contain QC nonconformances, especially for those
analyses that have extensive analyte lists such as Method 8260B (Volatile Organics) and 8270C
(Semivolatile Organics). The chances of every analyte passing all the QC criteria are remote
and not expected. In many cases, the DQA and DUE will reveal QC nonconformances that do
not affect the usability of the analytical data for the intended purpose. In these cases, the
investigator and others who will be relying on the data may have confidence that the quality of
the data is appropriate for the intended purpose.
In other cases, the DQA and DUE will reveal QC nonconformances that will affect the usability
of the analytical data for the intended purpose. In these cases, the investigator has developed
3
an understanding of the limitations of the analytical data (e.g., through a conceptual site model
(CSM)) and can avoid making decisions that are not technically supported and may not be fully
protective of human health and the environment.
It is important to note that uncertainty introduced through the collection of non-representative
samples or an inadequate number of samples will, in many cases, exceed the uncertainty
caused by laboratory analysis of the samples. It is imperative that the investigator follow the
appropriate regulations and guidance documents to ensure that the number and location of
samples collected and analyzed are sufficient to provide adequate characterization of site
conditions.
This guidance does not suggest formal data validation (such as that outlined in the NJDEP Site
Remediation Program Standard Operating Procedure (SOP) for Analytical Data Validation of
Target Analyte List (TAL) Inorganics, Revision No. 5, SOP No. 5.A.2) is to be performed in all
instances. Specifically, such documents describe formal, systematic processes for reviewing
analytical data. These processes involve, for example, verifying derived results, inspection of
raw data, review of chromatograms, mass spectra, inter-element correction factors to ascertain
that the data set meets the data validation criteria, and the DQOs specified in the quality
assurance project plan (QAPP). In most cases, use of the DKQPs will allow the investigator to
perform a DQA without conducting formal data validation. In cases where formal data validation
will be necessary, the investigator will have to evaluate the data in accordance with applicable
NJDEP and/or EPA Guidance/SOPs. Please note that if data validation is necessary, then a full
data deliverable package is required. (An example where full validation may be required could
be where site conditions have made it difficult for the laboratory to meet the quality control
requirements of a DKQP and the issuance of a RAO is in the balance.)
4
3. Document Overview
The DQA and DUE constitutes a two-step process that is designed to evaluate the quality of
analytical data to determine if the data are of sufficient quality for the intended purpose. The
DQA is an assessment of the laboratory quality control data, the laboratory report, and
laboratory narrative by the investigator to identify and summarize QC nonconformances. The
DUE is an evaluation by the investigator to determine if the analytical data (that may include
nonconformances) are of sufficient quality for the intended purpose. The DUE uses the results
of the DQA and evaluates the quality of the analytical data in relation to the project-specific
DQOs and the intended use of the data. The DQA should be performed in real-time when the
data are received throughout the course of a project. If issues with the data are found, an
adjustment to the project may be made in real-time, so that enough data with sufficient quality
may be gathered prior to beginning the DUE. The DUE is performed whenever the data are
used to make decisions.
5
4. Procedures
The process of obtaining analytical data of sufficient quality for the intended purpose and
evaluating the quality of analytical data in relation to project-specific DQOs and the CSM occurs
throughout the course of a project. This process includes the following:
Development of project-specific DQOs in accordance with professional judgment taking
cognizance of published applicable rules and guidance documents.
Communication with the laboratory regarding project-specific DQOs and the selection of
appropriate analytical methods with the appropriate analytical sensitivity;
Performance of QA and QC activities during the analysis of the samples and reporting of QC
results by the laboratory;
Performance of a DQA by the investigator when analytical results are received from the
laboratory to identify QC nonconformances; and,
Performance of a DUE by the investigator to determine if the analytical data are of sufficient
quality for the intended purpose. The DUE uses the results of the DQA and evaluates the
quality of the analytical data in relation to the project-specific DQOs and the CSM.
This process is described in Figure 1: DQA and Due Flow Chart.
6
Figure 1: DQA and DUE Flow Chart**
CSM
Figure 1: DQA and DUE Flow Chart
Sampling Plan, Field
QA/QC, and Method
Selection
Analytical Data, Field
Observations,
Hydrogeological and
Physical Data
Collect Additional Lab
or Field Data
Modify/Expand
Investigation/Remediate
Collect Additional Lab
or Field Data
Modify/Expand
Investigation
Representativeness
Evaluation
Does the Information/Data
Represent the Site and
Support the CSM?
DUE - Are the
Analytical Data
Adequate for the
Intended Purpose
Based on a Review
of QC
Nonconformances
and Information?
Data is Representative and of Adequate Quality to Support Environmental Professional’s Opinion
NO
YES
YES
DQA Identify Non-Conformances
NO
Start
** State Of Connecticut, Department of Environmental Protection, Laboratory Quality Assurance and Quality
Control, Data Quality Assessment And Data Usability Evaluation Guidance Document, May 2009, Revised
December 2010.
7
4.1 Data Quality Objectives
DQOs are developed by the investigator to ensure that a sufficient quantity and quality of
analytical data are generated to meet the goals of the project and support defensible
conclusions that protect human health and the environment. DQOs should be developed at
the beginning of a project and revisited and modified as needed as the project progresses.
Similarly, the quality of analytical data is evaluated in relation to the DQOs throughout the
course of a project.
It is important to document the DQOs for a project in the context of the CSM so there is a
roadmap to follow during the project and so there is documentation that the DQOs were met
after the project is finished. The DQOs for a project can be documented in a project work
plan, a QAPP, environmental investigation report, or other document. DQOs are a required
QAPP element per N.J.A.C. 7:26E 2.2. Sources of detailed information regarding the
development of DQOs and QAPPs are listed in Appendix A of this document.
Typical analytical DQOs include, but are not limited to the following:
The QA/QC criteria specified in the DKQPs or in other analytical methods with an
equivalent degree of QA/QC as in the DKQPs;
The applicable regulatory criteria, for example, the Appendix Table 1 - Specific
Ground Water Quality Criteria noted in the Ground Water Quality Standards, N.J.A.C.
7:9C; and
The target reporting limit (RL) for a specific substance when determining the extent
and degree of contamination.
The DQOs, which are based on the intended use of the analytical data, define how reliable
the analytical data must be to make sound, rational decisions regarding data usability. For
example, analytical data can be used by an investigator to determine if a discharge took
place, evaluate the nature and extent of a discharge, confirm that remediation is complete, or
determine compliance with an applicable standard/screening level as described in the
“Definition of Terms” above.
8
4.2 Uncertainty in Analytical Data
Uncertainty exists in every aspect of sample collection and analysis. For example:
Sample collection and homogeneity;
Sample aliquoting:
Sample preservation;
Sample preparation; and
Sample analysis
The overall measurement error is a combination of the sum of all the errors associated with
all aspects of sample collection and analysis. The investigator needs to understand the
impact of these uncertainties in order to establish data of known quality.
It is important to understand this uncertainty because analytical data with an unknown
amount of uncertainty may be difficult to use. However, it may still be possible to use the
analytical data if the investigator understands the degree of uncertainty, which is assessed
using the DQA/DUE process. The intended use of the analytical data determines how much
uncertainty is acceptable and how dependable the analytical data must be.
For example, when analytical data will be used for determining if a site meets the Residential
Direct Contact Soil Remediation Standards with a goal of obtaining an unrestricted Remedial
Action Outcome (RAO), the investigator must have a greater degree of confidence in that
data and must understand whether or not the degree of uncertainty will affect the usability of
the data for its intended purpose. Conversely, in cases where contaminants are known to be
present at concentrations significantly greater than Non-Residential Direct Contact Soil
Remediation Standards and further investigation and remediation will be conducted, the
amount of uncertainty associated with that analytical data can be greater.
4.3 Types of Analytical Data
There are two types of data: data that are generated from DKQPs and data that are not. For
the data generated from DKQPs, a lesser degree of scrutiny needs to be applied since the
9
uncertainty of these data is better understood. For data not generated from DKQPs, a higher
degree of scrutiny may be required since these data may have greater uncertainty. The type
of data will usually determine the level of effort that is required for the DQA and DUEs. For
data generated from DKQPs, an example of the information that should be submitted in a
conformance/nonconformance summary is included in the DKQPs Guidance (“Data of
Known Quality Conformance/Nonconformance Summary Questionnaire”). Because
many environmental investigation and remediation projects have been on-going for a period
of time before the DKQPs were developed and because DKQPs are not published for all
methods of analysis, it is likely that many investigators will need to integrate the data
generated by methods other than the DKQPs with data generated in accordance with the
DKQPs. This evaluation should be performed on a site-specific basis relative to the CSM and
DQOs, but the basic principles should be similar for each situation. Section 4 of the DKQP
Guidance presents information on the types of laboratory QC information that are needed to
demonstrate equivalency with the DKQs.
4.4 PARCCs Parameters
The PARCCs parameters are used to describe the quality of analytical data in quantitative
and qualitative terms using the information provided by the laboratory quality control
information. The PARCCS parameters precision, accuracy, representativeness,
comparability, completeness, and sensitivity are described below. The types of QC
information that can be used to evaluate the quality of analytical data using the PARCCS
parameters are provided in Appendix B of this document. Also found in Appendix B is a table
that summarizes DKQ performance parameters and the recommended frequency for the
various types of QC elements. Acceptance criteria associated with PARCCs Parameters are
included in any site-specific QAPP and are also discussed in the SRP Technical Guidance
for Quality Assurance Project Plans” at
http://www.nj.gov/dep/srp/guidance/index.html#analytic_methods
4.4.1 Precision
Precision expresses the closeness of agreement, or degree of dispersion, between a
series of measurements. Precision is a measure of the reproducibility of sample results.
10
The goal is to maintain a level of analytical precision consistent with the DQOs. As a
conservative approach, it would be appropriate to compare the greatest numeric results
from a series of measurements to the applicable regulatory criteria.
Precision is measured through the calculation of the relative percent difference (RPD) of
two data sets generated from a similar source or percent relative standard deviation
(%RSD) from multiple sets of data. The formula for RPD is presented in the definition for
precision in the Definition of Terms section of this document. For example, the analytical
results for two field duplicates are 50 milligrams per kilogram (mg/kg) and 350 mg/kg for a
specific analyte. The RPD for the analytical results for these samples was calculated to be
150%, which, although it doesn’t actually represent a numerical measure of heterogeneity,
suggests a high degree of heterogeneity in the sample matrix and a low degree of
precision in the analytical results. Duplicate results varying by this amount may require
additional scrutiny, including qualification and/or resampling. When using duplicate results
that have met DKQP acceptance criteria, the QAPP should discuss whether the average
or the higher of the two values would be used for making data usability decisions.
4.4.2 Accuracy
Accuracy is used to describe the agreement between an observed value and an accepted
reference or true value. The goal is to maintain a level of accuracy consistent with the
DQOs. Accuracy is usually reported through the calculation of percent recovery using the
formula in the definition for accuracy included in the Definition of Terms section of this
document. For example, the analytical result for a Laboratory Control Sample (LCS) is 5
mg/kg. The LCS was known to contain 50 mg/kg of the analyte. The percent recovery for
the analytical results for this analyte was calculated to be 10%, which indicates a low
degree of accuracy of the analytical results for the analyte and would indicate a low bias
of that analyte to any associated field sample in that analytical batch. Therefore, the actual
concentration of the analyte in samples is likely to be higher than reported. All of the
possible field sample collection and analytical issues which may affect accuracy should be
evaluated to determine overall accuracy of a specific reported result. These data may
require additional scrutiny with the possibility of qualification or rejection based upon the
DQO. A list of common qualifiers has been included in Appendix D of this Guidance
document.
11
4.4.3 Representativeness
Representativeness is a qualitative measurement that describes how well the analytical
data characterizes an area of concern. Many factors can influence how representative the
analytical results are for an area sampled. These factors include the selection of
appropriate analytical procedures, the sampling plan, matrix heterogeneity and the
procedures and protocols used to collect, preserve, and transport samples. Information to
be considered when evaluating how well the analytical data characterizes an area of
concern is presented in various SRP technical guidance documents and manuals.
For example, as part of a sampling plan, an investigator collected soil samples at
locations of stained soil near the base of several above-ground petroleum storage tanks
known to be more than seventy years old and observed to be in deteriorated condition.
The samples were analyzed for extractable petroleum hydrocarbons (EPH). The
concentrations of all EPH results were below the method RL or not detected (ND). The
investigator evaluated these results in relation to visual field observations that indicated
that petroleum-stained soil was present. The investigator questioned how well the
analytical results characterized the locations where stained soil was observed and
collected several additional samples for EPH analysis to confirm the results. The results of
the second set of samples collected from locations of stained soil indicated the presence
of EPH at concentrations of approximately 5,000 mg/kg. Therefore, the investigator
concluded that the original samples for which the analytical results were reported as ND
for EPH were not representative of the stained soil and that the second set of samples
were representative of the stained soil.
4.4.4 Comparability
Comparability refers to the equivalency of sets of data. This goal is achieved through the
use of standard or similar techniques to collect and analyze representative samples.
Comparable data sets must contain the same variables of interest and must possess
values that can be converted to a common unit of measurement. Comparability is
primarily a qualitative parameter that is dependent upon the other data quality elements.
For example, if the RLs for a target analyte were significantly different for two different
methods, the two methods may not be comparable and more importantly, it may be
12
difficult to use those data to draw inferences and/or make comparisons. Use caution in
combining data sets especially if the quality of the data is uncertain.
4.4.5 Completeness
Completeness is a quantitative measure that is used to evaluate how many valid
analytical data were obtained in comparison to the amount that was planned.
Completeness is usually expressed as a percentage of usable analytical data.
Completeness goals are specified for the various types of samples that will be collected
during the course of an investigation. Completeness goals are used to estimate the
minimum amount of analytical data required to support the conclusions of the investigator.
If the completeness goal is 100% for samples that will be used to determine compliance
with the applicable regulations, all of the samples must be collected, analyzed and yield
analytical data that are usable for the intended purpose. Critical samples include those
samples that are relied upon to determine the presence, nature, and extent of a release or
determine compliance with applicable regulations. The completeness goal for critical
samples is generally 100%. Overall project completeness goals are generally below 100%
(e.g., QAPP DQO for overall project completeness may be 90%) to account for losses due
to unintended issues with sample collection (e.g., well will not purge properly or possible
breakage of sample in-transit to the laboratory) or to account for quality issues which
affect usability of sample data.
4.4.6 Sensitivity
Sensitivity is related to the RL. In this context, sensitivity refers to the capability of a
method or instrument to detect a given analyte at a given concentration and reliably
quantitate the analyte at that concentration. The investigator should be concerned that the
instrument or method can detect and provide an accurate analyte concentration that is not
greater than an applicable standard and/or screening level. In general, RLs should be less
than the applicable standard and/or screening level. Analytical results for samples that are
non-detect for a particular analyte that have RLs greater than the applicable standards
and/or screening levels cannot be used to demonstrate compliance with the applicable
standards and/or screening levels.
13
The issue of analytical sensitivity may be one of the most difficult to address as it pertains
to data usability evaluations. Samples that are contaminated with sufficient quantity of
material, such that dilutions are performed, are a leading cause of RLs exceeding
applicable criteria. However, there may be instances where such exceedances are
insignificant relative to the site specific DQOs. As an example, the project may be on-
going and/or other compounds are “driving” the cleanup such that not meeting applicable
criteria for all compounds at that particular juncture is not an issue.
4.5 Data Quality Assessment
A DQA is the process of identifying and summarizing QC nonconformances. The DQA
process should occur throughout the course of a project. The DKQP Guidance Data of
Known Quality Conformance/Nonconformance Summary Questionnaire, laboratory
narrative, and analytical data package should be reviewed by the investigator soon after it is
received, so the laboratory can be contacted regarding any questions, and issues may be
resolved in a timely manner. The DQA is to be performed prior to the DUE. The level of effort
necessary to complete this task depends on the type of analytical data described above in
Section 2.3 of this guidance document. The types of QC information that are to be reviewed
as part of the DQA are described in Appendix C of this document. Results from the DQA are
used during the DUE to evaluate whether the analytical data for the samples associated with
the specific QA/QC information are usable for the intended purpose.
Appendix B of this Guidance document includes a table that summarizes the DKQ
parameters and the recommended frequencies for various types of QC information. The
actual QC checks, target acceptance criteria and information required to be reported under
the DKQPs are provided in Appendix B of the DKQ Guidance.
The DQA is usually most efficiently completed by summarizing QC nonconformances on a
DQA worksheet or another manner that documents the thought process and findings of the
DQA (e.g., NJDEP Full Laboratory Data Deliverable Form available at:
http://www.nj.gov/dep/srp/srra/forms/
).
Sample DQA worksheets are included in Appendix D of this document. These worksheets
may be modified by the user. Appendix D also presents a summary of selected DKQ
acceptance criteria which may be useful during the completion of DQA worksheets.
14
4.5.1 Batch Quality Control versus Site Specific Quality Control
Laboratory QC is performed on a group or “batch” of samples. Laboratory QC procedures
require a certain number of samples be spiked and/or analyzed in duplicate. Since a
laboratory batch may include samples from several different sites, the accuracy and
precision assessment for organic samples will not be germane to any site in the batch
except for the site from which the QC samples originated. QC samples from a specific site
are referred to as site specific QC. Since batch QC for organic samples may include
samples from different sites, it may be of limited value when evaluating precision and
accuracy for a site. For inorganic samples, the sample chosen for the QC sample pertains
to all inorganic samples in the batch because the inorganic methods themselves include
little sample-specific quality control. Typically, organic analyses require an MS/MSD pair
for every twenty samples of similar matrix (e.g., soil, water, etc.). Inorganic analyses
usually have a matrix spike and a matrix duplicate (MD) for every twenty samples;
however, an MS/MSD pair for inorganic analyses is acceptable. Information regarding
MS/MSDs is presented in Section 5.6.3.7 of this document. The results of the MS spike
can be used to evaluate accuracy, while the results of the MS and MSD analysis (or
sample and MD) can be used to assess precision. Similarly, LCSs and LCS/LCSDs are
used by laboratories as a substitute to or in addition to MS/MSD where the LCS is used to
evaluate method accuracy, while a LCS/LCSD pair can be used to evaluate both precision
and accuracy. Information regarding LCS/LCSD is presented in Section 4.6.3.6 of this
document.
There may be instances where the investigator incorporates site or project specific QC
samples as part of the DQO. Examples of where this may be appropriate are:
Complex or unique matrix;
Contract specific requirements;
High profile cases;
Sites containing contaminants such as dioxins or hexavalent chromium
If project specific QC samples are required to meet the DQO, then the investigator should
supply sufficient sample volume for the analyses.
15
4.5.2 Evaluating Significant Quality Control Variances
Some QC nonconformances are so significant that they must be thoroughly evaluated.
Some examples are the absence of QC analyses, gross exceedance of holding time, and
exceeding low recoveries of spikes and/or surrogates. Appendix E of this document
presents a summary of significant QC variances or gross QC failures.
If the DQA is performed when the laboratory deliverable is received it may be possible for
the investigator to request that the laboratory perform reanalysis of the sample or sample
extract within the holding time. During the DUE, data with gross QC failures in most cases
will be deemed unusable, unless the investigator provides adequate justification for its
use. However, samples with significant QC variances could be used if the results are
significantly above remedial standards/screening levels.
4.5.3 Poorly Performing Compounds
Not all compounds of interest perform equally well for a given analytical method or
instrument. Typically, this is due to the chemical properties of these compounds and/or
the limitations of the methods and instrumentation, as opposed to laboratory error. These
compounds are commonly referred to as "poor performers," and the majority of QC
nonconformances are usually attributed to these compounds. Appendix F of this
document presents a summary of compounds that are typically poorly performing
compounds. Each method specific DKQ acceptance criteria table (QAPP Worksheet)
notes the method-specific poor performers. A laboratory’s list of poorly performing
compounds should not be substantially different from this list. The investigator should,
through the QAPP, have the laboratory confirm which compounds are poor performers for
the methods used prior to the analysis of samples. This information should be used during
the DUE. The investigator may decide not to use the entire data set should “too many”
compounds fail to meet acceptance criteria as this may be an indication of general and
significant instrumental difficulties. For example, the investigator may decide that if QC
results for more than 10% of the compounds fail to meet acceptance criteria for DKQ
Method 8260 or more than 20% fail to meet criteria for DKQ Method 8270, the data may
not be usable to demonstrate that concentrations are less than applicable standards
without additional lines of evidence to support such a decision.
16
4.5.4 Common Laboratory Contaminants
During the course of the analysis of samples, substances at the laboratory may
contaminate the samples. The contamination in the sample may come from contaminated
reagents, gases, and/or glassware; ambient contamination; poor laboratory technique; et
cetera. A list of common laboratory contaminants can be found in Appendix G of this
document. However, not all sample contamination can be attributed to the compounds on
the laboratory contaminant list. During the DUE, the investigator must take the CSM and
site-specific information into account to support a hypothesis that the detection of common
laboratory contaminants in environmental samples is actually due to laboratory
contamination and not due to releases at the site or due to sampling efforts.
4.5.5 Bias
When QC data for analytical results indicates that low or high bias is present, this means
that the true values of the target analytes are lower or higher than the reported
concentration, respectively. Bias can also be indeterminate, which means that the
analytical results have poor analytical precision or have conflicting bias in the data.
Additionally, as bias ultimately can affect the actual concentration reported, all bias has
the potential to affect accuracy. Bias is evaluated by the investigator as part of the DUE.
Bias can be caused by many factors, including improper sample collection and
preservation, exceedances of the holding times, the nature of sample matrix, and method
performance. The sample matrix can cause matrix interferences. Typically, matrices such
as peat, coal, coal ash, clay, and silt can exhibit significant matrix interferences by binding
contaminants or reacting with analytes of concern. The investigator should contact the
laboratory to determine the appropriate laboratory methods to address these difficult
matrices. The evaluation of bias is further discussed in Section 5.6.1 of this document.
4.6 Data Usability Evaluation
The DUE is an evaluation by the investigator to determine if the analytical data are of
sufficient quality for the intended purpose and can be relied upon by the investigator with the
appropriate degree of confidence to support the conclusions that will be made using the data.
17
The investigator uses the results of the DQA to evaluate the usability of the analytical data
during the DUE in the context of project-specific DQOs and the CSM.
One of the primary purposes of the DUE is to determine if any bias that might be present in
the analytical results, as identified during the DQA, affects the usability of the data for the
intended purpose. The DUE can use multiple lines of evidence from different types of
laboratory QC information or from site-specific conditions described in the CSM to evaluate
the usability of the analytical data.
The initial DUE should evaluate precision, accuracy, and sensitivity of the analytical data
compared to DQOs. Representativeness, completeness, and comparability should be
evaluated as part of a DUE and should be considered when incorporating analytical data into
the CSM.
More scrutiny regarding the quality of analytical data may be necessary when the
investigator intends to use the data to demonstrate compliance with an applicable
standard/screening level than when the data are used to design additional data collection
activities or when remediation will be conducted. Data that may not be deemed to be of
sufficient quality to demonstrate compliance with applicable standard/screening level may be
useful for determining that a discharge has occurred in cases when remediation will be
conducted or to guide further data collection activities.
Typically, the most challenging DUE decisions are for situations when the analytical results
are close to, or at, the applicable standard/screening level and there are QC
nonconformances that might affect the usability of the data. In situations such as this, the
NJDEP expects that the investigator will use an approach that is protective of human health
and the environment. Coordination with the laboratory to understand QC information,
additional investigation, and re-analysis of samples may be necessary in some cases. If the
DQA is performed when the laboratory deliverable is received and issues are raised, it may
be possible to perform re-analysis of the sample extract within the holding time and still use
the sample data.
To help expedite the DUE, it may be useful to determine if the QC nonconformances
identified in the DQA are significant for a particular project. The types of questions listed
below are not inclusive. They are intended to give examples to the investigator to help
18
evaluate QC nonconformance for a particular project. See the DUE Worksheet provided in
Appendix I of this document for additional examples.
Will remediation be conducted at the area of concern? If remediation will be conducted,
the investigator should use the QC information supplied by the laboratory (or request
additional assistance from the laboratory when necessary) to minimize QC issues for
the samples to be collected to evaluate the effectiveness of remediation. Alternately,
if remediation will not be conducted, the analytical data should be of sufficient quality
to demonstrate compliance with an appropriate and applicable standard/screening
level.
Were significant QC variances reported? Analytical data with gross QC failures are
usually deemed unusable (rejected) unless the investigator provides adequate
justification for its use. Significant QC variances are discussed in Appendix E of this
document.
Were QC nonconformances noted for substances that are not constituents of concern
at the site as supported by the CSM? QC nonconformance assessments for
contaminants that are not of concern may not be critical to meeting project DQOs.
However, limiting the list of contaminants of concern without appropriate investigation
and analytical testing (i.e., incomplete CSM) can inadvertently overlook substances
that should be identified as contaminants of concern.
Were QC nonconformances reported for compounds that are poorly performing
compounds? If the nonconformances are noted for poorly performing compounds that
are not contaminants of concern for the site, then they have little or no impact on data
usability. However, if the nonconformances are noted for poorly performing
compounds that are compounds of concern for the site, then the investigator may
have to address these issues, including but not limited to re-sampling and/or re-
analysis. Poorly performing compounds are discussed in Section 3.3 and Appendix F
of this document.
The DUE process is discussed in detail using examples in the sections that follow. The
examples presented below are for illustrative purposes only and are not meant to be a strict
or comprehensive evaluation of all types of laboratory QC information or all the possible
19
outcomes of data quality evaluations. The discussion begins with examples of less complex
QC information and concludes with the use of multiple lines of evidence to evaluate more
complicated DUE issues using more than one type of laboratory QC information and
information from the CSM for a hypothetical site. The standards/screening levels identified in
the examples are for illustrative purposes and may not be consistent with actual levels.
Appendix H of this document illustrates many common QC issues and a range of potential
DUE outcomes for each issue. The DUE is usually most efficiently completed by using a
worksheet or another manner that documents the thought process and findings of the DUE.
Appendix I of this document presents a DUE Worksheet that can be used and modified as
needed to summarize the types of issues that should be discussed in the investigator written
opinion regarding data usability.
4.6.1 Evaluation of Bias
The types of bias are discussed in Section 4.5.5 of this document. Bias can be low, high
or indeterminate.
High or low bias can be caused by many factors. Investigators should be cautioned that it
is never acceptable to “adjust laboratory reported” compound concentrations or RLs
based on percent recovery.
Indeterminate or non-directional bias means that the analytical results exhibit a poor
degree of precision (e.g., as demonstrated by high RPD in sample/MD measurements) or
there are cumulative conflicting biases in the data set (e.g., surrogate recoveries for a
sample are low but LCS recoveries are high). Duplicate sample results are used to
evaluate the degree of precision between the measurements. Indeterminate bias may
occur when heterogeneous matrices, such as contaminated soil or soil containing wastes
such as slag, are sampled. The heterogeneity of the matrix causes the analytical results to
vary and may cause a large RPD between the sample results. The degree to which the
analytical results represent the environmental conditions is related to the number of
samples taken to characterize the heterogeneous matrix and how those samples are
selected and collected. For example, as a greater number of samples are analyzed, the
analytical results will better represent the concentrations of the analytes present in the
environment.
20
Bias for a particular result should not be evaluated until all sources of possible bias in a
sample analysis have been evaluated. Evaluating the impact of bias on one’s data set is
not always straightforward. For example, judging bias only on surrogate recovery and
ignoring LCS recovery results may lead to erroneous conclusions. Therefore, overall bias
for a result must be judged by the cumulative effects of the QC results.
21
Examples of the actions suggested based on the type of bias observed (L= low;
I=indeterminate; H=high; None = within limits) on non-detect data (ND) are shown below.
For the purposes of the table, bias refers to agreement with method defined QA/QC limits.
Table1: Summary Actions Due to Bias.
Bias
L
H
w/in limits
Conc.
ND<Reg Lev
Further
None
None
ND=Reg Lev
Further
None
None
ND>Reg Lev
Not usable to
determine
clean areas
determine
clean areas
Not usable to
determine
clean areas
Not usable to
determine
clean areas
Further = Look at Site; evaluate complete data set; Reanalyze; speak to
lab; resample if necessary.
Ultimately it is the investigators’ responsibility to use professional judgment when
determining the use of any data.
If the detected concentrations of analytes are below the applicable
standard/screening level, the bias may have limited impact on the usability of the
data. If the concentration is just below the regulatory limit, evaluation of bias can be
critical, especially when data are being used to demonstrate compliance (i.e.,
issuance of a RAO).
If the detected concentrations of analytes are above the applicable standard/screening
level, the bias may have limited impact on the usability of the data unless these data
are being used to demonstrate compliance (i.e., issuance of a RAO).
22
4.6.2 General Quality Control Information
The following subsections discuss issues associated with QC information related to
sample management, preservation, holding times, and field QC samples.
4.6.2.1 Chain of Custody Forms
Chain of Custody (COC) forms are used to document the history of sample possession
from the time the sample containers leave their point of origin (usually the laboratory
performing the analyses) to the time the samples are received by the laboratory. COCs
are considered legal documents. Sometimes incorrect information is on the COC form,
such as incorrect dates, sample identification numbers, and analysis requested.
Usually these errors are found through the course of the project. However, simply
correcting this information without documentation of the problem and the resolution
may amount to falsification of the chain of custody or cause confusion. The error may
be corrected by the investigator with a single-line cross-out of the error,
initialing/signing, dating of the correction, and an explanation for the correction. If the
laboratory notices an error on the COC, this should be noted in their sample receiving
documentation and in the laboratory narrative and the laboratory should contact the
investigator. Any changes to the COC should be approved by the investigator and
documented by the laboratory.
4.6.2.2 Sample Preservation Holding Times and Handling Time
Once a sample is collected, changes in the concentrations of analytes in the sample
can occur. To minimize these changes, the sample must be collected, stored, and
preserved as specified in the analytical method and for non-aqueous volatile organic
compounds as specified in the NJDEP's N.J.A.C. 7:26E-2.1(a)8. The sample must also
be analyzed within the specified holding and handling times. The holding time for a
sample has two components. The first component is the time from when a sample is
collected to when it is prepared for analysis or, if no preparation step is required, the
time from when the sample is collected to when it is analyzed. (For environmental
samples, handling time is included in this first component.) If a test requires a
preparation step, such as solvent extraction for determination of polychlorinated
23
biphenyls (PCBs) or acid digestion for determination of metals, there is a second
holding-time component referred to as the extract holding time. This is the time
between when the sample is prepared and when the resultant extract or digestate is
analyzed. Failure to analyze a sample within the prescribed holding time could render
the data unusable. The laboratory should be made aware (usually in the QAPP) that if
holdings times are not going to be met, then the laboratory should contact the
investigator and check to see if the samples should still be analyzed. The use of
laboratory data from a sample with a failed holding time must be evaluated for usability.
The determination made by the investigator to use data with failed holding times is
based on the critical nature of the sample, the type of sample and the analytical results.
The conventional conclusion with organic samples that exceeded holding times is that
there is a loss of compound and the concentration may be biased low.
It should be noted that certain constituents are not necessarily adversely affected by
holding time exceedances providing the samples are preserved and stored properly. If
the contaminants of concern were PCBs, PCDDs/PCDFs and metals, holding time
exceedance may not adversely affect usability. In these situations, the data should be
qualified and discussed by the investigator. However, attempts should be made to
meet the method required holding times.
Example 1: Meeting standards exceeded holding times
Benzene and 1,2-dichloroethane were found in a water sample at concentrations of 0.9
ug/L and 1 ug/L, respectively. This sample was to be the last round of sampling prior to
the intention of issuing a RAO. Applicable ground water quality criteria for benzene and
1,2-dichloroethane are 1 and 2 ug/L respectively. However, the data were obtained
from samples that exceeded holding time to analysis by 4 days. Because the overriding
consensus with a holding time exceedance is that data are biased low and, because of
the proximity of the concentration to the applicable criteria, the investigator should
probably not use these data.
24
Example 2Holding time exceedance ground water monitoring
Trichloroethene, tetrachloroethene and 1,1,1-trichloroethane are present in a water
sample at concentrations of 80 ug/L, 140 ug/L and 125 ug/L, respectively. Compound-
specific ground water criteria apply in this situation for the trichloroethene,
tetrachloroethene and 1,1,1-trichloroethane at concentrations of 1 ug/L, 1 ug/L and 30
ug/L, respectively. The sample is part of a routine, quarterly monitoring program of a
contaminated ground water aquifer and the sample results are similar to those
determined from previous rounds of sampling and analyses. It is expected that
quarterly monitoring will continue for a minimum of three additional years. However, the
data were from a sample that exceeded the holding time to analysis by 3 days. Based
on this information, the data would most likely be used because there will be additional
rounds of sampling prior to terminating the remedial activities, data are consistent with
previous results and the concentrations reported were significantly above the
applicable standards such that the effect of a holding time exceedance on the accuracy
of the numbers reported would probably be negligible.
Sample preservation can be either physical or chemical. Physical preservation might
be cooling, freezing, or storage in a hermetically sealed container. Chemical
preservation refers to addition of a chemical, usually a solvent, acid, or base to prevent
loss of any analyte in the sample. An example of physical storage is the freezing of soil
samples for determination of volatile organic compounds (VOCs). This procedure and
other procedures for preserving soil samples for the determination of VOCs can be
found in the NJDEP Field Sampling Procedures Manual.
NJDEP expects that all non-aqueous samples collected for the purpose of laboratory
analysis for VOCs be collected and preserved in accordance with the procedures
described in N.J.A.C. 7:26E-2.1(a)8 and all appropriate analytical methods and
technical guidance. If proper preservation of soils sampled for volatiles is not
performed, VOCs may be biased low and may be unusable. Based on this evaluation,
additional investigation and/or remediation may be warranted. Improperly preserved
samples should not be used to determine compliance with regulatory standards and/or
criteria.
25
4.6.2.3 Equipment, Trip and Field Blanks
Equipment-rinsate, trip, and field blank samples can be used to evaluate contamination
in a sample as a result of improperly decontaminated field equipment or contamination
introduced during transportation or collection of the sample. Trip and field blanks
(including laboratory analyte-free water which may be used to produce an equipment-
rinsate blank) must be transported to the site with sample containers and must be
received at the site within one day of preparation in the laboratory. Blanks may be held
on-site for no more than two calendar days and must arrive back at the lab within one
day of shipment from the field. If the handling time is not met, then it is possible that the
field blanks will not represent the site conditions. Handling times are established more
from logistical reasons than from scientific reasons. It is possible that sample
containers kept on-site or in construction trailers on site have a greater chance of
picking up contamination the longer they are stored. This may present a challenge to
the investigator for scheduling sample collection activities especially following
weekends and holidays.
The investigator should be cognizant that laboratories have a limited amount of time to
prepare/extract and analyze samples, some of which may require additional effort such
as reanalysis and as such, the quicker samples get to the laboratory, the better it is for
all parties concerned.
Low concentrations of contaminants may be detected in samples as a result of non-
site-related contamination. Organic compounds typically found include, but are not
limited to, methyl ethyl ketone (MEK), acetone, and methylene chloride which are
commonly used as laboratory solvents. Bis(2-ethylhexyl) phthalate is also a common
laboratory contaminant; however, it is also observed from field sample collection
activities such as use of plastics. Additional scrutiny should be taken if these are
contaminants of concern at the site.
The presence of any analytes in any blanks is noted in the DQA review of the data.
The concentrations of the analytes in the blanks are compared to any detected analyte
concentrations in the associated samples, taking into account any dilution factors.
Analytes that are detected in the blanks, but ND in the sample, can be ignored.
Analytes detected in the laboratory method blank (not the field and/or trip blank) and
26
detected in any associated sample should be flagged by the laboratory with a "B" suffix
to draw attention to the data user.
SRP has been using a 3 times to 10 times policy to evaluate the potential presence of
compounds in an environmental sample when the same compounds are also found in
a blank sample. The specific policy is as follows.
If the concentration of a given compound in a sample is less than or equal to three (3)
times the concentration of that compound in the associated equipment, trip or field
blank, then it is unlikely that the compound is present in the sample. If the
concentration is between 3 and 10, although it is present in a corresponding blank, the
presence of the compound in the site sample is considered real; however, if the
concentration is greater than 10 times the concentration in the corresponding blank,
the impact of the blank on the sample results is considered negligible.
All compounds that are present in a sample at a concentration of less than or equal to
10 times the concentration in the corresponding blank should be qualified
(conventionally, a “B” qualifier is added next to the concentration of the affected
compound) to indicate possible blank contamination.
1
Example 3: Application of 3x Rule:
Benzene was found in a ground water sample collected at the site at concentration of 2
μg/L. Benzene is also present at a concentration of 1.0 μg/l in the associated
equipment blank. The concentration in the sample is less than 3 times but less the
concentration of the blank. Therefore, the result may not be real; however, the result
should be qualified B and discussed by the investigator.
Example 4: Application of 10x Rule
1
Strict validation protocols may have more robust procedures for blank qualification (e.g., in
addition to the “B” qualifier, concentrations between 3 and 10 times the associated blank should
also be reported with the "J" qualifier.) However, for the purpose of the DUE, addition of the “B”
qualifier (or other user-defined qualifier) will suffice to denote corresponding blank contamination.
27
Benzene was found in a ground water sample collected at the site at concentration of 4
μg/L. Benzene is also present at a concentration of 0.5 μg/l in the associated
equipment blank. The concentration in the sample is greater than 3 times but less than
10 times the concentration of the blank. Therefore, the result is real and should be
qualified B which may indicate quantitative uncertainty. Additional site investigation
may be warranted including an evaluation of the sampling protocol.
Example 5: Application of 10x Rule
Benzene was found in a ground water sample collected at the site at concentration of
20 μg/L. Benzene is also present at a concentration of 0.5 μg/l in the associated
equipment blank. The concentration in the sample is greater than 10 times the
concentration of the blank. Therefore, the sample result is considered real and may be
used.
The investigator should review all blank related results in relation to the CSM for the
site, including results for other samples in the vicinity, in order to determine if this
evaluation is reasonable before concluding that a compound is or is not site related.
This policy cannot be used to eliminate detections of analytes that can be attributed to
a release or a potential release. Special attention should be paid to concentrations that
may be blank related at or near regulatory/screening levels.
4.6.2.4 Field Duplicates
Field duplicates are replicate or split samples collected in the field and submitted to the
laboratory as two different samples. Field duplicates measure both field and laboratory
precision. Blind duplicates are field duplicate samples submitted to the laboratory
without being identified as duplicates. Duplicate samples are used to evaluate the
sampling technique and homogeneity/heterogeneity of the sample matrix. The results
of field duplicates are reported as the RPD between the sample and duplicate results.
As a conservative approach, the higher of the two results for field duplicate samples
would be compared to the applicable regulatory criteria.
In general, solid matrices have a greater amount of heterogeneity than liquid matrices.
When the RPD for detected constituents (concentrations greater than the RL) is
28
greater than or equal to 50 percent for nonaqueous matrices or greater than or equal to
30 percent for aqueous matrices, the investigator is advised to consider the
representativeness of the sample results in relation to the CSM. If the field duplicates
are not collected and analyzed from your site, then field duplicate precision
cannot be part of your DUA.
Field duplicate results should be evaluated along with any laboratory duplicate results
that are available in an attempt to identify whether the issue is related to the sample
matrix, collection techniques, or the laboratory analysis of the sample. (Laboratory
duplicates are obtained from one environmental sample in one sample container that is
extracted and analyzed twice. Refer to Section 4.3.4 of this guidance document for
additional information.) If the laboratory duplicates are acceptable, but the field
duplicates are not, the likely source of this lack of reproducibility is heterogeneity of the
matrix or the sampling or compositing technique. If the laboratory duplicates are not
acceptable, laboratory method performance may be the source for the lack of
reproducibility. The RL for the analyte in question must be considered in this evaluation
because, typically, analytical precision decreases as the results get closer to the RL.
One could also evaluate precision by comparing a sample result to a sample duplicate
result (no spiking is performed), although representativeness of the samples could be a
factor when evaluating the results of duplicate analyses. Furthermore, if the results for
a specific analyte are ND in both samples, the evaluation of precision, through
calculation of RPD, cannot be performed
Example 6: Duplicate Sample Results Heterogeneity
Duplicate soil sample analytical results for lead for two soil samples were 500 mg/kg
and 1,050 mg/kg. The RL was 1 mg/kg. The RPD for these samples is approximately
71 percent, which is greater than the guideline of 50 percent. The lack of precision for
these sample results indicate that the samples are heterogeneous and may not be
representative of the site location for lead. The investigator is advised to consider the
representativeness of the sample results in relation to the CSM. Additional
investigation and analysis are needed to evaluate the actual concentrations and
distribution of lead at the site.
29
4.6.3 Laboratory Quality Control Information
The DKQPs and commonly used analytical methods for environmental samples have
been verified to produce reliable data for most matrices encountered. The reliability of
the results to represent environmental conditions is predicated on many factors
including:
The sample must be representative of field conditions;
The sample must be properly preserved and analyzed within handling and holding
times;
The preparation steps used to isolate the analytes from the sample matrix must be
such that no significant amounts of the analytes are lost;
The analytical system should not have contamination above the RL;
The analytical system must be calibrated and the calibration verified prior to sample
analysis; and
No significant sample matrix interferences are present which would affect the
analysis.
With the exception of the first bullet, the laboratory can provide the data user with
laboratory QC information that provides insight into these key indicators. The
determination that a sample is representative of the field conditions is based on
reviewing the CSM, the sampling plan, the field team’s SOPs and field logs, and the
results for other samples including field and laboratory duplicates.
The primary laboratory QC data quality information that the investigator considers
during the DQA are the DKQP Data of Known Quality
Conformance/Nonconformance Summary Questionnaire”, the chain of custody
form, sample preservation, handling and holding times, RLs, laboratory and field
duplicates, surrogates, MSs and MSDs (when requested by the investigator), method
blanks, and laboratory control samples. However, there are other non-standard types of
30
QC information (e.g., regulator pressure from a canister) that are required to be
reported by the DKQPs that are described in Appendix B of the DKQP Guidance.
4.6.3.1 Data of Known Quality Conformance/Nonconformance Summary Questionnaire
The DKQP “Data of Known Quality Conformance/Nonconformance Summary
Questionnaire” is used by the laboratory to certify whether the data meet the
requirements for “Data of Known Quality.” The DKQP Data of Known Quality
Conformance/Nonconformance Summary Questionnaire is presented in
Appendix A of the DKQ Guidance and can be found at the NJDEP website at
http://www.nj.gov/dep/srp/guidance/index.html#analytic_methods
. All of the
questions on the “Data of Known Quality Conformance/Nonconformance Summary
Questionnaire” should be answered, the questionnaire should be signed, and a
narrative of nonconformances included with the analytical data package. If all of the
questions are not answered, or the questionnaire is not signed, or if a narrative of
nonconformances is not included with the data package, then the investigator should
contact the laboratory to obtain a properly completed questionnaire and/or the
missing narrative. If the laboratory cannot supply the requested information, the
investigator should demonstrate equivalency with the DKQPs for the data set by
following the guidance presented in Sections 5 and 6 of the DKQP Guidance.
4.6.3.2 Reporting Limits
The RL is the lowest concentration that a method can achieve for a target analyte
with the necessary degree of accuracy and precision. As defined in N.J.A.C. 7:26E
2.1(a)3, the RL for an organic compound is derived from the lowest concentration
standard for that compound used in the calibration of the method as adjusted by
sample-specific preparation and analysis factors (for example, sample dilutions and
percent solids). The RL for an inorganic compound is derived from the concentration
of that analyte in the lowest level check standard (which could be the lowest
calibration standard in a multi-point calibration curve). RLs are method and
laboratory-specific. Laboratories are required to report the RLs for all compounds for
all samples per Appendix A of N.J.A.C. 7:26E.
31
RLs and their association with meeting standards and/or screening levels present
one of the most significant challenges to laboratories and investigators. A commonly
occurring scenario that arises is with volatile analyses and default impact to ground
water standards where multiple aliphatic compounds are present, the sample is
diluted because of the presence of one analyte with a very high standard, resulting in
an inability to “see down to the standard” for another compound. This frequently
occurs where samples from petroleum discharges areas are required to be diluted
due to the presence of compounds such as xylenes and/or ethyl benzene and the
0.005 mg/kg default impact to ground water soil screening level for benzene cannot
be attained. Dilutions occur not only to obtain an accurate concentration but also to
prevent temporary damage to the instrumentation. Where laboratories are having
difficulties reporting down to a low value, laboratories should perform and report
sample results that are derived from the lowest level of dilution
Multiple dilutions and alternative methods of analysis (e.g., gas chromatography with
a photoionization detector, Method 8010/8020) should be considered to obtain the
desired levels of quantitation.
Example 7: Reporting Limits and Dilution Factor
Results for soil samples tested for PCE (a primary driver at the site) are ND, with a
RL of 1,000 μg/kg) with a dilution factor of 20. Dilutions of the samples were
performed when the laboratory determined by pre-screening the samples that
undiluted analyses may cause contamination of the instrument that is difficult and
time consuming to remove and because the analyte concentrations would be above
the calibration curve. However, based on other analyses, it was determined that
there are other drivers that would result in the site undergoing remediation. In this
instance, as the remediation will also remove the PCE (even if it is above a
regulatory criteria), it would be acceptable for the laboratory to report PCE as ND
with a RL greater than the regulatory level.
Example 8: Reporting Limits and Dilutions
Results for a soil sample tested for BTEX are ND for benzene at a RL of 10 m/Kg
and xylenes was detected at 800 mg/Kg. The sample required a dilution of 100 due
32
to the concentration of xylene. (The Residential Direct Contact Soil Remediation
Standard for benzene is 2 mg/Kg and 12000 mg/Kg for xylenes.) However, while the
concentration for xylene is below its regulatory level, the ND for benzene at the RL is
above the regulatory level. If further delineation is to occur at the site, then the
exceedance of benzene should be noted but should not prevent the investigator from
proceeding with the remediation. If however, this analysis was to be used for
purposes of issuing a RAO, reanalysis and or resampling may be required and/or
further remediation may be required prior to resampling and reanalysis. If it is
absolutely necessary for benzene to be evaluated at or below the regulatory level
with high levels of xylene in the sample, the laboratory should be contacted to
discuss analytical options which may include alternative methodologies, sample
preparation and/or methods of detection.
Example 9: Reporting Limits
The ND result for PCE for a groundwater sample has a RL of 12 μg/L. The GWQS
for PCE is 1 μg/L. Additionally, the data cannot be used to show that PCE is not
present at a concentration less than the RL of 12 μg/L. However, this sample was to
be used to demonstrate compliance with the GWQS for PCE. Therefore, the data are
not usable for this project decision.
4.6.3.3 Method Blanks
Most analytical methods require method blanks. The purpose of the method blank is
to determine the presence and concentration of any contamination associated with
the processing or analysis of the samples at the laboratory. Laboratories are required
to summarize method blank results for all samples per Appendix A of N.J.A.C. 7:26E.
Ideally, method blanks should not contain any detected analytes above the RL, but
for certain tests, low levels of common contaminants are not unusual because of the
nature of the typical commercial analytical laboratory. Common laboratory
contaminants or artifacts include methylene chloride, acetone, MEK, for VOCs and/or
any phthalate for SVOCs. A summary of common laboratory contaminants is
presented in Appendix G of this document.
33
The presence of any analytes in any method blanks that are detected should be
noted during the review of the data. The concentrations of contaminants in method
blanks are compared to any detected analyte concentrations in the associated
samples, including field and trip blanks taking into account any dilution factors.
Analytes present in the blanks, but ND in the sample can be ignored. Analytes
detected in the laboratory method blank and detected in any associated sample
should be flagged by the laboratory with a "B" suffix to draw attention to the data
user.
Refer to Section 5.6.2.3 of this document for further information on blank action.
4.6.3.4 Laboratory Duplicates
Laboratory duplicates measure laboratory precision. The analytical results for
laboratory duplicates are reported as the RPD between the sample and duplicate
results. Laboratory duplicates are replicate samples and are prepared by taking two
aliquots from one sample container. Duplicate results are only used to determine
precision and not compliance with a standard and/or criteria.
Laboratory duplicate results should be evaluated along with any field duplicate
results to identify whether any precision issues are related to the sample matrix and
collection techniques or to the laboratory analysis of the sample. Information
regarding the interpretation of duplicate sample results can be found in Section 4.2.4
of this document.
4.6.3.5 Surrogates
A surrogate is an organic compound that is similar to the target analyte(s) in
chemical composition and behavior in the analytical process but is not normally
found in environmental samples. Laboratories are required to summarize surrogate
recoveries for all samples per Appendix A of N.J.A.C. 7:26E. Spiking the samples
(including any batch QC such as method blanks and LCSs) with surrogate
compounds prior to extraction and/or analysis and determining the percent recovery
of the spiked surrogate compound evaluates sample matrix effects, accuracy, and
34
laboratory performance on individual samples. The surrogate concentration is
measured using the same procedures used to measure other analytes in the sample.
Certain analyses that have extensive target compound lists require several
surrogates.
If the reported recovery for a surrogate is outside acceptance criteria for VOCs, then
all VOC results should be considered to be biased high or low depending on whether
the surrogate was higher or lower than the acceptance criteria. For SVOCs, if two or
more surrogates in the same fraction (acid SVOC surrogates or base neutral SVOC
surrogates) are outside acceptance criteria, all results in that fraction should be
considered to be biased high or low depending on whether the surrogate was higher
or lower than the acceptance criteria. For SVOCs, by understanding which
surrogates are related to which target compounds, the percent recovery of a
surrogate can be related to constituents of concern, which may be useful in
evaluating whether or not the data are useable. If a surrogate is not within the DKQ
criteria, the associated quantitative data may be suspect and may require further
scrutiny. Information regarding the surrogates for volatiles, SVOCs, chlorinated
pesticides and aroclors are presented in the tables in Appendix J of this document.
The evaluation of interfering matrix effects or high concentrations of target
compounds that may mask the detection of surrogate recoveries is a complex issue
and not straightforward in some cases. Common problems include the presence of
non-target compounds. The review and evaluation of surrogate compound results
involves the evaluation of multiple lines of evidence and is described in Section 4.4
of this document. Data from surrogate results should be used in conjunction with
other QC data, such as LCS and MS. The performance standards for surrogates are
presented in the DKQ protocols (Appendix B of the DKQ Guidance) and in Appendix
D of this document.
Surrogate recoveries may be affected when the sample or sample extract undergoes
dilution. Under severe instances, the surrogates may be “diluted out” and no
surrogate recovery is reported. When surrogate recoveries are affected due to
dilutions, the investigator may have to increase his/her reliance on other QC
information such as internal standard response, LCS and MS.
35
Example 10: Surrogates High Recovery
A soil sample analyzed by Method 8270 was collected to determine if further
remediation was needed.
The percent recovery for the surrogate pyrene-d10 was reported to be 159% and
for the surrogate benzo(a)pyrene-d12 was reported to be 145%. The method
specifies that the recovery limits for SVOC surrogates must be within 30 to 130
percent.
Benzo(a)pyrene was reported at a concentration of 10 mg/kg, which is greater
than the Residential Direct Contact Soil Remediation Standard of 0.2 mg/Kg
applicable in this example.
Since the reported concentration of benzo(a)pyrene is well above the regulatory level
for benzo(a)pyrene, the reported QC information has no bearing on the usability of
the results and therefore further remediation is needed.
Example 11: Surrogates Low Recovery
A soil sample was analyzed by Method 8260. The intended use of the analytical data
was to determine if contaminants were present at concentrations that exceed the
applicable regulatory level (Impact to Ground Water Screening Level in this
example).
The percent recovery for the surrogate Toluene-d8 was reported to be 20
percent. The DKQ protocol specifies that the recovery limits for surrogates
should be within 70 to 130 percent for this method. Because the reported
recovery for this surrogate is outside acceptance criteria for VOCs, then all VOC
results may be biased low.
1,1,1-Trichloroethane was reported at a concentration of 0.1 mg/Kg, which is
just below the regulatory level (of 0.2 mg/Kg).
The reported percent recovery for the surrogate toluene-d8 indicates a potential low
bias for 1,1,1-trichloroethane. Because the reported concentration of 1,1,1-
36
trichloroethane is just below the regulatory level, the reported potential low bias
means the results should not be used to determine that 1,1,1-trichloroethane is
present at a concentration less than the regulatory level. Before drawing any
conclusions regarding the effect of the low bias reported by the surrogate, the
investigator should consider using multiple lines of evidence, as described in Section
4.4 of this document. This example is evaluated further in Appendix J of this
document, with Example J-1 using multiple lines of evidence.
4.6.3.6 Laboratory Control Samples (LCS)
Laboratory control samples (sometimes referred as blank spikes) are used to monitor
the accuracy of the analyst(s) performing the laboratory method. The LCS should
contain all target analytes. By evaluating the accuracy of the LCS analysis (percent
recovery of the target analytes), one can evaluate the laboratory performance of the
entire analytical process. The evaluation of results of LCS involves the evaluation of
multiple lines of evidence, as described in Section 4.6.4 of this document. Data from
LCS should be used in conjunction with other QC data. The performance standards
for LCS are presented in the DKQ protocols (Appendix B of the DKQ Guidance) and
in Appendix D of this document. When required by the method, laboratories are
required to summarize LCS recoveries associated with the samples from your site
per Appendix A of N.J.A.C. 7:26E.
Example 12: Laboratory Control Samples Low Recovery
Groundwater samples were analyzed by DKQ Method 8260. The purpose of
sampling was to determine compliance with Regulatory criteria. The GWQS for
benzene is 1 μg/L.
The results for the LCS indicate a 54 percent recovery for benzene. The DKQ
protocol specifies that the recovery limits for the LCS should be within 70 to 130
percent.
The analytical results were ND for benzene at a RL of 0.5 μg/l.
37
The results of the laboratory control sample indicate a possible low bias in the
accuracy of the method. The results reported could have been affected by the low
bias of the method, and therefore it is possible that benzene may not have been ND
below the GWQS. Before drawing any conclusions regarding the effect of the low
bias reported associated with the LCS, the investigator should consider using
multiple lines of evidence, as described in Section 4.6.4 of this document.
Resampling and reanalysis may be appropriate. This example is further evaluated in
Appendix J of this document, with Example J-2 using multiple lines of evidence.
Example 13: Laboratory Control Samples High Recovery
Groundwater samples were analyzed using DKQ Method 8260. The purpose of
sampling was to evaluate groundwater contamination prior to the start of
remediation. The GWQS for trichloroethene (TCE) is 1 μg/.
The LCS indicates a 190 percent recovery for TCE, which was detected in the
sample at a concentration of 10 μg/L. DKQ Method 8260 specifies that the
recovery limits for the LCS should be within 70 to 130 percent.
The results for the LCS sample indicate a potential high bias. However, the reported
concentration of TCE is over the GWQS. Therefore, this high bias does not affect the
usability of the data for the intended purpose.
4.6.3.7 Matrix Spike/Matrix Spike Duplicates and Matrix Spike/Matrix Duplicate
The purpose of a MS sample is to determine whether the sample matrix contributes
bias to the analytical results. A MS is an environmental sample to which known
quantities of target analytes are added or spiked by the laboratory prior to sample
analysis. A matrix spike/matrix spike duplicate (MS/MSD) pair is prepared by spiking
two aliquots of an environmental sample with all target analytes. (Please keep in
mind that at such time in an investigation where site-specific concerns have reduced
the number of target analytes/compounds from a “full” list to a subset thereof, then
the MS/MSD fortifications may contain only the site-specific compounds of concern.)
Certain protocols do not require spiking with all analytes. However, DKQPs, with the
exception of air methods, do require the spiking of all target analytes. The two
38
aliquots are analyzed separately, and the results are compared. A MS can be used
to evaluate method accuracy, while a MS/MSD pair can be used to evaluate both
precision and accuracy. MS should not be performed on trip, equipment, or field
blanks. For analysis of samples for organic analytes, a MS/MSD pair is typically
performed. For inorganic analysis, a matrix spike/matrix duplicate (MS/MD) is
typically performed, although a MS/MSD pair is acceptable. Samples chosen for
MS/MSD and MS/MD should be chosen from samples that are similar in
geological/chemical characteristics to those actual site samples. It should be noted
that samples chosen are frequently from “other sites”. Although this practice is not
prohibited (and the use of site-specific QA/QC is generally not required), MS/MSD
and MS/MD results need to be used with discretion. It may or may not add value to
the data assessment process. When required by the method, laboratories are
required to summarize MS/MSD results per Appendix A of N.J.A.C. 7:26E.
To evaluate accuracy one must compare the results of the unspiked sample against
the spiked sample. To evaluate precision, the results of the matrix spike are
compared to those for the matrix spike duplicate. To evaluate accuracy, the percent
recoveries of the matrix spike compounds in both the sample and the duplicate are
compared (taking into consideration any concentration of the compounds in the
unspiked sample). Poor recoveries may be the result of matrix interference and
indicate that the sample results have a significant bias. The RPD between a set of
duplicate results (either a sample and duplicate pair or a MS/MSD pair) is used to
evaluate precision. High RPDs may indicate a lack of sample homogeneity. Poor
recoveries or high RPDs can also be caused by laboratory error, which would affect
the interpretation of results.
The sample submitted for MS/MSD evaluation should be representative of the
potentially contaminated matrix. Ideally, the sample selected for MS/MSD should be
spiked at a concentration which will allow for measurement of the spiked sample
matrix. (If the concentrations of compounds of concern in the unspiked sample are
very high, then it may be necessary to spike the sample at a high level
concentration.) The laboratory will need additional sample quantity when MS/MSDs
are requested and the need for these QC samples must be addressed prior to
sample collection.
39
The evaluation of precision and accuracy using MS/MSDs or sample/duplicate
results is a complex issue and not straightforward in some cases. For organics, the
results of the MS/MSD only impact the sample used for the spike while for metals,
the MS/MSD or MS/MD affect the entire associated batch. Common problems
include interfering matrix effects or high concentrations of target compounds or non-
target compounds that mask the detection or quantitation of spiked compounds. This
review and evaluation involves the evaluation of multiple lines of evidence, as
described in Section 4.6.4 of this document. Data from MS results should be used in
conjunction with other QC data, such as LCS, duplicate samples, and surrogates.
The performance standards for MS/MSDs are presented in the DKQ protocols
(Appendix B of the DKQ Guidance) and in Appendix D of this document.
Example 14: Matrix Spike/Matrix Spike Duplicates Low Recovery
A water sample was evaluated for metals by DKQ Method 6010. The intended
purpose of the analysis was to confirm that remediation was needed.
Lead was detected at 4 ug/L. The effective GWQS is 5 ug/L.
The MS/MSD percent recoveries for lead were 28 percent and 32 percent. The
DKQ protocol specifies that MS/MSD spike recovery limits should be from 75
percent to 125 percent.
The RPD for the MS/MSD pair is 13.3 percent. The DKQ protocol specifies that
RPD should be less than 30 percent for the MS/MSD pair.
All other QC criteria were within the DKQ protocol acceptance criteria.
The RPD for the MS/MSD was well within the acceptance criteria specified in DKQ
protocol, indicating acceptable laboratory precision for the site matrix for the method
of analysis. The MS/MSD percent recoveries indicated a potential low bias for lead.
Therefore, these results should not be used to indicate lead was below the GWQS
for lead.
40
Care must be taken in evaluating the MS/MSD recoveries if the unspiked sample
contains high concentrations of compounds used in the spike.
Example 15 Matrix Spike/Matrix Spike Duplicates High Recovery
A residential soil sample was analyzed by DKQ Method 8260 for VOCs. The
intended use of the data is to determine compliance with the residential direct
contact soil remediation standard.
TCE was reported at a concentration of 8 mg/kg, which is just above the
residential direct contact soil standard of 7 mg/kg.
The percent recoveries for TCE generated by a MS/MSD pair are 180 and 185
percent respectively. According to the DKQ protocol, the recovery limits for the
MS/MSD should be within 70 to 130 percent.
The RPD for the MS/MSD pair is 2.7 percent. The RPD should be less than 30
percent for the MS/MSD pair.
The spike recoveries indicate a potential high bias for trichloroethene. Because of
the reported high bias and the sample result just above the soil standard, the actual
concentration of TCE in the sample may be lower and may be less than the soil
standard. However, the investigator cannot adjust the concentrations of the reported
values lower. The RPD for the MS/MSD pair was within the acceptance criteria
specified in DKQ protocol; therefore, MS/MSD results show an acceptable degree of
the precision. Further evaluation of these results in conjunction with multiple lines of
evidence, as described in Section 4.6.4 of this document, is needed to assess this
potential high bias. This example is evaluated further in Appendix J of this document,
with Example J-3 using multiple lines of evidence.
4.6.3.8 Internal Standards
The purpose of an internal standard is to determine the existence and magnitude of
instrument drift and physical interferences. Internal standard performance criteria
ensure that the instrument’s sensitivity and response are stable (i.e., the analytical
behavior of compounds is uniform in each analytical run) during each analysis.
41
Laboratories are required to submit internal standard summaries for all samples per
Appendix A of N.J.A.C. 7:26E.
Per the analytical methods, target compounds are associated to and quantitated with
specific internal standards. Refer to Appendix J in this guidance document for
specific compound-to-internal standard associations. When results deviate from
acceptance criteria, the analytical results are considered unreliable and thus qualified
as estimated values. When internal standard acceptance criteria are not met, all
quantitative data associated with the non-compliant internal standard may be
suspect. When the internal standard result is below the lower limit of the acceptance
range, RLs may be suspect. Information regarding and internal standards for
Volatiles and SVOCs and their corresponding target compounds and surrogates are
presented in Appendix J of this document.
4.6.3.9 Serial Dilutions (ICP and ICP/MS)
The purpose of a serial dilution is to determine whether or not physical or chemical
interferences exist (on an analyte-specific basis) with the analysis of samples for
metals due to the sample matrix. If an analyte concentration is sufficiently high (i.e.,
minimally, a factor of 10 above a RL) an analysis of a 1:5 dilution should agree within
+/- 10% of the original sample result. Serial dilutions are required for analyses by
ICP and less frequently by ICP/MS. Analytes whose concentrations are outside the
10% difference in sample concentration (i.e., 90 110%) are quantitatively qualified.
Laboratories are required to submit serial dilution summaries for all samples per
Appendix A of N.J.A.C. 7:26E.
4.6.3.10 Interference Check Solution
The commonly occurring analytes aluminum, iron, magnesium and calcium may
cause interferences with the detection and/or quantitation of other analytes. The
instrument can correct for these interferences. The purpose of the Interference
Check Solution (ICS) is to demonstrate the instrument’s ability to overcome
interferences and report data for analytes of concern within an acceptable accuracy
of 80 120% of the actual spiked amount. The effects of the ICS results are applied
42
to all samples within the associated analytical batch. Laboratories are required to
submit ICS summaries for all samples per Appendix A of N.J.A.C. 7:26E.
In general the ICP sample data can be accepted if the concentrations of the
aluminum, iron, magnesium and calcium in the field sample are found to be less than
or equal to their respective concentrations in the ICS. If analytes aluminum, iron,
magnesium and calcium are present in a field sample at levels greater than the ICS,
then the following should occur:
Example 16: ICS Low Recovery
Groundwater samples were analyzed by DKQ Method 6010. The purpose of
sampling was to determine compliance with Regulatory criteria. The GWQS for
Arsenic and Cadmium are 3 ug/L and 4 ug/L, respectively.
The results for the ICS indicate a 54 percent recovery for arsenic and 60 percent
recovery for cadmium. The DKQ protocol specifies that the recovery limits for the
ICS should be within 80 to 120 percent.
The analytical results were both at the GWQS of 3 ug/L and 4 ug/L for Arsenic
and Cadmium, respectively.
The results of the ICS indicate a possible low bias in the accuracy of the method.
The results reported could have been affected by the low bias of the method, and
therefore it is possible that arsenic and cadmium may be above the GWQS. Before
drawing any conclusions regarding the effect of the low bias reported associated with
the ICS, the investigator should consider using multiple lines of evidence, as
described in Section 4.6.4 of this document. Resampling and reanalysis may be
appropriate. This example is further evaluated in Appendix J of this document, with
Example J-4 using multiple lines of evidence.
Example 17: ICSHigh Recovery
Soil samples were analyzed using DKQ Method 6010. The purpose of sampling was
to evaluate if the soil samples exceeded the residential direct contact soil
43
remediation standard for lead. The residential direct contact soil remediation
standard for lead is 400 mg/kg.
The ICS indicates a 150 percent recovery for lead, which was detected in the
sample at a concentration of 1000 mg/kg. The DKQ protocol for method 6010
specifies that the recovery limits for the ICS should be within 80 - 120 percent.
The results for the ICS sample indicate a potential high bias. However, the reported
concentration of lead is much greater than the applicable standard. Therefore, this
high bias does not affect the usability of the data for the intended purpose. Further
remediation would be required.
4.6.3.11 Matrix Spikes and Duplicates
The purpose of a matrix spike and duplicate is to determine whether the sample
matrix contributes bias to the analytical results. The sample that is spiked should be
representative of the soil type from the site under investigation/remediation.
Documenting the effect of the matrix for a given preparation batch consisting of
similar sample characteristics should include the analysis of at least one matrix spike
and one duplicate unspiked sample or one matrix spike/matrix spike duplicate pair.
The decision of whether to prepare and analyze duplicate samples or MS/MSD
should be based on knowledge of the samples in the sample batch or as noted in the
QAPP. If samples are expected to contain target analytes, then the laboratory may
use one MS and a duplicate analysis of an unspiked field sample. If samples are not
expected to contain target analytes, then the laboratory should use a MS/MSD.
Unknown source investigations should employ the use of a MS/MSD.
Sample requirements are specified in the DKQ methods attached to the NJDEP Site
Remediation Program, Data of Known Quality Protocols Technical Guidance April
2014. Actions to be taken on affected samples are the same as those noted in
Section 5.6.3.7 above. However qualifications of data affected by MS, MS/MSD/ and
duplicate outliers affect all samples associated with the corresponding digestion
batch.
44
4.6.3.12 Internal Standards for ICP/MS (for Metals)
The purpose of internal standards is to determine the existence and magnitude of
instrument drift and physical interferences. Internal standards are added to every
sample, calibration standard and QC sample. Laboratories are required to submit
internal standard summaries for all samples per Appendix A of N.J.A.C. 7:26E. If the
QC criteria are not met, then the sample must be diluted five-fold and reanalyzed
with the appropriate amounts of internal standard. If the first dilution does not correct
the deficiency, then the procedure should be repeated until the internal standard
intensities fall within the method-defined acceptance criteria.
4.6.4 Using Multiple Lines of Evidence to Evaluate Laboratory QC Information
The use of several different types of laboratory QC information as multiple lines of
evidence to understand complex QC issues is an important component of DUEs. A
conclusion about possible bias in data should not be drawn until the results of all QC
samples are assessed since cumulative quality control effects may confound results.
The following examples illustrate the evaluation of commonly reported QC information
using a “multiple lines of evidence” approach. The investigator should seek
experienced assistance, as needed, when evaluating QC data involving multiple lines
of evidence. These examples are intended to build on the information presented earlier
in this document. Additional examples using multiple lines of evidence are also
presented in Appendix J of this document.
Example 18: Multiple Lines of Evidence Low Recovery for LCS and MS/MSD
A soil sample was analyzed by DKQ Method 8260 for VOCs. The intended purpose of
the analysis was to evaluate the concentrations of VOCs that were present at a release
area.
The reported concentrations of the constituents of concern are just below (e.g.,
the concentrations are 9 ug/Kg and the regulatory levels are 10 ug/Kg) the
applicable regulatory criteria.
45
The percent recoveries for TCE generated by a MS/MSD pair are low and are
less than 45 percent. According to the DKQ protocol, the recovery limits for the
MS/MSD should be within 70 to 130 percent.
LCS percent recoveries are low and are less than 35 percent. The DKQ protocol
specifies that the recovery limits for the LCS should be within 70 to 130 percent.
About 25% of the DKQ Method 8260 target compounds, including TCE, are
outside of the acceptance criteria specified in the DKQ protocol.
In this example, the most important QC component is the LCS data as it is indicative of
the overall performance of the laboratory.MS/MSDs evaluate method precision and
accuracy in relation to the sample matrix. LCSs evaluate the laboratory's performance.
The QC sample results indicate consistent low bias associated with both the sample
analysis and the laboratory's performance for the analysis of TCE; however, the LCS
results indicate laboratory performance issues. The LCS is a measure of how well the
laboratory can perform a given method in a clean sample matrix. Failure to get
adequate LCS recoveries can indicate a problem with the compound-specific results
for the samples associated with the LCS. Therefore, the actual concentrations of the
constituents of concern may be higher than reported and actually above the regulatory
level.
The investigator may need to contact the laboratory for guidance on how to best
resolve issues associated with the failure of an LCS to meet acceptance criteria.
Reanalysis of the samples (if within holding time), use of alternative analytical
methods, or collection of additional samples may be necessary to obtain data that
could be used to demonstrate that the reported concentrations are less than the
applicable regulatory criteria.
Example 19: Multiple Lines of Evidence Low MS/MSD Recovery
A soil sample was analyzed by DKQ Method 8260 for VOCs. The intended purpose of
the analysis was to evaluate the concentrations of VOCs that were present due to a
discharge.
46
The reported concentrations of the constituents of concern are ND, and the RLs
are well below (e.g., a factor of 100 times lower) the applicable regulatory criteria.
The MS/MSDs are from the site being investigated.
The MS/MSD recoveries were outside acceptance limits. Recoveries were in the
40-50% range. According to the DKQ protocol, the recovery limits for the
MS/MSD should be within 70 to 130 percent.
The results for the surrogates and the LCS were within acceptance limits.
The results for the surrogates and the laboratory control sample indicate laboratory and
method performance are acceptable indicating that the data are not biased due based
on these QC indicators. The results for the MS/MSD indicate a potential low bias, but
as no compounds were detected and the RLs were far below the regulatory criteria,
there is no significant impact on the usability of the data.
4.6.5 Data Usability Evaluations for Non-DKQ Analytical Data
In order to evaluate if Non-DKQ data can be used to support environmental decision-
making, the investigator should go through a multi-step evaluation process. One
objective of that evaluation would be to make a decision as to whether additional data
collection is necessary to corroborate the Non-DKQ data or whether the quality of the
Non-DKQ data is such that it could be used for its intended purpose without the
collection of additional data. Such an evaluation process includes the following steps:
The QAPP should identify acceptance criteria for the non-DKQP methods and
the associated DQOs.
Perform a DQA and DUE to evaluate precision, accuracy and sensitivity. The
investigator must evaluate the RLs, method detection limits (if available),
handling and holding times, sample preservation, and results of QC measures
(surrogates, LCS, MS/MSD or MS/MD, method blank results). Review any data
narratives which may explain issues with sample receipt and analysis.
47
Consider such factors as the age of previously generated data, limitations and
benefits of analytical method(s), laboratory QA/QC results, and how any of those
factors might affect the quality of the data or the usability of the data with respect
to its intended purpose.
Determine whether any newer data corroborate the older results and whether all
sets of data are consistent with the CSM.
Review available field collection information, preservation techniques, filtering, et
cetera for the older samples to evaluate how those techniques compare to
current knowledge and how any differences from more recent scientific
perspectives might affect the quality of the data.
Consider decisions that have already been made based on the old data.
Consider future decisions that will be made based on the old data.
Consider any other site-specific factors.
NJDEP expects that more scrutiny regarding the quality of previously generated data
will be necessary when the investigator intends to use that data to demonstrate
compliance with applicable regulations than when that data are used to design
additional data collection activities.
If the investigator does not fully understand all of the issues associated with the data
quality assessment of non-DKQP, then it is highly recommended that they consult with
experts more knowledgeable in this field. The investigator may seek additional
guidance from the Department’s or USEPA Region 2 SOPs. Region 2 data validation
guidance documents and SOPs may be found at
http://www.epa.gov/region2/qa/documents.htm
48
4.6.6 Data Usability Evaluations Using Multiple Lines of Evidence from DQOs and
the CSM
Using multiple lines of evidence during a DUE is not limited to the use of analytical QC
data. Multiple lines of evidence using DQOs and CSM can also be used to determine if
the quality of the analytical data is adequate for the intended purpose. The DQOs are
used to determine if a sufficient quantity and quality of analytical data was generated to
meet the goals of the project and support defensible conclusions that are protective of
human health and the environment. Information regarding the DQOs is presented in
Section 2.1 of this guidance document. The investigator will also evaluate the analytical
data in relation to the CSM to determine if any significant data gaps result from the
quality of the data. For these evaluations, the SRP expects that the investigator will
use an approach that is fully protective of human health and the environment. This
evaluation includes, but is not limited to, the following actions:
Evaluate the analytical data to determine if the DQOs for precision, accuracy,
representativeness, comparability, completeness and sensitivity are met.
Evaluate the entire body of information (type, amount, and quality data) available
for the specific area/discharge for which the data are presumed to be
representative.
Determine whether the data are consistent with the CSM and if any significant
data gaps are present.
Consider the effects of having insufficient and/or inaccurate information relative
to the risk to potential receptors and the risk to human health and the
environment.
Consider the source of data (e.g., whether the data were generated by the
investigator’s own firm or some other firm, the investigator’s own involvement
with the project, the method of collection for the samples, and the reporting
methods by other firms/laboratories generating the data). Perform a critical
review of these data to evaluate its reliability.
49
Consider any other site-specific factors.
In addition to the items listed above the reader should also refer to the Data Usability
Evaluation Worksheet presented in Appendix H-2 for further information to consider
during this evaluation.
4.6.7 Factors to be Considered During Data Usability Evaluations
Factors that must be considered during DUEs are presented below:
Adjusting analytical results reported by the laboratory based on laboratory QC
information is not appropriate. For example, if the results for a matrix spike
indicate a percent recovery of 150%, it is not scientifically valid to adjust the
results downward by 50 %. If a contaminant is reported in a blank, it is never
appropriate to subtract the concentration of the concentration found in the blank
from the sample results.
False positives can occur due to contamination from commonly used laboratory
contaminants, interferences in laboratory methods themselves or sample
preservation procedures. For example, methyl ethyl ketone can be formed when
sodium bisulfate is used to preserve a soil sample for volatile organic compound
analysis. The investigator should contact the laboratory for assistance when the
results do not make sense in relation to the CSM.
In addition to evaluating high or low bias, it is also necessary to consider
indeterminate or non-directional bias caused by high RPDs or conflicting biases
in the data. High RPDs may indicate a lack of sample homogeneity and raise
questions regarding the representativeness of the sample.
The investigator is responsible for evaluating overall data quality and usability
and should not ask the laboratory to perform the DQA nor the DUE of their data
(e.g., it is not appropriate to have the laboratory complete the NJDEP Full
Laboratory Data Deliverable Form). If the laboratory is required by the
investigator to complete the NJDEP Full Data Deliverable Form and/or the
50
NJDEP Reduced Deliverable Form, the investigator is forewarned that they and
not the laboratory are responsible for the content of that information.
It is important that the meaning of laboratory acceptance criteria be understood
when evaluating QC results. The purpose of acceptance criteria is to define a
range where data are acceptable as reported. Any data within an acceptable
recovery window is appropriate for use. When QC results and information are
within acceptance criteria, the reported value is “accepted” as the concentration
that should be used for decision-making purposes.
Results from surrogate analytes do not automatically indicate that a QC issue
exists for a specific compound. Matrix spikes are used to evaluate the
performance of a specific compound on the spiked sample.
Soil and sediment results should be reported on a dry-weight basis. Tissues are
reported on a wet-weight basis. If sample results are reported incorrectly the
laboratory should be contacted for assistance.
Sample heterogeneity issues or RL issues are to be considered when evaluating
total results and results following SPLP or Toxicity Characteristic Leaching
Procedure (TCLP) extraction. For example, the total sample results of analysis
for total VOCs are “ND,” while the results for the SPLP or TCLP leachate indicate
the presence of VOCs at substantial concentrations.
It is inappropriate to conclude in all instances that because the matrix spike and
matrix spike duplicate results are biased low, the contaminants are bound up in a
sample matrix that has not undergone some form of treatment, and therefore the
low bias is irrelevant. (There may be instances where the compounds of concern
do exhibit low MS/MSD recoveries due to a treatment of the matrix designed for
exactly that purpose.) The investigator should contact the laboratory to
determine, if possible, how to overcome such matrix interference issues. An
evaluation to determine if a compound is bound up in the sample matrix is
outside of the scope of this document and may involve a significant study.
51
It is important to work with the laboratory to minimize analytical difficulties or bias.
There are several options for sample clean-up and analysis. Typically, sediment
samples for pesticides or PCBs need extensive sample clean-up because
naturally occurring interferences can cause analytical problems. Should the
resultant effect of cleanup be an increase in the RL, the laboratory should contact
the investigator and inquire as to how the laboratory is to proceed.
4.6.8 Documentation of Data Quality Assessments and Data Usability Evaluations
Documentation of the thought process used, as well as the outcomes of the DQA and
DUE is an essential task that is necessary to support the investigator’s decisions
regarding the usability of the analytical data for the intended purpose. This
documentation is a thoughtful and succinct evaluation and presentation of the findings
and conclusions of the DQA and DUE process. NJDEP expects that this
documentation will be presented in the documents submitted to the Department where
the analytical data are used to support the investigator’s opinion that the quality of
analytical data is appropriate, or not appropriate, for the intended purpose(s).
As stated previously, there are various ways to document this information, including the
DQA Worksheets in Appendix D, NJDEP Full Laboratory Data Deliverable Form in
http://www.nj.gov/dep/srp/srra/forms/
, DUE Worksheet in Appendix I of this document
and the text of the document that uses the analytical data. The DQA and DUE
worksheets may be modified by the user as deemed appropriate, provided the end
result meets the objectives expressed in this guidance document.
Typical documentation of a DQA and DUE includes a written summary regarding data
usability and DQA and DUE Worksheets. The report that presents the analytical data
should also include:
The laboratory reports, laboratory narratives, and Data of Known Quality
Conformance/Nonconformance Summary Questionnaireand chain of custody
form;
Project communication forms (if used); and
52
Any other pertinent information.
The investigator should work with the laboratory to receive the analytical data in a
convenient format, particularly if the laboratory report is provided in electronic format.
The use of electronic deliverables from the laboratory can make the transfer of data
into computer spreadsheets and databases more efficient, which in turn will improve
efficiency when performing the DQA and DUE.
53
REFERENCES
Connecticut Department of Environmental Protection, Guidance for Collecting and Preserving
Soil and Sediment Samples for Laboratory Determination of Volatile Organic Compounds,
Version 2.0, effective March 1, 2006.
Connecticut Department of Environmental Protection, Site Characterization Guidance
Document, effective September 2007.
Connecticut Department of Environmental Protection, Laboratory Quality Assurance and Quality
Control Guidance, Reasonable Confidence Protocols, Guidance Document, effective November
19, 2007.
Connecticut Department of Environmental Protection, Reasonable Confidence Protocols, for
various analytical methods.
Florida Department of Environmental Protection, Process for Assessing Data Usability, DEP-EA
001/07, MWB Draft 9/6/07 (v2).
Massachusetts Department of Environmental Protection, Bureau of Waste Site Cleanup, The
Compendium of Quality Assurance and Quality Control Requirements and Performance
Standards for Selected Analytical Methods Used in Support of Response Actions for the
Massachusetts Contingency Plan (MCP).
Massachusetts Department of Environmental Protection, Bureau of Waste Site Cleanup, MCP
Representativeness Evaluations and Data Usability Assessments, Policy #WSC-07-350,
September 19, 2007.
U.S. Army Corps of Engineers, Environmental Quality Assurance for HTRW Projects, Engineer
Manual. October 10, 1997, EM 200 1-6.
United States Environmental Protection Agency (EPA) Quality Assurance guidance document:
Guidance on Systematic Planning Using the Data Quality Objective Process (QA/G-4), February
2006, EPA/240/B-06/001.
54
U.S. Environmental Protection Agency, Office of Superfund Remediation and Technology
Innovation (5201G), Introduction to the Contract Laboratory Program, April 2004, EPA 540-R-
03-005, OWSER 9240.1-41.
U.S. Environmental Protection Agency, Guidance on Environmental Data Verification and Data
Validation, EPA QA/G-8, November 2002, EPA/240/R-02/002.
U.S. Environmental Protection Agency, EPA New England, Region 1, Quality Assurance Unit
Staff, Office of Environmental Measurement and Evaluation Data Validation Functional
Guidelines For Evaluating Environmental Analyses”, July 1996, Revised December 1996.
U.S. Environmental Protection Agency, EPA New England Region I, Compendium of Quality
Assurance Project Plan, Requirements and Guidance, Final October 1999, Attachment A.
U.S. Environmental Protection Agency, EPA-NE, Region I, Quality Assurance Project Plan
Manual, Draft, September 1998, Table 4, pg. 83-87.
U.S. Environmental Protection Agency, Office of Solid Waste and Emergency Response,
Guidance for Data Usability in Risk Assessment (Part A), Final, April 1992, EPA 9285.7, PB92-
963356.
U.S. Environmental Protection Agency, Guidance for Quality Assurance Project Plans, February
1998, EPA QA/G-5 EPA 600-R-98-018.
U.S. Environmental Protection Agency, Office of Emergency Response, Quality Assurance
Guidance for Conducting Brownfields Site Assessments, September 1998, EPA 540-R-98-038,
OSWER 9230.0-83P, PB98-963307.
55
Appendix A
Supplemental Information on Data Quality Objectives
and Quality Assurance Project Plans
56
APPENDIX A
S
UPPLEMENTAL INFORMATION ON DATA QUALITY OBJECTIVES
AND QUALITY ASSURANCE PROJECT PLANS
Data Quality Objectives (DQOs) are project-specific goals for an environmental investigation that address the
generation, assessment, and intended use of the data associated with that investigation. DQOs express the
qualitative and quantitative measures that will be used to determine whether the amount and quality of data
associated with the investigation are sufficient and sufficiently accurate to draw the conclusions that will be
necessary. Information on developing Data Quality Objectives can be found in the United States Environmental
Protection Agency (EPA) Quality Assurance guidance document: Guidance on Systematic Planning Using the
Data Quality Objective Process (QA/G-4), February 2006, EPA/240/B-06/001.
A Quality Assurance Project Plan (QAPP) documents the planning, implementation, and assessment
procedures for a particular project, as well as any specific quality assurance and quality control activities. It
integrates all the technical and quality aspects of the project in order to provide a "blueprint" for obtaining the
type and quality of environmental data and information needed for a specific decision or use. All work
performed or funded by EPA that involves the acquisition of environmental data must have an approved
QAPP. In these instances, the State of New Jersey Department of Environmental Protection and EPA must
review all QAPPs prior to the commencement of any monitoring component of the project. All QAPPs shall be
written in conformance with N.J.A.C. 7:26E 2.2 and the Site Remediation Program’s Technical Guidance for
Quality Assurance Project Plans”. These and other quality assurance documents can be accessed at the
following websites:
www.epa.gov/region1/lab/qa/qualsys.html and,
http://www.nj.gov/dep/srp/guidance/index.html
57
Appendix B
QC Information Summary and
Measurement Performance Criteria
58
APPENDIX B-1
S
UMMARY OF QUALITY CONTROL CHECKS AND SAMPLES
QC Sample or Activity used to
Assess Measurement Performance
Frequency*
Measurement
Performance
Criteria
Field Duplicate One in 20 samples per matrix for each parameter
See Appendix D-4
Site Specific Matrix Spike, Matrix Spike Duplicate
(MS/MSD) Pair
One in 20 samples, one MS/MSD per matrix for
each parameter
Laboratory Control Sample, Laboratory Control Sample
Duplicate (LCS/LCSD) Pair
One per batch of up to 20 samples per matrix
Field Blank Project specific
Equipment Blank
One in 20 samples with non-dedicated
equipment
Trip Blank
One per cooler (VOCs only) per event for VOCs
and volatile organic compounds
Performance Evaluation Sample Project specific
Inter-Lab Split Samples Project specific
Methanol Trip Blank Project specific
*Frequency determined by method and/or project-specific requirements
59
APPENDIX B-2
T
YPES OF INFORMATION USED TO EVALUATE
PRECISION, ACCURACY, REPRESENTATIVENESS, COMPARABILITY,
C
OMPLETENESS AND SENSITIVITY
QC Element Laboratory Measures Field Measures
Precision
Laboratory Control Sample/
Laboratory Control Sample Duplicate
Pair
Field Duplicates
Matrix Spike/Matrix Spike Duplicates pairs
(collect samples for)
Matrix Duplicate (collect samples for)
Matrix Spike Duplicates
Historical Data Trends Appropriate Sampling Procedure
Accuracy
Laboratory Control Samples
Matrix Spikes/Matrix Spike Duplicates
(collect samples for)
Matrix Spikes and Matrix Spike
Duplicates
Inclusion of “Blind” Samples
Internal Standards Appropriate Sampling Procedures
Surrogate Recovery Appropriate Sample Containers
Initial Calibration Appropriate Sample Preservation
Continuing Calibration Handling & Holding Times
Standard Reference Material Equipment Blank/Field Blank
Representativeness
Laboratory Homogenization
Appropriate Sampling Procedures
Appropriate Sample Containers
Appropriate Sub-sampling Appropriate Sample Preservation
Appropriate Dilutions Incorporation of Field Screening Data
“As Received” Sample Preservation
Meeting Hold Times
Appropriate Number of Samples
Comparability
Gas Chromatography/Mass
Spectrometry Tuning
Comparison to Previous Data Points
Calibration Comparison to Similar Data Points
Analytical Method Followed Similar Methods of Analysis used
Completeness
Percent Sample Per Batch Analyzed
and Reported
Percent Planned Samples Collected
All Critical Samples Reported and
Unqualified
All Critical Samples Collected
Sensitivity
Method Blanks Equipment Blank/Field Blanks
Instrument Blanks Appropriate Sample Volume or Weight
Reporting Limit
(Lowest Calibration Standard)
Appropriate Analytical Method
Adapted from Massachusetts Department of Environmental Protection, Bureau of Waste Site Cleanup, MCP
Representativeness Evaluations and Data Usability Assessments, Policy #WSC-07-350, September 19, 2007.
60
APPENDIX B-3
I
NFORMATION DERIVED FROM QUALITY CONTROL CHECKS AND SAMPLES
Data Quality
Indicator
(Type of
Information
Provided)
QC
Checks
and
Samples
Sources of Measurement Error
Sample Collection
Sample
Transport
Laboratory
Sampling
Equipment
Sample
Container
Preserva-
tion
Technique
Sample
Matrix
Shipment
Process
Sample
Storage at
Laboratory
Sample
Prepara-
tion
Reagents
Sample
Prepara-
tion
Equipment
Analytical
Method
Reagents
Analytical
Equipment
Purpose
Accuracy/Bias
(Contamination)
Equipment
Blank
(Rinsate
Blank)
X X X X X X X X X
To evaluate carryover
contamination resulting
from successive use of
sampling equipment.
Bottle Blank
(per Lot #)
X X X X X
To evaluate
contamination
introduced from the
sample container.
VOA Trip
Blank
X X X X X X X X
To evaluate
contamination
introduced during
shipment.
Storage
Blank
X X X X X
To evaluate cross
contamination
introduced during
sample storage.
Method
Blank
X X X X
To evaluate
contamination
introduced during
sample preparation
and/or analysis by
laboratory, including
reagents, equipment,
sample handling and
ambient laboratory
conditions.
Reagent
Blank
(per Lot #)
X X X X
To evaluate
contamination
introduced by specific
method reagents.
Instrument
(System)
Blank
X X
To evaluate
contamination
originating from the
analytical reagents
instrumentation.
61
Data Quality
Indicator
(Type of
Information
Provided)
QC
Checks
and
Samples
Sources of Measurement Error
Sample Collection
Sample
Transport
Laboratory
Sampling
Equipment
Sample
Container
Preserva-
tion
Technique
Sample
Matrix
Shipment
Process
Sample
Storage at
Laboratory
Sample
Prepara-
tion
Reagents
Sample
Prepara-
tion
Equipment
Analytical
Method
Reagents
Analytical
Equipment
Purpose
Accuracy/Bias
Matrix
Spike
X X X X X
To determine laboratory
preparatory and
analytical bias for
specific compounds in
specific sample
matrices.
Surrogate
Spike
X X X X X
To evaluate laboratory
preparatory and
analytical bias for
specific sample
matrices.
Accuracy/Bias
Laboratory
Control
Sample
(LCS)
X X X X
To evaluate the
laboratory’s ability to
accurately identify and
quantitate target
compounds in a
reference matrix at a
known concentration,
usually mid-range of the
calibration curve.
Perfor-
mance
Evaluation
Samples-
Ampulated
Single Blind
X X X X
To evaluate sample
handling procedures
from field to laboratory.
To evaluate the
laboratory’s ability to
accurately identify and
quantitate target
compounds in a
reference matrix.
Frequently used for data
quality assessments
and for laboratory self-
assessments and
external assessments.
Perfor-
mance
Evaluation
Sample-Full
Volume
Single Blind
X X X X X X X X
62
Data Quality
Indicator
(Type of
Information
Provided)
QC
Checks
and
Samples
Sources of Measurement Error
Sample Collection
Sample
Transport
Laboratory
Sampling
Equipment
Sample
Container
Preserva-
tion
Technique
Sample
Matrix
Shipment
Process
Sample
Storage at
Laboratory
Sample
Prepara-
tion
Reagents
Sample
Prepara-
tion
Equipment
Analytical
Method
Reagents
Analytical
Equipment
Purpose
Accuracy/Bias
Perfor-
mance
Evaluation
Sample
Double
Blind
X X X X X X X X
To evaluate sample
handling procedures
from field to laboratory.
To evaluate the
laboratory’s ability to
accurately identify and
quantitate target
compounds in a
reference matrix.
Laboratory
Fortified
Blank (LFB)
or
Laboratory
Control
Sample
(LCS)
X X X X
A type of LCS used to
evaluate laboratory
(preparatory and
analytical) sensitivity
and bias for specific
compounds in a
reference matrix at the
quantitation limit
concentrations.
Accuracy/Bias
Initial
Calibration
X X
To ensure that the
instrument is capable of
producing acceptable
qualitative and
quantitative data.
Continuing
Calibration/
Continuing
Calibration
Verification
X X
To ensure the accuracy
and stability of the
instrument response.
Instrument
Perfor-
mance
Check
Sample
X X
To verify that an
instrument can
accurately identify and
quantitate target
analytes at specific
concentration levels.
63
Data Quality
Indicator
(Type of
Information
Provided)
QC
Checks
and
Samples
Sources of Measurement Error
Sample Collection
Sample
Transport
Laboratory
Sampling
Equipment
Sample
Container
Preserva-
tion
Technique
Sample
Matrix
Shipment
Process
Sample
Storage at
Laboratory
Sample
Prepara-
tion
Reagents
Sample
Prepara-
tion
Equipment
Analytical
Method
Reagents
Analytical
Equipment
Purpose
Accuracy/Bias
(Preservation)
Cooler
Temp.
Blank
(VOC only)
X
To evaluate whether or
not samples were
adequately cooled
during shipment.
Sensitivity
Low-level
calibration
standard
X X X X
A standard used to
evaluate accuracy and
sensitivity at a specific
concentration. Used to
evaluate laboratory
sensitivity and bias for
specific compounds in a
reference matrix at the
quantitation limit
concentrations.
Method
Detection
Limit
Studies
X (if
performed
using
same
reference
matrix)
X X X X
A statistical
determination that
defines the minimum
concentration of a
substance that can be
measured and reported
with 99% confidence
that the analyte
concentration is greater
than zero.
Sensitivity
Low Point
of Initial
Calibration
Curve
(Reporting
Limit)
X X
To ensure that the
instrument is capable of
producing acceptable
qualitative and
quantitative data at the
lowest concentration
that sample results will
be reported; the
Reporting Limit.
64
Data Quality
Indicator
(Type of
Information
Provided)
QC
Checks
and
Samples
Sources of Measurement Error
Sample Collection
Sample
Transport
Laboratory
Sampling
Equipment
Sample
Container
Preserva-
tion
Technique
Sample
Matrix
Shipment
Process
Sample
Storage at
Laboratory
Sample
Prepara-
tion
Reagents
Sample
Prepara-
tion
Equipment
Analytical
Method
Reagents
Analytical
Equipment
Purpose
Precision
Field
Duplicates
X X X X X X X X X X
To measure overall
precision by evaluating
cumulative effects of
both field and laboratory
precision.
Laboratory
Duplicates
X X X X X
To evaluate laboratory
preparatory and
analytical precision.
Matrix
Spike
Duplicates
X X X X
To determine laboratory
preparatory and
analytical bias and
precision for specific
compounds in specific
sample matrices.
Analytical
Replicates
(e.g.,
duplicate
injections)
X
To evaluate analytical
precision for
determinative
instrumentation.
Internal
Standards
X
To evaluate instrument
precision and stability.
Inter-laboratory
Comparability
Field Splits X X X X X X
To evaluate sample
handling procedures
from field to laboratory
and to evaluate inter-
laboratory comparability
and precision.
Notes:
Not all of the types of QC checks and samples listed in this table are standard deliverables that are reported or required by the RCPs.
Table adapted from Region I, EPA New England Compendium of Quality Assurance Project Plan Requirements and Guidance, Final October 1999, Attachment A: Region I, EPA-NE Quality Assurance Project
Plan Manual, Draft, September 1998, Table 4, pages 83-87.
65
Appendix C
QC Information to be Reviewed During
Data Quality Assessments
66
APPENDIX C
Q
UALITY CONTROL INFORMATION TO BE EVALUATED
D
URING DQA AND DUES
NJDEP expects that the investigator will evaluate all laboratory reported QC information and
nonconformances in accordance with this guidance. Nonconformances that are found may be
noted on the DQQ Worksheets found in Appendix D of this document, the SRP Full Laboratory
Data Deliverables form and the SRP Reduced Laboratory Data Deliverables section appearing
in key documents.
The information below summarizes standard, required deliverables to obtain Data of Known
Quality. The QC information that must be reviewed during the DQA by the investigator includes,
but is not limited to the following:
STANDARD DKQ DELIVERABLES
Laboratory Report Inspection
Goal: Determine if all laboratory deliverables are provided and complete:
Tasks:
Review the laboratory report to determine that the following items are present for all
sample batches:
o DKQ Conformance/Nonconformance Summary
Questionnaire(C/NCSQ)
o Narrative identifying QC nonconformances;
o Analytical results;
o Chain of Custody Form; and,
o Quality control results, including but not limited to:
Method Blanks;
Laboratory Control Samples (LCS);
MS/MSD (when requested);
Surrogates (as appropriate for method); and,
Other QC results and information provided in the laboratory report.
Review information on the C/NCSQ to determine that:
o All the questions in the C/NCSQ are answered;
o The C/NCSQ is dated and signed; and,
o The narrative includes an explanation for the questions which were answered “NO.”
Review the laboratory narrative to identify QC nonconformances:
o Review the narrative for significant findings (i.e., QC nonconformances that could affect
usability of the reported results) and request additional information from the
laboratory, if applicable.
67
Review the Chain of Custody Form for completeness and correctness:
o Review Chain of Custody Form to ensure form is complete and correct;
o Verify sample identification numbers and collection information;
o Verify that there is an acceptance signature for each relinquished signature documenting
the delivery of the samples to the laboratory facility. Check for errors in noted dates and
times;
o Correct any errors with a single line cross-out, initial/date and note reason for correction;
and,
o Contact the laboratory for help or clarification if needed.
Data of Known Quality Evaluation
Goal: Determine if Data of Known Quality was achieved.
Tasks: Review the C/NCSQ to determine if data are of known quality was achieved.
Chain of Custody (COC) Evaluation
Goal: Evaluate the information presented on the Chain of Custody Form to determine if any
QC issues or nonconformances are present.
Tasks:
o Determine whether Handling Time was met;
o Determine if samples appropriately preserved/refrigerated/iced; and,
o Determine if samples were received by the laboratory an appropriate temperature.
Sample Result Evaluation
Goal: Determine if sample results have been properly reported.
Tasks: Evaluate the sample results:
Determine that reporting limits (RLs) were noted;
Verify that concentrations greater than the RL were reported;
Verify that concentration reported below the RLs are qualified “J”
Verify that the results for soils and sediments were reported in mg/kg on a dry weight basis;
Verify that results for aqueous samples are reported in ug/L;
Verify that air vapor samples are reported in ug/m3;
Check dilution factor to see if a dilution was performed and if so, the RL adjusted
accordingly;;
Determine that RLs are less than, or equal to the regulatory criteria; and,
Determine if sample results are provided for the each requested analysis
Sample Preservation and Holding Times Evaluation
68
Goal: Determine if samples were preserved properly and analyzed within holding times.
Tasks:
Review the chain of custody and or narrative to determine if the samples were preserved in
accordance with the requirement of the DKQ Method reported.
Review the narrative to determine if the holding time specified in the DKQ Method was met.
Review the chain of custody for other sample/method-specific QA (e.g. vacuum
readings on vapor canisters).
Method, Field or Trip Blank Evaluation
Goal: Determine the existence and magnitude of contamination resulting from laboratory or
field activities.
Task: Review all blank data and narratives for possible contamination.
Field Duplicates and Laboratory Duplicates
Goal: Evaluate Precision
Task: Review all duplicate sample information.
Laboratory Control Samples Evaluation
Goal: Evaluate accuracy of laboratory method.
Task: Review the narrative to determine if nonconformances were noted in the laboratory
narrative.
Surrogate Results Evaluation
Goal: Evaluate accuracy in the sample matrix.
Task: Review the narrative to determine if nonconformances were noted in the laboratory
narrative.
Matrix Spike/Matrix Spike Duplicate Results Evaluation
Goal: Evaluate accuracy (Matrix Spike) and precision (Matrix Spike Duplicate) in the sample
matrix.
Task: Review the narrative to determine if nonconformances were noted in the laboratory
narrative.
69
Other Information and QC Information:
Other Laboratory Information:
Evaluate precision, accuracy, representativeness, comparability, completeness, and sensitivity
as appropriate
Review information provided.
70
Appendix D
Data Quality Assessment Worksheets and
Summary of DKQ Acceptance Criteria
71
APPENDIX D-1
I
NSTRUCTIONS FOR THE USE OF THE
D
ATA QUALITY ASSESSMENT WORKSHEETS
The worksheets presented in Appendices D-2 and D-3 are two examples of Data Quality
Assessment Worksheets (DQA Worksheets) that may be used to summarize the QC
nonconformances that are reported for a laboratory deliverable for each sample in one place.
The “NJDEP Site Remediation Program Full Laboratory Data Deliverable Form” must be
submitted to the Department pursuant to N.J.A.C. 7:26E-2.1(a)15 when submitting analytical
results for samples of Immediate Environmental Concern (IEC), potable well samples, and
vapor intrusion cases pursuant to N.J.A.C 7:26E-1.14, 1.17, and 1.18 and for polychlorinated
dibenzo-p-dioxins/polychlorinated dibenzofurans and all hexavalent chromium soil samples
pursuant to N.J.A.C. 7:26E-2.1. This form and instructions are available on the NJDEP website
at http://www.nj.gov/dep/srp/srra/forms/
. These worksheets are intended to be a starting point
and can be modified by the user. A summary of the QC information to be reviewed as part of a
Data Quality Assessment is presented in Appendix C of this document. It is the investigator’s
responsibility to complete these worksheets (i.e., they should not be completed by the
laboratory).
If needed, the NJDEP DKQ acceptance criteria for each of the common analytical methods can
also be found in Appendix D-4 of this document and Appendix B of the DKQ Guidance.
Appendix D-2, DQA Worksheet 1
QC for DKQ deliverables and other information is shown on the left hand side of the form. QC
nonconformances, if any, are circled and described on the right hand side of the form. A space
for notes is also provided on the right hand side of this form.
Appendix D-3, DQA Worksheet 2
This one page worksheet can be used to list all of the nonconformances for a sample in one
place. To help streamline data entry this form can be filled out electronically by using a
spreadsheet program. For smaller projects, it may be useful to add a columns to list applicable
regulatory criteria and preliminary DUE findings.
72
APPENDIX D-2
D
ATA QUALITY ASSESSMENT WORKSHEET 1
PAGE __ OF __
PROJECT:
FILE NUMBER:
LABORATORY WORK ORDER
REVIEWER: DATE:
BLANKS
Compound
Compound
Compound
Compound
Notes
Method Blank, VOCs
>RL?
Method Blank, SVOCs
>RL?
Method Blank, VPH
>RL?
Method Blank, EPH
>RL?
Method Blank, PCBS
>RL?
Method Blank, Pest
>RL?
Method Blank, Metals
>RL?
Method Blank, Total Cyanide
>RL?
Method Blank, ETPH
>RL?
Method Blank Hex Chrome
>RL?
Field Blank
>RL?
Trip Blank
>RL?
VPH Blank (methanol)
>RL?
Blank Soil VOCs (methanol)
>RL?
Blank Soil VOCs
(water/bisulfate) circle
>RL?
LCS
SV
Low Bias
High Bias
Compound
Compound
Notes
VOCs
<10%
> 10% & < LCL
>UCL
SVOCs
<10%
> 10% & < LCL
>UCL
VPH
<10%
> 10% & < LCL
>UCL
EPH
<10%
> 10% & < LCL
>UCL
PCB
<10%
> 10% & < LCL
>UCL
PEST
<10%
> 10% & < LCL
>UCL
Hex Chrome
<70%
>70 % & <LCL
> UCL
Metals
<10%
> 10% & < LCL
>UCL
Total Cyanide
<10%
> 10% & < LCL
>UCL
ETPH
<10%
> 10% & < LCL
>UCL
SURROGATES
SV
Low Bias
High Bias
Compound
Compound
Notes
VOCs
<10%
> 10% & < LCL
>UCL
SVOCs
<10%
> 10% & < LCL
>UCL
VPH
<10%
> 10% & < LCL
>UCL
EPH
<10%
> 10% & < LCL
>UCL
PCB
<10%
> 10% & < LCL
>UCL
PEST
<10%
> 10% & < LCL
>UCL
MS/MSDS
SV
Low Bias
High Bias
QC Source
RPDS
Notes
VOCs
<10%
> 10% & < LCL
>UCL
Batch? Site?
SVOCs
<10%
> 10% & < LCL
>UCL
Batch? Site?
EPH
<10%
> 10% & < LCL
>UCL
Batch? Site?
PCB
<10%
> 10% & < LCL
>UCL
Batch? Site?
PEST
<10%
> 10% & < LCL
>UCL
Batch? Site?
Hex Chrome
<50%
> 505 & < LCL
>UCL
Batch?
Metals
<10%
> 10% & < LCL
>UCL
Batch? Site?
Total Cyanide
<10%
> 10% & < LCL
>UCL
Batch? Site?
ETPH
<10%
> 10% & < LCL
>UCL
Batch? Site?
FIELD DUPLICATES RPDS
Soil
Water
Compound
Compound
Notes
VOCs
RPD > 50%
RPD > 30%
SVOCs
RPD > 50%
RPD > 30%
VPH
RPD > 50%
RPD > 30%
EPH
RPD > 50%
RPD > 30%
PCB
RPD > 50%
RPD > 30%
PEST
RPD > 50%
RPD > 30%
Metals
RPD > 50%
RPD > 30%
Total Cyanide
RPD > 50%
RPD > 30%
EPH
RPD > 50%
RPD > 30%
LAB DUPLICATES RPDS
Soil
Water
Compound
Compound
Notes
RPD > 50%
RPD > 30%
Batch? Site?
Hex Chrome
RPD >20%
RPD >20%
Batch
Reasonable Confidence Achieved? Y/N Significant QC Variances Noted? Y/N Requested Reporting Limits Achieved? Y/N
Preservation Requirements Met? Y/N Holding Time Requirements Met? Y/N
Abbreviations: RL = Reporting Limit; LCS = Laboratory Control Sample; SV = Significant QC Variance; LCL= RCP Lower Control Limit; UCL= RCP Upper Control Limit; RPD =
Relative Percent Difference; VOCs = Volatile Organic Compounds; SVOCs = Semivolatile Organic Compounds; VPH = Volatile Petroleum Hydrocarbons; EPH = Extractable
Petroleum Hydrocarbons; PCBs = Polychlorinated Biphenyls; Pest = Pesticides; ETPH Extractable Total Petroleum Hydrocarbons
73
APPENDIX D-3
D
ATA QUALITY ASSESSMENT WORKSHEET 2
Project:
File Number:
Reviewer:
Date:
Notes:
Sample
Number(s)
Compound(s)
Quality Control
Nonconformance
Percent
Recovery
Relative
Percent
Difference
High/Low or
Indeterminate
Bias
Comments
Note other QC nonconformances below (data package inspection, reasonable confidence. chain of custody, sample result, sample
preservation and holding time evaluations.
Notes:
Bias High: Reported result may be lower. Reporting Limit (RL) is acceptable as reported.
Bias Low: Reported results may be higher. Reporting Limit (RL) may be higher than reported.
Bias Indeterminate: Reported result may be biased; however, it’s unclear whether the results may be biased low or high.
74
APPENDIX D-4
S
UMMARY OF DKQ ACCEPTANCE CRITERIA
QC Parameter Holding Time (1)
Method
Blank
Site Specific Matrix Spike/Matrix Spike
Duplicate
Laboratory Control Sample
Method 6010
Trace Metals
Inductively Coupled
Plasma-Atomic
Emission
Spectrometry
Aqueous soil,
sediment, and high
concentration
waste samples,
180 days. Mercury
28 days.
Target
analytes
must be <
RL.
Percent recovery limits must be between 75-
125%.
If MS/MSD run,
for aqueous samples, if concentration > 5x the
RL, RPD < 20%. If concentration < 5x RL,
difference ± RL;
for solids, if concentration > 5x RL, RPD <
35%. If concentration < 5x RL, difference ± 2x
RL.
LCS recoveries ± 20% for
aqueous samples and within
vendor control (95%
confidence limits) for solids.
Method 6020
Trace Metals
Inductively Coupled
Plasma-Mass
Spectrometry
Aqueous, soil,
sediment, and high
concentration
waste samples,
180 days. Mercury
28 days.
Target
analytes
must be <
RL.
Percent recovery limits must be between 75-
125%.
If MS/MSD run,
for aqueous samples, if concentration > 5x the
RL, RPD <20%. If concentration < 5x RL,
difference ± RL;
for solids, if concentration > 5x RL, RPD <
35%. If concentration < 5x RL, difference ± 2x
RL.
LCS recoveries ± 20% for
aqueous samples and within
vendor control (95%
confidence limits) for solids.
Method 7000 Series
Metals
(Flame and Graphite
Furnace Atomic
Absorption
Spectroscopy)
Aqueous, soil,
sediment, and high
concentration
waste samples,
180 days.
Target
analytes
must be <
RL.
Percent recovery limits must be between 75-
125%.
If MS/MSD run,
for aqueous samples, if concentration > 5x the
RL,
RPD ± 20%, if concentration < 5x RL,
difference ± RL;
for solids, if concentration > 5x RL, RPD ±35%.
If concentration < 5x RL, difference ± 2x RL.
LCS recoveries ± 20% for
aqueous samples and within
vendor control (95%
confidence limits) for solids.
75
APPENDIX D-4
S
UMMARY OF DKQ ACCEPTANCE CRITERIA
QC
Parameter
Holding Time (1)
Method
Blank
Site-Specific
Matrix
Spike/Matrix
Spike
Duplicate
Site-Specific
Matrix Spike/
Matrix Spike
Duplicate
(Aqueous
Only)
Site-Specific
Sample Matrix
Duplicate
Site-Specific
Soluble and
Insoluble Cr6+
Matrix Spike
(Solid Only)
Laboratory
Control
Sample
Method 7196
Hexavalent
Chromium
Aqueous 24 hours;
Soil/sediment samples,
digest within 30 days.
Analyze digestate within
7 days of preparation.
High concentration waste
samples Digest within 30
days. Analyze digestate
within 7 days of
preparation.
Soil/sediment pH and
ORP
24 hours of sample
preparation.
Soil/sediment, ferrous
iron and sulfide 7 days
Cr6+
must be
< RL
(Not Applicable
(Matrix spike
only for
Hexavalent
Chromium, not
MS/MSD pair)
Percent
recovery limits
must be
between 75-
125%.
Must be
performed on a
Site field
sample.
Aqueous/
Soil/Sediment:
RPD ≤ 20%; a
control limit of +
RL if original or
duplicate is < 4
times the RL.
Percent
recovery limits
must be
between 75-
125%.
LCS recoveries
±20% for
aqueous
samples and
within vendor
control (95%
confidence
limits) for solids
or the NIST
2701 control
limits.
Method
7470/7471
Mercury Cold
Vapor Atomic
Absorption
Spectroscopy
Aqueous, soil, sediment,
and high concentration
waste samples, 28 days.
Mercury
must be
<RL
Percent
recovery limits
must be
between 75-
125%.
Not applicable
For aqueous
samples RPD ±
20% if conc.
>5x the RL. If
conc. < 5x RL,
the limit is ±
RL.
For solids RPD
±35% if conc.
>5x the RL. If
conc. < 5x the
RL, limit is ± the
RL.
Not applicable
LCS recoveries
±20% for
aqueous
samples and
within vendor
control (95%
confidence
limits) for
solids.
76
APPENDIX D-4
S
UMMARY OF DKQ ACCEPTANCE CRITERIA
QC
Parameter
Holding Time (1) Method Blank Surrogates
Site-Specific
Matrix
Spike/Matrix Spike
Duplicate
Laboratory
Control Sample
Endrin and
DDT
Breakdown
Standard
Method
8021
Volatile
Organic
Com-
pounds
Aqueous 14 days (2)
Soil/sediment, 14 days if
preserved. 48 hours if
unpreserved (Note 3).
High concentration waste
samples, 14 days.
Target analytes
must be < RL
except for
common lab
contaminants
which must be <
3x the RL
(contaminants
are acetone,
methylene
chloride, and 2-
butanone).
Laboratory
determined percent
recoveries must be
between 70-130%
for individual
surrogate
compounds.
Laboratory
determined recovery
limits may be
outside 70-130 %
limits for difficult
matrices (e.g.
waste, sludges,
etc.).
Laboratory
determined percent
recoveries should
be between 70-130
% for target
compounds.
RPD’s should be
30%.
Laboratory
determined
percent recoveries
must be between
70-130% for target
compounds.
Not
applicable
Method
8081
Pesticides
Aqueous, 7 days to
extraction. 40 days from
extraction to analysis.
Soil/sediment samples, 14
days to extraction. 40
days from extraction to
analysis. Up to one year
for samples frozen within
48 hours of collection
(Note 1).
High concentration waste
samples 14 days to
extraction. 40 days from
extraction to analysis.
Target analytes
must be < RL.
Recovery limits lab
generated and
within maximum
range of 30-150%
for both compounds
on both columns.
Labs must develop
own in-house limits,
which fall within 30-
150% limits.
Laboratory
determined percent
recovery limits must
be between 30-
150%
RPD’s ≤ 20% for
water and 30%
for solids
Laboratory
determined
percent recovery
limits must be
between 40-140%
except for difficult
analytes, which
must be between
30-140%
recovery.
Breakdown
must be ≤
15% for each
compound.
77
APPENDIX D-4
S
UMMARY OF DKQ ACCEPTANCE CRITERIA
QC
Parameter
Holding Time (1) Method Blank Surrogates
Site Specific Matrix
Spike/Matrix Spike
Duplicate
Laboratory Control
Sample
Method 8151
Chlorinated
Herbicides
Aqueous 7 days to
extraction, 40 days from
extraction to analysis
Soil/Sediment, 14 days to
extraction. 40 days from
extraction to analysis. Up to
one year for samples frozen
within 48 hours of collection.
(Note 4)
High concentration waste
samples, 14 days to
extraction. 40 days from
extraction to analysis
Target analytes must
be <RL.
Recovery limits lab
generated and within
30-150% for both
compounds on both
columns.
Labs must develop own
in-house limits that fall
within 30-150% limits.
If surrogate exceeds
limits on one column
and herbicide
concentrations reported
at > RL but dual column
precision not acceptable
(RPD > 40%), re-extract
and reanalyze samples.
Laboratory determined
percent recovery limits
must be between 30-
150%, RPDs ≤ 20%
waters and ≤ 30%
solids.
Laboratory determined
percent recovery limits
must be between 40-
140% except in-house
limits for Dinoseb.
Labs expected to
develop own in-house
control limits that meet
or exceed limits listed
above.
Method 8082
Polychlori-
nated
Biphenyls
Aqueous 7 days to
extraction, 40 days from
extraction to analysis.
Soil/Sediment 14 days to
extraction. 40 days from
extraction to analysis. Up to
one year for samples frozen
within 48 hours of collection.
(Note 4)
High concentration waste
samples, excluding
transformer oils, 14 days to
extraction. 40 days from
extraction to analysis.
Transformer/Waste Oils, 1
yr
Target analytes must
be <RL.
Recovery limits lab
generated and within
maximum range of 30-
150% for both
compounds on both
columns.
Labs must develop own
in-house limits that fall
within 30-150% limits.
Laboratory determined
percent recovery limits
for AR-1016/1260 must
be between 40-140%.
Recoveries for all
Aroclors or Congeners
40-140%
Congeners must
contain all target
congeners.
RPD’s 20% for waters
and 30% for solids.
Laboratory determined
percent recovery limits
must be between 40-
140%.
Labs are required to
develop own in-house
limits that meet or
exceed limits listed
above.
78
APPENDIX D-4
S
UMMARY OF DKQ ACCEPTANCE CRITERIA
QC
Parameter
Holding Time (1) Method Blank Surrogates
Site Specific Matrix
Spike/Matrix Spike
Duplicate
Laboratory Control
Sample
Method
8260
Volatile
Organic
Com-
pounds
Aqueous, 14 days, 7 days
if unpreserved (2)
Soil/Sediment, 14 days if
preserved. 48 hours if
unpreserved.
(Note 3).
High concentration waste
samples, 14 days.
Target analytes must
be <RL except for
common lab
contaminants which
must be <3x the RL
(Contaminants are
acetone, methylene
chloride, and 2-
butanone).
Laboratory determined
percent recoveries must
be between 70-130%
for individual surrogate
compounds.
Laboratory determined
recovery limits may be
outside 70-130% limits
for difficult matrices
(e.g. waste, sludges,
etc.).
Laboratory determined
percent recoveries
should be between 70-
130% for target
compounds.
RPDs should be 30%
Laboratory determined
percent recoveries must
be between 70-130%
for target compounds.
Can also be used as
CCAL.
Lab may have difficult
compounds out of
criteria as long as within
40-160% recovery.
79
APPENDIX D-4
S
UMMARY OF DKQ ACCEPTANCE CRITERIA
QC Parameter Holding Time (1) Method Blank Surrogates
Site Specific Matrix
Spike/Matrix Spike
Duplicate
Laboratory Control
Sample
Method 8270
Semivolatile
Organic
Compounds
Aqueous, 7 days to
extraction. 40 days from
extraction to analysis
Soil/sediment, 14 days to
extraction. 40 days from
extraction to analysis.
Up to one year for
samples frozen within 48
hours of collection.
(Note 4)
High concentration
waste samples 14 days
to extraction. 40 days
from extraction to
analysis.
Target analytes
must be < RL
except for
common lab
contaminants
which must be
< 5x the RL
(Contaminants
are phthalates).
Soil recovery limits
lab generated and
within 30-130%.
Water recovery limits
lab generated and
within 30-130% for
base-neutrals, 15-
110% for acid
compounds.
Laboratory
determined percent
recovery limits must
be between 70-130%
except 20-160% for
difficult compounds.
RPD’s ≤ 20% for
waters and ≤ 30% for
soils.
Laboratory determined
percent recovery limits must
be between 70-130% except
20-160% for difficult
compounds.
Method
9010/9012/9014
Total Cyanide
Aqueous, soil, sediment
and high concentration
waste samples: Cyanide
14 days from collection
to analysis, (from date
when thawed if solid
samples frozen). Can
maintain samples up to 1
year if frozen
Cyanide must
be < RL.
Not applicable
Percent recovery
limits must be
between 75-125%.
For aqueous samples
RPD ≤ 20%
For solids RPD ≤
35%
LCS recoveries ±20% for
aqueous samples and within
vendor control (95%
confidence limits) for solids.
80
APPENDIX D-4
S
UMMARY OF DKQ ACCEPTANCE CRITERIA
QC Parameter Holding Time (1) Method Blank Surrogates
Site Specific Matrix
Spike/Matrix Spike
Duplicate
Laboratory Control
Sample
Fractionation
Check
Standard
NJDEP
Extractable
Petroleum
Hydrocarbons
(EPH)
Aqueous, soil,
and sediments,
samples must be
extracted within
14 days of
collection.
Extracts must be
analyzed within
40 days of
extraction.
All components
should be < 5
times their
respective
MDLs.
Labs develops
own in-house
limits which must
be within 40-140%
for each surrogate.
Sample recoveries
and must be within
40-140%.
Conc. of
fractionating
surrogates
naphthalene and
2-
methylnaphthalene
in aliphatic fraction
< 5% total conc. of
those 2
compounds (in the
batch-related LCS
or LCSD)
Lab develops own in-
house recovery range
but percent
recoveries should be:
Fractionated =
between 40 and
140% for each
carbon range.
Non-Fractionated =
between 40 and
140% for each
compound
RPDs should be
50% for waters and
soils/sediments if
MSD is performed.
Percent recoveries
between 40 and
140% for all
compounds in the
LCS; n-nonane
must be between
25-140%.
If #2 fuel used as
the LCS, percent
recoveries must be
between 40 and
140% for the #2-fuel
Retention times of
surrogates in LCS
must be within
retention time
windows
Every lot of
silica gel/SPE
cartridges
checked.
Percent
recoveries
between 40 and
140% for each
compound,
except for n-
nonane which
must be
between 25-
140%.
81
APPENDIX D-4
S
UMMARY OF DKQ ACCEPTANCE CRITERIA
Notes:
Not all method QA/QC deliverables are listed here. .
(1) See the Method for specific preservation requirement for each method.
(2) If aqueous samples effervesce upon addition of hydrochloric acid, samples must be collected unpreserved and stored at 4 ±
Celsius. Holding time is 7-days from collection.
(3) Samples should be collected and stored according to N.J.A.C. 7:26E-2.1(a)8.
(4) If the freezing option is selected, the sample must be frozen within 48 hours of collection. The holding time recommences
when thawing begins. The total holding time is calculated from the time of collection to freezing plus the time allowed for
thawing. The total elapsed time must be less than 14 days. Although the USEPA removed the holding time requirements for
PCBs, NJDEP still requires the method specified holding times to be followed.
Abbreviations:
CCAL Continuing Calibration
Cr Chromium
EPA United States Environmental Protection Agency
EPH Extractable Petroleum Hydrocarbons
LCS Laboratory Control Sample
LCSD Laboratory Control Sample Duplicate
ORP Oxidation Reduction Potential
RPD Relative Percent Difference
RL Reporting Limit
YR Year
82
Appendix D-5
Common Laboratory Data Qualifiers
Organics:
U This flag indicates the compound was analyzed for but not detected at a listed and
appropriately adjusted reporting level.
J This flag indicates an estimated value. This flag may be used when:
(1) estimating a concentration for TICs where a 1:1 response is assumed;
(2) the mass spectral and Retention Time (RT) data indicate the presence of a
compound that meets the volatile and semivolatile GC/MS identification criteria,
and the result is less than the adjusted Reporting Limit; and
(3) the RT data indicate the presence of a compound that meets the pesticide
and/or Aroclor identification criteria, and the result is less than the adjusted
Reporting Limit but greater than zero. For example, if the sample's adjusted
Reporting Limit is 5.0 μg/L, but a concentration of 3.0 μg/L is calculated, report it
as 3.0J.
N This flag indicates presumptive evidence of a compound. This flag is only used for
TICs, where the identification is based on a mass spectral library search and must
be used in combination with the J flag. It is applied to all TIC results. For generic
characterization of a TIC, such as chlorinated hydrocarbon, or for an "unknown" (no
matches 85%), the "N" flag is not used.
P This flag is used for pesticide and Aroclor target compounds when there is greater
than 40% relative percent difference (RPD) for detected concentrations between the
two GC columns (see Form X). The "P" flag is not used unless a compound is
identified on both columns.
C This flag applies to pesticide and Aroclor results when the identification has been
confirmed by GC/MS. If GC/MS confirmation was attempted but was unsuccessful,
do not apply this flag; use a laboratory-defined flag instead (such as the X-qualifier).
B This flag is used when the analyte is found in the associated method blank as well
as in the sample. It indicates probable blank contamination and warns the data user
to take appropriate action. This flag shall be used for a TIC as well as for a positively
identified target compound. Blank contaminants are flagged "B" only when they are
detected in the sample.
E This flag identifies compounds whose response exceeds the response of the
highest standard in the initial calibration range of the instrument for that specific
analysis. (If one or more compounds of concern have a response greater than the
response of the highest standard in the initial calibration, the sample or extract
should be diluted and reanalyzed according to the specifications of the method and
a new result reported.)
D If a sample or extract is reanalyzed at a dilution factor greater than 1(e.g., when the
response of an analyte exceeds the response of the highest standard in the initial
calibration), the D qualifier is attached to the sample result.
83
Organics (continued):
A This flag indicates that a TIC is a suspected Aldol-condensation product.
S This flag is used to indicate an estimated value for Aroclor target compounds where
a valid 5-point initial calibration was not performed prior to the analytes detection in
a sample. If an "S" flag is used for a specific Aroclor, then a reanalysis of the sample
is required after a valid 5-point calibration is performed for the detected Aroclor.
(Obtained from the USEPA Contract Laboratory Program Statement of Work for Organics
Analysis Multi-Media, Multi-Concentration SOM1.1May 2005 (revised in SOMO1.2).
Inorganics:
X The reported value is estimated due to interferences.
* QC analyses are outside control limits.
D The reported value is from a dilution.
J The reported value was less than the CRQL, but greater than or equal to the MDL.
U The result was less than the MDL. For Hardness, if the results for both Ca and Mg
were less than their respective MDLs.
N Spiked sample recovery not within control limits.
E The reported value is estimated due to the presence of interference. An explanatory
note should be included in a comments section.
Obtained From USEPA Contract Laboratory Program Statement of Work for Inorganic
Superfund Methods (Multi-Media, Multi-Concentration) ISM01.2 January 2010.
84
Appendix E
Evaluating Significant QA/QC Variances
85
APPENDIX E
E
VALUATING SIGNIFICANT QA/QC VARIANCES
On occasion, the investigator may encounter Quality Control (QC) nonconformances that are so
excessive that they must be considered as significant or gross violations of QC criteria. Causes
may range from problems associated with the sampled medium, such as severe matrix
interference, or may be the result of improper sample handling and management. Whatever the
cause, the investigator must determine whether or not the data associated with such significant
QC violations can be used in making the environmental decisions for which the associated
samples were collected.
In general, data associated with significant QC violations will be of limited use in decision-
making, and it is the responsibility of the investigator to demonstrate that such data are, in fact,
usable for a particular purpose. It should be understood that the same data set with the same
QC issues may be usable for one purpose but not for another. It is certainly possible that data
associated with significant violations of QC might be used for qualitative or screening purposes,
but it is highly unlikely that such data would be suitable for demonstrating compliance with
applicable regulations. However, samples with significant QC variances can be used to
determine that remediation is needed. The extent to which such data may be relied upon clearly
depends on the intended use of that data.
It is possible to review a data set with significant QC violations and, depending on the intended
purpose, the investigator may choose to use or qualify the data in one case and reject it in
another. For example, if significant QC failures occur, but an analyte is detected and the
purpose of the sample analysis is to characterize environmental matrices to determine if a
release has occurred, the investigator can reasonably justify using that data to determine that
there was, in fact, a release of the specific compounds that were detected. The data may not be
usable to determine all of the contaminants that may have been released (i.e., determine the full
nature of the release), and it should be clearly understood that additional measures should be
taken to ensure that QC results for sampling during follow-up portions of the investigation are
within acceptable limits.
If significant QC failures occur and the purpose of the sampling was to conclusively demonstrate
compliance with regulations, then it is unlikely that the data will be usable for that purpose.
86
If there are years of previous data or many other samples from a particular release area that are
consistent with the results of the data associated with significant QC failures and site conditions
have not changed as demonstrated through subsequent data, then it is possible that the data
with poor QC could be used with qualification. If the data with poor QC appear anomalous
relative to previous results, then it is unlikely that they can be relied on to draw final conclusions.
QC results for laboratory data associated with investigation and remediation projects should
always be evaluated with respect to the intended use of that data and the project-specific or
task-specific data quality objectives that were established for types of decisions that will be
made using that data. NJDEP expects that data with significant QC failures will be deemed
unusable, unless the investigator provides adequate justification for the use of such data and
qualifies the data accordingly, such as indicating that such data is used as qualitative, rather
than quantitative, information. Once the investigator comes to the conclusion that data are
unusable, NJDEP expects that any data deemed unusable will not be used to demonstrate
compliance with regulatory criteria.
The following paragraphs identify typical types and causes of significant QC violations and
provide a discussion of the factors that an investigator should consider when evaluating whether
or not the associated data is usable.
General QC Infractions
Sample Receipt Issues
Field and trip blanks were not received at the site within 1 day of their preparation at the
laboratory;
Blanks and associated samples were held longer than 2 days on-site and/or did not arrive
back at the laboratory within 1 day of shipment;
Samples to be analyzed are received outside a temperature of 4 +/-2º Celsius (C);
Samples received above a maximum temperature of 12ºC more than 24 hours from
collection; and
87
Lack of evidence of cooling with ice or use of artificial ice substitutes, such as “blue ice,”
which are not acceptable as evidence of cooling if the sample temperature is outside the
acceptance limits specified in the DKQ protocols and the three prior bullets above.
Sample Containers
Any improper sample container, as described in the applicable analytical method, or a sample
container that is not properly sealed or has been otherwise compromised, should be considered
to be a significant QC infraction.
Sample Preservation
Analytical results from samples that are not preserved in accordance with the requirements of
the analytical method should be considered to be a significant QC infraction.
Analysis Holding-time Excursions (total holding time from collection)
Analytical results that are greater than the applicable regulatory criteria can be considered
usable, regardless of the holding time, as long as the intended use of the data is to identify
locations where concentrations of contaminants exceed those criteria. However, analytical
results less than regulatory criteria that were analyzed and/or extracted after more than two
times the holding time has passed should not be considered usable unless the investigator can
provide the rationale for the use of the data. Similarly, if samples for which analytical results are
greater than regulatory criteria were subject to holding-time issues and such results are
intended for use in demonstrating compliance in any way, such as using an alternative criterion,
those results must be considered in a manner similar to results that are less than regulatory
criteria.
Calibration Issues
If calibration issues are reported the investigator should contact the laboratory, as needed, for
guidance. Although reporting of calibration QC is not required under the DKQs on a routine
basis, the DKQ protocols require that the laboratory narrate nonconformance of calibration
issues, as described in the DKQ protocols for various analytical methods. The following
calibration issues are among those that would be considered significant QC infractions:
Instrument not calibrated by an initial calibration (ICAL);
88
No continuing calibration standard analyzed within 24 hrs of ICAL;
Gas Chromatography/Mass Spectrometry tune criteria significantly out of criteria (greater
than 20 percent for any one atomic mass unit); and
Relative Response Factor (RRF) less than 0.05 (with no technical justification for low
RRF), for DKQ Methods 8260B and 8270C should result in rejection of all results
reported as below the reporting limit for associated samples.
Reporting Issues
Issues of suspected data fraud should be forwarded to the appropriate authorities, e.g. the
NJDEP Office of Quality Assurance.
Professional Judgment
In some cases, it is appropriate to reject data based on professional judgment. These cases
include, but are not limited to the following:
Severely poor overall instrument performance;
Low percent solids (less than 10 percent); and
Multiple QC nonconformances and gross failures.
Significant QC Violations for Specific Analytes
The following situations are considered to be significant QC violations. If any of the following
issues are reported, the investigator is encouraged to contact the laboratory for guidance.
Inorganic Compounds
LCS recovery is less than 50 percent of the control limit - An LCS less than 50 percent of
control limit may be off-set by matrix spike data within acceptance criteria to reasonably
determine that the problem is only associated with the LCS.
MS recovery is less than 30 percent for all affected analytes in a batch, with the exception of
hexavalent chromium if supported by Oxidation Reduction Potential (ORP) and pH data which
89
indicates reducing conditions Hexavalent chromium readily reduces to trivalent chromium in a
reducing environment.
Organic Compounds
LCS recovery is less than 10 percent - Usability of results reported as below the reporting limit
for analytes with LCS recovery less than 10 percent is severely limited and would require
substantial justification by the investigator.
Surrogate recoveries for organics less than 10 percent - Usability of results reported as below
the reporting limit for analytes associated with surrogates with LCS recovery less than 10
percent is severely limited and would require substantial justification by the investigator.
MS/MSD recoveries for organics less than 10 percent - Usability of results reported as below
the reporting limit for affected compound in the unspiked sample (i.e., field sample used for
MS/MSD only) is severely limited and would require substantial justification by the investigator.
The investigator should also evaluate how these results may affect the usability of other sample
results in the batch.
Internal standard area counts in a sample are less than 20 percent of associated calibration
check standard area counts generally associated non-detects for analytes which are
quantitated using the internal standard are rejected and would not be usable for project
decisions.
Fractionation Check Standard (FCS) recovery for EPH for any analyte included in the FCS that
is not between 40% and 140% (with lower recoveries permissible for n-Nonane but recovery
must be >25%) - results are generally rejected and are not usable for project decisions.
Endrin/DDT Breakdown Check Standard, breakdown should be less than 15 percent - Non-
detected results for endrin or DDT, whichever compound is affected, should be rejected and
detected results for the breakdown projects should be considered biased high. This indicates
the equipment was in need of maintenance at the time of analysis.
Dual column precision percent difference is greater than 100 percent for single response
pesticides and herbicides - Reject all results for affected pesticides and herbicides. Dual
columns are used to confirm the presence of analytes.
90
Dual column precision percent difference is greater than 500 percent for multi-response
pesticides and polychlorinated biphenyls - Reject all results for affected data.
91
Appendix F
Poorly Performing Compounds
92
A
PPENDIX F
P
OORLY PERFORMING COMPOUNDS
1
Method 8260
The following compounds are poorly performing compounds: acetone, bromoform,
bromomethane, 1,2-dibromo-3-chloropropane, dichlorodifluoromethane, cis-1,3-
dichloropropene, 1,4-dioxane, 2-hexanone, 2-butanone (MEK), 4-methyl-2-petanone (MIBK),
naphthalene, styrene, and 1,1,2,2-tetrachloroethane. (See EPA Methods 8000 and 8260 for
more detail.) Acetone, 2-hexanone, MEK and MIBK are water soluble and are therefore poor
purgers; they are not easily purged from the water sample onto the trap. 1,4-Dioxane has poor
purging efficiency and is subject to poor recovery if chlorinated solvents are present in the
sample. 1,4-dioxane should not be analyzed by Method 8260; a modified version of Method
8270 is to be used. Naphthalene is a relatively high boiling compound for volatiles, and is also
poorly purged from the sample. The remaining compounds, bromoform, bromomethane, 1,2-
dibromo-3-chloropropane, dichlorodifluoromethane, cis-1,3-dichloropropene, styrene, and
1,1,2,2-tetrachloroethane, are easily degraded by heat as found in the injection port of the gas
chromatograph or can react in certain sample matrices resulting in poor recovery. Additionally
bromomethane and dichlorodifluoromethane are gases and are sometimes lost from the trap
during analysis.
1
Poorly Performing Compounds are those compounds whose characteristics are such that routine analytical method
criteria are difficult to achieve. In the data assessment and data usability evaluation, added scrutiny should be given
to the “analytical behavior” of these compounds.
93
Method 8270
The following compounds are poorly performing compounds: 4-chloraniline, 4-chloro-3-
methylphenol, 4,6-dinitro-2-methylphenol, 2,4-dinitrophenol, 1,4-dioxane,
hexachlorocyclopentadiene, 2-nitroaniline, 3-nitroaniline, 4-nitroaniline, 4-nitrophenol,
pentachlorophenol, phenol, pyridene, 2,4,5-trichlorophenol and 2,4,6-trichlorophenol. (See EPA
Methods 8000 and 8270 for more detail.) Most of these compounds are thermally reactive and
are potentially lost in the injection port of the gas chromatograph. All of the phenolics are
reactive with base and relatively water soluble. They are sometimes poorly extracted from
aqueous samples and if a soil sample has a basic pH, may not be extracted at all. 1,4-Dioxane
has poor extraction efficiency; however, Method 8270 has been modified by the department to
include an option. The isotopically labeled compound 1,4-dioxane-d8 is added to the sample
prior to extraction and is used both as an internal standard (to quantitate 1,4-dioxane) and a
surrogate (1,4-Dioxane-d4 is used to quantitate 1,4-dioxane-d8 as a surrogate). This option is
available for certification by the NJDEP Office of Quality Assurance.
94
Appendix G
Range of Data Usability Evaluation Outcomes
95
Appendix G
Range of Data Usability Evaluation Outcomes
The table which follows provides the data reviewer and the investigator with options addressing
how to use data. It discusses what to look for and how to use data that may be qualified due to
a variety of issues. The user is cautioned that each element of quality needs to be addressed
before deciding that data are not usable. Any of the elements should undergo review by the
investigator to determine if there is anything that is correctable prior to a usability evaluation.
Data with certain quality assurance deficiencies may be usable in certain circumstances. In the
worst case scenario, depending on the severity of the deficiency, the data may be unusable.
96
APPENDIX G
D
ATA USABILITY OUTCOMES
1
Quality Control
Element (Sample
Type, Analysis,
Condition or
Characteristic)
Type of
Nonconformance
Possible Causes
Major PARCCS
Parameters
Affected (Note
2)
Possible Effects on Data Usability (Note 3)
Chain of Custody
Chain broken,
incomplete, or not
kept
Missing signatures, missing
seals, missing dates or times,
type of analysis requested not
listed
Completeness
If confirmed that sample set is complete and
samples not compromised, data are usable.
Sample labeling
Sample labels
unreadable, missing,
or not attached to
containers
Failure to protect label from
moisture, failure to use
appropriate marker or labels,
improper standard operating
procedure (SOP)
Representative-
ness
Completeness
If the sample can be unambiguously identified,
then samples are usable.
Sample labeling
Samples mislabeled
or labeled
incompletely
Sampler error
Improper SOP
Representative-
ness
If the sample can be unambiguously identified,
then samples are usable.
Sample containers
Plastic containers for
organic analytes
Samplers unaware of
container requirements,
improper SOP, failure to read
SOP, SOP incorrect,
insufficient quantity of correct
containers samplers used
containers on-hand
Representative-
ness
Accuracy
Completeness
Possible phthalate interference and/or volatile loss
may be present.
Sample containers
Glass containers for
metals
Samplers unaware of
container requirements,
improper SOP, failure to read
SOP, SOP incorrect,
insufficient containers
Representative-
ness
Accuracy
Completeness
Possible inorganic contamination may be present.
Headspace
Bubbles in water
inside volatile organic
chemical (VOC) vial
Poor sampling technique, caps
not sealed tightly, septum caps
not used, water vials not
completely filled, improper
Representative-
ness
Accuracy
Completeness
Loss of volatiles may occur.
97
APPENDIX G (CONTINUED)
R
ANGE OF DATA USABILITY OUTCOMES
1
Quality Control
Element (Sample
Type, Analysis,
Condition or
Characteristic)
Type of
Nonconformance
Possible Causes
Major PARCCS
Parameters
Affected (Note
2)
Possible Effects on Data Usability (Note 3)
SOP
Preservation soil and
sediment samples
VOC soil or sediment
samples not properly
preserved
Varies
Accuracy
Representative-
ness
Completeness
Comparability
Loss of volatiles may occur.
Preservation
aqueous samples
No preservative or
wrong pH
No preservative added or
improper amount of
preservative added
Representative-
ness
Accuracy
Completeness
This is an analyte- and method-dependent issue.
Loss of analytes may occur.
Preservation
aqueous samples
Wrong preservative
Improper SOP, failure to read
SOP, SOP incorrect, correct
preservative unavailable
Representative-
ness
Accuracy
Completeness
This is an analyte- and method-dependent issue.
Loss of analytes may occur
Preservation
Improper
temperature
(temperature outside
4 ± 2° C Note (4)
Insufficient ice, samples too
cold, shipping container
inadequately insulated,
samples adequately cooled at
time of sampling and during
shipping, transit time too long
or too short for samples to
reach temperature
Representative-
ness
Accuracy
Completeness
Loss of analytes may occur if temperature is too
high. If temperature is too low, check container for
integrity.
98
APPENDIX G (CONTINUED)
R
ANGE OF DATA USABILITY OUTCOMES
1
Quality Control
Element (Sample
Type, Analysis,
Condition or
Characteristic)
Type of
Nonconformance
Possible Causes
Major PARCCS
Parameters
Affected (Note
2)
Possible Effects on Data Usability (Note 3)
NJDEP certification
status
Laboratory not
certified or approved
for specific analytes
by NJDEP.
Varies
All may be
affected
Except in limited circumstances, data should not
be used.
Handling or Holding
times
Handling and/or
Holding times
exceeded
Excessive analysis time; tardy
ship date; inappropriate
shipping method; slow
laboratory turn-around time.
Representative-
ness
Accuracy
Completeness
Loss of analytes may occur.
(Note 5)
Analysis method
Wrong method used
to analyze samples
Incorrect laboratory method
specified on chain of custody
form; laboratory/analyst
unaware of requirement;
failure to read SOP; SOP
incorrect.
Representative-
ness
Comparability
Completeness
Accuracy
Sensitivity
Except in limited circumstances, data should not
be used.
Reporting Limit (RL) RL too high
Insufficient measures to
combat interferences (i.e.,
cleanup, background
correction); insufficient
sample; high dilution factor;
wrong or inappropriate
method.
Comparability
Completeness
Sensitivity
If the RL for site-specific compounds of concern >
the standards/screening levels, then NDs cannot
be used to determine compliance.
If a compound is detected and the RL is elevated,
the data are usable.
Method blank (MB)
Method blank absent
(Note 6)
Improper SOP
Representative-
ness
Accuracy
Completeness
Data may contain false positives and in some
circumstances, data should not be used.
Method blank (MB) Contamination
Contaminated reagents,
gases, glassware; ambient
contamination; poor laboratory
technique.
Representative-
ness
Accuracy
Completeness
Data may contain false positives and/or high bias
Equipment blank (EB)
or Rinsate blank
Contamination
Improper decontamination of
field sampling equipment;
contaminated rinsate water,
Representative-
ness
Accuracy
Data may contain false positives and/or high bias
99
APPENDIX G (CONTINUED)
R
ANGE OF DATA USABILITY OUTCOMES
1
Quality Control
Element (Sample
Type, Analysis,
Condition or
Characteristic)
Type of
Nonconformance
Possible Causes
Major PARCCS
Parameters
Affected (Note
2)
Possible Effects on Data Usability (Note 3)
containers, or preservatives.
Completeness
Trip blank (TB) for
analysis of VOCs
Trip blank absent
TB not included; Improper
SOP; TB broken during
shipment; TB lost during
shipment.
Representative-
ness
Accuracy
Completeness
Data may contain false positives and/or high bias.
Trip blank for analysis
of VOCs
Contamination
Cross-contamination during
shipment or storage;
contaminated reagent water,
glassware, or preservatives
Representative-
ness
Accuracy
Completeness
Data may contain false positives and/or high bias.
Laboratory Control
Sample (LCS)
LCS absent (Note 7) Improper laboratory SOP
Accuracy
Completeness
Comparability
Complete evaluation of the data may not be
possible.
LCS, Laboratory
Control Sample
Duplicate (LCSD),
blank spike (BS), blank
spike duplicate (BSD)
Low recoveries
Method failure; improper
spiking; degraded spiking
solution; failed spiking device.
Accuracy
Completeness
Comparability
Data may contain false negatives and/or low bias
LCS, LCSD, BS, BSD High recoveries
Method failure; improper
spiking; degraded spiking
solution; failed spiking device;
contaminated reagents, gases,
glassware, etc.
Accuracy
Completeness
Comparability
Data may contain false positives and/or high bias
LCS, LCSDs High RPDs
Method failure; improper
spiking; failed spiking device;
contaminated reagents, gases,
glassware, etc.
Representative-
ness
Precision
Completeness.
Comparability
Poor precision exists in the analytical procedure.
Surrogates in MB,
LCS, LCSD, BS, BSD
Low recoveries
Method failure; improper
spiking; degraded spiking
solution; failed spiking device.
Accuracy
Completeness
Laboratory performance should be questioned.
100
APPENDIX G (CONTINUED)
R
ANGE OF DATA USABILITY OUTCOMES
1
Quality Control
Element (Sample
Type, Analysis,
Condition or
Characteristic)
Type of
Nonconformance
Possible Causes
Major PARCCS
Parameters
Affected (Note
2)
Possible Effects on Data Usability (Note 3)
Surrogates in MB,
LCS, LCSD, BS, BSD
High recoveries
Method failure; improper
spiking; degraded spiking
solution; failed spiking device;
contaminated reagents, gases,
glassware. etc.
Accuracy
Completeness
Laboratory performance should be questioned
Surrogates in samples Low recoveries
Matrix effects; inappropriate
method; method failure;
improper spiking; degraded
spiking solution; failed spiking
device.
Accuracy
Completeness
Data may contain false negatives and/or low bias.
Surrogates in samples High recoveries
Matrix effects; inappropriate
method; method failure;
improper spiking; degraded
spiking solution; failed spiking
device; contaminated
reagents, gases, glassware,
etc.
Accuracy
Completeness
Data may contain false positives and/or high bias.
MS, MSD (Note 8)
Low recoveries (Note
9)
Matrix effects; inappropriate
method; method failure;
inadequate cleanup;
inadequate background
correction; failure to use
method of standard additions;
improper spiking; degraded
spiking solution; failed spiking
device.
Accuracy Data may contain false negatives and/or low bias.
MS, MSD (Note 8)
High recoveries (Note
9)
Matrix effects; inappropriate
method; method failure;
inadequate cleanup;
inadequate background
correction; failure to use
method of standard additions;
improper spiking; degraded
Accuracy
Data may contain false positives and/or high bias.
Qualify sample results greater than the RL (i.e.,
possible matrix effects).
101
APPENDIX G (CONTINUED)
R
ANGE OF DATA USABILITY OUTCOMES
1
Quality Control
Element (Sample
Type, Analysis,
Condition or
Characteristic)
Type of
Nonconformance
Possible Causes
Major PARCCS
Parameters
Affected (Note
2)
Possible Effects on Data Usability (Note 3)
spiking solution; failed spiking
device; contaminated
reagents, gases, glassware,
etc.
MS, MSD (Note 8)
High Relative Percent
Difference
Sample heterogeneity;
inadequate sample mixing for
non-voc samples in the
laboratory or the field; samples
misidentified; method failure;
improper spiking; failed spiking
device, duplicate spiking of a
sample, contaminated
reagents, gases, glassware,
etc.
Representative-
ness
Precision
The sample itself may be heterogeneous leading
to poor precision (high variability).
Dilution factors
Extremely high
dilution factors
High concentrations of
interferences or analytes;
inappropriate analytical
method used or selected
Accuracy
Comparability
Completeness
Samples with high RLs may not meet DQO and
RLs may become greater than regulatory criteria.
Field Duplicates
Field duplicates are
not comparable
within DQOs
Sample inhomogeneity;
insufficient mixing in field;
samples not split but
collocated (Note 10);
insufficient mixing in
laboratory.
Representative-
ness
Precision
The sample itself may be heterogeneous leading
to poor precision (high variability). The sample
may not be representative of site conditions.
102
APPENDIX G (CONTINUED)
R
ANGE OF DATA USABILITY OUTCOMES
1
This table was adapted from US Army Corps of Engineers, Environmental Quality Assurance for HTRW Projects, Engineer Manual. October 10, 1997, EM 200 1-
6, table 3-1.
103
APPENDIX G (CONTINUED)
R
ANGE OF DATA USABILITY OUTCOMES
1
Notes:
(1) Entries in the Possible Causes, PARCCS Parameters Affected, Effect on Data, and Possible Data Evaluation columns assume only one type of failure
occurring at any one time. The cumulative or synergistic effects of more than one failure type occurring simultaneously make data usability evaluation more
complex. Data usability evaluations involving multiple failure types are beyond the scope of this table. Not all possible QC failures and outcomes are illustrated on
this table.
(2) The PARCCS parameters most affected are listed. All of the PARCCS parameters may affected in some cases. Any failure that results in invalid data affects
Completeness.
(3) All data usability evaluations are subject to discretion of the investigator taking into account project DQOs, and the intended use of the analytical data. The
DQA and DUE thought process must be documented in the report using the data.
(4) Refrigeration not required for trace metals (excluding mercury).
(5) Exceeding holding times on some analyses can produce false positives (i.e., carbonates, dissolved oxygen, etc.) and high bias (i.e., pH, carbonates, dissolved
oxygen, etc.). High bias and false positives can also occur when degradation products of contaminants are also themselves analytes, i.e., when 4,4'-DDT is
present and holding times are exceeded, high bias and false positives for the degradation products 4,4 DDD, 4,4 DDE, 4,4 DDT, 2,4 DDD, 2,4 DDE, 4,4’-DDT can
occur.
(6) Method blanks are not appropriate for all analyses, i.e. pH, conductivity, % solids, etc.
(7) Laboratory control samples are not appropriate for all analyses, i.e. pH, conductivity, % solids, etc.
(8) Matrix spike and matrix spike duplicates are performed at the request of the investigator and may not be present.
(9) Note that when the native sample concentrations are significantly greater than the effective spike concentration that the conclusion of the matrix effect is only
tentative. As a general rule of thumb, the native sample concentration should be no more than four times higher than the effective matrix spike concentration of for
the matrix effect to be considered probably present.
(10) Conventional sampling protocols for some analyte classes (i.e., VOCs) prohibit sample mixing and splitting because it results in the loss of analytes. Field and
QC samples for these analytes are more appropriately collected as sample pairs.
104
Appendix H
Data Usability Evaluation Worksheet
105
APPENDIX H-1
I
NSTRUCTIONS FOR USE OF THE
D
ATA USABILITY EVALUATION WORKSHEET
The Data Usability Evaluation Worksheet (DUE Worksheet) can be used to document the
investigator’s thought process during a DUE of the QC nonconformances that were cataloged
as part of the DQA. A description of the “Nonconformance DQA Review Elements” listed in the
left hand column can be found in Appendix C of this document. The DUE worksheet is available
below in Appendix H-2 and can be modified by the user.
106
APPENDIX H-2
D
ATA USABILITY EVALUATION WORKSHEET
Project Name: ___________________________________________________
Laboratory: _____________________________________________________
Sample Delivery Group: ___________________________________________
Sample Delivery Group Number: ___________________________________
Date Samples Collected: ___________________________________________
Reviewer: _______________________________________________________
Describe the intended use of the data:
Nonconformance
DQA Review
Elements
Briefly Summarize DQA Nonconformances
Laboratory Report
Inspection
Reasonable
Confidence
Evaluation
Chain of Custody
Evaluation
Sample Result
Evaluation
Sample
Preservation and
Holding Time
Evaluation
Blank Evaluation
Laboratory Control
Samples
Surrogates
Site Specific Matrix
Spikes and Matrix
Spike Duplicates
Tentatively
Identified
Compounds
Other QC data
107
APPENDIX H-2 (CONTINUED)
D
ATA USABILITY EVALUATION WORKSHEET
Provide a summary statement describing how the analytical data set relied upon is of adequate quality
and of sufficient accuracy, precision, and sensitivity for the intended purpose. Questions for the
investigator to consider during the DUE include, but are not limited to, the following, please see the text of
this guidance for additional information:
How will the analytical data be used:
Is this the initial site investigation to determine if and what contamination exists?
Will the analytical results be used to determine compliance with Regulatory criteria (e.g. post
excavation samples)?
Will remedial action be conducted?
Has remedial action been conducted?
Are the results going to be used to guide further remedial investigation?
Are the results going to be used to guide further remedial action (including monitored natural
attention of groundwater)?
Evaluate seasonal variability, or homogeneity in an environmental sample?
Laboratory QC Information
If the results are close to a regulatory limit, does any QC bias affect the interpretation of the data?
Are significant QC variances reported?
Are the biases high or low?
Are the identified QC nonconformances related to results for substances that are reported as
“ND,” and the reporting limits are less than regulatory criteria?
Are the nonconformances related to poorly performing compounds that are not constituents of
concern?
Are the nonconformances related substances that are not constituents of concern?
How do the nonconformances affect “NDs” and reported concentrations?
DQOs
Were the DQOs precision, accuracy, representativeness, comparability, completeness and
sensitivity met?
Are all critical samples usable for the intended purpose(s)?
Does sample homogeneity or heterogeneity affect the representativeness of the samples?
CSM
Do any analytical QC nonconformances create significant data gaps in the conceptual site
model?
Evaluate the entire body of information (type, amount, and quality data) available for the specific
area/release for which the data are presumed to be representative. Determine whether any
newer data corroborate the older results and whether both sets of data are consistent with the
CSM.
Consider the risk of being wrong based on risk to potential receptors and the risk to human health
and the environment.
Consider the source of data (e.g., whether the data were generated by the investigator’s own firm
or some other firm, the investigator’s own involvement with the project, method of collection for
the samples, and reporting methods by other firms/laboratories generating the data). Perform a
critical review of these data to evaluate its reliability.
Consider any other site-specific factors.
108
Appendix I
Surrogates and Internal Standards
109
APPENDIX I-1
S
EMI-VOLATILE INTERNAL STANDARDS AND THEIR CORRESPONDING
T
ARGET COMPOUNDS AND SURROGATES
This table lists the commonly used (e.g. DKQ Methods 8260 and 8270) internal standards and their associated target compounds and surrogates
for semi-volatiles. If the laboratory data indicates a problem with the internal standard(s) and/or surrogate(s), this table can be used to evaluate
which target compounds are effected. For instance, if the surrogate 1,2-Dichloroethane-d4 had a low recovery, the compounds listed in the same
column would potentially be effected as well, and low bias should be suspected unless otherwise indicated by additional QC data.
1,4-Dichlorobenzene-d4
Naphthalene-d8
Acenaphthene-d10
Phenanthrene-d10
Chrysene-d12
Perylene-d12
Aniline
Nitrobenzene
Hexachlorocyclopentadiene
4,6-Dinitro-2-methylphenol
Pyrene
Di-n-octyl phthalate
Phenol
Isophorone
2,4,6-Trichlorophenol
4-Bromophenyl-
phenylether
Butylbenzylphthalate
Benzo(b) fluoranthene
bis-(2-Chloroisopropyl ether)
2-Nitrophenol
2,4,5-Trichlorophenol
N-Nitroso-diphenylamine
3,3'-Dichlorobenzidine
Benzo(k) fluoranthene
2-Chlorophenol
2,4-Dimethylphenol
2-Chloronaphthalene
Hexachlorobenzene
Benzo(a)anthracene
Benzo(a)pyrene
2-Methylphenol
bis-(2-Chloro
ethoxy)methane
2-Nitroaniline
Pentachlorophenol
Chrysene
Indeno(1,2,3-cd)-pyrene
Pyridine
2,4-Dichlorophenol
Dimethylphthalate
Phenanthrene
bis-(2-Ethylhexyl)phthalate
Dibenzo(a,h)-anthracene
2,2'-oxybis-(1-
Chloropropane)
Naphthalene
2,6-Dinitrotoluene
Anthracene
Terphenyl-d14 (surr)
Benzo(g,h,i)perylene
4-Methylphenol
4-Chloroaniline
Acenaphthylene
Carbazole
N-Nitroso-di-n- propylamine
Hexachlorobutadiene
3-Nitroaniline
Di-n-butylphthalate
Hexachloroethane
4-Chloro-3- methylphenol
Acenaphthene
Fluoranthene
2-Fluorophenol (surr)
2-Methylnaphthalene
2,4-Dinitrophenol
Pentachloronitro-benzene
Phenol-d5 (surr)
1,2,4-Trichlorobenzene
4-Nitrophenol
2,4,6-Tribromophenol (surr)
2-Chlorophenol-d4 (surr)
Nitrobenzene-d5 (surr)
Dibenzofuran
1,2-Dichlorobenzene-d4 (surr)
2,4-Dinitrotoluene
Diethylphthalate
Fluorene
4-Chlorophenylphenylether
4-Nitroaniline
1,2,4,5-Tetrachlorobenzene
2-Fluorobiphenyl (surr)
110
APPENDIX I-2
V
OLATILE INTERNAL STANDARDS AND THEIR CORRESPONDING
T
ARGET COMPOUNDS AND SURROGATES
1,4-Difluorobenzene (I.S.)
Chlorobenzene-d5 (IS)
1,4-Dichlorobenzene-d4 (IS)
Dichlorodifluoromethane
1,1,1-Trichloroethane
Bromoform
Chloromethane
Cyclohexane
1,3-Dichlorobenzene
Vinyl chloride
Carbon tetrachloride
1,4-Dichlorobenzene
Bromomethane
Benzene
1,2-Dichlorobenzene
Chloroethane
Trichloroethene
1,2-Dibromo-3-chloropropane
Trichlorofluoromethane
Methylcyclohexane
1,2,4-Trichlorobenzene
1,1-Dichloroethene
1,2-Dichloropropane
1,2,3-Trichlorobenzene
1,1,2-Trichloro-1,2,2-trifluoroethane
Bromodichloromethane
1,2-Dichlorobenzene-d4 (DMC)
Acetone
cis-1,3-Dichloropropene
Carbon disulfide
4-Methyl-2-pentanone
Methyl acetate
Toluene
Bromochloromethane
trans-1,3-Dichloropropene
Methylene chloride
1,1,2-Trichloroethane
trans-1,2-Dichloroethene
Tetrachloroethene
Methyl tert-butyl ether
2-Hexanone
1,1-Dichloroethane
Dibromochloromethane
cis-1,2-Dichloroethene
1,2-Dibromoethane
2-Butanone
Chlorobenzene
Chloroform
Ethylbenzene
1,2-Dichloroethane
m,p-Xylene
1,4-Dioxane
o-Xylene
Vinyl chloride-d3 (DMC)
Styrene
Chloroethane-d5 (DMC)
Isopropylbenzene
1,1-Dichloroethene-d2 (DMC)
1,1,2,2-Tetrachloroethane
2-Butanone-d5 (DMC)
Benzene-d6 (DMC)
Chloroform-d (DMC)
1,2-Dichloropropane-d6 (DMC)
1,2-Dichloroethane-d4 (DMC)
trans-1,3-Dichloropropene-d4 (DMC)
1,4-Dioxane-d8 (DMC)
Toluene-d8 (DMC)
2-Hexanone-d5 (DMC)
1,1,2,2-Tetrachloroethane-d2 (DMC)
111
APPENDIX I-3
Surrogates for Chlorinated Pesticides and Aroclors
Decachlorobiphenyl
Tetrachloro-m-xylene
112
Appendix J
Supplemental Examples Using Multiple Lines of Evidence
113
APPENDIX J
S
UPPLEMENTAL EXAMPLES USING MULTIPLE LINES OF EVIDENCE
These examples illustrate how multiple lines of evidence may be used to address QC
nonconformances.
Example J-1: Surrogates Low Recovery, Expanded Version of Example 11
A soil sample was analyzed by DKQ Method 8260. The intended use of the analytical data was
to determine if contaminants were present at concentrations that exceed the Impact to Ground
Water Screening Level (IGWSL).
The percent recovery for the surrogate Toluene-d8 was reported to be 20 percent. The method
specifies that the recovery limits for surrogates must be within 70 to 130 percent. Because the
reported recovery for this surrogate is outside acceptance criteria for Volatile Organic
Compounds (VOCs), then all VOC results are biased low.
1,1,1-Trichloroethane was reported at a concentration of 0.1 mg/kg, which is just below
the applicable criteria of 0.2 mg/kg.
MS/MSD percent recoveries from a soil sample collected at the site, from substantially the
same type of unconsolidated material as the sample, were within the RCP acceptance
criteria for all compounds reported by RCP Method 8260. 1,1,1-Trichloroethane was not
detected (ND) as a target compound in the MS/MSD sample.
The RPD for the MS/MSD for pair for 1,1,1-trichloroethane is 13.3 percent. The method
specifies that relative percent difference must be less than 30 percent for the MS/MSD
pair.
All other quality control criteria were within the DKQ acceptance criteria.
The reported percent recovery for the surrogate toluene-d8 indicates a potential low bias for all
volatile organic compounds. Because the reported concentration of 1,1,1-trichloroethane is just
below the IGWSL, the reported potential low bias associated with the surrogate recovery means
the results should not be used to solely determine that 1,1,1-trichloroethane is present at a
concentration less than the regulatory criteria. Multiple lines of evidence such as matrix spikes
114
and matrix spike duplicates were used to evaluate this data set further. However, the MS/MSD
percent recoveries for soil samples collected at the site, from substantially the same type of
unconsolidated material as the sample were reported within DKQ acceptance criteria.
Conclusion: The evaluation of these results using multiple lines of evidence would not prevent
the investigator from concluding that 1,1,1-trichloroethane is not present at a concentration
greater than the regulatory criteria.
Example J-2: Laboratory Control Samples Low Recovery, Expanded Version of Example 12
Ground water samples were analyzed by DKQ Method 8260. The purpose of sampling was to
determine compliance with regulatory criteria. The GWQS for benzene is 1 μg/l.
The results for the LCS indicate a 54 percent recovery for benzene. The method specifies
that the recovery limits for the LCS must be within 70 to 130 percent.
The analytical results were ND for benzene at a reporting limit of 0.5 μg/l.
The surrogate recoveries are within the DKQ method acceptance criteria.
The MS/MSD percent recoveries from a water sample collected at the site, from
substantially the same aquifer as the sample, were within the DKQ method acceptance
criteria for all compounds reported by DKQ Method 8260. Benzene was ND as a target
compound in the MS/MSD sample.
The RPD for the MS/MSD for pair for Benzene is 23.3 percent. The method specifies that
the RPD must be less than 30 percent for the MS/MSD pair.
All other QC criteria are within the RCP acceptance criteria.
The results of the laboratory control sample indicate a potential low bias in the accuracy of the
method. Therefore, the results reported could have been affected by the low bias of the
associated with the method, and the results should not solely be used to determine if benzene is
present at a concentration greater than the GWQS. Multiple lines of evidence such as
surrogates, and matrix spikes and matrix spike duplicates were used to evaluate this data set
further. However, the surrogate recoveries were within DKQ method acceptance indicating an
acceptable degree of accuracy with the analytical method. In addition, the MS/MSD percent
115
recoveries from a water sample collected at the site, from substantially the same aquifer were
reported within DKQ acceptance criteria.
Conclusion: The evaluation of these results using multiple lines of evidence would indicate to
the investigator that benzene is below the applicable GWQS.
116
Example J-3: MS/MSD High Recoveries, Expanded Version of Example 15
A residential soil sample was analyzed by DKQ Method 8260 for VOCs. The intended use of
the data is to determine compliance with the residential direct contact soil remediation standard.
Trichloroethene (TCE) was reported at a concentration of 8 mg/kg, which is just above
residential direct contact soil standard of 7 mg/kg.
The percent recoveries for TCE generated by a MS/MSD pair are 180 and 185 percent
respectively. According to the DKQ method, the recovery limits for the MS/MSD should
be within 70 to 130 percent.
The RPD for the MS/MSD pair is 2.7 percent. The relative percent difference should be
less than 30 percent for the MS/MSD pair.
The surrogates are within DKQ acceptance criteria.
In a duplicate sample, TCE was reported at a concentration of 10 mg/kg, which is just
above the residential direct contact soil standard of 7 mg/kg. The relative percent
difference between the original and duplicate sample is 18.2 percent, which indicates an
acceptable degree of precision between the two samples.
All other QC criteria were within the DKQ acceptance criteria.
The spike recoveries indicate a potential high bias for TCE. Because of the reported high bias
and the sample result just above the residential direct contact soil standard of 7 mg/kg., the
actual concentration of TCE in the sample may be lower and may be less than the residential
direct contact soil standard of 7 mg/kg. However, the investigator cannot adjust the
concentrations of the reported values lower. The RPD for the MS/MSD pair was within the
acceptance criteria specified in DKQ method, and therefore, MS/MSD results show an
acceptable degree of the precision. Because of the reported high bias associated with the
MS/MSD pair, the MS/MSD results should not be used solely to determine if TCE is present at a
concentration greater than the residential direct contact soil remediation standard.
Multiple lines of evidence, including surrogate recoveries, duplicate samples and, were used to
further evaluate this data set. The surrogate recoveries are within the range specified in the
117
DKQ method. The duplicate sample results indicate that the concentration of TCE is above the
residential direct contact soil standard of 7 mg/kg.
Conclusion: The evaluation of these results using multiple lines of evidence would indicate that
TCE is above the applicable residential direct contact soil standard of 7 mg/kg.
(Note: Using the same example as above for a non-residential site, the conclusion is that the
concentration of TCE is below the non-residential direct contact soil remediation standard (20
mg/kg); however, the concentration of TCE would exceed the default impact to ground water
criteria (0.007 mg/kg) necessitating the evaluation of that pathway.)
Example J4- ICS Low Recoveries, Expanded Example 16
Ground water samples were analyzed by DKQ Method 6010. The purpose of sampling was to
determine compliance with Regulatory criteria. The GWQS for Arsenic and Cadmium are 3
ug/L and 4 ug/L, respectively.
The results for the ICS indicate a 54 percent recovery for arsenic and 60 percent
recovery for cadmium. The DKQ protocol specifies that the recovery limits for the ICS
should be within 80 to 120 percent.
The analytical results were both at the GWQS of 3 ug/L and 4 ug/L for Arsenic and
Cadmium, respectively.
The Matrix spike results for arsenic and cadmium were below the QC limit of 75% at 65%
and 60 percent, respectively.
The duplicate results were both acceptable for arsenic and cadmium.
Due to the proximity of the sample results to the GWQS, multiple lines of evidence should be
evaluated. The results of the ICS indicate a possible low bias in the accuracy of the method.
The result of the MS provides additional evidence that results are biased low. The duplicate
result demonstrates acceptable precision.
Conclusion: Based on the data reviewed one would conclude that the sample results are likely
above the applicable GWQS and additional sampling and analyses would be recommended.
118
Appendix K:
Glossary
119
Term Definition
Accuracy
Accuracy describes the closeness of agreement between an observed
value and an accepted reference value that is accepted as the true
value. Accuracy is typically evaluated using spikes (laboratory control
samples, surrogate spikes, and mat
rix spikes) and blanks (trip, field,
and method), or any other standard subjected to the entire analytical
process. Accuracy is usually reported as a percentage of the observed
value divided by the reference value (percent recovery) using the
following equation:
%R = observed value X 100
reference value
Where %R = percent recovery
Acid Semivolatile
Organic Compound
Surrogates
Acid surrogates are compounds routinely used with semi-volatile
methods
that exhibit similar chemical behavior to acidic organic
compounds such as phenols. Common acid surrogates include: 2-
Fluorophenol, phenol-d5 (a deuterated phenol), and 2,4,6-
Tribromophenol. (See also surrogate).
Analyte
Analyte means the substance being measured by an analytical
procedure.
Analytical Batch
An analytical batch is a group of samples that are processed and
analyzed as a unit. For quality control purposes, the maximum number
of samples in a batch is 20 per matrix.
120
Term Definition
Applicable
Standard/Screening
Level
Residential Direct Contact Health Based Criteria and Soil Remediation
Standards (RDC SRS),
2
http://www.nj.gov/dep/srp/regs/rs/rs_rule.pdf
Nonresidential Direct Contact Health Based Criteria and Soil
Remediation Standards (NRDC SRS),
3
http://www.nj.gov/dep/srp/regs/rs/rs_rule.pdf
Default Impact to Ground water Soil Screening Levels for
Contaminants;
4
http://www.nj.gov/dep/srp/guidance/rs/partition_equation.pdf
Default Leachate Criteria for Class II Ground Water (Synthetic
Precipitation Leachate Procedure);
5
http://www.nj.gov/dep/srp/guidance/rs/splp_guidance.pdf
Specific Ground Water Quality Criteria (Groundwater Quality
Standards);
6
http://www.nj.gov/dep/rules/rules/njac7_9c.pdf
Surface Water Quality Criteria for Toxic Substances (SWQC);
7
http://www.nj.gov/dep/rules/rules/njac7_9b.pdf
Maximum Contaminant Levels (MCL) for State Regulated VOCs;
8
http://www.state.nj.us/dep/rules/rules/njac7_10.pdf
NJDEP MASTER TABLE GENERIC VAPOR INTRUSION
SCREENING LEVELS including
Vapor Intrusion Groundwater Screening Levels (GWSL);
9
Vapor Intrusion Residential Indoor Air Screening Level (RIASL);
10
Vapor Intrusion Nonresidential Indoor Air Screening Level
(NRIASL);
11
All at http://www.nj.gov/dep/srp/guidance/vaporintrusion/vig_tables.pdf
NJDEP Action Levels for Indoor Air ;
12
http://www.nj.gov/dep/srp/guidance/vaporintrusion/vig_tables.pdf
Vapor Intrusion Health Department Notification levels (HDNL);
13
http://www.nj.gov/dep/srp/guidance/vaporintrusion/vig_tables.pdf
Extractable Petroleum Hydrocarbons (EPH);
14
http://www.nj.gov/dep/srp/guidance/srra/eph_method.pdf
121
Term Definition
Applicable
Standard/Screening
Level (continued)
Hexavalent Chromium Cleanup Criterion;
15
http://www.state.nj.us/dep/srp/guidance/rs/chrome_criteria.pdf
Ecological Screening Criteria;
16
http://www.nj.gov/dep/srp/guidance/ecoscreening/esc_table.pdf
Site specific criteria developed for the investigation and remediation
according to the applicable NJDEP guidance.
Area of Concern
"Area of concern" means any existing or former distinct location or
environmental medium where any hazardous substance, hazardous
waste, or pollutant is known or suspected to have been discharged,
generated, man
ufactured, refined, transported, stored, handled,
treated, or disposed, or where any hazardous substance, hazardous
waste, or pollutant has or may have migrated, including, but not limited
to, each current and former objects and/or areas defined in N.J.A.C.
7:26E-1.8.
Base Neutral
Semivolatile Organic
Surrogates
Base neutral semivolatile organic surrogates exhibit similar chemical
behavior to the base-neutral semivolatile organic compounds. Common
examples include: Nitrobenzene-d5, 2-Fluorobiphenyl, and terphenyl-
d14. (See also surrogate).
Bias
Bias is the deviation of the measured value from the true value. This
can be analytical bias within the analytical procedure, or it can be due
to matrix effects. There is inherent bias within all analytical procedur
es.
Quality control measurement tools that can be used to evaluate bias
include laboratory control samples, check standards, matrix spikes, or
any other standards used for analysis.
2
NJDEP, Remediation Standards, N.J.A.C. 7:26D
3
NJDEP, Remediation Standards, N.J.A.C. 7:26D.
4
NJDEP, Development of Site-Specific Impact to Ground Water Soil Remediation Standards Using the
Soil-Water Partition Equation, December 2008, http://www.nj.gov/dep/srp/guidance/rs/
.
5
NJDEP, Guidance for the use of the Synthetic Precipitation Leaching Procedure to Develop Site-
Specific Impact to Ground Water Remediation Standards, June 2, 2008,
http://www.nj.gov/dep/srp/guidance/rs/
.
6
NJDEP, Groundwater Quality Standards, N.J.A.C. 7:9C
7
NJDEP, Surface Water Quality Standards, N.J.A.C. 7:9B
8
NJDEP, Safe Drinking Water Act Regulations, N.J.A.C. 7:10
9
NJDEP, Vapor Intrusion Technical Guidance, criteria dated March 2013.,
http://www.nj.gov/dep/srp/guidance/vaporintrusion/
.
10
Ibid.
11
Ibid.
12
Ibid.
13
Ibid.
14
NJDEP, Protocol for Addressing Extractable Petroleum Hydrocarbons, Version 5.0, August 9, 2010,
http://www.nj.gov/dep/srp/guidance/srra/eph_protocol.pdf
.
15
NJDEP, Chromium Soil Cleanup Criteria, April 2010,
16
NJDEP, Ecological Screening Criteria, March 10, 2009,
http://www.nj.gov/dep/srp/guidance/ecoscreening
.
122
Term Definition
Calibration Curve/Initial
Calibration
A calibration curve/initial calibration curve is generated by analyzing a
series of standards and plotting instrument response versus
concentration. A calibration curve is used to calibrate an analytical
system. Calibration criteria are specified in each analytical method.
Check Standard
A check standard is a solution of one or more analytes that is used to
document laboratory performance. This check standard can go by
many different names including laboratory control samples and
laboratory fortified blank. Consult with the laboratory to understand the
naming scheme used to identify such standards. This standard can
also be used to check the validity of a purchased stock or calibration
standard.
Comparability
Comparability refers to the equivalency of two sets of data.
Comparability may be achieved through the use of standard or similar
techniques to collect and analyze representative samples. Comparable
data sets must contain the same variables of interest and must
possess values that can be converted to a common unit of
measurement. Comparability is normally a qualitative parameter that is
dependent upon other data quality elements.
Completeness
Completeness is a measure of the amount of valid data obtained from
a measurement system compared to the amount that was expected to
be obtained under correct, normal conditions.
Conceptual Site Model Defined in NJDEP Conceptual Site Model Technical Guidance, 2012.
Contaminant or
Contamination
Contamination or contaminant means any discharged hazardous
substance as defined pursuant to N.J.S.A. 58:10-
23.11b, hazardous
waste as defined pursuant to N.J.S.A. 13:1E-38, or pollutant as defined
pursuant to N.J.S.A. 58:10A-3.
Contaminant of
Potential Ecological
Concern (COPEC)
COPEC means a substance detected at a contaminated site that has
the poten
tial to adversely affect ecological receptors because of its
concentration, distribution, and mode of toxicity; contaminants with
concentrations above their respective New Jersey Surface Water
Quality Standards or ecological screening criteria are identifi
ed as
contaminants of potential ecological concern.
Control Sample
Control sample means a quality control sample introduced into a
process to monitor the performance of a system.
Critical Sample
Critical samples are user defined where the completeness go
al is
usually 100 percent.
Data of Known Quality
When “Data of Known Quality” is achieved for a particular data set, the
investigator will have “Data of Known Quality” that the laboratory has
followed the Data of Known Quality Protocols, has described non-
conformances, if any, and has adequate information to make
judgments regarding data quality.
123
Term Definition
Data of Known Quality
Protocols (DKQPs)
DKQPs include specific laboratory quality assurance and quality control
(QA/QC) criteria that produce analytical data o
f known and
documented quality. The DKQ protocols are shown in Appendix B of
the NJDEP Site Remediation Program, DATA OF KNOWN QUALITY
PROTOCOLS TECHNICAL GUIDANCE, April 2014. (DKQ Guidance)
Data Quality
Objectives (DQOs)
DQOs, developed by the investigator, are qualitative and quantitative
statements derived from the DQO Planning Process that clarify the
purpose of the study, define the most appropriate type of information to
collect, determine the most appropriate conditions from which to collect
that
information, and specify tolerable levels of potential decision
errors.
Environmental Sample
An environmental sample is a sample of soil, groundwater, surface
water, soil vapor, sediment, air, or any other environmental matrix
collected for analysis.
Equipment-Rinsate
Blank
An equipment-rinsate blank is a sample of analyte-free water that is
used to rinse the sampling equipment. An equipment-rinsate blank is
collected after decontamination to assess potential contamination from
inadequate decontamination of field equipment. An equipment-rinsate
blank can also be used to evaluate the potential for field sampling
equipment to leach contaminants into a sample and cause cross
contamination.
Field Blank
A field blank is analyte-free matrix, usually water, prepared in the
laboratory and transported to the sampling location along with the
empty sample containers. At the sampling location the matrix is used to
fill randomly selected sample containers and then returned to the
laboratory for analysis. The field bla
nk is treated as a sample in all
respects, including exposure to sampling location conditions, storage,
preservation, and all analytical procedures. Field blanks are used to
assess any contamination contributed from sampling location
conditions and the transport, handling, and storage of the samples.
Field Duplicates
Field duplicates are replicates collected from the same location in the
field and submitted to the laboratory as two distinct samples.
Duplicates are used to evaluate precision, sample homogen
eity, and
field sample collection activities.
Field Reagent Blank See “Field Blank.”
Gas Chromatography/
Mass Spectrometry
Gas Chromatography/Mass Spectrometry is an analytical procedure in
which a gas chromatograph is connected to a mass spectrometer. The
technique allows for both accurate identification and quantitation of
analytes.
Handling Time
The maximum amount of time for a QC sample (e.g., field or trip
blanks) to be transported to a site and/or the maximum amount of time
for transport of site f
ield samples and field QC samples back to the
laboratory. Samples held beyond the allowed handling time may be
considered biased low or invalid, depending on the intended use of the
data (see NJDEP Field Sampling Procedures manual, August 2005).
124
Term Definition
Holding Time
The holding time is the maximum time that a sample may be held, after
the sample is taken prior to preparation and/or analysis and still be
considered valid or not compromised. Holding times can include time to
extraction and time allowed after extraction before analysis and time
allowed prior to digestion and after digestion prior to analysis based on
method specific requirements. Samples analyzed past the holding time
are determined to be compromised and
may be considered invalid,
depending on the intended use of the data.
Instrument Blank
An instrument blank is analyte-free matrix (e.g., distilled water)
processed through the instrumental steps of the measurement process;
used to determine instrument contamination. Typically gas
chromatography methods (excluding volatile organic compounds) use
pure solvent as an instrument blank while metals and wet chemistry
techniques use water or acidified water. Gas chromatography methods
for volatile organic compounds use either acidified water or methanol.
Internal Standards
For certain analytical methods, internal standards are compounds that
are added, immediately prior to analysis, at a known concentration to
every standard, blank, sample, and quality control sample. Internal
standards are used to calibrate t
he analytical system by plotting the
response of the internal standards versus the compound(s) of interest.
Internal standards should closely match the chemical behavior of the
compound(s) of interest and be known not to be present in the sample.
Laboratory Control
Sample (LCS)
A LCS is a sample matrix, free from the analytes of interest, spiked with
verified known amounts of analytes or a material containing known and
verified amounts of analytes from the same source as the calibration
standards. It is generally used to establish intra-laboratory or analyst
specific precision and bias or to assess the performance of all or a
portion of the measurement system. The LCS is carried through the
analysis along with the samples. LCSs are also known as laboratory
fortified blanks or blank spikes.
Laboratory Fortified
Blank
See “Laboratory Control Sample.”
License Site
Remediation
Professional
(investigator)
An individual who is licensed by the board pursuant to section 7 of P.L.
2009, c.60 (C.58:10C-7) or the department pursuant to section 12 of
P.L. 2009, c.60 (C.58:10C-12).
Matrix Duplicates
Matrix duplicates refer to the replicate analyses of samples taken from
the same sample container and prepared in the laboratory. Matrix
duplicates are used to evaluate precision and sample homogeneity.
Matrix Interference
Matrix interferences are manifestations of non-target analytes or
physical/ chemical characteristics of a sample that prevents the
quantification of the target analyte (i.e., the compound or element of
interest being effectively quantified by the test method) as it is routinely
performed, typically adversely impacting the reliability of the
determination. For example, some matrices including silt, clay, coal,
ash, and peat effectively bind analytes which may lead to low biased
results for certain extraction/analysis procedures. Co-eluting peaks in a
GC chromatogram may result in a high bias for an analyte of concern.
125
Term Definition
Matrix
The matrix is the material of which the sample is composed or the
substrate (e.g
., surface water, ground water, drinking water, soil,
sediment, air) that may or may not contain an analyte of interest.
Matrix Spike
A matrix spike is an aliquot of an environmental sample to which known
quantities of target analytes are added in the la
boratory. The matrix
spike is analyzed in an identical manner as a sample. The purpose of a
matrix spike sample is to determine the quantitative accuracy of the
overall analytical procedure for determining the analytes of concern in
the sample.
Matrix Spike Duplicate
A matrix spike duplicate is an intra-laboratory split sample, with both
aliquots spiked with identical concentrations of method analytes. The
spiking occurs prior to sample preparation and analysis. The results are
used to document the precision and accuracy of a method in a given
sample matrix. See also “Matrix Spike.”
Method Blank
A method blank is an “analyte-free” matrix that is treated exactly as a
sample including exposure to all glassware, equipment, solvents,
reagents, labeled compounds, internal standards, and surrogates that
are used with samples. The method blank is used to determine if
analytes or interferences are present in the laboratory environment, the
reagents, or the apparatus. A method blank may also be referred to as
a laboratory reagent blank.
Nonconformance
A nonconformance is an occurrence during the processing or analysis
of a sample that deviates from the quality control performance criteria
of the analytical method. Examples of nonconformances include, but
are not li
mited to, missed holding times, temperature excursions,
recoveries of surrogates or matrix spikes outside of performance
criteria, initial or continuing calibration failures.
Non-target compounds
Non-targeted compound means a compound detected in a sample
using a specific analytical method that is not a targeted analyte (see
below), a surrogate compound, a system monitoring compound, a
deuterated monitoring compound or an internal standard compound.
PARCCS Parameters
The PARCCS parameters are precision, accuracy, representativeness,
comparability, completeness, and sensitivity.
Performance
Evaluation Sample
See “Proficiency Test Sample.”
Petroleum (or
Petroleum Product)
Petroleum" or "petroleum products" means oil or petroleum of any kind
and in any form, including, but not limited to, oil, petroleum, gasoline,
kerosene, fuel oil, oil sludge, oil refuse, oil mixed with other wastes,
crude oils, and substances or additives to be utilized in the refining or
blending of crude petroleum or petroleum stock in this State. However,
any compound designated by specific chemical name on the list of
hazardous substances adopted by the department pursuant to this
section shall not be considered petroleum or a petroleum product for
the purposes of P.L.1976, c.141, unle
ss such compound is to be
utilized in the refining or blending of crude petroleum or petroleum
stock in this State.
126
Term Definition
Precision
Precision is the consistency of measurement values quantified by
measures of dispersion such as the sample standard deviation.
Precision must be defined in context
e.g., for a certain analyte,
matrix, method, perhaps concentration, lab or group of labs. Precision
for laboratory and field measurements can be expressed as the relative
percent difference (RPD) between two duplicate
determinations or
percent relative standard deviation (%RSD) between multiple
determinations.
Proficiency Test
Sample
Proficiency test sample is a sample provided to a laboratory for the
purpose of demonstrating that the laboratory and the individual analyst
performing the test
can successfully analyze the sample within
acceptable limits. The true value of the sample is unknown by the
analyst.
Quality Assurance
Project Plan (QAPP)
A QAPP is a document which describes the procedures necessary to
produce a
n orderly assemblage of detailed procedures designed to
produce data of sufficient quantity and quality to meet the data quality
objectives for a specific data collection activity.
Quality
Assurance/Quality
Control (QA/QC)
QA is an integrated system of management activities involving
planning, implementation, assessment, reporting, and quality
improvement to establish the reliability of laboratory data to ensure that
a process, item, or service is of the type and quality needed and
expected by the client. QC procedures are the specific tools that are
used to achieve this reliability. QC is the overall system of technical
activities whose purpose is to measure and control the quality of a
product or service so that it meets the needs of users. QC procedures
measure the performance of an analytical method in relation to the QC
criteria specified in the analytical method. QC information documents
the quality of the analytical data.
Qualified Data
Qualified data are analytical results that have an affixed code placed
there by laboratories, and/or individuals conducting independent data
review, to denote that quality control requirements or other evaluation
criteria are not met. Data reviewers assess these and other criteria to
determine the usability of data.
Reagent water
Reagent water is water (generally that has been generated by any
purification method) demonstrated to be free from the analytes of
interest and potentially interfering substances at the method detection
limit for the analyte.
Reasonable
Confidence
When “Reasonable Confidence” is achieved for a particular data set,
the investigator will have “Reasonable Confidence” that the laboratory
has followed the Reasonable Confidence Protocols, has described
nonconformances, if any, and has adequate inf
ormation to make
judgments regarding data quality.
Rejected Data
Rejected data are data that have failed to meet QC requirements
and/or method specific/contractual requirements to such an extent that
the data are determined to be unusable.
127
Term Definition
Relative Percent
Difference (RPD)
The RPD is defined by the following equation:
RPD = A-B x 100
((A+B)/2)
Where A = Analytical results from first measurement and
B = Analytical results from the second measurement.
Reporting Limit
As per N.J.A.C. 7:26E-1.8, Reporting limit" means, for a compound
analyzed by a particular method, the sample equivalent concentration
(i.e., based on sample specific preparation and analysis factors), for
organics, associated with the lowest concentration standard used in the
calibration of the method and for inorganics, derived from the
concentration of that analyte in the lowest level check standard (which
could be the lowest calibration standard in a multi-
point calibration
curve).
Representativeness
Representativeness is a qualitative measurement that describes how
well the analytical data characterizes a discharge or area of concern
under investigation as part of an environmental site assessment. Many
factors can influence how representative the analytical results are for a
discharge
. These factors include, the selection of appropriate
analytical procedures, the sampling plan, and the procedures and
protocols used to collect, preserve, and transport samples.
Sensitivity
Sensitivity refers to the ability of an analytical procedure to detect and
quantify an analyte at a given concentration.
Spike
A known quantity of an analyte added to a sample for the purpose of
determining recovery or efficiency (analyst spikes), or for quality control
(blind spikes).
Split Sample
A split sample is prepared when aliquots of sample taken from the
same container and then analyzed independently. Split samples are
usually taken after mixing or compositing and are used to document
intra- or inter-laboratory precision.
Standards
Standards are solutions that contain known concentration of target
analytes. Examples include stock standards and calibration standards.
Surrogate
A surrogate is an organic non-target analyte that has similar chemical
properties to the analyte of interest. The surrogate standard is added to
the sample in a known amount and used to evaluate the response of
the analyte to preparation and analysis procedures. The surrogate
concentration is measured using the same procedures used to
measure other analytes in the sample. Surrogate recoveries are used
to evaluate the performance of the analysis.
Target Analytes
Target analytes are the compounds included on the list of analytes for
an analytical method. Site-specific target analytes are defi
ned in the
QAPP.
Tentatively Identified
Compound (TIC)
As per N.J.A.C. 7:26E-1.8, TIC means a non-targeted compound
detected in a sample using a GC/MS analytical method which has been
tentatively identified using a mass spectral library search. An estimated
concentration of the TIC is also determined.
128
Term Definition
Trip Blank
Trip blanks originate within the laboratory. Trip blanks are sample
containers that have been filled with analyte-free reagent water carried
with other sample containers out to the field and back to the lab without
being exposed to sampling procedures. Trip blanks are used to
ascertain if sample containers may have been contaminated during
transportation and storage.
Turn-Around Time
The turn-around time is the amount of time it takes for the laboratory to
report the analytical results to the customer following the submittal of
the samples to the laboratory.
Uncertainty
A measure of the total variability associated with sampling and
measuring that includes the two major error components: systematic
error (bias) and random error.
129
Appendix L:
List of Acronyms
130
List of Acronyms
% R Percent Recovery
BEHP bis(2-ethylhexyl) phthalate
ºC Degrees Celsius
CCAL Continuing Calibration
CFR Code of Federal Regulations
Cr Chromium
CSM Conceptual Site Model
DDT Dichloro-diphenyl-trichloroethane
DKQ Data of Known Quality
DQA Data Quality Assessment
DQO Data Quality Objective
DUE Data Usability Evaluation
EPA United States Environmental Protection Agency
EPH Extractable Petroleum Hydrocarbons
Hg Mercury
ICAL Initial Calibration
LSRP Licensed Site Remediation Professional
LCL Lower Control Limit
LCS Laboratory Control Sample
LFB Laboratory Fortified Blank
MEK Methyl Ethyl Ketone
MIBK 4-Methyl-2-petanone
μg/kg Micrograms per Kilogram
μg/l Micrograms per Liter
mg/kg Milligrams per Kilogram
MS/MSD Matrix Spike/Matrix Spike Duplicate
ND Not Detected (i.e., below the Reporting Limit)
PAHs Polycyclic Aromatic Hydrocarbons, also known as Polynuclear Aromatic Hydrocarbons
PARCCS Precision, accuracy, representativeness, comparability, completeness, and
sensitivity
PCBs Polychlorinated Biphenyls
PCE Tetrachloroethene, also known as Tetrachloroethylene or Perchloroethylene
Pest Pesticides
131
QA/QC Quality Assurance/Quality Control
QAPP Quality Assurance Project Plan
RL Reporting Limit
RPD Relative Percent Difference
RRF Relative Response Factor
SOP Standard Operating Procedure
SPLP Synthetic Precipitation Leaching Procedure
SVOCs Semi-volatile Organic Compounds
TCLP Toxicity Characteristic Leaching Procedure
TICs Tentatively Identified Compounds
TCE Trichloroethene
UCL Upper Control Limit
VOCs Volatile Organic Compounds
Work Group NJDEP Analytical Methods Technical Guidance Work Group
132