Laserfiche WebLink
chromatograms, absorbance, matrix spikes, and matrix spike duplicated should be <br /> included; <br /> • Level V —Method blank data and the control chart from the blank/spike; <br /> • Date of analysis was performed; <br /> • Signature of laboratory manager; <br /> • Method precision, accuracy, and completeness attainable; <br /> • QA checks to be run as part of the method; and <br /> • Limitations of the method. <br /> 4.0 Data Validation and Usability <br /> 4.IData Review Validations and Verification <br /> Data reduction refers to computations and calculations performed on data. This includes, but is not <br /> limited to, summary statistics, standard errors, confidence limits, test of hypothesis relative to the <br /> parameters,and model validation. <br /> Paper and electronic transfer of data will be conducted with multiple levels of redundancy. Data <br /> will be verified manually and at times electronically at each stage of transfer to ensure accuracy. <br /> The flow data from collection through storage is described in the CGRS data management system. <br /> Data validation is the process by which a sample measurement, method, or piece of data is deemed <br /> useful for a specified purpose. Data validation methods will depend on the type of study that <br /> generated the data,the type of sampling,the test method, and the end use (DQOs)of the data. <br /> The project manager is responsible for specifying the systematic process to be used to review the <br /> body of data against a set of criteria to assure that the data are adequate for their intended use. The <br /> process shall consist of data editing, screening, checking, auditing, verification, certification, and <br /> review. <br /> 4.2 Validation and Verification Methods <br /> The project manager will perform the final review and approval of the data prior to it being entered <br /> into the last system as valid. The project manager will look at field duplicates, matrix spike/matrix <br /> duplicates, lab blanks and lab duplicates to ensure they are acceptable. The project manager will <br /> also compare the sample descriptions to the field sheets for consistency and will ensure that any <br /> anomalies in the data are appropriately documented. <br /> 4.3 Reconciliation with DQOs <br /> Once the data results are compiled, the project manager will review the field duplicates to determine <br /> if they fall within the acceptance limits as defined in the QAPP. Completeness will also be <br /> evaluated to determine if the completeness goal for this project's requirements as outlined in the <br /> QAPP (including the accuracy for lab spikes) the data may be discarded and re-sampling may <br /> occur. The project manager will be responsible for determining the cause of failure and make the <br /> decision to discard the data and re-sample. If the failure is tied to the analysis calibration and <br /> maintenance techniques will be reassessed as identified by the appropriate lab personnel. If the <br /> failure is associated with the sample collection and re-sampling is needed the samplers will be <br /> retrained. <br /> Varra Companies <br /> Quality Assurance Project Plan <br /> Page 14 <br />