www.webMET.com - your meteorological resource center
United States
Canada
EPA Models
Wind Rose
Percentiles
Met Station
Met Monitoring Guide
Met Data
Modeling
Digital Terrain Data
Books
Videos
Met Facts
Forecast
Links
About WebMET.com
Contact Us



8.6.3 Validation Procedures

All necessary supporting material, such as audit reports and any site logs, should be readily available for the level 1 validation. Access to a daily weather archive should be provided for use in relating suspect data with to local and regional meteorological conditions. Any problem data, such as data flagged in an audit, should be corrected prior to the level 1 data validation. The validation procedures described in the following include screening, manual review, and comparison.

Table 8-3
Suggested quality control (QC) codes for meteorological data Code Meaning Description
CodeMeaningDescription
0Valid
Observations that were judged accurate within the performance limits of the instrument
Observations that required additional processing because the original values were suspect, invalid, or missing.
 
1EstimatedEstimated data may be computed from patterns or trends in the data (e.g., via interpolation), or they may be based on the meteorological judgment of the reviewer.
2Calibration
applied
Observations that were corrected using a known, measured  quantity (e.g., instrument offsets measured during audits)
3UnassignedReserved for future use
4UnassignedReserved for future use
5UnassignedReserved for future use
6Failed automatic QC check Observations that were flagged with this QC code did not pass screening criteria set in automatic QC software.
7SuspectObservations that, in the judgment of the reviewer, were in error because their values violated reasonable physical criteria or did not exhibit reasonable consistency, but a specific cause of the problem was not identified (e.g., excessive wind shear in an adiabatic boundary layer). Additional review using other, independent data sets (Level 2 validation) should be performed to determine the final validity of suspect observations.
8InvalidObservations that were  judged inaccurate or in error,  and the cause  of the inaccuracy or error was known (e.g., wind contaminated by ground clutter or a temperature lapse rate that exceeded the auto convective  lapse rate). Besides the QC flag signifying invalid data, the data values themselves should be assigned invalid indicators.
9MissingObservations data were not collected.

Data Screening

Screening procedures generally include comparisons of measured values to upper and lower limits; these may be physical limits, such as an instrument threshold, or may be established based on experience or historical data. Other types of procedures employed in screening include assessments based on the rate of change of a variable (in these data that change too rapidly or not at all are flagged as suspect) and assessments based on known physical principles relating two or more variables (e.g., the dew point should never exceed the dry-bulb temperature).

Screening may be regarded as an iterative process in which range checks and other screening criteria are revised as necessary based on experience. For example, an initial QA pass of a data set using default criteria may flag values which upon further investigation are determined to be valid for the particular site. In such cases, one or more follow-up QA passes using revised criteria may be necessary to clearly segregate valid and invalid data. Suggested screening criteria are listed in Table 8-4. Data which fail the screening test should be flagged for further investigation.

Manual Review

The manual review should result in a decision to accept or reject data flagged by the screening process. In addition, manual review may help to identify outliers that were missed by screening. This review should be performed by someone with the necessary training in meteorological monitoring.

In the typical manual review, data should be scanned to determine if the reported values are reasonable and in the proper format. Periods of missing data should be noted and investigated. Data should also be evaluated for temporal consistency. This is particularly useful for identifying outliers in hourly data. Outliers should be reviewed with reference to local meteorological conditions. Data are considered to be at Level 1 validation following the manual review and can be used for modeling and analysis.

Comparison Program

After the data have passed through the screening program, they should be evaluated in a comparison program. Randomly selected values should be manually compared with other available, reliable data (such as, data obtained from the nearest National Weather Service observing station). At least one hour out of every 10 days should be randomly selected. To account for hour-to-hour variability and the spatial displacement of the NWS station, a block of several hours may be more desirable. All data selected should be checked against corresponding measurements at the nearby station(s). In addition, monthly average values should be compared with climatological normals, as determined by the National Weather Service from records over a 30-year period. If discrepancies are found which can not be explained by the geographic difference in the measurement locations or by regional climatic variations, the data should be flagged as questionable.

Table 8-4
Suggested Data Screening Criteria
Variable Screening Criteria: flag data if the value
Wind Speed- is less than zero or greater than 25 m/s
- does not vary by more than 0.1 m/s for 3 consecutive  
  hours
- does not vary by more than 0.5 m/s for 12 consecutive   
  hours
Wind Direction- is less than zero or greater than 360
- is less tan zero or greater than 1 degree for more than 3   
  consecutive hours
- does not vary by more than 10 degrees for 18 consecutive 
  hours
Temperature- is greater than the local record high
- is less than the local record low
  (The above  limit could be applied on a monthly basis)
- is greater than a 5 C change from the previous hour
- does not vary by more than 0.5 C form 12 consecutive 
  hours
Temperature Difference- is greater than 0.1 C/m during the daytime
- is less than -0.1 C/m during the night time
- is greater than 5.0 C or less tan -3.0 C
Dew Point Temperature- is greater than the  ambient temperature for the given time 
  period 
- is greater than a 5 C change form the previous hour
- does not vary by more than 0.5 C for 12 consecutive hours
- equals the ambeint temperature for 12 consecutive hours
Precipitation- is greater than 25 mm in one hour
- is greater than 100 mm in 24 hours
- is less than 50 mm in three moths
  (The above values can be adjusted based on local climate)
Pressure- is greater than 1060 mb (sea level)
- is less than 940 mb (sea level)
  (The above values should be adjusted for elevations other 
  than sea level.)
- changes by more than 6 mb in three hours
Radiation- is greater than zero at night
- is greater than the maximum possible for th date and 
  latitude.

Further Evaluations

Any data which are flagged by the screening program or the comparison program should be evaluated by personnel with meteorological expertise. Decisions must be made to either accept the flagged data, or discard and replace it with back-up or interpolated data, or data from a nearby representative monitoring station (see Section 1). Any changes in the data due to the validation process should be documented as to the reasons for the change. If problems in the monitoring system are identified, corrective actions should also be documented. Any edited data should continue to be flagged so that its reliability can be considered in the interpretation of the results of any modeling analysis which employs the data.

8. QUALITY ASSURANCE AND QUALITY CONTROL
8.1 Instrument Procurement 
     8.1.1 Wind Speed 
     8.1.2 Wind Direction  
     8.1.3 Temperature and Temperature Difference 
     8.1.4 Dew Point Temperature 
     8.1.5 Precipitation 
     8.1.6 Pressure 
     8.1.7 Radiation  
 8.2 Installation and Acceptance Testing
     8.2.1 Wind Speed 
     8.2.2 Wind Direction  
     8.2.3 Temperature and Temperature Difference 
     8.2.4 Dew Point Temperature 
     8.2.5 Precipitation 
     8.2.6 Pressure
     8.2.7 Radiation
  8.3 Routine Calibrations  
     
8.3.1 Sensor Check  
      8.3.2 Signal Conditioner and Recorder Check  
     8.3.3 Calibration Data Logs 
     8.3.4 Calibration Report  
     8.3.5 Calibration Schedule/Frequency 
     8.3.6 Data Correction Based on Calibration Results  
  8.4 Audits 
     8.4.1 Audit Schedule and Frequency  
     8.4.2 Audit Procedure 
     8.4.3 Corrective Action and Reporting 
  8.5 Routine and Preventive Maintenance 
     8.5.1 Standard Operating Procedures  
     8.5.2 Preventive Maintenance  
 8.6 Data Validation and Reporting  
     8.6.1 Preparatory Steps 
     8.6.2 Levels of Validation  
     8.6.3 Validation Procedures  
     8.6.4 Schedule and Reporting
  8.7 Recommendations


webgis.com
Free Digital
Terrain Data &
GIS Resources



lakes
Leading Air Dispersion Modeling & Risk Assessment Software



courses
calpuff view
CALPUFF View
Advanced Air Dispersion Model


HOME | ABOUT | MET STATION EQUIPMENT
METFACTS | BOOKS | VIDEOS | FORECAST

Copyright © 2002 WebMET.com - Disclaimer