|The purpose of this section is to outline the procedures for calibrating artifacts and instruments while guaranteeing the 'goodness' of the calibration results. Calibration is a measurement process that assigns values to the property of an artifact or to the response of an instrument relative to reference standards or to a designated measurement process. The purpose of calibration is to eliminate or reduce bias in the user's measurement system relative to the reference base. The calibration procedure compares an "unknown" or test item(s) or instrument with reference standards according to a specific algorithm.
What are the issues for calibration?
- Artifact or instrument calibration
- Reference base
- Reference standard(s)
What is artifact (single-point) calibration?
- Calibration model
What are calibration designs?
- Properties of designs
- Check standard in a design
- Special types of bias (left-right effect & linear drift)
- Solutions to calibration designs
- Uncertainty of calibrated values
Catalog of calibration designs
- Mass weights
- Gage blocks
- Electrical standards - saturated standard cells, zeners, resistors
- Roundness standards
- Angle blocks
- Indexing tables
- Humidity cylinders
Control of artifact calibration
- Control of the precision of the calibrating instrument
- Control of bias and long-term variability
What is instrument calibration over a regime?
- Models for instrument calibration
- Data collection
- What can go wrong with the calibration procedure?
- Data analysis and model validation
- Calibration of future measurements
- Uncertainties of calibrated values
- From propagation of error for a quadratic calibration
- From check standard measurements for a linear calibration
- Comparison of check standard technique and propagation of error
Control of instrument calibration
- Control chart for linear calibration
- Critical values of t* statistic