Next Page Previous Page Home Tools & Aids Search Handbook
2. Measurement Process Characterization
2.4. Gauge R & R studies
2.4.5. Analysis of bias


Definition Drift can be defined (VIM) as a slow change in the response of a gauge.
Instruments used as comparators for calibration Short-term drift can be a problem for comparator measurements. The cause is frequently heat build-up in the instrument during the time of measurement. It would be difficult, and probably unproductive, to try to pinpoint the extent of such drift with a gauge study. The simplest solution is to use drift-free designs for collecting calibration data. These designs mitigate the effect of linear drift on the results.

Long-term drift should not be a problem for comparator measurements because such drift would be constant during a calibration design and would cancel in the difference measurements.

Instruments corrected by linear calibration For instruments whose readings are corrected by a linear calibration line, drift can be detected using a control chart technique and measurements on three or more check standards.
Drift in direct reading instruments and uncertainty analysis For other instruments, measurements can be made on a daily basis on two or more check standards over a preset time period, say, one month. These measurements are plotted on a time scale to determine the extent and nature of any drift. Drift rarely continues unabated at the same rate and in the same direction for a long time period.

Thus, the expectation from such an experiment is to document the maximum change that is likely to occur during a set time period and plan adjustments to the instrument accordingly. A further impact of the findings is that uncorrected drift is treated as a type A component in the uncertainty analysis.

Home Tools & Aids Search Handbook Previous Page Next Page