Defining Our Terms
As defined in the ANSI/ISA–51.1–1979 (R1993) standard entitled Process Instrumentation Terminology.
“Accuracy” is the degree of conformity of an indicated value to a recognized accepted standard value, or ideal value.
“Accuracy rating” is a number or quantity that defines a limit that errors will not exceed when a device is used under specified operating conditions. Accuracy rating includes the combined effects of conformity, hysteresis, dead band, and repeatability errors.
Expressing Accuracy Rating
Two common methods of expressing accuracy rating are
- Percent of scale length (percent of full scale)
- Percent of actual output reading
Traditionally, expressing accuracy rating with percent of full scale has been used with analog instruments. On legacy Interscan Vikane® monitors, accuracy rating is ±2% of full scale. Since the full scale range is 0-50 ppm, any reading is accurate ±1 ppm.
With digital instruments, it is more common to express accuracy rating with percent of reading, often adding the inherent error of the least significant digit. Thus, one might encounter the specification of ±2% of reading ±1 least significant digit.
In such a case, if the digital range were 0-50 ppm, at a reading of 1 ppm, the accuracy would be 1 ppm ± 0.02 ppm ±1 ppm (meaning that the true value could be between 0 – 2.02 ppm).
If the digital range were 0-50.0 ppm, at a reading of 1 ppm, the accuracy would be 1 ppm ± 0.02 ppm ± 0.1 ppm (meaning that the true value could be between 0.88 – 1.12 ppm).
The Influence of the Calibration Standard
However, in gas detection, most instruments, including all instruments used to detect Vikane®, must be calibrated against a known standard. Thus, these instruments are reference methods, rather than absolute methods.
By all rights, the accuracy of the calibration standard should be taken into account when discussing the accuracy of a gas detection instrument, but in practice, this is done more by implication than directly. An instrument manufacturer may reveal in some footnote that measurement accuracy is limited to the accuracy of the calibration standard, but then proceed as if this does not really matter.
In other words, although a disclaimer may be presented, all accuracy specs will deal with inherent matters of the instrument only. In fact, the error in calibration standard accuracy would be additive, and it is likely that this would add another ±2 percent to the mix. Fortunately, with the mandated 5 ppm calibration standard, this yields an additional error of only 0.1 ppm.
But, the 5 ppm calibration standard improves accuracy in one other way:
All other things being equal (but, we will find that they are NOT) it is considered best practice to calibrate a gas analyzer at somewhere between 50-85% of the full-scale value. That is why a 40 ppm standard was long used for the Vikane® monitor.
However, it is also considered best practice to calibrate at a value reasonably close to the levels at which you will be measuring. Hence, the introduction of the 5 ppm calibration standard. Note that even if a 1 ppm standard were available, it would be unwise to calibrate an instrument so close to the bottom of its range.
Consider that the ±1 ppm accuracy spec was based on a full scale range of 0-50 ppm. Arguably, since the majority of clearance measurements will be made in the range of 0-5 ppm, and the unit will be calibrated with a 5 ppm standard, some allowance should be made for this compression of scale.
By conventional reasoning, a true 0-5 ppm range instrument would have an accuracy of ±0.1 ppm. And while one cannot hold that our 0-50 ppm unit, pressed into service as a quasi 0-5 ppm unit—by virtue of the new calibration standard—is a true 0-5 ppm instrument, some accuracy benefit should still ensue in this very special case.
It All Comes Down To This
A very conservative approach would be to average the two ranges, giving a “virtual” measuring range of 0-27.5 ppm. As such, the accuracy would be ±0.55 ppm—a significant improvement.
It is stipulated that some analytical purists may take issue with our “virtual” measuring range argument, but then analytical purists—happily ensconced in their laboratories—do not have to clear structures, subject to extremely demanding environmental regulations.