090227-1656 EST
zog:
Why is it shocking? What is the meaning of calibration?
From dictionary.com
http://dictionary.reference.com/browse/calibration.
1. to determine, check, or rectify the graduation of (any instrument giving quantitative measurements).
I believe Chris did his own calibration check when he found that his two meters were in substantial disagreement.
For my needs and general use of different meters my calibration check is their relationship to each other and a confidence level associated with each. Some instruments and references that I have had for a long time are quite stable relative to my needs. Some are not. One that has not maintained accurate calibration is a Simpson 880 true RMS meter. This uses no ferromagnetic material and no permanent magnets. Should be very stable. Yet it reads 134 V with 121 V input. By comparison a 1974 Simpson 270 reads 9.90 V DC with 9.92 V applied.
Trying to find the original accuracy spec on the Simpson 880 I found the following site, but no information on an 880 voltmeter:
http://www.slack.com/temodel.html
Very slow to load.
I have a Ballantine 420 calibrator, circa 1962. It is labeled as having an accuracy of 0.5%. Maximum DC output is 10 V and its reference is an OA2. That is a gas discharge voltage regulator tube. There are a number of other vacuum tubes used to generate the 1 kHz RMS signal. I do not know whether the AC signal was referenced back to the OA2 or used something else. In the life of this instrument it has had negligible use. Really have had little need. In 1964 I built a small standard DC reference based on a temperature compensated Zener. This was calibrated to the Ballantine. Today, 45 years later, my homemade reference is 6 MV higher than the Ballantine at 10 V. that is 0.06% difference. My Fluke 27 reads 9.99 V on the homemade reference 10 V output, 0.1% difference.
That an OA2 reference is this unchanged over these many years is quite unexpected.
Do either of these two references get used? No. They are more of a curiosity now.
How much absolute accuracy does one need? Obviously it is a function of needs. In many cases resolution and noise level are more important than absolute accuracy. Being monotonic and having good linearity also may be more important than absolute accuracy.
Most electricians do not normally need 0.1% accuracy. I suspect that most electricians have more than one meter and a correlation between their meters gives them a reasonable estimate of meter condition. The good Fluke meters generally have a stated DC accuracy of 0.1%. This is likely to be quite stable over time.
I think there are more unknowns where the user of a meter does not know its limitations. For example a so called true RMS meter that has an input capacitor is used to measure a chopped DC signal and expects to get the RMS value of that signal which happens to be 0.707 of the peak-to-peak amplitude for a square wave, whereas the RMS of the same waveform with the DC component removed is 0.5 of the peak-to-peak amplitude.
For many applications I suspect that a formal calibration check by a certified lab is probably unnecessary vs comparison with other meters of known quality and condition. I am not surprised that most meters are not sent out for formal calibration checks. Much of the time it is probably an unnecessary cost.
A new meter from a quality source is probably accurate to its advertized specification and may be a more cost effective solution than having a meter repaired.
One type of instrument that needs constant calibration checks is a 50 or 100 #-in mechanical dial indicator type of hand torque wrench. These are delicate and get abused on a production line, and often have large errors. Beyond this they are improperly applied much of the time relative to measuring pinion preload drag torque.
.