Eddy Current
Senior Member
Which is more accurate Digital or Analog Meters?
That depends on what is being measured. I've used high quality, in-spec, DVMs to measure (specifically generated as an exhibit) AC voltage and seen over a 30% error of __real RMS value__
Which is more accurate Digital or Analog Meters?
Most of the time with servicing you don't care if you have 120 volts or if you have 118.34 volts, you are just concerned that it is close to 120.
We did a show-n-tell on how peak, average, and true RMS meters displayed various signals; this one happened to be a 1% pulse train. AC cal for some meters is done with sine waves. We also used square and triangle waves from function generators.If it had a 30% error how would it pass calibration?
And you can see trends. As an engineering co-op student in the 60's working with analog radios in our final test, some of the guys thought they wanted one of them newfangled digital meters ... when you are adjusting to a peak or null, analog is (OPINION MODE) much better. It probably took 3 or 4 times as long with a digital meter.Analog meters have one big advantage over digital, that is the response time. You will be able to see rapid fluctuations on an analog meter that a digital can't.
Plus you have the factor of misreading an analog meter, you have to look stright at it to get correct value (The term for that is on the tip of my tounge but can't recall) so I would have to say digital is more accurate.
Edit: Paralax error is the term I was looking for
Analog panel meters have an inherent advantage for an operator to quickly know the status. If the digital ammeter display is reading 150 amps, the operator has to read the number and relate that to the motor full load amps of 75A. With a properly ranged analog meter the operator knows the motor is in trouble with a quick glance at the pegged needle. As he is walking up to the panel, he can see the needle pegged long before he can interpret the numbers on the digital display.
Which is more accurate Digital or Analog Meters?
With respect, I disagree.Accuracy is determined by certification.
There a couple of other points you miss here.Given the the same level of accuracy, and certified to be accurate within those limits, than it would not matter.
The main difference I see, is that analog need to be rechecked periodically, where as digital is digital.
That's why some meters had a mirror strip across the dial. If you were looking straight at the meter, the reflection of the needle would be hidden behind the needle.
Probably faster but not usually seamlessly.I do not care for digital meters without a bar graph. The bar graph can respond as fast as the analog.
Analog panel meters have an inherent advantage for an operator to quickly know the status. If the digital ammeter display is reading 150 amps, the operator has to read the number and relate that to the motor full load amps of 75A. With a properly ranged analog meter the operator knows the motor is in trouble with a quick glance at the pegged needle. As he is walking up to the panel, he can see the needle pegged long before he can interpret the numbers on the digital display.