Re: Analog Meters
First, a bit of basic theory for any readers who may not be familiar with voltmeter history.
A voltmeter connected to a circuit is in parallel with the load, and acts like an additional load. The more current a voltmeter draws from the circuit under test, the more the measured voltage will "sag" under the loading effect of the meter.
The first high impedance voltmeter, the VTVM (vacuum tube volt meter) was developed for use on electronic circuits. The old basic analog meter movement required up to a milliamp to drive the pointer to full scale. If this was used to measure voltage in an electronic circuit where the normal current was only 1 mA, it would double the load the source must deliver, pulling down the source voltage.
The modern digital voltmeter has a very high input impedance, meaning that it draws
almost zero current from the circuit under test. This way, there will be little, if any, "impact" on the circuit as the voltage is being measured.
In other words, it takes a very tiny current input to the meter to operate the display.
There is no need for such a meter for use on building wiring circuits. The old 1000 ohm-per-volt analog meter actually does a better job.
please explain how high input impedance results in "false" voltage readings. That needs explaining.
This is how I view it.
A radio amplifies the energy induced into the antenna, to a value high enough to drive a speaker or headphones.
Think of a building wiring system as an antenna, and your high-impedance digital meter as an amplifier (which it is) that can amplify the induced energy to a value high enough to produce a reading, even if the circuit supply is shut off.
Ed
[ March 29, 2004, 10:22 PM: Message edited by: Ed MacLaren ]