120208-2055 ESt
ArchieMedes:
With reference to the above highlighted text, how come the RMS reads higher than the average meter? What is the theory behind?
An AC meter, such as a Simpson 260 or Fluke 27, is based upon full wave rectification of the AC input resulting in a DC output with some average DC value. For a sine wave signal with no DC component the average value is 0.636 times the peak voltage of the sine wave. If a half wave rectifier was used, then the average value would be 0.318 times the peak voltage. In essence this is the area under the curve. The easy way of getting a very accurate value for this constant is with calculus.
If you look at the definition of an RMS measurement from a mathematical point of view and then relate this to power in an electrical circuit you find that RMS measurements can be related to power in an AC or DC circuit like you would calculate power in a DC circuit.
Fundamentally an RMS measurement is the square of a variable, determination of its mean (average) value, then obtaining the square root of that average.
The average reading meter gets the average value of the absolute value (full wave rectified) of the signal multiplied by 1.11 and displays it.
The RMS reading meter multiplies the input signal by itself producing a non-negative result. This result is averaged, and the square root of that is obtained and displayed.
In one case you simply average the measurement. and in the other measurement the output is the square root of the average of the squared value of the signal.
When you look at different input signal waveforms and process them with the two different methods you will find some cases where the results are quite different. Yours being one illustration.
.