090611-2050 EST

The correct unit for torque is #-feet, #-in, NM, etc. and NOT foot-#. The difference in names was to distinguish between two different animals.

The torque wrench manufacturers that label their wrenches in foot-# are mislabeling their product. The difference in names for torque and work were emphasized to me way back in high school physics a long time ago. Even in various references that predate my high school physics by many years I can show the correct usage of #-ft for torque. To a large extent the torque wrench manufacturers can be blamed for this common labeling and usage problem.

I have some additional comments on VA. Separately the voltage and current should be the RMS values. Really the important part is that the current be the RMS value. It makes sense to also nominally measure the RMS voltage. That V and A of VA are RMS measurements does not mean that VA is RMS, and so one should not say RMS VA. One could say VA based on RMS values of voltage and current. One could use non-RMS values, but this would not be very useful. The normal assumption should be that VA is derived from the RMS measurement of V and of A.

Why is the RMS value of importance? This is because the RMS value of the current is related to the heating of a resistance (wire) and therefore the temperature rise.

If there is little distortion in the voltage waveform then a voltage measurement with a Simpson 260 or a Fluke non-RMS meter is still a very good estimate of the RMS voltage. These instruments perform a full-wave average measurement of the AC signal, and use a meter scale calibrated to provide the RMS value of a sine wave. Put a distorted waveform into these instruments and the reading may be quite different than the true RMS value. The ratio between RMS and full-wave average for a sine wave is 0.707/0.636. If you used a full wave bridge rectifier and a DC meter, then the DC reading needs to be multiplied by 1.112 to obtain the RMS value.

Why are devices like transformers rated in VA? This is because a larger percentage of the power dissipated in the transformer is from I^2*R power loss. So if a transformer can tolerate 100 A input current for its full rated load, then it does not matter what the phase relationship between the input voltage and the current is, the internal power dissipation in the transformer is the same. Thus, a purely capacitive load on this transformer that produces an input current of 100 A and a power factor of approximately 0 produces as much transformer heating as does a pure resistive load of the same input current but a power factor of 1.

To clarify: What does RMS mean? The RMS value of something (a variable) is obtained by taking the square of the instantaneous value of a variable, averaging this over a period of time, and taking the square-root of the average. In general this is done on a variable that has a periodic variation, is statistically stationary, and the calculation is performed over an integral number of cycles, or a very large number of cycles so that starting and ending points for the average duration do not contribute significant error.

Power factor is always defined as the ratio of POWER to VA no matter what are the waveform distortions. With distorted waveforms the cos of the angle of the voltage and current zero crossings may not equal the power factor.

An unloaded transformer will have a low power factor and this is a case where cos won't work well to define power factor because of the severe distortion of the excitation current. This same transformer at full load will have an input power factor very close to the power factor of the load on the transformer. The contribution to the input current of the magnetizing current to the transformer core may be in the 1% range.

A good discussion on power factor can be found at

http://en.wikipedia.org/wiki/Power_factor
.