Using different devices to measure kW, we are getting different results. The scenario involves measurement using different power quality analyzers and utility metering equipment, and power factor correction devices which incorporate inductors in series with capacitors (allowing the device to be placed near inductive loads). The device connects in parallel via a disconnect.
When connected, we consistently see the following per unit: a drop in amperage of 14-16 amps; an increase in voltage by .1 to .4%; and a reduction in reactive current by 22 kVAR. We also see a reduction in kW by 1.7 to 2.2 but only when measured with certain equipment. An Amprobe DM-II Pro for example will show the drop in kW. Also, sometimes the utility?s metering equipment shows kW demand in real time, and will show the drop in kW. In switching units on and off, we don?t see any significant change in kW using certain other devices including a Fluke 435 and Amprobe DM-III, as well as analyzers from Dranetz-BMI and AEMC (each of which measure harmonics).
Without sparking a debate about power factor correction, and real and apparent power, what could be the reason for the differences in measuring kW?
When connected, we consistently see the following per unit: a drop in amperage of 14-16 amps; an increase in voltage by .1 to .4%; and a reduction in reactive current by 22 kVAR. We also see a reduction in kW by 1.7 to 2.2 but only when measured with certain equipment. An Amprobe DM-II Pro for example will show the drop in kW. Also, sometimes the utility?s metering equipment shows kW demand in real time, and will show the drop in kW. In switching units on and off, we don?t see any significant change in kW using certain other devices including a Fluke 435 and Amprobe DM-III, as well as analyzers from Dranetz-BMI and AEMC (each of which measure harmonics).
Without sparking a debate about power factor correction, and real and apparent power, what could be the reason for the differences in measuring kW?