GeorgeB
ElectroHydraulics engineer (retired)
- Location
- Greenville SC
- Occupation
- Retired
A customer and good friend of mine makes equipment for the steel industry. One of his systems uses resistive heaters, high power, in a 2400F furnace.
His element manufacturer had some quality control issues, and to perform incoming inspection, he uses an AC welder, Lincoln AC-225-S which has 100% duty cycle at one setting (often used, but not by him, to thaw pipes).
A year or so back he decided to quantify his inspection. The data are puzzling.
Heater has 30.3 (+/-0.1V) volts across it, measured with 4 different "true RMS" meters, 3 Fluke, 1 Oriental something. Note that the operating supply is 277V so this monster just gets a lot warm in free air during receiving inspection. The element is designed for negligible temperature coefficient of resistance. It's not changing NOTICEABLY during the measurement.
Leads have 60 amps (+/- 0.2A) thru them, measured with 4 different clamp meters.
Heater has 0.450 ohms resistance; his specification to the element manufacturer is 0.44 to 0.46. This is approximately measured by 2 of the Fluke meters and for more resolution, a 4 wire low resistance meter. He has a 0.2% 0.565 ohm precision resistor he uses to check his meters.
So what's the question? R=E/I, right (DC or resistive AC circuit). Run that calculation, and we get 0.505 ohms, far different (over 10%) from the 0.45 measured with DC instrumentation. Obviously the transformer welder is quite inductive. Is that causing my readings issue?
It's been 50 years since I studied this in EE school and I never used it afterwards ... all I remember is that voltage and current are in phase in a resistor. But there is just 1 loop containing a simplified ideal source, ideal inductor, and ideal resistor. I know that if I put a 1 "ohm" capacitor in series with a 1 ohm resistor across an AC source, there will be something different than 2 resistors (a lossy voltage divider) or 2 capacitors (a lossless voltage divider).
@gar or anyone else, shed your light on my ignorance please.
His element manufacturer had some quality control issues, and to perform incoming inspection, he uses an AC welder, Lincoln AC-225-S which has 100% duty cycle at one setting (often used, but not by him, to thaw pipes).
A year or so back he decided to quantify his inspection. The data are puzzling.
Heater has 30.3 (+/-0.1V) volts across it, measured with 4 different "true RMS" meters, 3 Fluke, 1 Oriental something. Note that the operating supply is 277V so this monster just gets a lot warm in free air during receiving inspection. The element is designed for negligible temperature coefficient of resistance. It's not changing NOTICEABLY during the measurement.
Leads have 60 amps (+/- 0.2A) thru them, measured with 4 different clamp meters.
Heater has 0.450 ohms resistance; his specification to the element manufacturer is 0.44 to 0.46. This is approximately measured by 2 of the Fluke meters and for more resolution, a 4 wire low resistance meter. He has a 0.2% 0.565 ohm precision resistor he uses to check his meters.
So what's the question? R=E/I, right (DC or resistive AC circuit). Run that calculation, and we get 0.505 ohms, far different (over 10%) from the 0.45 measured with DC instrumentation. Obviously the transformer welder is quite inductive. Is that causing my readings issue?
It's been 50 years since I studied this in EE school and I never used it afterwards ... all I remember is that voltage and current are in phase in a resistor. But there is just 1 loop containing a simplified ideal source, ideal inductor, and ideal resistor. I know that if I put a 1 "ohm" capacitor in series with a 1 ohm resistor across an AC source, there will be something different than 2 resistors (a lossy voltage divider) or 2 capacitors (a lossless voltage divider).
@gar or anyone else, shed your light on my ignorance please.