I need to calculate the AC power required to serve telecommunications equipment. The data I have on the equipment is given with its DC value (i.e. 40A at -48VDC). How can I convert this DC load to AC (say 240VAC single phase)?
If you are asking what amount of AC power is the same as the DC you have described, then that is simple. Power is power. 40 amps times 48 volts (we can safely ignore the negative sign) equals 1920 watts. Now divide that by 240 VAC and you get 8 amps. But I think that what you really need to know is what type of power converter you need to accomplish your desired conversion. I can't help you with that, as electronics and I do not get alone.
Actually, your first response was the one I was looking for. I've seen the formula AC=DC/.636 tossed around--Is this perhaps used for equating AC and DC voltages (not power)?
I tried to play around with the "0.636" number, without success. I can't figure out its origin. It does not appear to help in the conversion of DC power to AC power, for any of the commonly used voltage levels (12 VDC, 24 VDC, 48 VDC, 120 VAC, 208 VAC,240 VAC, etc.).
My mistake, .636 is the average value of a "rectified" sine wave. The average of a sine wave would be zero, since the negative half equals the positive half.
I did notice earlier that 0.636 was approximately equal to 2 divided by pi. I didn't think it was worth mentioning. But based on steve66's statement, I just did the integration, and confirmed that the average value of a rectified sine wave is 2/pi. But I still don't see how anyone would say that to convert DC power to AC power, you divide by that number. I think the "rule of thumb" that msteiner had been told is nonsense.
Reading between the lines, the question seems to be, "How much alternating current at 240V is required to operate a 40A, 48Vdc telephone power supply?"
If that is the case and assuming 75% efficiency,
Irms = 40Adc x 48Vdc/(240Vrms x 0.75) = 11A
Sounds like a 20A, 240V circuit to me.
One would have to see the specs on the equipment to be absolutely sure though.
Lets assume the voltage is full wave rectified. Next we consider the input to the DC filter. If it is a parallel capacitor, the capacitor will charge to the peak input voltage = 240V / .707 = 340 volts. That will be our DC output voltage.
If the input is a series inductor, then the DC output voltage will be the average of the input voltage = 240V * .636 = 152 volts.
Does this make any difference?? Probably not with all else being equal, and assuming they aren't using any series voltage regulators. Thats probably where the 0.636 popped up, but with Rattus' calculation, we don't even need to know the input voltage.
Steve, I believe the average factor should be applied to Vpeak, not Vrms. None of these factors is germane to the problem though. The supply most likely contains a transformer, rectifier, regulator, and backup batteries. It is more than a simple rectifier circuit.
240V sounds like the ideal input voltage. Assuming the unit could be wired for 120V, the input current would double to 22Arms.
I ain't built a lot of big current regulating stuff. Paralelling transistors is a real challenge. You have to put resistances in series otherwise one'll turn on and just short across the other eliminating it from the circuit.