Saving Power with Lower Amprage?

Status
Not open for further replies.

richinsc

New member
I have a question in terms of amperage. I know that when dealing with batteries you calculate run time in amp hours. Thus in theory and my thought process if you take an 800W Computer. Not saying that it will actually run at 800 Watts all the time! Anyway if you take an 800W Computer you get the following amperge calculations.

800W / 110V = 7.27A (800 Watts power on 110 volt circuit.)
800W / 230V = 3.47A (800 Watts Power on 230 volt circuit.)

So in theory if you run a computer system at a higher voltage while running on battery power you would be able to run the computer longer at 230V then you would be able to at a 110V.

My question is if you save on battery power, if you were to apply the same principal to power consumption and electrical bill calculation wouldn't you save money running all your consumer electronics at 230V instead of 110V?

But of course power bills are calculated in kWh. Which means regardless of the voltage you are still drawing the same wattage. But I am thinking that it should still save power running the higher voltage. The reason I ask this question is because I have several computer servers that are currently running on 110V but thought of switching to 230V. Please correct me if my thought process is wrong.
 
Your servers require a certain amount of power (watts) to run, that does not change regardless of voltage.

This applies if your supplying from a battery or the utility.
 
If you double the voltage of the system, keeping the same load power consumption (thus halving the load current) and keep the same amp-hour rating in your battery pack, then you will double your run time...but the battery pack will have twice as many cells and thus be twice the size. No energy saved, aside from losses in the internal resistance of the battery pack.

You see the same issue when supplying same power loads at 120V versus 240V. The total power consumed remains the same aside from small losses in the wiring itself.

-Jon
 
OK this is up my area of expertise. First you cannot compare DC battery power to AC power with respect to efficiency. Allow me to explain.

IF you are running battery power typically means you are running low voltages like 12, 24, 48, or 96, and up to 550. So take a small UPS running at 12 volts DC with a 800 watt load, convert to 120 VAC. Well at the DC level translates to more than 800 watts because of conversion efficiency and a good number to use is plus 10% or 880 watts so at 12 volts is 74 amps. Wow how big of cable do you need to safely pass 74 amps. Well NEC is pretty much useless because you can get away with #4 AWG. However if you have any kind of distance involved you are going to develop a lop of voltage drop and power loss, so you would be required to up size the cable to keep voltage drop at 2% or less operating at 12 volts. Move up to say 170 volt UPS and convert to 120 AC, now you are at 5 amps, can use a #12 AWG for very long distance with very little voltage drop and power loss.

So the answer to your question is efficiency, and higher voltages are more efficient at transporting power from point A to B, and less installation expense because you can use smaller conductors. We have not quite learned that here in the USA, like are UK cousins who operate at 220 to 240. But we are stuck with are decision.

FWIW a well designed data center would use a 3-phase dual conversion UPS, with high voltage DC around 500 volts, and invert to 480 volt delta, fed to a PDU that steps down to 208/120 for branch circuit distribution.
 
I completely agree with the previous posts.
In addition, there are many white papers out on the web for data center energy efficiency.
One that I like includes the elimination of one of the transformation steps (at the PDU level to 208V) and distributing at 415/240V three phase http://www.emerson.com/edc/docs/Energy_Logic_Reducing_Data_Center_Energy_Consumption.pdf
It is using an idea from European voltages here in the US, since the server power supplies can handle higher voltages.
 
So the answer to your question is efficiency, and higher voltages are more efficient at transporting power from point A to B, and less installation expense because you can use smaller conductors. We have not quite learned that here in the USA, like are UK cousins who operate at 220 to 240. But we are stuck with are decision.
One of the chief merits of using a higher voltage is, as you say, smaller conductors can be used. But that isn't all of it.
Any other hardware that is rated by current (switchgear, contactors, fusegear, VSDs) are generally smaller and cost less.
 
I completely agree with the previous posts.
In addition, there are many white papers out on the web for data center energy efficiency.
One that I like includes the elimination of one of the transformation steps (at the PDU level to 208V) and distributing at 415/240V three phase http://www.emerson.com/edc/docs/Energy_Logic_Reducing_Data_Center_Energy_Consumption.pdf
It is using an idea from European voltages here in the US, since the server power supplies can handle higher voltages.
Ron then I assume the client would need to option thier equipment to operate at 240? I like that idea.
 
Status
Not open for further replies.
Top