What formula to use to determine battery life

Status
Not open for further replies.

tonyou812

Senior Member
Location
North New Jersey
i want to determine how long a 120 volt battery backup will last if it is using a 12 volt battery and a power inverter. the load will be a computer cpu and screen maybey 80-100 watts max. what is the formula for battery life?it is a small ups the battery says 10ah/20hr
 
Last edited:
Tony:

It depends on FLA of load, type and model of battery and efficency of inverter.


Is this a 120 VAC battery comprised of 10-12 VDC sealed lead acid batteries or a small UPS with a single 12 VDC battery for a 120 VAC load?
 
Efficiency:

Efficiency:

tonyou812 said:
what if the load on the inverter is 150 watts at 120 volts. Do i divide the wattage by 12 volts which will give me 12 amps then what?

You must know the efficiency of the inverter, say 80%. Then,

Pin = Pout/eff = 150W/0.8 = 188W

Iin = Pin/12V = 188W/12V = 15.6Adc

That is about 3 hours for a 50AH battery.
 
Rattus' approximation is probably as good as you will be able to get given the quality of information available on the hardware that you are using.

The capacity of a battery is generally rated in terms of volts and amp-hours. In theory, a 1 amp hour battery can supply a current of 1 amp for 1 hour. So if you have a load of 15 amps and a capacity of 45 amp hours, then in theory you have 3 hours of run time.

Unfortunately it gets more complex than this. As a battery discharges, its output voltage declines. Similarly, under increasing load the output voltage will drop. The amp hour measurement is made at a specified rate of discharge until the output voltage reaches a specified threshold. The discharge rate and the voltage threshold will be different for different battery types, and different when different manufacturers want to inflate different numbers.

In particular, simple lead acid batteries are often rated at the 'C/20' rate. This means that they are measured using a discharge that will deplete them in _20_ hours. By definition of amp-hours, running 2.5 amps for 20 hours means '50 amp hours'. But in reality, if a particular battery could supply 2.5A for 20 hours, it can probably supply 20A for perhaps 1 or 1.5 hours. This apparent reduction in capacity is the energy lost to the internal resistance of the battery, combined with energy still in the battery at the end of discharge, but not used because the discharge voltage has fallen too much.

If your inverter is well regulated, then as the battery voltage drops, the current drawn from the battery will _increase_, so that the same power is delivered to the load.

So your first step is to figure out the current that the inverter will draw, as described by rattus.

Then you need to figure out how the battery will tolerate that loading, by comparing the battery amp-hour rating to the calculated load.

Use manufacturer information to figure out the expected capacity at that load.

Then _approximately_ the amp-hour rating of the battery at the expected load, divide by the load current, will give you the expected run time.

-Jon
 
Jon,

I was going to write a nice long-winded response, then thought the better of it. Your post pretty much covers everything I can think of.

By and large, for off-the-shelf UPSes, the run time should be thought of in terms of shutting everything down. Most UPSes have runtimes at rated capacity that are measured in single digits worth of minutes. This is made worse because they are often rated in "Volt-Amps" in advertising literature, with the watts given in the fine print. To get a UPS that would support a 600w server, I went with a Belkin 1100va UPS. It will run about 20 minutes at a typical load, but if I go to shut that machine down more than 5 or 10 minutes into being on batteries, the load becomes so great that it will lose power before it shuts down. That's very typical of off-the-shelf UPSes, and a UPS with a 10ah battery falls into that category.

The reason for this, as you explain, is that capacity is rated using a discharge rate that is measured in hours, often somewhere between 5 and 20 hours. Assuming the best, a 300va load being supported by a 12v 10ah battery has a runtime of 24 minutes, or 0.4 hours. That's more than 10 times faster than anyones rated capacity's discharge rate.

In the post I didn't submit, my opening comment was that the OP should determine the run time experimentally, then determine whether that is sufficient and resize the UPS accordingly.
 
Here the sample calculation for 30 KVA inverter


UPS rating 30 KVA (Specified)
Load Power Factor 0.8 (Specified)
Min. DC Voltage 357 V (Specified)
Max. DC Voltage 500 V (Specified)
Inverter Efficiency 0.93 (Considered)
End Cell Voltage ECV 1.75 V/Cell (Specified)
Back Up Time 15 mins. (Specified)

No. of Cells Min. DCV 357
ECV 1.75
204
No. of Cells 204 Cells


Design Margin DM 1 (Considered)
Aging Factor AF 1 (Considered)

Max. DC Current UPS rating x 1000 x Load Power Factor
Minimum DC Voltage x Inverter Efficiency

30 x 1000 x 0.8 24000
357 x 0.93 332.01

Max DC Current 72.29 Amps

Hence AH capacity selected is for 15 minutes back up time as per graph

HOPE THE ABOVE IS HELP
 
All you can get is an approximation depending on discharge rate, and age of batteries. You can zero in a little more using temps, but here is the industry formula without considering temp and age.

T = (AH / L) x C

Where:

T = Time in hours
AH = Amp Hours the battery is rated for.
L = Load in amps
C = Discharge rate.

For “C” most batteries use 8-hours as a standard discharge rate, so therefore C = 1. However lets say you have a battery rated at 10 AH and discharge ate a rate of 5-amps. Going to the manufacture curves you would use .7 as “C”. On the other hand the same battery at a discharge rate of say .5-amps would have C = 1.1. For practical applications just use .8-to-1 as “C”. If using 1 for C the simplified formula is T = AH / L

The only accurate method is by measurement, becuase all batteries loose capacity with age, charge/discharge cycles, and temp variations.
 
Status
Not open for further replies.
Top