Battery charger location and conductor sizing

Status
Not open for further replies.
I am in the military and newly posted to a shop that performs generator and UPS maintenance on our base. I've noticed that quite a few of our installations were preformed by personnel that were posted in our shop and not by the dedicated installation unit. This has resulted in some minor mistakes in wiring devices such as battery chargers. I want to effectively explain to our new apprentice's what the overall outcome can be in different charging situations with the current setup.

A little background of the charger setup. It is mounted inside a Cummins transfer switch, a harness is used to connect the output terminals of the charger to a terminal block within the switch. The terminal connectors max conductor size is 10 AWG. Connection is made on the starter for the generator.

So in normal operation our 15 amp 12 volt charger will float and output .5 - 1.5 amps of current. During this time the temp rise on the conductor is minimal and the voltage drop at the end of the conductor falls within a specification which allows the unit to function correctly.

Generator fails to start after 3 cranking cycles and the batteries are now drained. The charger kicks on and outputs an amperage between 6 and 12 amps. This causes the conductors to heat creating a larger voltage drop.

With the conductors undersized I can see this going two ways.

1. The voltage at the terminal is lower changing the flow of current and you get an output that still charges your battery but not fast enough to meet some NFPA110 applications; and
2. The voltage drop is so low that the supply voltage is lower than the battery output causing the unit to fail.
a. tripping the battery charger built in DC breaker; or
b. causes the unit to overheat and fail completely.

Is there anything I am missing in my failure explanation or can anyone explain in more detail what will happen in 2? Maybe for 2a the heat of the conductor 2b I'm not quite sure how to explain ie. backwards current draw from the battery in which case a diode within the unit would solve.

Possible solutions I will give them are:
1. 10 AWG conductor to a newly mounted terminal block transitioning to a larger conductor. Upsize the conduit and connect the larger +'ve and -'ve conductor to the starter;
2. Install a smaller battery charger with a max output low enough to satisfy wiring with 10AWG as long as it meets a the level of generator application; and
3. Supply the generator unit with a new run of AC and mount a charger closer to the unit shortening the DC run allowing for a 10A charge.

Any input on this would be great, the more angles I can see the more ways I can lead a discussion on this topic.
 
One very quick comment.
You state that the 6 to 12 amps of charging current drawn by the batteries might cause so great a voltage drop that the voltage becomes too low to charge the battery.

You cannot have it both ways (Voltage drop and no current or reverse current.)

What will actually happen is that just enough current will flow to cause just enough voltage drop that the remaining voltage will cause exactly that current to flow into the battery. One half the current, one half the voltage drop.

Now if there are loads on the DC other than just the charging current into the battery (12 volt lights for example, or the ignition circuit of a gasoline engine) then those other loads could drop the voltage to the point where the battery would not charge.

A bigger concern is that the engine might not be left running long enough to fully charge the battery.
 
Charging current is set at 10% of battery capacity. As you state battery charger current is 10A, your battery capacity is probably 100AH. If the charging voltage is lower than battery terminal voltage, no battery charging takes place. So choose the size of the battery charger output wire in such a that 10A current flows.
 
So I am assuming correctly the voltage drop and current draw will find a point where no current or some current will flow to the battery to charge it.

If there are other loads such as DC lights they will run off of the battery or battery charger. If battery charging is needed the two loads will decide which way the current will go by resistance.

All of our or I am willing to say every engine we have has a charging alternator attached to the drive belt. So as long as that alternator is performing at optimal condition the batteries will be charged at a sufficient rate and the battery charger will assume a float voltage rather than equalize charge once the generator switches from providing load to waiting for a signal to start.
 
Shaib

Shaib

So lets assume that the charger uses 14.2V and the battery is at 10V.

using the VD = 2pI(L/A) at 125 ft 10AWG with 10 amps flowing. what would be the outcome?

I use the below formula for sizing battery chargers.

1.2*battery Amp Hours divided by required charging hours.
 
That formula assumes that the battery must be recharged from complete discharge (0% SOC)
that is rarely the case for batteries which are intended to last a large number of cycles, but is a worst case infrequent occurrence for a UPS.

Tapatalk!
 
So lets assume that the charger uses 14.2V and the battery is at 10V.

using the VD = 2pI(L/A) at 125 ft 10AWG with 10 amps flowing. what would be the outcome?
It is not clear what p, L mean in your formula. Let us take the voltage of the discharged battery as 9V (check it up)(when the battery is fully charged it becomes, say 10V). So the difference 14.2-9=5.2V is the voltage drop in the cable. As the battery receives more and more charge from the charger, its voltage improves to 10V and the charging current drops and so the voltage drop in the wires.
So choose the wire size with resistance in such a way it causes a voltage drop of 5.2V.
I use the below formula for sizing battery chargers.

1.2*battery Amp Hours divided by required charging hours.
If the charging hours is same i.e 10hours as in my 10% case, you use higher size of the charger than is required.
 
Last edited:
OK I can help you out here, I have worked battery plants all my career. For low voltage we do not use the minimum size cables, we design by voltage drop, and for 12 volt systems you are looking at 1 to 2 % which at 12 volts is only .12 to .24 volts so there is no much to work with. After years of data collecting to find the conductor size the formula is [22.2 x Imax x 1-way wire distance] / Vd = Cm

Where
Cm Circular mills of copper conductor needed.

Next point you do not have to worry about the charge current. The charge current is controlled by the battery charger and will go into current limit. Current limit is the chargers maximum charge current rating. Only thing you have to worry about is keeping the voltage drop to a minimum.

So for example let's say the charger is rated 15 amps, 12 volt system, 1-way conductor distance is 30 feet, and we want to limit to 2% voltage drop. [22.2 x 15 amps x 30 feet] / .24 = 41625 cm. Cross reference that and you need a #4 AWG copper conductor to get the job done. NEC says all you need is 14 AWG.

As for charge rates that depends on the battery type and how fast you want to completely recharge a dead battery. There is no point in discussing in detail. But as a general rule of thumb you want your charger sized to provide a minimum C/12 and maximum of C/8 where C = the battery Amp Hour rating. C/10 is the sweet spot, so for a 100 AH battery that is 10 amps. FWIW this is a SLI aka cranking battery and they can take very high charge rates of up to 1C
 
Last edited:
Status
Not open for further replies.
Top