Transformer overload capacity vs voltage

Status
Not open for further replies.
Lots of good papers out there on overloading transformers. Found two that had all sorts of graphs with temp vs overload vs time vs lifespan, etc. What I couldnt find is anything discussing these factors at different applied voltages. I am talking about voltage still within nominal, just say +10% vs -10%. Would voltage being at the high end of nominal buy me a little more capacity? It seems simple enough that same current at higher voltage = more KVA with same heat, but maybe there are other factors at play I am glancing over.
 

charlie b

Moderator
Staff member
Location
Lockport, IL
Occupation
Semi-Retired Electrical Engineer
If you raise the voltage on the primary side, it will cause voltage on the secondary to go up. What that will do to the secondary current depends on the type of load. If it’s motors, current will drop. If it’s resistive heaters, current will go up.

But it sounds like you are asking what happens if you raise voltage and then adjust the load as necessary to achieve the initial value of current. My answer is that you will be putting out more KVA, but (as you suggest) the heat generated in the transformer will be the same. Thus, it will not impact the transformer’s lifespan.
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
191003-1129 EDT

If load current is maintained constant, then I^2*R losses from load current remains constant, but as excitation voltage changes core losses change.

On a small 175 VA transformer with no load I read inputs of 1.8 W at 90 V, 4.2 W at 120 V, and 7.8 W at 140 V. Core losses and magnetizing current losses rapidly increase with applied voltage to the primary. Much more rapidly than a linear relationship to applied voltage. And I expect faster than the square of voltage. For example voltage ratio squared is (140/120)^2 = 1.36, but the noload input power ratio from 120 to 140 was 7.8/4.2 = 1.86 .

.
 

Hv&Lv

Senior Member
Location
-
Occupation
Engineer/Technician
Nominal +-10%.
so 108V and 132V.
Not to get too deep into it, but with those swings the transformer isn’t going to be the problem.
the transformer will outlast the components it is supplying, thus decreasing its load as time passes.
As stated above, the core losses will increase, but a transformer is rated in KVA at nominal.
raise the E, the I will increase or decrease by the load type.

To a point.
Voltage is pressure, it can only go so high before it “busts out at the seams”.
 
Last edited:
If you raise the voltage on the primary side, it will cause voltage on the secondary to go up. What that will do to the secondary current depends on the type of load. If it’s motors, current will drop. If it’s resistive heaters, current will go up.

But it sounds like you are asking what happens if you raise voltage and then adjust the load as necessary to achieve the initial value of current. My answer is that you will be putting out more KVA, but (as you suggest) the heat generated in the transformer will be the same. Thus, it will not impact the transformer’s lifespan.


Ok well I will unsecret all the details. The "load" that initiated this thought is a PV inverter. It will output the same power (actually slightly more power the higher the voltage is, but ignore that) at any voltage within its range. It has basically unity PF. The transformer is 15KVA. Current Pv system is about 14 KVA. I would like to expand system to 19 KVA. I think I can do this without upgrading the transformer. Max output doesnt happen all the time, and not at all in hot weather. Transformer will definitely be working over at times no doubt. I can adjust voltage by changing taps, so this was just part of the evaluation if higher voltage buys me a little more KVA.


191003-1129 EDT

If load current is maintained constant, then I^2*R losses from load current remains constant, but as excitation voltage changes core losses change.

On a small 175 VA transformer with no load I read inputs of 1.8 W at 90 V, 4.2 W at 120 V, and 7.8 W at 140 V. Core losses and magnetizing current losses rapidly increase with applied voltage to the primary. Much more rapidly than a linear relationship to applied voltage. And I expect faster than the square of voltage. For example voltage ratio squared is (140/120)^2 = 1.36, but the noload input power ratio from 120 to 140 was 7.8/4.2 = 1.86 ..

Interesting. I didnt think about that. I have measured the no load losses of the above transformer but didnt think to try it at different voltages. I will do that. The good and interesting thing about my setup is it is kind of "self regulating" where with low or little PV output, the voltage stays lower, but is driven up with "voltage rise" from more current output, keeping core losses lower, and raising KVA during high output (if that theory holds).


Nominal +-10%.
so 108V and 132V.
Not to get too deep into it, but with those swings the transformer isn’t going to be the problem.
the transformer will outlast the components it is supplying, thus decreasing its load as time passes.
As stated above, the core losses will increase, but a transformer is rated in KVA at nominal.
raise the E, the I will increase or decrease by the load type.

To a point.
Voltage is pressure, it can only go so high before it “busts out at the seams”.

I am not clear what exactly you are saying here Hv. I think you speaking to reduced life of the load components due to possible out of spec voltage? Now that I have clarified things a bit would you like to re-comment?
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
191003-2220 EDT

electrofelon:

If insulation is sufficiently good, and I can't imagine it is not very much better than any voltage you will apply. Like 2500 to 4000 V for a 120 V to several hundred volt transformer.

So maximum hot spot internal temperature is probably your major concern, mechanical vibration less important.

Internal temperature rise over ambient, and therefore the absolute temperature, which is what fails insulation, is roughly Trise = Power*Thermal resistance. See https://en.wikipedia.org/wiki/Thermal_resistance .

It is hard to measure maximum hot spot because it requires putting a sensor in where you think that maximum is. So it is easier to make a guess that maximum hot spot is 10 deg C greater than average temperature rise of the copper wire. Then measure wire temperature change by resistance change after the resistance stabilizes.

If you are grid tied, then your greater problem could be overvoltage trying to force power into the grid.

A transformer operated below its maximum ambient temperature rating will handle a greater load than its rating at maximum ambient rating.

.
 

Hv&Lv

Senior Member
Location
-
Occupation
Engineer/Technician
Ok well I will unsecret all the details. The "load" that initiated this thought is a PV inverter. It will output the same power (actually slightly more power the higher the voltage is, but ignore that) at any voltage within its range. It has basically unity PF. The transformer is 15KVA. Current Pv system is about 14 KVA. I would like to expand system to 19 KVA. I think I can do this without upgrading the transformer. Max output doesnt happen all the time, and not at all in hot weather. Transformer will definitely be working over at times no doubt. I can adjust voltage by changing taps, so this was just part of the evaluation if higher voltage buys me a little more KVA.




Interesting. I didnt think about that. I have measured the no load losses of the above transformer but didnt think to try it at different voltages. I will do that. The good and interesting thing about my setup is it is kind of "self regulating" where with low or little PV output, the voltage stays lower, but is driven up with "voltage rise" from more current output, keeping core losses lower, and raising KVA during high output (if that theory holds).




I am not clear what exactly you are saying here Hv. I think you speaking to reduced life of the load components due to possible out of spec voltage? Now that I have clarified things a bit would you like to re-comment?

Two kinds of losses to consider. No load and full load.
Also, oil filled or dry pack?
I believe an oil filled will definitely last longer due to its ability to dissipate heat faster.
im not sure why you want to go so high on your overvoltage. Your panels are only rated for so much output. Once they reach their output that’s it. What is adjusting your invertor voltage going to get you?
we have them here voltage matching. They max out everyday the sun shines.
these are 2 and 5MW systems...
 
Two kinds of losses to consider. No load and full load.
Also, oil filled or dry pack?

Oil
im not sure why you want to go so high on your overvoltage. Your panels are only rated for so much output. Once they reach their output that’s it. What is adjusting your invertor voltage going to get you?
we have them here voltage matching. They max out everyday the sun shines.
these are 2 and 5MW systems...

Nothing would be changed on the inverter. The voltage would be adjusted by the transformer tap settings. That is the question: does higher voltage result in higher KVA capability.
 

Hv&Lv

Senior Member
Location
-
Occupation
Engineer/Technician
Short answer, no. All the taps will do is change the TTR. The voltage will go up, kVA will stay the same.
 

Hv&Lv

Senior Member
Location
-
Occupation
Engineer/Technician
But wouldnt current be less (on the inverter side) so less heat and more KVA?

A transformer isn’t a motor.
Also, there is voltage drop in a transformer, so it is compensated in the windings to account for this voltage drop. Backfeeding it with taps will operate opposite of of what you think.
a transformer isn’t a ballast either. There aren’t any limits on the current output except the impedance. It will try to deliver whatever you ask for.
overload it, the voltage will drop due to saturation, it will heat up and blow.
you increase the PV output, the XF will try to keep up.
Whether or not you increase the voltage, it will still try to output whatever is being drawn or exported.
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
191005-1155 EDT

The issue of overloading a transformer, meaning overheating the transformer internally, is largely determined by the current flow thru the transformer. If the voltage is not excessive, then core losses are not a major factor.

Internal absolute temperature is as I said before T internal absolute = T ambient + (Internal power dissopation * Thermal resistance).

If you are at a low ambient temperature compared to the transformer rating ambient, then you can more heavily load the transformer than its rated value.

Transformers are designed by some criteria in terms of peak flux level to avoid going very far into saturation. How far into saturation you go is a function of the volt-time integral into the primary. Load current has only a small effect on core flux density. Increasing primary current tends to slightly reduce the flux density from voltage drop in the primary winding.

I can short the secondary of a transformer and raise the primary voltage up to a level that does not exceed the primary current rating and do no damage to the transformer.

I can run the transformer unloaded at up to an overvoltage that does not cause excessive heating of the transformer, and not damage it. This may need a little qualification because we now get more heat from the core itself. Which means I can not go to full rated primary current. There also may be mechanical vibration problems. I would suggest that from a steady state point of view that this will be less than double the nominal voltage rating of the transformer. I have never experimented with determining an approximate level. I think I would quit because of noise before that.

.
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
191005-1307 EDT

Hv&Lv:

I see electrofelon as having the goal to move more power from his solar system to the grid.

This means his output voltage from his inverter system has to be high enough to drive the amount of current he wants to force into the grid to accomplish this. In turn this is a function of the grid voltage at the moment of interest in time, and all impedances from his inverter to the grid.

His PV system inverter has to stay below some trip out voltage.

So the transformer has to be picked to accomplish this. Meaning turns ratio and internal impedance. Obviously with an adequate voltage rating.

The big problem that I have read about relative to PV systems is that when the grid is being backfed from the inverter is that the power company transformer is too small for peak power from the PV system, meaning high internal impedance, and thus the voltage at the inverter rises too high. Thus, a transformer can be interposed between the inverter and the service entrance to lower the inverter voltage but raise the voltage going to the grid. But does this create too high of a voltage in the home. If you can tolerate the higher voltage in the home, then I believe a better approach would be to have an inverter that works to a higher output voltage.

.
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
191005-1345 EDT

A further point is that a PV inverter driving the grid is designed roughly as a constant current source up to a limit defined by the power available from the PV array. Possibly it is a constant power device with certain limiting factors. I don't have one to play with.

,



.

.
 

Hv&Lv

Senior Member
Location
-
Occupation
Engineer/Technician
191005-1307 EDT

Hv&Lv:

I see electrofelon as having the goal to move more power from his solar system to the grid.

This means his output voltage from his inverter system has to be high enough to drive the amount of current he wants to force into the grid to accomplish this. In turn this is a function of the grid voltage at the moment of interest in time, and all impedances from his inverter to the grid.

His PV system inverter has to stay below some trip out voltage.

So the transformer has to be picked to accomplish this. Meaning turns ratio and internal impedance. Obviously with an adequate voltage rating.

The big problem that I have read about relative to PV systems is that when the grid is being backfed from the inverter is that the power company transformer is too small for peak power from the PV system, meaning high internal impedance, and thus the voltage at the inverter rises too high. Thus, a transformer can be interposed between the inverter and the service entrance to lower the inverter voltage but raise the voltage going to the grid. But does this create too high of a voltage in the home. If you can tolerate the higher voltage in the home, then I believe a better approach would be to have an inverter that works to a higher output voltage.

.

Our large sites supply their own transformers, usually 25 kV to 347/600. Not sure why they like those secondary voltages here in the lower 48, but that seems to be the voltage of choice for the secondary.
For our houses, we don’t allow net metering. They can export, but it is basically a feed in tariff payed from a separate revenue meter, and the price paid is the wholesale cost. Net metering is retail.
with that being said, the houses generally don’t produce more than electrofelon is wanting to do. He has a 15kVA but wants to produce 19kVA by pushing a higher voltage through the 15kVA.
I don’t know the situation there exactly. Usually the invertors should trip somewhere around 258(129). He isn’t picking out a transformer. It’s already there.
my thinking here is with the increase in voltage to say 128, with an assumed 120 TTR XF, his output voltage will be around 1000 volts above nominal primary. This increased voltage and currents will cause a transformer overload, that will continue until it reaches its breaking point. In dead winter, he possibly could do this for days or weeks. Summer time, the grid output may fall low enough to keep the XF right at its full rating.
Increased voltages, increased currents, eventually core saturation, loss of voltage regulation, and something goes. Winding shorts, XF goes kaput...
or the best thing is the invertor trips off line or ratchets production down.

I have no idea what type of system he has.

Of course, all that being said, his 19kVA system will rarely, if ever see all that. House loads soaking some up, array inefficiencys and losses in PV systems will probably keep his max around the 15kVA point anyway...:roll:
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
191005-1637 EDT

There are a number of things said in different posts that are confusing.

The PV system is a power source and only that. It is a one way device. It is not a load. Further, it is pretty much a current source as designed for a grid tied application.

Core saturation results from the volt-time integral applied to the primary. Obviously scaling factors exist.

Increasing load current does not increase saturation unless that increasing current in some way increases primary voltage or lowers input frequency or changes primary voltage waveform in certain ways.

We do not know what transformer is in question.

If grid tied, and the grid system voltage goes down, then PV inverter can push more power to the grid if available. But if it was pushing all available, then no change. We would generally expect that the PV system outputs all available power to something, grid or the home or combination, until it reaches its maximum voltage.

I don't know if inverters trip out on overvoltage, or just hold at that voltage output and reduce current output.

.
 
Lots of different stuff being discussed here. I will try to clarify a few things generally then quote some specific statements.

I think the original question sort of got lost in the weeds. I may have confused things with excessive details of my somewhat odd electrical system. Basically the question is if I raise the voltage of a transformer, say, 5% above nominal, does that theoretically give me 5% more KVA capacity? We should probably assume unity PF to not get too confused. Perhaps we should clarify what is meant by "raise the voltage". We could raise the voltage of the secondary side with taps. Of course in that case the primary would remain at the same voltage. This was the case I was thinking of. If we had a source that was under our control, we could then raise the voltage of both sides. Either way, one or both windings will have higher voltage than nominal.

He has a 15kVA but wants to produce 19kVA by pushing a higher voltage through the 15kVA.

That is odd wording, but yes that is how it works you need a certain voltage to push the amount of current you want..... Put another way, I have 14KW of inverters right now, and want to add 5KW more. Yes that will cause more voltage rise/drop to make that happen. That is not the voltage change I was talking about though. That would only be a volt or so. I would change the taps on the transformer to kick it up 12 volts or so (if that will make the transformer happy) . The inverters will trip out at 264 and of course I dont want to cut it too close due to varying grid voltage.

I don't know if inverters trip out on overvoltage, or just hold at that voltage output and reduce current output.

My experience is they will trip out. They wont try to reduce output to hover below that trip setting.

The big problem that I have read about relative to PV systems is that when the grid is being backfed from the inverter is that the power company transformer is too small for peak power from the PV system, meaning high internal impedance, and thus the voltage at the inverter rises too high. Thus, a transformer can be interposed between the inverter and the service entrance to lower the inverter voltage but raise the voltage going to the grid. But does this create too high of a voltage in the home. If you can tolerate the higher voltage in the home, then I believe a better approach would be to have an inverter that works to a higher output voltage.

Yes that is possible. In my case, POCO transformer is 25KVA and 2.1 %Z so that shouldnt contribute much voltage drop. PV systems are typically designed to keep conductor losses low not as much for efficiency, but more so to keep voltage firmly constrained in the inverter operating window.

This means his output voltage from his inverter system has to be high enough to drive the amount of current he wants to force into the grid to accomplish this. In turn this is a function of the grid voltage at the moment of interest in time, and all impedances from his inverter to the grid.

His PV system inverter has to stay below some trip out voltage.

Yes. But just to be clear, I am not concerned about this voltage "rise". I have taps and can change them as necessary. I run at about low 250's under full load, it will climb a bit with another 5KW but should stay well under the 264.

A further point is that a PV inverter driving the grid is designed roughly as a constant current source up to a limit defined by the power available from the PV array. Possibly it is a constant power device with certain limiting factors. I don't have one to play with.

There has been some discussion about power output of inverters and no one, except the inverter engineers and programmers, seems to know how exactly this works. The current of a GTI is a "hard stop" known value. Of course we need that to size conductors and equipment to. My inverters are 7KW, and the specs say "rated power at 240V", but mine typically float around 7160 watts under prime conditions. They will not go above that even with highest AC voltage, excess DC available, and cold ambient temp. It is nice of them to give me a little extra. Of course they can reduce power to protect themselves under high heat conditions.

Of course, all that being said, his 19kVA system will rarely, if ever see all that. House loads soaking some up, array inefficiencys and losses in PV systems will probably keep his max around the 15kVA point anyway..

It will see the full 19KVA with the second inverter. I see 14KVA frequently now. At least this wont happen in hot weather. I am a hobbyist welder, maybe I will add some fins onto the transformer tank. :thumbsup:
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
191005-2351 EDT

electrofelon:

I have a 50 kVA pole transformer, some wire to my meter, a small additional amount of wire, main fuses, and then one circuit breaker. With about a 10 A 120 V load the 120 loaded bus to neutral bus has a voltage change of about 0.9 V. This does not include the breaker.

A 240 V 10 A load would probably produce about a 1.0 V change. I can explain why if need be. The 50 kVA transformer will probably look like about 0.25 V change for a 10 A load change, but a good amount of this is at 90 deg.

If you pump back 80 A at 240 V, then for my parameters the hot to hot bus voltage will possibly increase by 8 V. My nominal at present is 123 V, which is quite normal, but sometimes rises to over 125 V. With either of these nominal values an 8 V increase would take me over 246+8 or 250+8, half at 120. Can equipment in my home tolerate voltage that high? Probably, but should not.

If I put an interposing transformer between the PV system and my main panel, then I can reduce the voltage the inverter sees, but it does not correct the voltage in my home. I can also put the whole home load and inverter on the output of the interposing transformer. This increases source impedance.

.
 
Status
Not open for further replies.
Top