No load losses - normal vs reverse fed

Status
Not open for further replies.

Besoeker

Senior Member
Location
UK
Here's a test for you.

Find one post of yours anywhere on this forum were someone on this forum was able to change your opinion.

In your view you are always correct, that cannot be true.
Might I gently suggest that you stick to the topic rather than personal judgements?
 

iceworm

Curmudgeon still using printed IEEE Color Books
Location
North of the 65 parallel
Occupation
EE (Field - as little design as possible)
... The comment about lower than expected voltage doesn't just apply to control transformers. It is just basic transformer calcs. The no-load voltage is higher than the on-load voltage. Thus, the turns ratio cannot be the same as the designed operational voltage ratio. It really is that simple ...

Nor was I referring to just those.
My point is valid for all transformers.

Bes -
I can't tell what your point is. I read all the posts and you are jumping around a lot.

Compensated windings has nothing to do with the No-Load voltage is higher than the Full-Load voltage. That difference is due to transformer impedance. This voltage drop has nothing to do with compensated windings, forward or reverse fed, or even the turns ratio.

For non-compensated, (non-CPT, in the US we would call this a power transformer) the turns ratio is the same as the voltage ratio. And that would be true forward fed or reverse fed. Unless you have a valid reference - I'm sticking with my story.

And yes, I am discounting, "Some manufacturers also build larger transformers with compensated windings." as referenced by the GE paper, cause those are sufficiently weird that I've never seen one. And I can't tell you about IEC equipment, so they doesn't count either. However, even if those were included, that doesn't change:

The voltage drop from No-Load to Full-Load is from the transformer impedance.

And I think you are wrong with the compensated windings just being a change in the turns ratio. I'm thinking magic excrement in the flux loop has at least as much to do with it as turns ratio.

ice
 

Besoeker

Senior Member
Location
UK
Bes -
I can't tell what your point is. I read all the posts and you are jumping around a lot.

Compensated windings has nothing to do with the No-Load voltage is higher than the Full-Load voltage. That difference is due to transformer impedance. This voltage drop has nothing to do with compensated windings, forward or reverse fed, or even the turns ratio.

For non-compensated, (non-CPT, in the US we would call this a power transformer) the turns ratio is the same as the voltage ratio. And that would be true forward fed or reverse fed. Unless you have a valid reference - I'm sticking with my story.

And yes, I am discounting, "Some manufacturers also build larger transformers with compensated windings." as referenced by the GE paper, cause those are sufficiently weird that I've never seen one. And I can't tell you about IEC equipment, so they doesn't count either. However, even if those were included, that doesn't change:

The voltage drop from No-Load to Full-Load is from the transformer impedance.

And I think you are wrong with the compensated windings just being a change in the turns ratio. I'm thinking magic excrement in the flux loop has at least as much to do with it as turns ratio.

ice

So what mechanism is used to compensate?
 

GoldDigger

Moderator
Staff member
Location
Placerville, CA, USA
Occupation
Retired PV System Designer
The way I see this B is stating that some or maybe all European power transformers have compensated windings.
US electricians and EEs who are familiar with US power transformers state that few if any have compensated windings.
Neither side has produced any evidence to refute the other position, only to support their own.
The two positions are *not contradictory*, so I do not see the point in just repeating the same assertions!

Sent from my XT1585 using Tapatalk
 

iwire

Moderator
Staff member
Location
Massachusetts
The way I see this B is stating that some or maybe all European power transformers have compensated windings.
US electricians and EEs who are familiar with US power transformers state that few if any have compensated windings.
Neither side has produced any evidence to refute the other position, only to support their own.
The two positions are *not contradictory*, so I do not see the point in just repeating the same assertions!

Sent from my XT1585 using Tapatalk

I agree with this entirely and both myself and electrofelon suggested that is the case but Bes would have none of that, he told us that was not possible.
 

Besoeker

Senior Member
Location
UK
I agree with this entirely and both myself and electrofelon suggested that is the case but Bes would have none of that, he told us that was not possible.
Do you agree/accept that no-load and on-load voltages differ?
 

iceworm

Curmudgeon still using printed IEEE Color Books
Location
North of the 65 parallel
Occupation
EE (Field - as little design as possible)
Do you agree/accept that no-load and on-load voltages differ?
There you go again. Stop it.

The difference between No-Load voltage and Full-Load voltage has nothing to do with compensated windings. It is caused by transformer impedance.

ice
 

Besoeker

Senior Member
Location
UK
There you go again. Stop it.

The difference between No-Load voltage and Full-Load voltage has nothing to do with compensated windings. It is caused by transformer impedance.

ice
But it has to do with turns ratio. Isn't that why, when used in reverse, the expected voltage may not be achieved as SquareD have stated?
 

GoldDigger

Moderator
Staff member
Location
Placerville, CA, USA
Occupation
Retired PV System Designer
But it has to do with turns ratio. Isn't that why, when used in reverse, the expected voltage may not be achieved as SquareD have stated?
Ding! Everyone to their corners.

Transformer impedance causes full load voltage to be lower than no load voltage.

The only question is whether a transformer is designed to produce its nominal voltage at full load or at zero load.
That is a question of normal practice rather than a mandatory design feature.

Sent from my XT1585 using Tapatalk
 

iwire

Moderator
Staff member
Location
Massachusetts
Ding! Everyone to their corners.

Transformer impedance causes full load voltage to be lower than no load voltage.

The only question is whether a transformer is designed to produce its nominal voltage at full load or at zero load.
That is a question of normal practice rather than a mandatory design feature.

Again I agree, I think you have summarized this nicely. At least that has been my understanding of this thread.
 

iwire

Moderator
Staff member
Location
Massachusetts
But it has to do with turns ratio. Isn't that why, when used in reverse, the expected voltage may not be achieved as SquareD have stated?

Again Sq D is talking about Control Transformers when talking about compensated windings. It does not say all transformers have compensated windings

The practice of backfeeding a general purpose transformer is NOT recommended, especially in transformers smaller than 3Kva. Backfeeding is not allowed for any Industrial Control Transformers of any size, because windings are compensated and backfeeding will result in lower than expected output voltage.
 

big john

Senior Member
Location
Portland, ME
It has to be to get the right on-load voltage. It isn't a fudge.
It's called a compensated winding. As far as I know it does not exist outside of control transformers.

The turns ratios on power transformers are often accurate within several decimals of nomimal nameplate voltages.

As far as OPs question, reverse feeding transformers can absolutely result in higher excitation current, making it harder to select primary protection due to nuisance tripping, but I don't know what actually causes that.
 

GoldDigger

Moderator
Staff member
Location
Placerville, CA, USA
Occupation
Retired PV System Designer
As far as OPs question, reverse feeding transformers can absolutely result in higher excitation current, making it harder to select primary protection due to nuisance tripping, but I don't know what actually causes that.

I won't say this is the right answer, but it seems to me that the inner windings couple very tightly to the iron core and are very subject to transient effects of magnetization, while the outer windings are less tightly coupled (some of the inductance is air core) and so the surge current is more controlled.
That is in line with the primary of a one-way transformer being on the outside so that the taps are easier to run to terminals.
Do you have a reference for us Phil? :)

Sent from my XT1585 using Tapatalk
 
Interesting aside:
Has anyone ever seen anything on minimum inrush (normal install forward feed)?

Had an occasion to advise on a temp power install. 225kva, 408D/208Y, fed with a 100A CB. Normal loading (primary side) was 50A.

I figured it would be more likely than not to trip. IEEE says 1 in 6 times the inrush will be max. All we had for instrumentation was a peak reading fluke clamp-on. That thing just oozed on line. As I recall, the inrush was less than 100A.

Considering 1 in 6 is max inrush, is 1 in 6 min inrush and the rest are scattered in between? Anybody seen any actual test results, or papers on tests?

Plenty of papers on max inrush. Haven't ever seen one on min inrush.

ice

I have seen plenty of motors/transformers/Ac's that have been wired by people who are not electricians. This means that usually they did not utilize the larger OCPD sizing allowed by the NEC as compared to "normal" loads. I cant recall ever having trouble getting any of these to start.

At my house, I have a 2400 to 240 15 KVA used in reverse, which then feeds another 2400 to 240. Its on a 90 amp breaker. I have been turning it on and off very frequently for various reasons, and it may have tripped out a few times upon energizing, but maybe not - its a homeline 2 pole so sometimes that handle tie makes the operation funky..... so hard to tell.

P.S. good paper, nice summary.
 
Status
Not open for further replies.
Top