**220915-2355 EDT**

spraymax6:

I think you lack an understanding of electrical circuit theory, and the relation to practical circuits.

Do you have constant source voltage at the input to your transformer assuming you have a constant supply voltage? In other words is the source impedance very much lower for your source at the transformer input than the transformer internal impedance. In many cases you can make this assumption, or at least may need to assume this for analysis purposes.

Given this assumption, and are interested in the output voltage from the transformer, then you need to look at how your load current will vary with time.

If you need very good output voltage regulation from 0 to maximum secondary current, then you will need a transformer with a lower internal impedance, than if you only load the transformer with a single constant load.

In any case the transformer will be required to run continuously at whatever average load RMS current you require. You adjust turns ratio to give you whatever output voltage you need. If you need a lower internal impedance you may need to use a physically larger transformer than would be required from a power perspective. Or you may need to use more expensive materials, or other design concepts for the transformer.

.

wow, not sure you can judge my level of understanding of electrical circuit theory. I look to ask a question in order to get clarification on a topic which has apparently been previously asked on this forum, and because of that I lack understanding of electrical theory?

Any sample voltage drop calculation I have seen has not included translation of the voltage drop through a transformer, I think asking how people typically deal with this is a solid question as I would venture to guess not many people take this into consideration. Take a look at the original thread to help support that notion.

Sample voltage drop calculations always assume a voltage source and give you the formulas depending on the system 3 phase or 1 phase.

In addition the question is geared toward the practical application of all this. Typically you choose a transformer based on your anticipated load. You don't know what that load will be all the time. Maybe the transformer is 25% loaded, maybe 50%, maybe 75%. Playing with taps to combat VD may lead to over voltage at times of light loading.

This is where the question stems, if people don't typically take into consideration transformer voltage drops and load profiles can change significantly so that using taps isn't practical, how aren't more VD issues occurring?

I have typically done VD calcs by hand and just translated the VD from the primary to the secondary, SKM is showing me this is not the best route.