080405-1630 EST USA
mjrose27:
The fundamental reason a transformer is rated on current (KVA or VA) is that a transformer is limited by two factors --- power dissipation in the transformer and maximum flux density. Power dissipation in a transformer is from two sources --- core losses and I^2*R losses. The resistive losses have no correlation with the phase of the input voltage to the transformer and therefore you are only concerned with the input current to the transformer for these losses.
The transformer designer will in some fashion balance core and I^2*R losses. At no secondary load transformer loss is all core plus the I^2*R due to the magnetizing current. The greater you saturate the core the greater are these losses. But for a given transformer design and at a given input voltage the no load losses will remain somewhat constant with varying secondary load.
However, as load current of any phase angle or waveform increases the I^2*R losses increase. Whether the load current has a 0 deg phase angle with respect to the voltage (a resistive load) or it is 90 deg out of phase from a capacitive load the current and I^2*R losses are the same for the same load current. In the case of the resistive load you have lots of power transferred. With an ideal capacitive load no power consumed in th load but substantial in the transformer.
Power dissipation in the transformer in relation to the size of the transformer determines its rating because of the temperature rise within the transformer.
.