191005-1307 EDT
Hv&Lv:
I see electrofelon as having the goal to move more power from his solar system to the grid.
This means his output voltage from his inverter system has to be high enough to drive the amount of current he wants to force into the grid to accomplish this. In turn this is a function of the grid voltage at the moment of interest in time, and all impedances from his inverter to the grid.
His PV system inverter has to stay below some trip out voltage.
So the transformer has to be picked to accomplish this. Meaning turns ratio and internal impedance. Obviously with an adequate voltage rating.
The big problem that I have read about relative to PV systems is that when the grid is being backfed from the inverter is that the power company transformer is too small for peak power from the PV system, meaning high internal impedance, and thus the voltage at the inverter rises too high. Thus, a transformer can be interposed between the inverter and the service entrance to lower the inverter voltage but raise the voltage going to the grid. But does this create too high of a voltage in the home. If you can tolerate the higher voltage in the home, then I believe a better approach would be to have an inverter that works to a higher output voltage.
.
Our large sites supply their own transformers, usually 25 kV to 347/600. Not sure why they like those secondary voltages here in the lower 48, but that seems to be the voltage of choice for the secondary.
For our houses, we don’t allow net metering. They can export, but it is basically a feed in tariff payed from a separate revenue meter, and the price paid is the wholesale cost. Net metering is retail.
with that being said, the houses generally don’t produce more than electrofelon is wanting to do. He has a 15kVA but wants to produce 19kVA by pushing a higher voltage through the 15kVA.
I don’t know the situation there exactly. Usually the invertors should trip somewhere around 258(129). He isn’t picking out a transformer. It’s already there.
my thinking here is with the increase in voltage to say 128, with an assumed 120 TTR XF, his output voltage will be around 1000 volts above nominal primary. This increased voltage and currents will cause a transformer overload, that will continue until it reaches its breaking point. In dead winter, he possibly could do this for days or weeks. Summer time, the grid output may fall low enough to keep the XF right at its full rating.
Increased voltages, increased currents, eventually core saturation, loss of voltage regulation, and something goes. Winding shorts, XF goes kaput...
or the best thing is the invertor trips off line or ratchets production down.
I have no idea what type of system he has.
Of course, all that being said, his 19kVA system will rarely, if ever see all that. House loads soaking some up, array inefficiencys and losses in PV systems will probably keep his max around the 15kVA point anyway...:roll: