W or VA for transformer capacity?

Status
Not open for further replies.

ggunn

PE (Electrical), NABCEP certified
Location
Austin, TX, USA
Occupation
Consulting Electrical Engineer - Photovoltaic Systems
When I am considering PV inverters and transformer capacity, should I be using, for example with an SMA STP 62US-41, 62500W or 66000VA? Since transformer capacity is denoted as VA, 66000VA seems like apples to apples, but I can also see that the maximum output current used for wire sizing and OCPD relate to the 62500W number.
 
Last edited:
I believe the theory is that the wattage represents the power delivered to the load, but the VA represents the power that the conductors and system must be able to conduct in order to do that.
 
I would always use VA even when it may happen to be W in some circumstances.
 
I believe the theory is that the wattage represents the power delivered to the load, but the VA represents the power that the conductors and system must be able to conduct in order to do that.
Well, code says we use maximum output current for conductor size and OCPD, and when I back calculate from Imax using the service voltage I see that Imax relates to the W number, not the VA number.
 
Well, code says we use maximum output current for conductor size and OCPD, and when I back calculate from Imax using the service voltage I see that it relates to the W number, not the VA number.
What if the power factor is not unity? Then the maximum current could be much more for the VA than the W.
 
If the invertor is 62500 watts at a unity PF then watts and VA will be the same. If it has, say, a rating of 62500 watts @ .8 PF then the VA will be 20% higher.
 
If the invertor is 62500 watts at a unity PF then watts and VA will be the same. If it has, say, a rating of 62500 watts @ .8 PF then the VA will be 20% higher.
The question I asked is if the PV system can actually export anything other than power. I don't see how it could export VA since I would think the current exported would be in phase with the voltage.
 
Could the power factor from a PV system ever be anything but unity (or very close to it)?
PV inverters these days generally (always?) support exporting at non-unity power factor, and that capability is often a utility requirement, but I don't think there is as yet a good method to tell the PV inverters when to do that.

Cheers, Wayne
 
Well, code says we use maximum output current for conductor size and OCPD, and when I back calculate from Imax using the service voltage I see that Imax relates to the W number, not the VA number.
How's that? The spec sheet for the SMA STP 62-US says "62,500W ; 66,000 VA ; 480V/277V WYE ; 80A maximum output current". And 80A * 277V * 3 = 66,480 VA.

Cheers, Wayne
 
How's that? The spec sheet for the SMA STP 62-US says "62,500W ; 66,000 VA ; 480V/277V WYE ; 80A maximum output current". And 80A * 277V * 3 = 66,480 VA.

Cheers, Wayne
<slaps forehead> So I see. I hadn't run the numbers on this particular inverter, mostly smaller ones.
 
Could the power factor from a PV system ever be anything but unity (or very close to it)?
Yes, it can be something besides unity. It certainly can be for a stand-alone inverter. And interactive inverters can be set for a certain power factor value or can respond to conditions. See UL1741SA volt-var feature.

When multiple generators are operating in parallel, such as on the grid, how do any of them decide what power factor to operate at? I ask with genuine curiosity.
 
Thanks to wwhitney for pointing out my misconception; the capacity of a transformer to handle the output of an inverter needs to be determined by a P=IV calculation using the maximum output inverter current rather than simply using the published maximum power in Watts if the maximum VA for the inverter is not furnished.
 
New smart inverter requirements allow Volt-VAR control where the inverter will export VARs if the grid voltage goes down. It's all part of using DER to provide grid support functions and not just blindly export power. The inverters do this on their own based on internal settings, no outside control signal is needed other than the voltage of the utility. It's not always enabled but some utilities do want it enabled. To make sure that the customer can always export full power, which they get paid for, modern inverters will have higher VA ratings, to allow the export of the full real power rating ($$$) and some VARs on top of that (no $$$).
That being what it is, any BOS on the AC side needs to be able to carry the 80A maximum output current of this inverter. The PF does not matter. The transformer VA rating should be at least the inverter VA rating because that's the maximum apparent power the inverter can output. I've seen some inverter data sheets that list a max current for the real power and the apparent power rating and sometimes a designer will use the lower real current, and that is the incorrect value to use.
 
Last edited:
New smart inverter requirements allow Volt-VAR control where the inverter will export VARs if the grid voltage goes down. It's all part of using DER to provide grid support functions and not just blindly export power. The inverters do this on their own based on internal settings, no outside control signal is needed other than the voltage of the utility. It's not always enabled but some utilities do want it enabled. To make sure that the customer can always export full power, which they get paid for, modern inverters will have higher VA ratings, to allow the export of the full real power rating ($$$) and some VARs on top of that (no $$$).
That being what it is, any BOS on the AC side needs to be able to carry the 80A maximum output current of this inverter. The PF does not matter. The transformer VA rating should be at least the inverter VA rating because that's the maximum apparent power the inverter can output. I've seen some inverter data sheets that list a max current for the real power and the apparent power rating and sometimes a designer will use the lower real current, and that is the incorrect value to use.
That's what I said, innit? :D
 
Thanks to wwhitney for pointing out my misconception; the capacity of a transformer to handle the output of an inverter needs to be determined by a P=IV calculation using the maximum output inverter current rather than simply using the published maximum power in Watts if the maximum VA for the inverter is not furnished.
Ready for my take? 😏 I would say the NEC doesn't even address transformer sizing... But if we want a theoretical answer, I will allow the discussion to continue 😆. My inverters will put out over their nameplate wattage if the AC voltage is higher, so maybe we would want to use nameplate AC amperage X Max AC voltage? However then would we also need to take into account the transformer capacity at a higher than nominal voltage?
 
I'm going to go with maximum VA and leave it at that. I'm working on a system now where the inverters sum to 225kW; I had originally specified a 225VA 480V to 208V step down transformer, but I have changed it to 300kVA (the next size up).
 
I'm going to go with maximum VA and leave it at that. I'm working on a system now where the inverters sum to 225kW; I had originally specified a 225VA 480V to 208V step down transformer, but I have changed it to 300kVA (the next size up).
Also, I believe "oversizing" PV transformers may be advisable due to better voltage regulation. I other words, Im pretty sure a 100% loaded X KVA transformer will have more voltage drop than a 50% loaded 2X KVA transformer.
 
Status
Not open for further replies.
Top