PV System Clarification

Status
Not open for further replies.

wwhitney

Senior Member
Location
Berkeley, CA
Occupation
Retired
Yes, utilities in general have a responsibility to keep the voltage within a window near nominal so that peoples appliances and devices function and don't fry. My understanding is that broadly speaking under IEEE standards this is within +10% and -12%. Inverters generally follow the same standards.
So in theory you could have a 240V service where the utility provides 264V consistently, and you'd be unable to install any interactive inverters, as their voltage window is also limited to 264V?

I guess that problem is not easy to get rid of in theory, as even if the utility standard were revised to +8% to -12%, if you had enough DG exporting simultaneously, that DG could push the grid voltage up over 2%. Although perhaps in practice, that would be usually be enough margin and it would hard to get enough DG to exceed 2% grid voltage rise?

Any pointers to a good case study/explanation of Hawaii's situation and rules? They have this problem and stricter rules in place to deal with it, right?

Thanks, Wayne
 

tallgirl

Senior Member
Location
Great White North
Occupation
Controls Systems firmware engineer
I would say this is out of date and you indeed haven't been keeping track. UL1741SA is an example of expanding the standard to address these issues. California's smart inverter requirements that rolled out between 2017 and 2020 mean that currently produced inverters being sold in the US are 'smart' not 'dumb'. (Technically they are 'grid support' inverters as opposed to mere 'interactive' inverters.) I haven't nerded out on all the nitty-gritty but they are supposed to adjust output and power factor to respond to voltage and frequency conditions on the grid.

These issues are essentially a matter of programming. So while there is a large fleet of 'dumb' inverters still out there, the idea that the grid can't handle continued addition of new smart inverters doesn't hold much water in my opinion.
I'm absolutely not at all out of date. I've not seen anything from IEEE (or NREL or NERC or anyone) which addresses the issues I've raised.

The key problem with DER is the "D" -- Distributed -- and the way that even "Smart 2.0" inverters are primarily interested in maximizing production, rather than maximizing interaction with the rest of the generation fleet. I wasn't around as much when I was working on inverter firmware or utility-scale farms, but I have worked on, first hand, the software and control systems for PV system.

DER is, by its very nature, not dispatchable nor is it all that amenable to things like day-ahead forecasting. It trashes distribution systems, which either aren't designed for positive voltage gradient, or which aren't intelligent enough to manage either the loss of reactive power, or communicate the need for additional reactive power.
 

tallgirl

Senior Member
Location
Great White North
Occupation
Controls Systems firmware engineer
So in theory you could have a 240V service where the utility provides 264V consistently, and you'd be unable to install any interactive inverters, as their voltage window is also limited to 264V?

I guess that problem is not easy to get rid of in theory, as even if the utility standard were revised to +8% to -12%, if you had enough DG exporting simultaneously, that DG could push the grid voltage up over 2%. Although perhaps in practice, that would be usually be enough margin and it would hard to get enough DG to exceed 2% grid voltage rise?

Any pointers to a good case study/explanation of Hawaii's situation and rules? They have this problem and stricter rules in place to deal with it, right?

Thanks, Wayne
More or less. There are times when grids run "high" because they are expecting additional load at different times of the day, and they are adding generation and not switching taps ahead of the additional load.

When I was still in Austin, I often saw the voltage at the pad mount across the street over 260V -- almost no net current flow either way, voltage measured at the service -- particularly early in the day on cool days. All monitoring systems should track grid voltage, especially at those near-zero points.

The problem in Hawaii, and I worked with people on Hilo, is the amount of solar because of the high cost of energy. In their case, the issue was so much solar that distribution lines and circuits were potentially overloaded in the wrong direction. When the grid is treated as an "infinite battery" the amount of generation on circuits can exceeded what might have been predicted as the sort of "load diversity calculation" maximum load. In the "inifinite battery" scenario, people are sizing systems to produce an entire day's energy over the span of 4-6 hours. I saw similar, only different, issues in rural areas, including some of the more remote parts of Australia (and Texas).
 

tallgirl

Senior Member
Location
Great White North
Occupation
Controls Systems firmware engineer
A couple points:

1. Regarding #1 in the OP, I think that there is a difference between "The DG being the problem" and "The POCO having a problem". In the cases I was discussing, it is definitely the latter. Fact is voltage is all over the place even without the solar connected. I was seeing voltage go from 240 to 255 in a period of less than a minute. The line needs to be upgraded from 4.8KV and/or have more regulators. The POCO guy made some comment about the DG being the problem. No. Your voltage is all over the place and I don't see a single regulator on the line, just no one noticed it before.
"Voltage is all over the place" because from the utility / ISO perspective, that's what is acceptable and it helps make the grid work. There was a study out of Texas which said that the mere act of trying to make the grid "work better" made the grid work worse. I don't know if Texas stopped trying to use the line frequency as a time base, but getting rid of that lunacy would help grid stability.

2. Regarding the "residential solar should be banned" comment, that just seems like a broad over generalization. I am sure there are some lines that have a ton of DG on them and perhaps it's causing issues (probably fixable with adjusting regulator parameters, maybe @Hv&Lv can comment). Other lines probably have relatively little DG on them and a long ways to go before there are any "issues".

I've been following (or doing, or writing about) residential solar for about 15 years and I stand by that statement. Any region with more DER than the largest single generator, which is what operating procedures are based on, needs to require that all DER be aggregated, subjected to day-ahead forecasting, and appropriately dispatched.

What I saw when I was leaving the residential PV world to work with utility-scale PV, was wealthy homeowners maximizing their production in order to treat the grid as an infinite battery, with no ability to stabilize their own output. Even with the changes from the IEEE (UL is nice, but IEEE is who knows better what needs doing), DER is destabilizing in the current Wild West environment.
 

jaggedben

Senior Member
Location
Northern California
Occupation
Solar and Energy Storage Installer
... Any region with more DER than the largest single generator, which is what operating procedures are based on, needs to require that all DER be aggregated, subjected to day-ahead forecasting, and appropriately dispatched.
...

This seems way off the mark to me. I can see how such a stance might make sense for grids the size of a Hawaiian island or smaller. But it seems to makes zero sense for typical mainland balancing authorities, let alone the grids that they interconnect to. I mean, Cal ISO has nearly five times as much DER as its largest generator (ten times if you consider Canyon Diablo's units individually), and that doesn't lead to alerts every summer day. I think you are vastly overstating the difficulty in predicting DER output over large areas. It is basically a component of load, which as far as I can tell is about as difficult to predict.
 
"Voltage is all over the place" because from the utility / ISO perspective, that's what is acceptable and it helps make the grid work. There was a study out of Texas which said that the mere act of trying to make the grid "work better" made the grid work worse. I don't know if Texas stopped trying to use the line frequency as a time base, but getting rid of that lunacy would help grid stability.

I really don't think so in this case. Clearly I am not a distribution system EE or grid operator, but I am a electrical distribution nut and quite familiar with the lines and methods around here. Some of the lines are due for upgrading plain and simple. They did it on my line a few years ago when they updated the 13.2->4.8 kv platform that serves me and added a regulator bank after it. Another 4.8kv line had regulators added last year. The line the serves the PV system I am discussing, has a 13.2->4.8 kv pole bank that has got to be 40 years old. Probably high impedance and poor regulation by today's standards. Not a single regulator on the 4.8kv section....... The pad Mount serving the place is 22 years old, I assume they will just replace that and that will provide a stiffer enough supply to fix things. I have a mini MV distribution system at my place and I have seen first hand how much better regulation and losses have gotten in the last 10 years.




What I saw when I was leaving the residential PV world to work with utility-scale PV, was wealthy homeowners maximizing their production in order to treat the grid as an infinite battery, with no ability to stabilize their own output. Even with the changes from the IEEE (UL is nice, but IEEE is who knows better what needs doing), DER is destabilizing in the current Wild West environment.

Again, this just seems like a broad generalization: "DER is destabilizing". Seems that logically it would depend on the amount of DER, characteristics of the line, spacing of the DER, and many other factors. Yes of course the grid is not an infinite battery and yes inverters treat it as such, but I have a hard time seeing that that matters when you have a small residential PV system every two to three miles of line.
 

tallgirl

Senior Member
Location
Great White North
Occupation
Controls Systems firmware engineer
I really don't think so in this case. Clearly I am not a distribution system EE or grid operator, but I am a electrical distribution nut and quite familiar with the lines and methods around here. Some of the lines are due for upgrading plain and simple. They did it on my line a few years ago when they updated the 13.2->4.8 kv platform that serves me and added a regulator bank after it. Another 4.8kv line had regulators added last year. The line the serves the PV system I am discussing, has a 13.2->4.8 kv pole bank that has got to be 40 years old. Probably high impedance and poor regulation by today's standards. Not a single regulator on the 4.8kv section....... The pad Mount serving the place is 22 years old, I assume they will just replace that and that will provide a stiffer enough supply to fix things. I have a mini MV distribution system at my place and I have seen first hand how much better regulation and losses have gotten in the last 10 years.

Local regulation, in terms of what's happening with you, can be improved. If your line is switched to provide lower voltage so the PV system stays connected, someone somewhere else on that same circuit may see too little voltage. If your PV system shuts down because a giant cloud just passed overhead, suddenly a lot of supply just disappeared, and then what? I've watched the voltage swings from cloud passage, and they are dramatic.

Again, this just seems like a broad generalization: "DER is destabilizing". Seems that logically it would depend on the amount of DER, characteristics of the line, spacing of the DER, and many other factors. Yes of course the grid is not an infinite battery and yes inverters treat it as such, but I have a hard time seeing that that matters when you have a small residential PV system every two to three miles of line.

It isn't the small systems. If you look at the change in both installations per year - about 3GW residential last year, compared to about 300MW residential in 2011 - and total generation, "10 years" is practically forever. That's the problem.

Systems are being installed to produce close to net-zero generation, and they have to do that during the few hours when there's good production. I've spoken with countless people who refuse to install anything other than south-facing arrays because that's the most cost effective. I've modeled split east-west arrays and they'd be far nicer to the grid than the fixation on south-facing-only. Relative to a few years ago they are still cheap, and with MLPE, they are just as easy to install.
 

tallgirl

Senior Member
Location
Great White North
Occupation
Controls Systems firmware engineer
This seems way off the mark to me. I can see how such a stance might make sense for grids the size of a Hawaiian island or smaller. But it seems to makes zero sense for typical mainland balancing authorities, let alone the grids that they interconnect to. I mean, Cal ISO has nearly five times as much DER as its largest generator (ten times if you consider Canyon Diablo's units individually), and that doesn't lead to alerts every summer day. I think you are vastly overstating the difficulty in predicting DER output over large areas. It is basically a component of load, which as far as I can tell is about as difficult to predict.

You were saying?

 
Local regulation, in terms of what's happening with you, can be improved. If your line is switched to provide lower voltage so the PV system stays connected, someone somewhere else on that same circuit may see too little voltage. If your PV system shuts down because a giant cloud just passed overhead, suddenly a lot of supply just disappeared, and then what? I've watched the voltage swings from cloud passage, and they are dramatic.



It isn't the small systems. If you look at the change in both installations per year - about 3GW residential last year, compared to about 300MW residential in 2011 - and total generation, "10 years" is practically forever. That's the problem.

Systems are being installed to produce close to net-zero generation, and they have to do that during the few hours when there's good production. I've spoken with countless people who refuse to install anything other than south-facing arrays because that's the most cost effective. I've modeled split east-west arrays and they'd be far nicer to the grid than the fixation on south-facing-only. Relative to a few years ago they are still cheap, and with MLPE, they are just as easy to install.
I guess I am not really sure what we are discussing/debating. I guess we have moved on from voltage gradients in long rural distribution lines to macro scale DER policy and grid operations?

The article you posted seems to be partly at odds with your thesis that DER causes problems with grid stability. They say they can curtail to keep the grid within parameters, just that its more at odds with policy and goals to have to curtail renewable sources. Are you saying that the ISO doesnt have the ability to curtail residential systems, only larger systems and that is an issue? I didnt see any mention of that in the article.
 

tallgirl

Senior Member
Location
Great White North
Occupation
Controls Systems firmware engineer
If that's your best response then I feel like my point is made. That doc isn't even talking about behind-the-meter generation. And behind-the-meter generating capacity in CAISO territory has roughly doubled since it was published.
Maybe we have different perspectives on this?

@electrofelon conceded they can curtail output to stabilize the grid, but the paper also stated they were having to curtail more frequently. More frequently turning off PV isn't "getting better", it's getting worse. That's my point - more solar isn't making it "better", because I view "better" as not having to curtail, ever.

I'm not saying anything that's controversial, even for people in CAISO who are saying the same thing. I even remarked about changing orientation, which is one of the mitigatins CAISO has said should be done.

 
Maybe we have different perspectives on this?

@electrofelon conceded they can curtail output to stabilize the grid, but the paper also stated they were having to curtail more frequently. More frequently turning off PV isn't "getting better", it's getting worse. That's my point - more solar isn't making it "better", because I view "better" as not having to curtail, ever.

I'm not saying anything that's controversial, even for people in CAISO who are saying the same thing. I even remarked about changing orientation, which is one of the mitigatins CAISO has said should be done.

I am not going to deny that once you get to a certain point of portfolio penetration, wind and solar get challenging for a grid operator. My main issue is many people seem to just jump on the anti renewable bandwagon and apply that philosophy everywhere - even for areas that have a much much smaller renewable portfolio than other areas that are doing it successfully. For example look at texas which has an enormous amount of wind capacity and is integrating it successfully. So, if texas is doing it with 15% renewable*, Doesnt NY with under 1% wind and solar have a long "relatively non-problematic" way to go? Get to the point where things start to get "difficult" and go from there.....

*is wind easier or harder for a grid operator to manage and predict than solar?
 

tallgirl

Senior Member
Location
Great White North
Occupation
Controls Systems firmware engineer
I'm far from anti-renewable. I think that done properly, which would involve a lot of the mitigations in that Wikipedia article, solar can be a very nice thing.

Texas has issues, some of which are because they aren't part of either interconnect. I made it through two rounds of phone interviews with Tesla to be their Stationary Storage firmware architect. They didn't know where they were going to do that. I refuse to move to Silicon Valley. I strongly suggested they move to Texas because ERCOT is a great laboratory. That went nowhere. Now I'm controlling your robot overlords. Would you like some 24VAC with your PLC?
 

jaggedben

Senior Member
Location
Northern California
Occupation
Solar and Energy Storage Installer
Maybe we have different perspectives on this?

@electrofelon conceded they can curtail output to stabilize the grid, but the paper also stated they were having to curtail more frequently. More frequently turning off PV isn't "getting better", it's getting worse. That's my point - more solar isn't making it "better", because I view "better" as not having to curtail, ever.

I'm not saying anything that's controversial, even for people in CAISO who are saying the same thing. I even remarked about changing orientation, which is one of the mitigatins CAISO has said should be done.

I don't disagree with your points at their most general. I was responding to your particular assertion I quoted above regarding DER needing to be aggregated, dispatched, and limited to a size that struck me as arbitrary. I don't see how curtailment of utility scale renewables directly bears on the question of at what point residential DER would need to be dispatchable or limited in size. And whatever limits and controls there need to be on behind-the-meter DER, I don't think they are what you said.

Curtailment of renewables is an economic and a policy goal problem if you want renewables to keep growing, yes. But it doesn't seem to be evidence that operators are struggling from a technical standpoint to maintain 'grid instability' in large grids. I mean, obviously the work required of grid operators has changed, but I haven't seen any evidence that it's been terribly difficult for them to handle. As far as I can tell they've been able to build up the necessary experience as more renewable generation has come online. In fact, due to the large amount of DER on CAISOs grid we have lower annual peak load from the grid operator's perspective, which means there are fewer public alerts or requests for the public to curtail usage than there were a couple decades ago.
 

tallgirl

Senior Member
Location
Great White North
Occupation
Controls Systems firmware engineer
Sooner or later DER has to act like a part of the traditional generation fleet. DER provided half of all PV energy in California the last few years, and CAISO has some very ambitious goals for more renewable energy.

It's not technically hard to do what I described because data reporting from DER systems is increasingly common. It would also help push some of the costs associated with PV variability back onto the system owners who are the ones benefiting from their production.
 

ggunn

PE (Electrical), NABCEP certified
Location
Austin, TX, USA
Occupation
Consulting Electrical Engineer - Photovoltaic Systems
#3 due to Vd in the middle of the PV system's temporal power curve.

If it were #1 or #2 the time of day would not matter and #4 is a throwaway answer.

Yes, I know #1 or #2 could be the explanation, but the question asks which is more likely.
 
My understanding is that broadly speaking under IEEE standards this is within +10% and -12%.


So in theory you could have a 240V service where the utility provides 264V consistently, and you'd be unable to install any interactive inverters, as their voltage window is also limited to 264V?

I guess that problem is not easy to get rid of in theory, as even if the utility standard were revised to +8% to -12%, if you had enough DG exporting simultaneously, that DG could push the grid voltage up over 2%. Although perhaps in practice, that would be usually be enough margin and it would hard to get enough DG to exceed 2% grid voltage rise?

Any pointers to a good case study/explanation of Hawaii's situation and rules? They have this problem and stricter rules in place to deal with it, right?

Thanks, Wayne
"Voltage is all over the place" because from the utility / ISO perspective, that's what is acceptable and it helps make the grid work. There was a study out of Texas which said that the mere act of trying to make the grid "work better" made the grid work worse. I don't know if Texas stopped trying to use the line frequency as a time base, but getting rid of that lunacy would help grid stability.



I've been following (or doing, or writing about) residential solar for about 15 years and I stand by that statement. Any region with more DER than the largest single generator, which is what operating procedures are based on, needs to require that all DER be aggregated, subjected to day-ahead forecasting, and appropriately dispatched.

What I saw when I was leaving the residential PV world to work with utility-scale PV, was wealthy homeowners maximizing their production in order to treat the grid as an infinite battery, with no ability to stabilize their own output. Even with the changes from the IEEE (UL is nice, but IEEE is who knows better what needs doing), DER is destabilizing in the current Wild West environment.
So I have an update from the utility regarding the high/variable voltage for the PV system I have mentioned in this thread. But first, There seems to be some confusion about the voltage the utility is supposed to provide. It seems +/- 5% is what is typically required. I believe this is in ANSI C84.1. This is more inline with what the lineman said who showed up to look, said that it should be 114-123. I know that is a bit off from +/- 5%, either he was a little off or its a company policy? Anyway this seems to prove (not that I was ever in doubt) that this is a "POCO problem" not a "PV problem".

So anyway we had put in a request for a high voltage situation several weeks ago, and the POCO said they would send someone out that day as they consider that "an emergency". No one ever saw them, then turkey day happened and what not, so a few days ago we started making phone calls again. So after me an my partner on this each spend 1/2 hour on the phone we finally get transferred to the right person and they send someone right over. I guess this person wasn't a lineman, just a "van jockey" and he ends up pulling the meter and driving away! An hour later the lineman shows up and blasts the van jockey for leaving the people without power for reading 252 volts. But he does say something is definitely amuck and the voltage shouldn't be that high, and perhaps its a bad or maladjusted regulator. He also said (as I suspected) that they were toward the end of a long 4800 volt line (I thought it was 4800 delta, but the lineman did say 4800 wye, but if thats the case it would be unigrounded). He said (understandably) he has to put it through engineering and cant go tweaking regulators, but that they will assess the line and take care of it. So that is where it stands now.


 

wwhitney

Senior Member
Location
Berkeley, CA
Occupation
Retired
Getting back to the OP, let's say you have an issue with high voltage at the PV inverter, be it from high voltage at the point of utility connection, or voltage rise on the premises wiring system. Is using a buck autotransformer a viable workaround?

Let's say that the voltage range at the PV inverter (if it ignored its voltage window) is only half the width of the allowable voltage window, just shifted too high so that at the high end it's outside the PV inverter's voltage window.

Cheers, Wayne
 
Getting back to the OP, let's say you have an issue with high voltage at the PV inverter, be it from high voltage at the point of utility connection, or voltage rise on the premises wiring system. Is using a buck autotransformer a viable workaround?
I dont see why that wouldnt work. However first I may try to change the inverter's voltage window. I have not had to do that yet, but from what I hear, you need the assistance of the inverter manufacturer to make such changes and IIRC you/they need POCO approval too.
 
Status
Not open for further replies.
Top