Skelufteay
Member
- Location
- Denver, Colorado
I have been encountering many facilities lately that are utilizing a demand charge based method to determine cost of electricity. I am trying to find the best way to lower our customer's utility bills by installing solar and/or additional technologies to trim the demand. This makes for an interesting calculation for ROI, since you cannot just take the production (kWh) times their rate and get a rough idea of monthly savings. So I am looking for some assistance and ideas... let's take an example for those example minded:
Company A is billed on a flat rate (yearly average numbers):
Billed $0.11/kWh. Uses 50,000 kWh per month. Utility bill therefore costs them $5,500/mo (not considering all those other fees they add in there)
Say Solar generates 20,000 kWh per month. They would effectively be saving $2,200/mo for a new monthly bill of $3,300.
Company B is billed on a demand based (yearly average numbers):
Billed $0.024/kWh and $20.58/kW. They too use 50,000 kWh and their peak demand for the month is 151 kW (measured for a higher peak every 15 minutes). That would mean their utility bill is $1,200 + $3,107.58 = $4,307.58.
Solar generates the same 20,000 kWh but only saves Company B $480/mo in usage (kWh).
So how can Company B benefit more from solar? or is Solar not an attractive option for companies on this billing rate?
So far I have gathered that if Company B only sees this 151 kW during a single day in the month, for 1 minute, they are still charged this rate even if their normal usage is down at 50 kW. I also know that PV can reduce the draw from the utilities and trim this peak, but if that peak happens to be during a cloudy day then they don't 'save' any on the demand rate because that one minute sets it for the rest of the month. I have been suggested looking into batteries that are charged by the grid during the night (when the peak is low) or using soft-starts on every motor we can find, but I am sure there are more ... perhaps better ideas ... out there.
Company A is billed on a flat rate (yearly average numbers):
Billed $0.11/kWh. Uses 50,000 kWh per month. Utility bill therefore costs them $5,500/mo (not considering all those other fees they add in there)
Say Solar generates 20,000 kWh per month. They would effectively be saving $2,200/mo for a new monthly bill of $3,300.
Company B is billed on a demand based (yearly average numbers):
Billed $0.024/kWh and $20.58/kW. They too use 50,000 kWh and their peak demand for the month is 151 kW (measured for a higher peak every 15 minutes). That would mean their utility bill is $1,200 + $3,107.58 = $4,307.58.
Solar generates the same 20,000 kWh but only saves Company B $480/mo in usage (kWh).
So how can Company B benefit more from solar? or is Solar not an attractive option for companies on this billing rate?
So far I have gathered that if Company B only sees this 151 kW during a single day in the month, for 1 minute, they are still charged this rate even if their normal usage is down at 50 kW. I also know that PV can reduce the draw from the utilities and trim this peak, but if that peak happens to be during a cloudy day then they don't 'save' any on the demand rate because that one minute sets it for the rest of the month. I have been suggested looking into batteries that are charged by the grid during the night (when the peak is low) or using soft-starts on every motor we can find, but I am sure there are more ... perhaps better ideas ... out there.