"Wire Losses"

Status
Not open for further replies.

sculler

Member
Several years ago a wire manufacturer released a paper to justilfy using # 10 wire vs #12 based on energy savings over the life of building due to less energy lost in wire resistence. Does anyone know where I might find such a paper or have a ready formula to show losses just in wire resistence (not Voltage Drop) when running log feeders?
 
Since the price of copper has risen more quickly than the price of energy, my gut makes me think that wire would have to be in service for a hundred years.
 
I second the pointer to copper.org.

Their goal is to sell copper, and they have lots of articles saying why you should use more, and in particular saying that you should spend more on copper up front to save money long term.

Take their articles with a _large_ grain of salt (they have an agenda and are not afraid to use it! ), but for applications that are run a large fraction of the time at near full load, paying a premium for more efficient energy use can have a pretty rapid (1-2year) payback.

Take a 10hp motor that runs 75% of the time. It will burn about 70,000 KWh in a year. A 2% difference in efficiency would mean about 1500KWh of electricity, or about $75-$150 per year depending upon rates. If a higher efficiency motor costs $200 more, then it could quickly be worth it to pay the premium.

-Jon
 
I have an old book that describes the savings of using #12 conductors over #14 in a dwelling installation. Maybe the writer worked for the copper people.:rolleyes:
 
It was either HUD or the USDA that examined this issue as part of an affordable housing study. They determined that in a dwelling, the upsizing to #12 when #14 was otherwise permitted was not cost effective. Wish I could find that paper. The government sites are notoriously hard to search.
 
It is a simple voltage drop calculation. Calculate the voltage drop for both wire sizes, find the difference, multiply this difference by the current to find the watts. Use that to calculate the savings in energy. With loads that run almost all the time, you can get paybacks of less than a year. It would be unlikely to find any loads in a dwelling unit or even in most commercial occupancies that will have much of a payback. It is very common to get a quick payback for industrial equipment that runs 24/7.
You can also skip the votlage drop calculation and do an I^2R calc. Just square the current and multiply by the resisance of the wire to get the watts.
Don
 
Its also worth considering what has happened and continues to happen to copper pricing; what might of made sense a decade ago might now be considered farsical. Without doing the math, I suspect that these days optimizing copper is the best way to a payback on anything other than a continuous load.
 
It's not really that simple. Using a larger wire size will only save money if it means that the load will need to run less to serve the desired outcome. For example, if voltage drop causes you have a dimmer bulb, you won't leave it on longer to compensate.

However, for electric heat, you would have to leave the heat on longer to obtain a certain amount of heat gain. But, if the heat emitted by the wire contributes to the heat gain, say by being within the heated space, it's not really waste.

The extra resistance of a circuit with more voltage drop will actually cause less current to flow. If the desired results can be obtained in the same amount of time, with a lower overall current, there's no money to be saved by increasing wire size.

Remember, except for motors, elecrical equipment is constant impedance, not constant power, so current does not increase when voltage decreases. When the voltage drops, so does the current, decreasing the power delivered, not increasing it.

So, the result of increasing conductor size really depends on the result of decreasing the voltage drop.
 
so current does not increase when voltage decreases.

you are confusing me larry. P/E=I right? for some reason when i put in a lower E, I goes up! this does not apply across the board?


edit -- what about inductive lighting and such? i'm apparently not up to speed on something.
 
brantmacga said:
P/E=I right? for some reason when i put in a lower E, I goes up! this does not apply across the board?

That formula is correct, but you are assuming power stays constant, it is not when talking about resistive loads. You have to use some of the other 11 formulas in ohm's law to figure things out like E/R=I. Add more resistance to a circuit and you get less current (I). If you have less current what happens to Power in the equation E*I = P.
 
when talking about resistive loads.

but that's only on resistive loads, right? i thought about that, which is why i made the edit talking about inductive lighting. larry said only except for motors. i didn't mean to sound like i was questioning larry either, i was just confused for a moment.
 
Not a problem, and it points out exactly how theory and the real world differ. As Dereck and I mentioned, you must remember which electrical parameter(s) remain constant and which ones vary with conditions.

The supply voltage at the service is considered constant (well, almost), as is the equipment impedance. What varies is total circuit impedance; as a result of that, circuit current varies; as a result of that, power varies.
 
Larry makes some excellent points (as usual). Nothing is a simple as it first appears. Attempts to minimize voltage drop to save energy are good, but as Larry points out, no real energy savings will result an many applications. Also remember that MOST residential circuits are very lightly loaded, most of the time. With no load, there is no voltage drop. For example, my bathroom receptacle is heavily loaded for the five minutes or so that my wife is using her industrial strength hair dryer (closely related to the track drying equipment used by NASCAR). The rest of the time the only load on the circuit is a night light. Arbitrarily increasing the size of the conductor to reduce the voltage drop during the short time that it is heavily loaded would be false economy. There are also other ways to reduce voltage drop that would be easier than trying to install #10 wire on the typical terminals on a receptacle. Installing a couple of additional circuits so that each circuit is more lightly loaded would be one example - less load = less voltage drop. Installing a sub-panel to reduce the length of the branch circuit would probably reduce voltage drop with a properly sized feeder - shorter conductor length = less voltage drop. Arbitrarily increasing the size of the branch circuit conductor would be the least desirable way to address the possible savings by reducing voltage drop.
 
Last edited:
Thanks for all the replies. The www.copper.com was helpful!! I was actually trying to extrapulate the principle of power wasted in the wire to argue for a new service on a distribution center expansion vs. running (3) 400 amp feeders 450ft. Owner isn't concerned about initial feeder costs, he's concerned about footprint he'll give up for trans, swb etc, so I am trying to show what he will continue to pay in line losses for the life of the building.
 
sculler said:
Owner isn't concerned about initial feeder costs, he's concerned about footprint he'll give up for trans, swb etc, so I am trying to show what he will continue to pay in line losses for the life of the building.

Remember that the transformer will have losses, which the customer may or may not pay for (customer owned transformer versus POCO owned, and POCO billing for additional service equipment).

Transformers have losses when energized, even if feeding zero load, conductors only have losses when supplying a load.

-Jon
 
Larry,
You need to look at power supplied not delivered. The delivered power is less as a result of heat loss.
 
zazmat said:
Larry,
You need to look at power supplied not delivered. The delivered power is less as a result of heat loss.
No argument. However, in order to assign the amount of power lost to heat, as well as that condumed by, or delivered to, the load, you first have to calculate the total Kva supplied. That requires knowing the source voltage and entire circuit impedance.

The load's rated Kva consumption assumes that the design voltage is actually being delivered to the terminals. The best we can do is to calculate the load's impedance using the nameplate data, add that to the entire service-, feeder-, and branch-circuit conductor impedances.

That total impedance, along with the known or measured source voltage, will give us actual current, with which we calculate the voltage across each segment of the circuit, and thus power lost to heat, as well as the power eventually delivered to the load.

What's the point? Yeah, what was the point? Oh, yeah, I remember. The whole point is that voltage drop due to circuit conductor impedance causes an overall reduction in power consumption supplied, as well as delivered.

Only if a load is upsized in compensation of voltage drop in an attempt to maintain load power (which would actually increase voltage drop, unless conductors are also upsized) would it be possible to maintain supplied power.

Remember, most loads are not constant power, which would cause current to rise proportionately with voltage reduction. A buck-boost transformer used in boost mode would be an example of this done by design. You can't get more power out of a system than you put in.
 
Last edited:
Status
Not open for further replies.
Top