mbrooke
Batteries Included
- Location
- United States
- Occupation
- Technician
Ok, I apologize if this is not the right place; as the question is purely a theoretical one- and not job related. However, I was asked this question by an EE student and I don't know how to answer it. Apologies ahead of time if its not well worded.
Picture a DC circuit first. 12 gauge THHN, 100 volts, 15 amperes.
At 50 feet the voltage drop is 2.38% resulting in 97.62 volts at the load.
Then consider 2000 feet where the voltage drop is 95.3% at 4.7 volts at the load.
Same wire, same 15 amp, same 2000 feet now at 10,000 volts:
Voltage drop is 0.95% and 9904.7 volts at the load.
So we can conclude a huge voltage discrepancy in voltage drop based on the voltage. The % voltage at the source remains the same for both 100 volts and 10,000 volts, but the % voltage at the load varies tremendously based on the source magnitude. I can understand this and it makes sense for a DC circuit. I can also conclude that for any given voltage, the longer the run the more % drop.
Now picture an AC circuit. At any point in time the "DC" voltage (snap shot frozen in time) varies in magnitude relative to other points in time of the sine wave. In essence a 2000 foot circuit should behave like the 100 volt DC version near the bottom of the sine wave giving 95.3% drop, yet perform like a 10,000 volt circuit at the 10,000 volt peak giving a 0.95% voltage drop. The AC RMS voltage would be 7000 volts at the source.
Now, wouldn't the difference in % drop cause cause a different sine wave to appear at the load when compared to the source? :blink::?
Picture a DC circuit first. 12 gauge THHN, 100 volts, 15 amperes.
At 50 feet the voltage drop is 2.38% resulting in 97.62 volts at the load.
Then consider 2000 feet where the voltage drop is 95.3% at 4.7 volts at the load.
Same wire, same 15 amp, same 2000 feet now at 10,000 volts:
Voltage drop is 0.95% and 9904.7 volts at the load.
So we can conclude a huge voltage discrepancy in voltage drop based on the voltage. The % voltage at the source remains the same for both 100 volts and 10,000 volts, but the % voltage at the load varies tremendously based on the source magnitude. I can understand this and it makes sense for a DC circuit. I can also conclude that for any given voltage, the longer the run the more % drop.
Now picture an AC circuit. At any point in time the "DC" voltage (snap shot frozen in time) varies in magnitude relative to other points in time of the sine wave. In essence a 2000 foot circuit should behave like the 100 volt DC version near the bottom of the sine wave giving 95.3% drop, yet perform like a 10,000 volt circuit at the 10,000 volt peak giving a 0.95% voltage drop. The AC RMS voltage would be 7000 volts at the source.
Now, wouldn't the difference in % drop cause cause a different sine wave to appear at the load when compared to the source? :blink::?