I always use the connected load ampage to calculated voltage drop for panel feeders. For example, a panel rated 100amps and upstream circuit breaker rated 100amps also. But the connected load is 50amps. I will use 50amps to calculate the voltage drop to make sure no 3% drop to the panel. One EC asked me how about the client connect new load after this project and it will very likely increase the connected load and voltage drop will be bigger than 3%.
How do you guys calculate voltage drop? Thanks.
How do you guys calculate voltage drop? Thanks.