I'm trying to learn about how voltage drop is calculated for a panel. There is a lot of information on how it is calculated from a panel to a end device and that is relatively trivial. However I haven't found much information on how it is figured say from a switchgear to a remote panel.
So for example let's say I have a 200A 480/277V Panel roughly 750' away from the switchgear that feeds it. I would like to ensure that the voltage drop across this feeder is no more than 2% to ensure that I have the remaining allowance of 3% to figure for my end devices.
Hypothetical Loads:
A Phase Load: 19088VA
B Phase Load: 19088VA
C Phase Load: 19088VA
How would I go about calculating the size of my feed in order to keep my drop under 2%?
Thank you in advance. Any/All help is appreciated.
So for example let's say I have a 200A 480/277V Panel roughly 750' away from the switchgear that feeds it. I would like to ensure that the voltage drop across this feeder is no more than 2% to ensure that I have the remaining allowance of 3% to figure for my end devices.
Hypothetical Loads:
A Phase Load: 19088VA
B Phase Load: 19088VA
C Phase Load: 19088VA
How would I go about calculating the size of my feed in order to keep my drop under 2%?
Thank you in advance. Any/All help is appreciated.