Utility voltage drop
What would the minimum percent voltage drop that would be acceptable on a 208v 3ph motor. Would a drop of 10% from the utility be to great. Thanks
Your question needs clarification.
Suppose we consider the utility transformer that supplies your building with 208.
Assume the primary supply to this transformer is very stiff. This means virtually no voltage change occurs on the primary side from load changes you make on the secondary side when you fully load the transformer. As an example less than 0.1% primary change from a 100% load change.
Assume at the secondary side of a typical transformer at the transformer terminals the voltage changes 3% for the 100% load change. Now progress to the main panel. Assume the change here is 5% for the 100% load change.
Internal to the transformer there is an impedance as viewed looking into the secondary terminals that primarily consists of the copper losses (resistance) in the windings, and the equivalent circuit series inductance resulting from leakage flux.
The wires from the transformer to the main panel can be considered mostly resistive. But if you put a specific load current on this source and measure the voltage change at the main panel, then all significant resistive and inductive components of the source determine the voltage change at the main panel.
From the main panel there usually are a number of different branch circuits. The impedance of a branch circuit from the main panel to a particular load point is probably larger than the source impedance of the source as viewed from the main panel. The reason is that most branch circuits have some reasonable length of wire and are not designed for the full current capability of the source.
Suppose your motor load produces a load current that is 10% of the 100% load rating of the source. Since the source impedance at the main panel results in a 5% voltage drop at 100% load, then you can expect about 0.5% change for a 10% load (your motor) at the main panel. The phase of the load current does affect the change in voltage but for approximation purposes it is easiest to ignore this.
Likely the biggest part of the voltage change results from your branch circuit impedance from the main panel to the load. Here is an example: my home service is a nominal 200 A 120-0-120 source. A 12 A change (1500 W heater) produces about a 0.6 V change close to the main panel without getting inside the panel for measurements. A very approximate source impedance is 0.6/12 = 0.05 ohms. At a bench about 60 ft of wiring away and several plugs in series and five circuit breakers the voltage change is 5.2 V or a total impedance of approximately 5.2/12 = 0.43 ohms. If the wire were #12 (which it is not) the calculated resistance would be about 60*2*1.6/1000 = 0.192 ohms. Approximately 0.43 - 0.05 = 0.37 of the drop is from the impedance of the branch circuit to the point of measurement. If I move back on the branch circuit a ways and eliminate two breakers and one plug the voltage change is 2.8 or 0.23 ohms to this point. The branch wire from the main panel to the bench are is larger than #12. So much of the drop is from plugs and circuit breakers.
The above is by way of illustrating some measurements you could make. These measurements are more important to your question, than is voltage drop. Note: I did not describe the above measurements as voltage drop, but rather as voltage change. These could be the same in a purely resistive or inductive circuit, but probably in an actual circuit are not. But maybe only a small difference between voltage drop and voltage change.
Suppose your motor works well over a voltage range of +/-10% of nominal rating. Next assume your supply without the motor load is 10% below the nominal motor rating. Then you connect the motor and the supply voltage drops another 10%. You are likely in trouble. Consider the opposite extreme where the source voltage is 10% higher than your motor nominal. Now you might tolerate a 20% voltage drop with the motor connected and be OK.
Truthfully the 20% drop would cause real problems on motor start-up where you might actually hit 60% or more drop and the motor would never reach speed, quite likely just stall.
With high line drop of 10% at steady-state load you are probably excessive. I think your real problem is at motor start-up.
An example from rough memory. I have a DeWalt radial arm saw. On 120 V at the end of about 100 ft of #12 branch circuit wire this motor draws about 80 A for maybe 5 seconds to reach speed. This causes a very large drop in voltage at the motor. However, once up to speed there is no major problem using it as a saw. If this were close the the main panel start-up would be very quick. To run the saw under these conditions is very marginal. Under normal load the drop would be about 3 to 4%.