A recent project was given to me where had to do lighting design for a three floor building. It was a small size building and I figured out that adding 20 fixtures in each level will be sufficient. This means total of 60 fixtures for the whole building. Each fixture was rated 120v, 100w. There was an existing 120/240v lighting panel in the building. My electrical calculations are below
Total watt for first floor 100 x 20 = 2000W
Total current = 2000/120
= 16.6 A
Wire size based on NEC rule =16.6*1.25 = 20.75 which means 20A breaker and size #12 wire.
The issue that I have is about the voltage drop. How should I calculate it?
Should I take the distance of the farthest fixture when it comes to the length of circuit or should I take the distance of the middle fixture for circuit length? The fixtures are evenly spaced. I am being allowed to use only 3 circuits from the existing panel (one for each floor).
Comments / modifications awaited.
Thanks
Total watt for first floor 100 x 20 = 2000W
Total current = 2000/120
= 16.6 A
Wire size based on NEC rule =16.6*1.25 = 20.75 which means 20A breaker and size #12 wire.
The issue that I have is about the voltage drop. How should I calculate it?
Should I take the distance of the farthest fixture when it comes to the length of circuit or should I take the distance of the middle fixture for circuit length? The fixtures are evenly spaced. I am being allowed to use only 3 circuits from the existing panel (one for each floor).
Comments / modifications awaited.
Thanks