110 or 120

Status
Not open for further replies.

tom baker

First Chief Moderator
Staff member
I just finished up reading AC/DC, The War of the Currents, a history of the AC vs DC or Westinghouse vs Edison. Edison had selected 110V as it resulted in the best combination between voltage drop and lamp life.

When did we change from 110 to 120?
Old timers always refer to 110 or 220. I had some WWII motors that were 220/440....
 

templdl

Senior Member
Location
Wisconsin
I just finished up reading AC/DC, The War of the Currents, a history of the AC vs DC or Westinghouse vs Edison. Edison had selected 110V as it resulted in the best combination between voltage drop and lamp life.

When did we change from 110 to 120?
Old timers always refer to 110 or 220. I had some WWII motors that were 220/440....

May you be comparing Edison's DC voltage to that of Westinghouse's AC voltage?

I think that a common 1ph distribution voltage was 2400v which then evolved into the 4160Y/2400 which made the 4160 sort of a weird voltage because of adding 3ph to an existing 2400 1ph system.
I'm not sure where 3ph 480v came from but if 1ph240 is a nominal 1ph voltage 480v is twice that and 120v is half of 240.
And the 208y voltage probably evolved in the same way 4160y evolved, that 120v was an existing voltage and they needed to have a 3ph distribution system for it, 208Y/120 similar to 4160Y/2400 where both 120 and 2400v where existing 1ph voltages.

As such 120v is considered the nominal voltage which is often refered to as 110, 115v etc. I think is most of existing distibution systems today if you did have 110v it probably would be considered to be low. And 240v as a nominal volage is often refered to as 220 of sometimes 230v. It is my understanding that the voltages that equipment has been designed for is based upon the anticipation of what the actual voltage would most likely be which is very often less than the nominal voltage for opimal performance.

And where you pointed out that the reason why ?Edison had selected 110V as it resulted in the best combination between voltage drop and lamp life" seems a bit strange to be as I wonder what caused him to draw that conclusion. One would have to wonder about the consistency of manufacturing the light bulb and filament along with the wire and its related voltage drop. it would seem top me that each would be so variable that it would have been dificult to come to any type of conclusion with any consistancy.

I have a feeling that the thread that you started may go on for some time as everyone adds their thoughts.
 

hurk27

Senior Member
Found this:

History of voltage and frequency
Voltage & frequency around the world


The system of three-phase alternating current electrical generation, transmission, and distribution was developed in the 19th century by Nikola Tesla, George Westinghouse and others. Thomas Edison developed direct-current (DC) systems at 110 V and this was claimed to be safer in the battles between proponents of AC and DC supply systems (the War of Currents). Edison chose 110 volts to make high-resistance carbon filament lamps both practical and economically competitive with gas lighting. While higher voltages would reduce the current required for a given quantity of lamps, the filaments would become increasingly fragile and short-lived. Edison selected 100 volts for the lamp as a compromise between distribution costs and lamp costs. Generation was maintained at 110 volts to allow for a voltage drop between generator and lamp.

In the 1880s only carbon-filament incandescent lamps were available, designed for a voltage of around 100 volts. Later metal filament lamps became feasible. In 1899, the Berliner Elektrizit?ts-Werke (BEW), a Berlin electrical utility, decided to greatly increase its distribution capacity by switching to 220 volt nominal distribution, taking advantage of the higher voltage capability of metal filament lamps. The company was able to offset the cost of converting the customer's equipment by the resulting saving in distribution conductors cost. This became the model for electrical distribution in Germany and the rest of Europe and the 220-volt system became common. North American practice remained with voltages near 110 volts for lamps.[7]

In 1883 Edison patented a three wire distribution system to allow DC generation plants to serve a wider radius of customers. This saved on copper costs since lamps were connected in series on a 220 volt system, with a neutral conductor connected between to carry any unbalance between the two sub-circuits. This was later adapted to AC circuits. Most lighting and small appliances ran on 120 V, while big appliances could be connected to 240 V. This system saved copper and was backward-compatible with existing appliances. Also, the original plugs could be used with the revised system.

Main article: Utility frequency
At the end of the 19th century, Westinghouse in the US decided on 60 Hz and AEG in Germany decided on 50 Hz, eventually leading to the world being mostly divided into two frequency camps. Most 60 Hz systems are nominally 120 volts and most 50 Hz nominally 230 volts

From here:
Mains electricity

The above web site (wikipedia.org) has an open shareing license
You are free to:
Read and Print our articles and other media free of charge.
Share and Reuse our articles and other media under free and open licenses.
Contribute To and Edit our various sites or Projects.
 
Status
Not open for further replies.
Top