Does anyone have a history of the "wrong" Voltages

Status
Not open for further replies.

megloff11x

Senior Member
We have the old 110V and 115V instead of 120V. I have 3-phase machines that want 3-phase 220V or even 230 and 240V instead of 208V. You call the manufacturer and can't always get a straight answer on run em 120 or 208, or get a transformer just for them. Some are old but I still see appliances and machines much younger than me with these Voltages stamped on the nameplate. If you send to Europe or Japan, they too have variance from their standard values. And if you buy something made overseas for the US market, anything goes on the nameplate.

Is there a good reference that has a history of where these oddball Voltages started and why?

Matt
 
We have the old 110V and 115V instead of 120V. I have 3-phase machines that want 3-phase 220V or even 230 and 240V instead of 208V. You call the manufacturer and can't always get a straight answer on run em 120 or 208, or get a transformer just for them. Some are old but I still see appliances and machines much younger than me with these Voltages stamped on the nameplate. If you send to Europe or Japan, they too have variance from their standard values. And if you buy something made overseas for the US market, anything goes on the nameplate.

Is there a good reference that has a history of where these oddball Voltages started and why?

Matt

I do not think they are odd ball voltages. Just different ways of referring to the same nominal voltages.

Voltage, Nominal. A nominal value assigned to a circuit or
system for the purpose of conveniently designating its voltage
class (e.g., 120/240 volts, 480Y/277 volts, 600 volts).
The actual voltage at which a circuit operates can vary from
the nominal within a range that permits satisfactory operation
of equipment.

As a practical matter, the NEC has now defined what the "standard" nominal voltages are.

220.5 Calculations.
(A) Voltages. Unless other voltages are specified, for purposes
of calculating branch-circuit and feeder loads, nominal
system voltages of 120, 120/240, 208Y/120, 240, 347,
480Y/277, 480, 600Y/347, and 600 volts shall be used
 
Last edited:
Is there a good reference that has a history of where these oddball Voltages started and why?

these are not oddball voltages, they are just old.

First you need to remember that we have nominal supply voltages, today they are: 120V, 208, 240, 480, and 600V
And we have nominal utilization voltages (what equipment wants), today these are: 115V, 200V, 230V, 460V, and 575V.

Note that utilization voltages are a % of the nominal supply voltage.

There are also tolerances, the most commonly quoted being +/- 10% of rated.
Then we have maximum voltage ratings, the most common seem to be 125V, 250V, and 600V.

In the early days of commercial electricity, 110V was very common. But line losses were very problematic so it became common to connect two 110V sources in series creating 110/220V systems. This concept of connecting two sources in series led to the voltages of 110V, 220, and 440V. These voltages were prevalent through the early days of electrification, so our they became part of our general conversational language. As the US industrial complex grew, line losses continued to play havoc so, the nominal supply voltages were increased to 115V, 230V. and 460V. By the 60's the use of electricity had grown so much, the standard US voltages were again raised to the values we have today.

Because there is no single set of utility regulations for the entire US, many smaller companies were slow in upgrading their 'delivery' equipment so it is not un-common to have nominal voltages that went 'obsolete' more than 50years ago.


So, to the OP's question. Without knowing the actual tolerances of your equipment it would be difficult to say if it can be used at today's nominal voltages. However, most non-motor equipment is not very fussy about the voltage it gets, as long as its maximum rating is not exceeded. Inductive loads (e.g. motors) are not so forgiving, it is somewhat load dependent, so it is best to supply them less than 10% above their nameplate rating. But, I have seen 440V motors purring merrily along on nominal 480V systems.
 
The story goes, 110V came about with Edison, but it was 110V DC. He needed to match the lumen output of gas lamps to make his light bulbs viable, so through experimentation he settled on 110VC as a compromise between the lumen output of his lamp compared to the size of wire he needed to run at the first demonstration he did. Then once his light bulbs caught on, Westinghouse and Tesla started distributing 110VAC to houses that were using Edison's lamps, so that users didn't have to change anything (incandescent lamps don't know AC from DC). Remember, at that time, household appliances were NOT yet using electricity, it was mainly for lighting only. So then after a lot of houses got wired for electric lighting, appliance mfrs caught on and had to match the lighting voltage.
 
I've used buck boost before.

I'm mainly interested in the "academic" history, specific names and dates and such. I recall the Edison story on 110VDC. Last I dealt with them the Japanese use 100VAC 50Hz and Europe has run 220-230ish 50Hz. At one previous company we had to get a 50Hz power supply because just stepping down the Voltage wasn't enough. There were frequency issues, including cooking the transformers.

I was hoping to find an old book with the lore in it.

Matt
 
I was hoping to find an old book with the lore in it.

It is not lore.

Look up the history of ANSI C84.1. The IEEE Red Book contains the 1989 version of this standard.

My oldest reference book, first published in 1957, describes equipment "rated 115V are supplied from 120V line and neutral". Later on, it talks about "appliances rated 220V or 230V are supplied at 240V".
 
even today, utility transformers are rated at 480V and 240 volts. and the utility puts in 460V and 230V in your service contract. part of the reason is the expected load drop from the transformer to the appliance.
 
From the IEEE Grey Book

The reasons for these differences go back to the original development of electric power distribution systems. The first
utilization voltage was 100 V. However, the supply voltage had to be raised to 110 V in order to compensate for the
voltage drop in the distribution system. This led to overvoltage on equipment connected (close to the supply, and the
utilization equipment rating was also raised to 110 V. As generator sizes increased and distribution and transmission
systems developed, an effort to keep transformer ratios in round numbers led to a series of utilization voltages of 110
V, 220 V, 440 V, and 550 V, and a series of distribution voltages of 2200 V, 4400 V, 6600 V, and 13 200 V.

As a result of the effort to maintain the supply voltage slightly above the utilization voltage, the supply voltages were
raised again to multiples of 115 V, which resulted in a new series of utilization voltages of 115 V, 230 V, 460 V, and 575
V, and a new series of distribution voltages of 2300 V, 4600 V, 6900 V, and 13 800 V.

As a result of the development of the 208Y/120 V network system, the supply voltages were raised again to multiples
of 120 V. This resulted in a new series of utilization voltages of 120 V, 208Y/120 V, 240 V, 480 V, and 600 V, and a new
series of primary distribution voltages of 2400 V, 4160Y/2400 V, 4800 V, 12 000 V, and 12470Y/7200 V. However,
most of the existing primary distribution voltages continued to be used, and no 120 V multiple voltages developed at
the transmission level.
 
Thanks,
I did not know that was in there.:dunce:

Funny enough I thought that is where you based your response from as you mentioned some key terms such as nominal system voltage and nominal utilization voltage. I think the IEEE color series are probably the only placed I have noticed these terms used.

This is also mention in the IEEE Red book.

What is impressive to me is that you were able to pick up that information from another source.
 
What is impressive to me is that you were able to pick up that information from another source.

I missed a 100% on a test once, when I answered 110V was a common single phase voltage.
It was particularly galling, as I was the one that had created the test questions several months earlier.:ashamed:

I learned basic electricity from someone who actually wired using the 'old' voltages.
In the late 70's there was a local paper mill that required us to tap transformers as low as we could get. They were still upgrading their motor control center starter coils from 440V.
 
I missed a 100% on a test once, when I answered 110V was a common single phase voltage.
It was particularly galling, as I was the one that had created the test questions several months earlier.:ashamed:

I learned basic electricity from someone who actually wired using the 'old' voltages.
In the late 70's there was a local paper mill that required us to tap transformers as low as we could get. They were still upgrading their motor control center starter coils from 440V.

Funny story about the exam question.

Nothing like picking up information from field experience.
 
We have the old 110V and 115V instead of 120V. Is there a good reference that has a history of where these oddball Voltages started and why? Matt

refer to American Electricians' Handbook division 3 beginning at page 7 of 9th edition ELECTRICAL SYSTEMS

FOOTNOTE a Electric Power Club standard voltage ratings
 
Status
Not open for further replies.
Top