Why do breaker interrupting ratings decrease with increase voltage ratings?

Status
Not open for further replies.

Pitt123

Senior Member
In case you have't guessed, I've been doing some breaker research lately. :)

I noticed that as breaker voltage ratings increase the interrupting capacity of the breaker decreases. Why is this? Is this becasue for a given system there will be less avaliable fault current at higher voltages?

Also does anyone know why the breaker interrupting ratings change when applied to frequencies other than 60hz?

Lastly I saw a statement stating that the voltage rating is determined by the maximum voltage that can be applied across its terminals. Is this referring to L-G for each of the terminals?
 

kingpb

Senior Member
Location
SE USA as far as you can go
Occupation
Engineer, Registered
Assuming a whole lot of generalities, because there are a lot of factors ratings, etc. involved, but it is not that there is less fault current with a higher voltage, it is the capability of the breaker itself to withstand a fault. Pretty much follows ohms law;
so as the voltage rating goes up, the current capability goes down. Again, that statement is predicated on a lot of generalities.

This is a good topic though, and hopefully it will get a good response and lively dialog.
 

broadgage

Senior Member
Location
London, England
For a given design of breaker, the short circuit capacity decreases if it is used on a higher voltage system because the higher voltage is more diffecult to interupt and causes more severe arcing.
The available fault current may be greater on a higher voltage system, therefore the use of the smaller/cheaper types of breaker may not be permissable.
MCCBs or HBC fuses may be needed instead of MCBs.

Operation at very low or high frequencies may limit the breaking capacity.
There is in practice no difference at standard grid frequencies of 50 or 60 cycles.
If a breaker is only listed for 60 cycles, then use on 50 cycles would be a violation, but not in practice dangerous.
 

don_resqcapt19

Moderator
Staff member
Location
Illinois
Occupation
retired electrician
I always assumed it was an "energy" issue and 10,000 amps at 480 is twice as much energy as 10,000 amps at 240.
 

Pitt123

Senior Member
O.k. it sounds like the interrupting rating is based off of the total energy like some folks are saying. For a given period of time then the energy with time being constant would be dependent on the kVA value that the breaker was interrupting. Therefore with increaseing voltage the interrupting current would have to decrease in order to keep the same kVA rating and not increase the kVA rating and therefore energy that the breaker must interrupt.

Since energy if a function of this kVA value and time, what is usually the time that is intended or designed for these breakers?

I guess the frequency as well would have something to do with with how the energy is calculated as well?


Just to clarify is my understanding of the following statement correct?

"Lastly I saw a statement stating that the voltage rating is determined by the maximum voltage that can be applied across its terminals. Is this referring to L-G for each of the terminals?"
 
In case you have't guessed, I've been doing some breaker research lately. :)

I noticed that as breaker voltage ratings increase the interrupting capacity of the breaker decreases. Why is this? Is this becasue for a given system there will be less avaliable fault current at higher voltages?

Also does anyone know why the breaker interrupting ratings change when applied to frequencies other than 60hz?

Lastly I saw a statement stating that the voltage rating is determined by the maximum voltage that can be applied across its terminals. Is this referring to L-G for each of the terminals?

In general terms, just as the available energy diminishes as you step down the voltages, eg. a 10MVA transformer may feed several 1000HP motors and 2.5MVA transformer, but normally not a single unit of the same size, the available fault energy diminishes. The transformer itself and the transmission/distribution media - cables, busses, etc. - will limit the fault energy that can be transmitted to various points in the system. The 'further down' are you in the system, the less will be the available energy, so the less is there to be interrupted. Some of the user devices, primarily motors, that will act like energy 'storage' and release that energy at the time of the fault, but the total is much smaller than the main source, eg. Utility or Generator, but this amount will quickly decays during the fault.
 

Jraef

Moderator, OTD
Staff member
Location
San Francisco Bay Area, CA, USA
Occupation
Electrical Engineer
In a nutshell, the interrupt capacity is the amount of fault energy that the breaker can interrupt without becoming shrapnel.

I think first you have to understand what "interrupting capacity" means. Electro-magnetic energy, when expressed in large amounts over a very short span of time such as during a fault, is potentially explosive. The current carrying components in a device will want to alternately attract and repel each other with a LOT of force very quickly (assuming AC here). So there is a lot of mechanical stress expressed on the breaker frame and components during a high current fault. Although that amount of force would be the same regardless of voltage (because it's based on current), the reason you have reduced capacity as voltage increases is because of the difficulty in interrupting an arc at higher voltage. This causes an increase in the amount of time it takes to stop this destructive force from continuing. So as voltage increases, the mechanical strength of the breaker, being fixed, results in a lower amount of fault current that the breaker can withstand.
 

Jraef

Moderator, OTD
Staff member
Location
San Francisco Bay Area, CA, USA
Occupation
Electrical Engineer
... Just to clarify is my understanding of the following statement correct?

"Lastly I saw a statement stating that the voltage rating is determined by the maximum voltage that can be applied across its terminals. Is this referring to L-G for each of the terminals?"

Yes and No. Voltage is determined primarily by the L-L terminal distance. For example (IIRC) you must have at least 1" air space and 2" of total surface area between live components on a 600V rated device, less on a 300V device.
 
Status
Not open for further replies.
Top