In case you have't guessed, I've been doing some breaker research lately. 
I noticed that as breaker voltage ratings increase the interrupting capacity of the breaker decreases. Why is this? Is this becasue for a given system there will be less avaliable fault current at higher voltages?
Also does anyone know why the breaker interrupting ratings change when applied to frequencies other than 60hz?
Lastly I saw a statement stating that the voltage rating is determined by the maximum voltage that can be applied across its terminals. Is this referring to L-G for each of the terminals?
I noticed that as breaker voltage ratings increase the interrupting capacity of the breaker decreases. Why is this? Is this becasue for a given system there will be less avaliable fault current at higher voltages?
Also does anyone know why the breaker interrupting ratings change when applied to frequencies other than 60hz?
Lastly I saw a statement stating that the voltage rating is determined by the maximum voltage that can be applied across its terminals. Is this referring to L-G for each of the terminals?