We are installing a bunch of 15kV shielded, 3/c - 4/0 to 500kcmil. I've been looking over the test data and have a bunch of questions. DC hipoting is not my area of expertise. So far I've read IEEE 400, (IEEE 400.1 is not available to me), IEEE 576, IEEE std 4, NETA MTS, NETA Acceptance Criteria, maybe some other stuff NEMA 54? and 73?) - these are all real page turners (yawn).
IEEE 400 says 56kV max, and lists a max current dependent of the insulation type, insulation thickness, temp, voltage, length. The acceptance criteria is listed as the 56kV current at 1 minute greater than the 56kV current at 15 minutes.
What I am seeing, is up to 30kV the current is linear with respect to the voltage and the curve is through the origin (0kV,0uA). Around 35kV the slope changes up significantly. The ends are terminated, clean and bagged and the RH is 40% to 50%, ambient is 58F to 70F. Between 50kV and 56kV, the current takes another slight jump above a linear approximation (least squares) of the 35kV-50kV data.
The cables meet test spec.
Here is what I think I am seeing. Up to 30kV the I'm just seeing the cable insulation resistance. Above 35kV I think I am seeing corona discharge. Above 50kV I think the insulation is being degraded slightly.
The electricians are pretty good. They seem to have a lot of experience in doing the tests and doing the terminations. The terminations look good.
So, who has some experience to give me some clues?
Is it corona I'm seeing above 35kV?
If it is, what are some methods of raising the voltage where it starts?
Are we gaining any additional information about the cable integrity by testing to 56kV over say 35kV?
Like I said, the cables passed, I am just trying to gain insight on what I am seeing - interpretation of test results
I'm planning on sending my data to the cable mfg and calling him next week. Too many rats left to kill to do it this week.
IEEE 400 says 56kV max, and lists a max current dependent of the insulation type, insulation thickness, temp, voltage, length. The acceptance criteria is listed as the 56kV current at 1 minute greater than the 56kV current at 15 minutes.
What I am seeing, is up to 30kV the current is linear with respect to the voltage and the curve is through the origin (0kV,0uA). Around 35kV the slope changes up significantly. The ends are terminated, clean and bagged and the RH is 40% to 50%, ambient is 58F to 70F. Between 50kV and 56kV, the current takes another slight jump above a linear approximation (least squares) of the 35kV-50kV data.
The cables meet test spec.
Here is what I think I am seeing. Up to 30kV the I'm just seeing the cable insulation resistance. Above 35kV I think I am seeing corona discharge. Above 50kV I think the insulation is being degraded slightly.
The electricians are pretty good. They seem to have a lot of experience in doing the tests and doing the terminations. The terminations look good.
So, who has some experience to give me some clues?
Is it corona I'm seeing above 35kV?
If it is, what are some methods of raising the voltage where it starts?
Are we gaining any additional information about the cable integrity by testing to 56kV over say 35kV?
Like I said, the cables passed, I am just trying to gain insight on what I am seeing - interpretation of test results
I'm planning on sending my data to the cable mfg and calling him next week. Too many rats left to kill to do it this week.