Hipot versus Insulation Resistance Test

Status
Not open for further replies.

akrus

Member
Location
USA
Occupation
Engineer
I have been reading a lot about the distinctions between Dielectric Withstand (Hipot) and Insulation Resistance testing. There is a lot of conflicting information out there about the merits of each, especially if you read a response from a manufacturer of particular test equipment.

I understand that generally hipot is a pass/fail test to determine whether the equipment can withstand the test voltage and IR is a quantitative test to measure the resistance at the test voltage. I can see that often hipot is run at a higher AC voltage and IR is run at a relatively lower DC voltage.

Now for my question...

In my application, we are often requested to run both of these tests at the production level. The requirement is to run DWV at 500 VAC and IR at 500 VDC. The capacitance in the unit under test is minimal because there are no capacitive components and the wire length is short. I am struggling to understand how there is any difference between these tests. In both cases, ~500V is applied (I know AC is RMS) and the test equipment is measuring the leakage current. How would any defect (creepage, clearance, damaged insulation, etc.) be detected in one and not the other? My thought is that IR testing covers everything unless the hipot is run at a much higher test voltage. Am I missing something here? Any discussion is welcome. Thanks!
 
I have been reading a lot about the distinctions between Dielectric Withstand (Hipot) and Insulation Resistance testing. There is a lot of conflicting information out there about the merits of each, especially if you read a response from a manufacturer of particular test equipment.

I understand that generally hipot is a pass/fail test to determine whether the equipment can withstand the test voltage and IR is a quantitative test to measure the resistance at the test voltage. I can see that often hipot is run at a higher AC voltage and IR is run at a relatively lower DC voltage.

Now for my question...

In my application, we are often requested to run both of these tests at the production level. The requirement is to run DWV at 500 VAC and IR at 500 VDC. The capacitance in the unit under test is minimal because there are no capacitive components and the wire length is short. I am struggling to understand how there is any difference between these tests. In both cases, ~500V is applied (I know AC is RMS) and the test equipment is measuring the leakage current. How would any defect (creepage, clearance, damaged insulation, etc.) be detected in one and not the other? My thought is that IR testing covers everything unless the hipot is run at a much higher test voltage. Am I missing something here? Any discussion is welcome. Thanks!
What kind of procuct are you testing? Motors? Transformers?
 
IR is usually measured between conductors or from conductors to chassis.
Dielectric withstand is usually measured between a floating input voltage source and chassis or ground. Not from input conductor to conductor.
 
What kind of procuct are you testing? Motors? Transformers?
Aerospace 28VDC equipment typically. Sometimes we have flow down requirements and in that case we will obviously do what it says. Other times there is not requirements and we are seeking the procedure for highest quality. These are the cases where I am questioning the practical benefit of both tests versus the risk of prematurely damaging the insulation.
 
Status
Not open for further replies.
Top