Readings on a Fluke 1587 Insulation Tester

Status
Not open for further replies.
I have a Fluke 1587 that I use to "meg" out circuits and wiring. On a good conductor that is open on both ends when I hit the test button with the meter set on 1000V the out put usually reads around 999 mOhms or 1 gOhm then quickly climbs to >2.2 gOhms which is as high as the meter will read. Does anyone know why the readings change?
 

Tony S

Senior Member
Capacitive charging.

If you use a 5kV Megger it has a time function. I would specify X Gohm within Y minutes or if the reading reaches 1Tohm abort the test but record the time, the weather conditions were also added to the test results.

The weather can make a hell of a difference, you get different readings on a cold winters day to a warm summers day.

Interpreting the results is where the black magic comes in.
 

meternerd

Senior Member
Location
Athol, ID
Occupation
retired water & electric utility electrician, meter/relay tech
You have to hold the button for as long as you're testing. Basically until the reading stabilizes. 2.2G is just the full scale reading, which it goes to if you let go of the button. Not sure about the model you have, but there's usually a red light on the button showing that the output is energized. Make sure the light stays on until you're done with the test. The reading should start low and increase to the actual value and stabilize. On large loads such as motors, that can take a bit of time because of the capacitance. If it goes to full scale every test, then you have a problem with your test leads or the tester. Short the leads together and test. It should show close to zero.
 

MD84

Senior Member
Location
Stow, Ohio, USA
Capacitive charging as mentioned by Tony S.

Think of the meter using ohms law. It takes a reference voltage (1000V) and measures current. When voltage is applied the conductors are charged to 1000V. Depending on the capacitance of the conductors, charging current will flow until approaching the test voltage. Once the conductors are charged there will be a leakage current.

The charging current starts off relatively higher and tapers off as the conductors reach capacity. This is why the ohm reading increases. More current will indicate lower resistance.

I am guessing that the meter charging current is limited to maybe 1 micro amp. This is why the resistance reading flatlines at 1 gig ohm until approaching leakage current. Leakage current is probably less than 454 nano amps so the meter is basically "pegged".

You could test this theory by starting a test with only the meter leads. The meter should move very quickly to 2.2 gigs. Then try the longest and largest cable you have access to. The meter would show one gig for a longer time until finally moving to 2.2 gigs or less.

You may be able to achieve a test result <2.2 gigs with a long multi conductor cable by putting your test lead on one conductor and then bonding all the other conductors and connect them to the ground lead.
 
Status
Not open for further replies.
Top