I guess with AC it could depend on where on the waveform the voltage is when you are shocked. If you touch it when it is nearing zero then this may not hurt as much as if you touched it when it was at a peak of 170V. With DC it is the same voltage no matter what.
A megohmmeter is a current limited source. It might be 1000V open circuit, but the output voltage is designed to drop in order to maintain a certain maximum current. The jolt from a given megohmmeter may be current limited to a relatively safe value.
The maximum current can be quite high for some devices, and some high voltage test sets have lethal output. Also that limited current can still charge up cable and system capacitances to deliver a lethal jolt.
How is the current limited? Is this by some sort of output impedance on the device?
So even though it is 1000V open circuit, when you put the leads on a cable (or body) the output voltage drops to some amount based on the current that it starts to output. For example if you put this 1000V device on a a cable, then current will start to flow based on output impedance of device, and cable circuit impedance, and this output current will cause a voltage drop across the device output impedance and thus drop the voltage to some lower value, lets say maybe 250V? So the cable or body would only have maybe 250V across it with leads connected?
If this is the case, why call it a 1000V megger as opposed to whatever value it drops to?
Why is DC voltage chosen for meggers? Is it to deal with the capacitive charging current from cables, that would flow with AC but is blocked with DC?