I am perplexed over the data in which my meter is reading. I am using a variac which is variable from 0 -280 VAC. I am using two digital multimeters in a circuit. One to read voltage ( in parrallel - this meter is a Fluke 21 which has a 300mA internal fuse) and the other to read current ( in series - this meter is the Fluke 177 True RMS Meter which has a 10A internal fuse). Using my variac I apply 3 voltages 108VAC, 120VAC and 132VAC. My current readings and calculated power are as follows: 108VAC - 1.15A - 124.2W, 120VAC - 1.22A - 146.4, 132VAC - 1.28VAC - 168.9W. The only load in my circuit is a 150W Halogen Lamp. What I am perplexed over is my power calculations given Ohm's Law Theory. When voltage is increased, current should decrease, right? Please help, what am I doing wrong?