zcanyonboltz
Senior Member
- Location
- denver
So I've searched this online for a while and I keep getting taken to physics websites and forums and long drawn out answers that don't make a lot of sense to a non physicist. My question is how is it that when voltage increases amperage decreases but watts stay the same? To illustrate, lets say you have 10,000 watts being drawn at 120 volts, the amperage is 83.3 amps. Now take that same 10,000 watts at 240 volts and you have 41.6 amps. People often think just because equipment operates at 240 volts the power consumption is less but when I tell them this isn't the case you're billed in wattage consumption and this stays the same at either 120v or 240v they are in awe. Any ways to explain this? Thanks