I hope Ive have been telling customers correctly when they call for me to install dimmer switches to save on electrical bill. I advise them that dimmers dont save anything . If my electrical theory serves me right a dimmer is nothing more than a variable resister in series with the light load. The dimmer causes a voltage drop across it the same as a light bulb and gets hot the same way. We used to make home made testers by wiring 2 light bulb sockets in series each bulb recieved 60 volts on a 120 circuit and would burn dim yet the total watts was the same as a single bulb . If the bulbs burned normal the circuit was 240 v . I still use those testers sometime because they put a load on the circuit and wont read phantom voltage like a tester. The point Im making is the dimmer is the same as a bulb tester it uses its share of the voltage drop and the rest goes to the light circuit.
Jet, imagine controlling 8-75 watt recessed lights with a 600 watt dimmer. If you dim the lights so they are only operating at 25% of full output do you really think the dimmer will be dissipating 450 watts of heat? You would have some pretty bad burns if you came in contact with the dimmer.
I think you are correct but if are are dimming 600 watt they will get very hot , pull the cover off and check. I just hate to see power wasted that way in the form of heat in the outlet box. that has to take a toll on the insulation of the conductors in the outlet and also derate them.
The semiconductor dimmers do lose some heat but in the example that curt used they would not produce 450 watts of heat, that would be very hot compared to what the dimmer switch acutually gives up as heat.
Putting two loads in series does not result in same amount of total power in the circuit either. In the OP if the two test bulbs were both identical 100 watt bulbs and were in series across a 120 volt load they each have 60 volts across them but the total power used would not be 100 watts it would be 50 watts (this is assuming resistance is constant at all temperatures which it is not in an incandescant lamp).
In curts example if you were to use a true resistor for dimming 600 watts of lamps to 25% your results would be like this:
600 watts @ 120 volts = 5 amps
600 watts / 5 amps = 24 ohms of resistance on the lamps (we again will assume that resistance will be constant at all temperatures to simplify the problem)
25% of 5 amps = 1.25 amps - this will be our target amperage for 25% of the 600 watt amperage.
To achieve 1.25 amps of current through a 24 ohm resistance with 120 volts of input you must put 72 ohms of resistance in series with it.
30 volts will drop across the 24 ohm resistance, and 90 volts will drop across the 72 ohm resistance.
The power consumed by the lamps will be 37.5 watts and the power consumed by the resistor will be 112.5 watts. The dimming resistance will consume more power than the lamps being dimmed but the total power consumed by the 120 volt input will only be 150 watts so there is still a reduction in energy used.