110506-2051 EDT
The results of this poll really surprise me.
Suppose it really took a lot of power to initiate the arc in the tube. For example 1,000,000 watts for 1 microsecond. Thus the turn on energy would be 1 watt-second. Next make the on-off duty cycle 10 minutes on and 10 minutes off and assume a single 40 W bulb. In that 20 minute period the tube will consume 40*10*60 = 24,000 watt-seconds of energy, or 6.67 watt-hours, or 0.006,67 KWH. Over a 1 hour period the consumption is 3*6.67 = 20 W-hours, or 0.020 KWH. A different way to get the same answer is --- the average power is 20 W because the duty cycle is 50%. So during an hour period the consumption is 0.020 KWH.
What is 3 times 1 watt-second? It is 3 watt-seconds. How much is this in KWH? It is 3/3600 KWH, or 0.000,83 KWH. This is totally insignificant in comparison to the energy to run the tube.
I picked a totally ridiculous power level just to make a point. Power is not the issue. Energy is.
I do not have any data on the power level to start a fluorescent tube, but it is not much different than the operating power and is only present for at most a few cycles. Basically a magnetic ballast is a current limiting device and to initiate the arc the current won't be much different than the operating current. Also in all probability for a cold cathode tube the arc strikes within 1/2 to 1 cycle, 16 milliseconds.
If you start an incandescent bulb from room temperature at the peak of the voltage waveform and in a relative sense compare this to a fluorescent tube, then I expect that the incandescent consumes more energy for start up than the fluorescent.
.