I know that the power through a given resistor is calculated by V^2/R where V is the RMS value of the voltage.  So for example with 120V dropped across a 10ohm resistor the power dissipated across the resistor would be 1440watts.  This of course is assuming a 60Hz source.
What if however we were performing this same calculation with a 50Hz source? Would we have to calcultate a different RMS voltage?
Since RMS voltage is 1/T *(integral of x^2) and since the period or "T" value in this equation would be smaller for a 50Hz signal would the Vrms voltage for a 50Hz signal be higher than that for a 60Hz signal? If that is the case, then would there be more power dissipated across the 10ohm resistor for a 50Hz signal than with a 60Hz signal?
	
		
			
		
		
	
				
			What if however we were performing this same calculation with a 50Hz source? Would we have to calcultate a different RMS voltage?
Since RMS voltage is 1/T *(integral of x^2) and since the period or "T" value in this equation would be smaller for a 50Hz signal would the Vrms voltage for a 50Hz signal be higher than that for a 60Hz signal? If that is the case, then would there be more power dissipated across the 10ohm resistor for a 50Hz signal than with a 60Hz signal?
 
				
