xebo
Member
- Location
- United States
1. Do transformers provide a constant source of "Power" or voltage?
1a. If power is measured by voltage and current, total voltage is relatively constant, and total current changes depending on how many loads you have on the circuit (in parallel), then how can power output be constant?
2. On a circuit that spreads the loads evenly between the different phases, is there any current on the neutral?
2a. (2 rephrased):If I want to minimize current on the neutral, should my aim be to ensure the current on each phase is as equal as possible, by ensuring the resistance (load) on each phase is as equal as possible?
3. If an appliance is "x watts" (Like a 30 watt bulb), does that mean it will fail if more power than 30 watts is run through it? Does that mean it was designed to run at 30 watts only? Does it mean that "if you put 120 volts (for example) through this resistor, it will experience a current flow resulting in 30 watts of power? Where do wattage ratings come from, and why do they exist?
3a. If I increase the voltage to an appliance (Like going from 120v to 240v), what will happen? Overload, fail to operate, operate faster/brighter/etc?
3b. If an appliance is listed at "30 watts", is this just an indirect way of measuring its resistance? Why aren't lightbulbs "480 ohm bulbs"? Why is the standard to measure in terms of power?
1a. If power is measured by voltage and current, total voltage is relatively constant, and total current changes depending on how many loads you have on the circuit (in parallel), then how can power output be constant?
2. On a circuit that spreads the loads evenly between the different phases, is there any current on the neutral?
2a. (2 rephrased):If I want to minimize current on the neutral, should my aim be to ensure the current on each phase is as equal as possible, by ensuring the resistance (load) on each phase is as equal as possible?
3. If an appliance is "x watts" (Like a 30 watt bulb), does that mean it will fail if more power than 30 watts is run through it? Does that mean it was designed to run at 30 watts only? Does it mean that "if you put 120 volts (for example) through this resistor, it will experience a current flow resulting in 30 watts of power? Where do wattage ratings come from, and why do they exist?
3a. If I increase the voltage to an appliance (Like going from 120v to 240v), what will happen? Overload, fail to operate, operate faster/brighter/etc?
3b. If an appliance is listed at "30 watts", is this just an indirect way of measuring its resistance? Why aren't lightbulbs "480 ohm bulbs"? Why is the standard to measure in terms of power?