Sunday, July 17, 2011
Higher voltage = lower energy loss?
Power companies send electricity at extremely high voltages because lower current means less energy lost. But here's the thing I don't get about that. Let's say (100% hypothetical numbers here, don't fixate on that) you've got a 100 volt circuit at ten amps and ten ohms (V = I * R, 100 = 10 * 10). Now you step that up so it's 1000 volts, which decreases the amps to one (because 100 * 10 = 1000 * 1, same total energy output) and increases the ohms to 1000. Why doesn't the tenfold increase in resistance cause as much more energy loss than the reduction of current? What's the math behind this?
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment