So there's been a Voltage drop of 12,000-1500=10500V, so the power dissipated is 3.6 x 10^7 W. In the case of 50,000 being the potential the potential drop is 48500 W. So power dissipated is 7.8 x 10^8 W. Now when I take the difference of these I get 7.4 x 10^8.
If electricity is delivered at 50000 V instead of 12000 V, then power wasted will be (1) 1.1 x 105 WIL>sobstraction.
a power station delivers 750 kW of power at 12,000 V to a factory through wires with total resistance of 3.0 ohms. How much less power is wasted if the electricity is delivered at 50,000 V rather than 12,000 Volts? I'm having some basic problems conceptualizing what's going on in this problem.