Why are transformers used in the National Grid?

The National Grid is how electricity is distributed from suppliers to consumers (commercial or private household). Power stations produce power in the region of Mega Watts (the order of millions of watts) and we want as much of this power to be transferred efficiently (without losses) to be used by consumers. There are two equations which are important to consider:

  1. Power loss in a line = current^2 x resistance of the wire (written as P = I^2 x R)

  2. Power = current x voltage (written as P = I x V)

Looking at equation 1) we see that in order to minimise power losses in the line we need to have as small a current as possible (the resistance of the wires is constant). Equation 2) tells us that for a constant power, in order to have low current we must increase the voltage.

Step-up and Step-down transformers are used to increase and decrease voltage respectively. Once electricity is generated from the power station (~20,000V) it is 'stepped-up' to between 132,000V and 420,000V and distributed via cables to consumers. Mains electrics (what we use in our homes) runs at 230V, so a step down tranformer is needed to reduce the voltage to this safe level.

Answered by Mike R. Physics tutor

15088 Views

See similar Physics GCSE tutors

Related Physics GCSE answers

All answers ▸

if -3 + -4 is -7, is -3 x -4 =-12 (also negative)


What is thermionic emission?


Why do astronauts feel weightless while in orbit?


A car's speed changes from 10m\s to 40m\s in 10 seconds. What is its acceleration?


We're here to help

contact us iconContact usWhatsapp logoMessage us on Whatsapptelephone icon+44 (0) 203 773 6020
Facebook logoInstagram logoLinkedIn logo
Cookie Preferences