To answer this we need to consider a circle of radius r centred at the origin, drawn in the x,y plane. We'll draw a right-angled triangle such that the hypotenuse is the radius, and say that the point where the hypotenuse meets the circle is P. Let's call the angle between the hypotenuse and the positive x axis t. Then by definition we know that sin(t) is the length of the opposite side to t divided by the length of the hypotenuse. In this case that's the y coordinate of P divided by r. So multiplying by r shows that the y-coordinate is rsin(t). By making a similar argument for cos(t) we see that the x-coordinate of P is rcos(t). For the next part we need to use Pythagoras' theorem, which, if you recall, states that for any right-angled triangle the length of the hypotenuse squared is equal to the sum of the squares of the lengths of the remaining two sides. So for our case the theorem tells us that (rcos(t))2 + (rsin(t))2 = r2 . We can expand the brackets to get r2cos(t)2+r2sin(t)2=r2. Dividing by the common factor of r2 gives us the identity we were asked to prove.