This is a common question as we are often taught to differentiate by simply being told how to do the calculation, but not what it is we're really doing.
The concept
Differentiation is finding the gradient of a line. For simple problems this is easy to see using the normal trignometry methods for finding a gradient: Draw a graph of y = x and you can see that the gradient of the line is 1. Integrate y = x and you find that dy/dx = 1, as you would hope!
However the trignometric methods will not work for a curve such as y = x2 as the gradient is different at every point, so we have to use the differentiation. In this case what the differentiation is really doing is finding the gradient of the line at any given point x. This can be understood through differentiating by first principles.
First Principles
To differentiate y = x2 from first principles we begin by finding the gradient between point (x, x2) and an arbitrary point (x+h, (x+h)2).
This gives us the following equation:
which simplifies to:
Now for the important bit. We found the gradient between x and another point a distance h from x, but we want the gradient at x, so we find the limit of the above equation when h tends to 0. You can see on the graph below how reducing the value of h to 0 will give us the gradient at point x:
In our case of y = x2, this gives us the following answer:
Which gives us exactly what we would get by the differentiation methods we know! You can try this with almost any differential function and you'll find that it works, but don't try anything too complicated just yet, as sometimes it can be tricky to evaluate that limit!