Suppose we have the well known function sin(x) and we wish to find the values of
sin(x) near x=0. The Taylor series expresses the given function as an infinite
series of terms of the form , where n varies from 0 to any arbitrarily large
integer. That is, we want to write sin(x) in the equivalent form
Here the coefficients must be determined so that we have actually an equality.
For our given function the following moves help achieve our goal. First, if we take the first derivative of both sides of the above, we obtain:
Next if we evaluate both sides at x=0, all terms except the first on the right hand side vanish and we are left with
But we know that cos(0) = 1. Consequently, we have found that . Incidentally, we
also know that
must vanish since sin(0) = 0 and all terms in the right hand side vanish
at x=0 except for
. Thus for the two sides to be equal we thus need
to be zero.
What about the other terms in the infinite series representation of sin(x)? These can be determined
by following a similar line. Namely, if we take two derivatives of both sides we find:
Again, if we evaluate both sides at x=0, we find that . Then, taking three derivatives
of both sides gives:
Then, evaluating both sides at x=0 yields .
Thus far we have found the series representation of sin(x) to be
Now we can see the strategy. To find the kth coefficient in the series expansion for sin(x) near x=0, we take k derivatives of the function and evaluate both sides at the base point x=0. Clearly, the result will be
In general, we might not want to find the series representation of a function
about the base point x=0 but rather a general point, say . What is the
form of the series representation then? The answer is
This is the Taylor series for f(x) about .