\documentstyle[12pt]{article}
\begin{document}
\section{Numerical Differentiation}
In this section we will discuss a few basic ideas in the construction of numerical
approximations for derivatives. We focus on using the Taylor series method, since
with it one can reduce the problem of generating approximations to derivatives to
algebraic manipulations. Further, with the Taylor series method one automatically gets
an indication of the order of accuracy of the particular approximation.
In no sense do we intend on presenting a complete treatment. What we hope the reader
comes away with is an appreciation of the origin of some commmon approximations and an
ability to generate such approximations for him/herself in the future, as well as being
able to invent other approximations which are needed by some future particular context.
\subsection{Forward and Backward First Derivatives}
Suppose that we are given an array of values $f_i$, where $i=0,1,2,...,N$. We assume that
there exists a continuous function underlying the $N+1$ values. Further, we assume that
the underlying function has as many derivatives defined as is necessary for our purposes
below. We also assume that there are $N+1$ values of a variable x, $x_i$. For simplicity
we will take the $x_i$ to be uniformly distributed in values with $x_0 < x_1 < ... < x_N$
and the separation of $x_i$ and $x_{i+1}$ independent of $i$, so $dx = x_{i+1} - x_i$ for all
$i$. Having the $f_i$ non-uniformly distributed would complicate the considerations which
follow.
The problem is then to find approximate expressions for the first derivative using only
the provided data $(x_i,f_i)$.
Given our assumption that there is an underlying continuous function with as many derivatives
as needed, we make use of the familiar Taylor series expansion formula, which expresses the
function $f$ as an infinite series expansion about a given point of interest:
\begin{equation}
f\left(x\right) = f\left(x_0\right) + \left(x - x_0\right) f^{'}{\mid}_{x_0} + {1 \over 2} \left(x - x_0\right)^2 f^{''}{\mid}_{x_0}
+ {1 \over {3 !}}\left(x - x_0\right)^3 f^{'''}{\mid}_{x_0} + \ \ldots
\end{equation}
Let's evaluate $f$ at $x = x_0 \pm dx$ using the Taylor series. Simplifying slightly we find:
\begin{equation}
f\left(x_0 \pm dx \right) = f\left(x_0\right) \pm dx f^{'}{\mid}_{x_0} + {1 \over 2} dx^2 f^{''}{\mid}_{x_0}
\pm {1 \over {3 !}} dx^3 f^{'''}{\mid}_{x_0} + \ \ldots
\end{equation}
Now if we let $x_0$ be $x_i$, then $x_0 \pm dx$ corresponds to $x_{i \pm 1}$ and the Taylor series
expansion becomes that for $f_{i \pm 1}$. Note that if we subtract $f_i$ from Eq.(2) for the $i+1$ case, we have
\begin{equation}
f_{i+1} - f_{i} = dx f^{'}{\mid}_{x_i} + {1 \over 2} dx^2 f^{''}{\mid}_{x_i} + {1 \over {3 !}}dx^3 f^{'''}{\mid}_{x_i}
+ \ldots
\end{equation}
Dividing by $dx$ and solving for the first derivative at $x_i$ we obtain the
{\bf{forward difference approximation}} for the first derivative:
\begin{equation}
f^{'}{\mid}_{x_i} = {{f_{i+1} - f_{i}} \over {dx}} - {1 \over 2} dx f^{''}{\mid}_{x_i} + \ldots
\end{equation}
Note that the error in this approximation is of order $(dx)^1$ in the separation between points. Such an dependence
is said to indicate that the forward difference approximation is only "first order accurate". As we will see below,
one can do better than this.
Next, we give the result of performing the calculation as above except using $f_{i-1}$ and $f_i$. Then, taking
the difference between $f_{i-1}$ and $f{i}$ from the Taylor series expression and simplifying a bit, we find
the {\bf{backward difference approximation}} for the first derivative:
\begin{equation}
f^{'}{\mid}_{x_i} = {{f_{i} - f_{i-1}} \over {dx}} + {1 \over 2} dx f^{''}{\mid}_{x_i} + \ldots
\end{equation}
As is clear from inspecting this result, the backward difference approximation is also only first order
accurate in $dx$.
Before we proceed to find more accurate approximations, let's pause to make a comment on these two formulae.
From the perspective of order of accuracy, the forward and backward difference formulae represent the simplest
and least accurate approximations that one can imagine. In practice these formulae are not used except in situations
where a quick estimate of the derivative is needed. In most applications in computational physics, the resulting
low accuracy that these formulae present make them just not good enough for serious simulation. While the two
formulae are not widely used and thus will not get much attention from us in future discussions, here is a
good place to point out another aspect of them which may need attention when we get some higher order accurate
approximations: their one-sidedness. To illustrate the point, we can simply note that if we used the forward
difference formula we would run into trouble when we try to use it on the last point $i=N$. There is no point
and function value at $i=N+1$. So we can not compute the derivative at the rightmost point. However, we could
get a first order accurate estimate there if we used the backward difference approximation just for that last point,
since it uses values of the function at $i$ and $i-1$. A similar remark can be made about the leftmost point $x_0$.
At $i=0$ we can not use the backward formula but could use the forward formula. Hence, to the same order of
accuracy in $dx$ if we utilize both formulae we can find the derivative at all points $i=0, \ldots ,N$.
In addition, let's note that the forward and backward difference approximations would be exact if the underlying
function $f(x)$ happened to be a linear function $f(x) = a + b x$. However, in applications one does not know
the underlying function and thus can not determine, in advance, whether these forward and backward formulae
are the only ones needed. In most situation, they certainly do not suffice to capture enough about the rate
of change of the data to be very useful.
\subsection{Centered First Derivatives}
If we take the difference between $f_{i+1}$ and $f_{i-1}$, we find that all of the even order derivative
terms in the Taylor expansion cancel out, leaving the following result:
\begin{equation}
f_{i+1} - f_{i-1} = 2 dx f^{'}{\mid}_{x_i} + { 2 \over {3!} } dx^3 f^{'''}{\mid}_{x_i} + \ldots
\end{equation}
Solving for the first derivative then yields the result:
\begin{equation}
f^{'}{\mid}_{x_i} = {{f_{i+1} - f_{i-1}} \over {2 dx}} - { 2 \over {3!} } dx^2 f^{'''}{\mid}_{x_i} + \ldots
\end{equation}
Note that this approximation is second order accurate in $dx$. It uses one point to the right and one point
to the left of the point where the derivative is evaluated. Only two values of $f$ are needed, as with the
forward and backward formulae. So, we can get better accuracy without using more data. Since the data used
are symmetrically placed relative to where the derivative is computed, this formula is called a centered
difference approximation. In practice, this formula is quite widely used. It is a workhorse of computational
physics.
As with the forward and backward formulae, we note that the centered difference formula would be exact if
the underlying function had a vanishing third derivative. This implies that the function would necessarily
have to be a quadratic function $F = a + b x + c x^2$. While we do not know that the function has this
property, our result above for this centered difference approximation indicates that our approximating an unknown
function with a quadratic is a better approximation by a whole order of accuracy. This in not very surprising.
Surely, the more terms in the Taylor series we include the better we should be able to capture the behavior of
the function.
Our remarks above about the problem with the forward and backward formulae at the points $i=0$ and $i=N$
also apply to this centered difference approximation: At $i=0$ one needs $f_{-1}$ while at $i=N$ one needs
$f_{N+1}$, both of which do not exist as part of the original data. What to do? Well, we can develop one-sided
second order accurate formulae without too much trouble.
For example, let's construct a second order expression for the first derivative at $x_0$ using only data
with $i > 0$. For this purpose we record the Taylor expansion at $i=1,2$ with base point at $i=0$. We
have
\begin{equation}
f_1 = f_0 + dx f^{'}{\mid}_{x_0} + {1 \over 2} dx^2 f^{''}{\mid}_{x_0}
+ {1 \over 6} dx^3 f^{'''}{\mid}_{x_0} + \ \ldots
\end{equation}
\begin{equation}
f_2 = f_0 + 2 dx f^{'}{\mid}_{x_0} + 2 dx^2 f^{''}{\mid}_{x_0}
+ {8 \over 6} dx^3 f^{'''}{\mid}_{x_0} + \ \ldots
\end{equation}
Then, by inspecting these two expansions we easily see that the following combination will
result in the terms which have second derivatives cancelling out.
\begin{equation}
4 f_1 - f_2 = 3 f_0 + 2 dx f^{'}{\mid}_{x_0} - {2 \over 3} dx^3 f^{'''}{\mid}_{x_0} + \ldots
\end{equation}
Solving for the first derivative we find
\begin{equation}
f^{'}{\mid}_{x_0} = {{4 f_1 -3 f_0 - f_2} \over { 2 dx }} - {1 \over 3} dx^2 f^{'''}{\mid}_{x_0} + \ldots
\end{equation}
Thus we have constructed a second order accurate derivative approximation which is defined at the leftmost
end point of the defined data. The same method could be used to generate a second order derivative defined
at the rightmost end of the given data.
\subsection{A Centered Second Derivative}
In our algebraic manipulations thus far of the Taylor series we have endeavored to obtain the first
derivative and did what we had to do to eliminate the second derivative terms when we wanted a final
result of second order. Here we simply notice that if we take the sum rather than the difference between
$f_{i+1}$ and $f_{i-1}$ we can solve for the second derivative. We have
\begin{equation}
f_{i+1} + f_{i-1} = 2 f_i + dx^2 f^{''}{\mid}_{x_i} + {1 \over 12} dx^4 f^{''''}{\mid}_{x_i}
\end{equation}
Finally, solving for the second derivative we find
\begin{equation}
f^{''}{\mid}_{x_i} = {{f_{i+1} -2 f_i + f_{i-1} } \over { dx^2 }} - {1 \over 12} dx^2 f^{''''}{\mid}_{x_i} + \ldots
\end{equation}
This result is another of those which are widely used in computational physics simulations. Note that one needs
three data values to obtain this second derivative. The centered character of the approximation has the
benefit that it does not build in any directional bias. Of course, should one need a second derivative at one
of the end points one would have to find a set of interior values which would also have the third derivative
terms cancelling. We leave such a construction as an exercise for the reader.
\end{document}