The derivative
describes the current rate of change of the function
. Now the derivative function
can be differentiated again, provided that it is again differentiable. The obtained derivative of the derivative is called second derivative or derivative of second order and is called
or
. This can be done arbitrarily often. If the second derivative is again differentiable, a third derivative
can be constructed, then a fourth derivative
and so on.
These higher derivatives allow statements about the course of a function graph. The second derivative tells us whether a graph is curved upwards ("convex") or curved downwards ("concave"). If a function has a convex graph, its gradient increases continuously. For this convexity,
is a sufficient condition. If the second derivative is always positive, then the first derivative must grow continuously. Analogously, it follows from
that the graph is concave and the derivative falls monotonically.
Higher-order derivatives do not only tell us more about abstract functions, they can also have a physical meaning. Consider the function
with
, which shall describe the location
of a car at the time
. We already know that we can calculate the speed of the car at the time
with the first derivative:
. What does the derivative
of
say? This is the instantaneous rate of change of speed and thus the acceleration of the car. It accelerates with
. So second derivatives describe accelerations.
Now we can derive this second derivative again, whereby we get the rate of change of acceleration
. This is called jerk in vehicle dynamics and indicates how fast a car increases acceleration or how fast it initiates braking. For example, a big jerk occurs during emergency braking. Since
is in an emergency stop, the graph of the speed
is convex - the speed decreases more and more. The fourth derivative
again tells us that the jerk has no instantaneous rate of change.
The set of all
times continuously differential functions with domain of definition
and range
is denoted
. In particular
consists of the continuous functions. If we can derive the function
arbitrarily often, we write
. If
, then we can write
or
in short. Those sets of functions satisfy the inclusion chain:
Solutions:
- true
- false
- false
- false
- true
- true
Example (Derivatives of the power function)
We consider the function
. This function is infinitely often differentiable, since there is for all
and all
:
In general, for
with
there is:
Example (Derivatives of the sine function)
The function
is infinitely often continuously differentiable. For all
there is:
In general, for all
there is:
Question: What are the derivatives von
?
We use that
. For
there is
In general, for all
there is:
Math for Non-Geeks: Template:Aufgabe
Math for Non-Geeks: Template:Aufgabe
Additional question: Is
continuous at
?
Nope. Take the two sequences:
For these sequences, there is:
. However
So
doesn't exist. By means of the sequence criterion,
is hence not continuous at
.
Remark: Therefore,
is also not differentiable at
.
The linearity of derivatives is also "inherited" to higher derivatives: If
and
are differentiable, for
the function
is also differentiable with
If
and
are now even twice differentiable, then there is
If we continue to do so, we will get
Example (Linearity of higher derivatives)
Since
and
for
there is
Proof (Linearity of higher derivatives)
Theorem whose validity shall be proven for
bewiesen werden soll:
1. Base case:
2. Inductive step:
2a. Inductive hypothesis:
2b. Induction theorem:
2c. Proof of induction step:
We now try to determine a general formula for the
-th derivative of the product function
of two arbitrarily often differentiable functions
and
. By applying the factor-, sum- and product rule several times we obtain for
If we plug in
and
, and instead of the derivatives of
and
the corresponding powers of
and
, we see a clear analogy to the binomial theorem:
This analogy can be made clear as follows:
We assign for every
the derivative
to the power
, and the derivative
to the power
. The
-th derivative
corresponds to the
-th power
. The derivative of the term
is by means of the product rule
The expression
now corresponds in our analogy to the sum
. We get this term from
by multiplication with
. For our polynomials, the distributive law yields
Therefore, the application of the product rule corresponds to the multiplication with the sum
. Thus the
-th derivative
corresponds to the power
. From the binomial theorem
we hence get the
Proof (Leibniz rule for derivatives)
Theorem whose validity shall be proven for
bewiesen werden soll:
1. Base case:
2. Inductive step:
2a. Inductive hypothesis:
2b. Induction theorem:
2c. Proof of induction step: