Theorem 4.3:
Let
be a tempered distribution. Then the restriction of
to bump functions is a distribution.
Proof:
Let
be a tempered distribution, and let
be open.
1.
We show that
has a well-defined value for
.
Due to theorem 3.9, every bump function is a Schwartz function, which is why the expression

makes sense for every
.
2.
We show that the restriction is linear.
Let
and
. Since due to theorem 3.9
and
are Schwartz functions as well, we have

due to the linearity of
for all Schwartz functions. Thus
is also linear for bump functions.
3.
We show that the restriction of
to
is sequentially continuous. Let
in the notion of convergence of bump functions. Due to theorem 3.11,
in the notion of convergence of Schwartz functions. Since
as a tempered distribution is sequentially continuous,
.
The convolution of two functions may not always exist, but there are sufficient conditions for it to exist:
Theorem 4.5:
Let
such that
and let
and
. Then for all
, the integral

has a well-defined real value.
Proof:
Due to Hölder's inequality,
.
We shall now prove that the convolution is commutative, i. e.
.
Proof:
We apply multi-dimensional integration by substitution using the diffeomorphism
to obtain
.
Lemma 4.7:
Let
be open and let
. Then
.
Proof:
Let
be arbitrary. Then, since for all

and further
,
Leibniz' integral rule (theorem 2.2) is applicable, and by repeated application of Leibniz' integral rule we obtain
.
In this section, we shortly study a class of distributions which we call regular distributions. In particular, we will see that for certain kinds of functions there exist corresponding distributions.
Two questions related to this definition could be asked: Given a function
, is
for
open given by

well-defined and a distribution? Or is
given by

well-defined and a tempered distribution? In general, the answer to these two questions is no, but both questions can be answered with yes if the respective function
has the respectively right properties, as the following two theorems show. But before we state the first theorem, we have to define what local integrability means, because in the case of bump functions, local integrability will be exactly the property which
needs in order to define a corresponding regular distribution:
Now we are ready to give some sufficient conditions on
to define a corresponding regular distribution or regular tempered distribution by the way of

or
:
Theorem 4.11:
Let
be open, and let
be a function. Then

is a regular distribution iff
.
Proof:
1.
We show that if
, then
is a distribution.
Well-definedness follows from the triangle inequality of the integral and the monotony of the integral:

In order to have an absolute value strictly less than infinity, the first integral must have a well-defined value in the first place. Therefore,
really maps to
and well-definedness is proven.
Continuity follows similarly due to

, where
is the compact set in which all the supports of
and
are contained (remember: The existence of a compact set such that all the supports of
are contained in it is a part of the definition of convergence in
, see the last chapter. As in the proof of theorem 3.11, we also conclude that the support of
is also contained in
).
Linearity follows due to the linearity of the integral.
2.
We show that
is a distribution, then
(in fact, we even show that if
has a well-defined real value for every
, then
. Therefore, by part 1 of this proof, which showed that if
it follows that
is a distribution in
, we have that if
is a well-defined real number for every
,
is a distribution in
.
Let
be an arbitrary compact set. We define

is continuous, even Lipschitz continuous with Lipschitz constant
: Let
. Due to the triangle inequality, both

and

, which can be seen by applying the triangle inequality twice.
We choose sequences
and
in
such that
and
and consider two cases. First, we consider what happens if
. Then we have
.
Second, we consider what happens if
:

Since always either
or
, we have proven Lipschitz continuity and thus continuity. By the extreme value theorem,
therefore has a minimum
. Since
would mean that
for a sequence
in
which is a contradiction as
is closed and
, we have
.
Hence, if we define
, then
. Further, the function

has support contained in
, is equal to
within
and further is contained in
due to lemma 4.7. Hence, it is also contained in
. Since therefore, by the monotonicity of the integral

,
is indeed locally integrable.
Theorem 4.12:
Let
, i. e.

Then

is a regular tempered distribution.
Proof:
From Hölder's inequality we obtain
.
Hence,
is well-defined.
Due to the triangle inequality for integrals and Hölder's inequality, we have

Furthermore
.
If
in the notion of convergence of the Schwartz function space, then this expression goes to zero. Therefore, continuity is verified.
Linearity follows from the linearity of the integral.
We now introduce the concept of equicontinuity.
So equicontinuity is in fact defined for sets of continuous functions mapping from
(a set in a metric space) to the real numbers
.
Proof:
In order to prove uniform convergence, by definition we must prove that for all
, there exists an
such that for all
.
So let's assume the contrary, which equals by negating the logical statement
.
We choose a sequence
in
. We take
in
such that
for an arbitrarily chosen
and if we have already chosen
and
for all
, we choose
such that
, where
is greater than
.
As
is sequentially compact, there is a convergent subsequence
of
. Let us call the limit of that subsequence sequence
.
As
is equicontinuous, we can choose
such that
.
Further, since
(if
of course), we may choose
such that
.
But then follows for
and the reverse triangle inequality:

Since we had
, the reverse triangle inequality and the definition of t

, we obtain:

Thus we have a contradiction to
.
Proof: We have to prove equicontinuity, so we have to prove
.
Let
be arbitrary.
We choose
.
Let
such that
, and let
be arbitrary. By the mean-value theorem in multiple dimensions, we obtain that there exists a
such that:

The element
is inside
, because
is convex. From the Cauchy-Schwarz inequality then follows:


Definition 4.16:
If
are two
-dimensional multiindices, we define the binomial coefficient of
over
as
.
We also define less or equal relation on the set of multi-indices.
Definition 4.17:
Let
be two
-dimensional multiindices. We define
to be less or equal than
if and only if
.
For
, there are vectors
such that neither
nor
. For
, the following two vectors are examples for this:

This example can be generalised to higher dimensions (see exercise 6).
With these multiindex definitions, we are able to write down a more general version of the product rule. But in order to prove it, we need another lemma.
Lemma 4.18:
If
and
, where the
is at the
-th place, we have

for arbitrary multiindices
.
Proof:
For the ordinary binomial coefficients for natural numbers, we had the formula
.
Therefore,


This is the general product rule:
Theorem 4.19:
Let
and let
. Then

Proof:
We prove the claim by induction over
.
1.
We start with the induction base
. Then the formula just reads

, and this is true. Therefore, we have completed the induction base.
2.
Next, we do the induction step. Let's assume the claim is true for all
such that
. Let now
such that
. Let's choose
such that
(we may do this because
). We define again
, where the
is at the
-th place. Due to Schwarz' theorem and the ordinary product rule, we have
.
By linearity of derivatives and induction hypothesis, we have
.
Since

and
,
we are allowed to shift indices in the first of the two above sums, and furthermore we have by definition
.
With this, we obtain

Due to lemma 4.18,
.
Further, we have
where
in
,
and

(these two rules may be checked from the definition of
). It follows
.
For
there are operations such as the differentiation of
, the convolution of
and
and the multiplication of
and
. In the following section, we want to define these three operations (differentiation, convolution with
and multiplication with
) for a distribution
instead of
.
Proof:
We have to prove two claims: First, that the function
is a distribution, and second that
as defined above has the property

1.
We show that the function
is a distribution.
has a well-defined value in
as
maps to
, which is exactly the preimage of
. The function
is continuous since it is the composition of two continuous functions, and it is linear for the same reason (see exercise 2).
2.
We show that
has the property

For every
, we have

Since equality of two functions is equivalent to equality of these two functions evaluated at every point, this shows the desired property.
We also have a similar lemma for Schwartz distributions:
The proof is exactly word-for-word the same as the one for lemma 4.20.
Noting that multiplication, differentiation and convolution are linear, we will define these operations for distributions by taking
in the two above lemmas as the respective of these three operations.
Proof:
The product of two
functions is again
, and further, if
, then also
. Hence,
.
Also, if
in the sense of bump functions, then, if
is a compact set such that
for all
,
.
Hence,
in the sense of bump functions.
Further, also
. Let
be arbitrary. Then
.
Since all the derivatives of
are bounded by polynomials, by the definition of that we obtain

, where
are polynomials. Hence,
.
Similarly, if
in the sense of Schwartz functions, then by exercise 3.6

and hence
in the sense of Schwartz functions.
If we define
, from lemmas 4.20 and 4.21 follow the other claims.
Theorem and definitions 4.23:
Let
be open. We define

, where
such that only finitely many of the
are different from the zero function (such a function is also called a linear partial differential operator), and further we define
.
Let
be a distribution. If we define
,
then the expression on the right hand side is well-defined, for all
we have
,
and
is a distribution.
Assume that all
s and all their derivatives are bounded by polynomials. Let
be a tempered distribution. If we define
,
then the expression on the right hand side is well-defined, for all
we have
,
and
is a tempered distribution.
Proof:
We want to apply lemmas 4.20 and 4.21. Hence, we prove that the requirements of these lemmas are met.
Since the derivatives of bump functions are again bump functions, the derivatives of Schwartz functions are again Schwartz functions (see exercise 3.3 for both), and because of theorem 4.22, we have that
and
map
to
, and if further all
and all their derivatives are bounded by polynomials, then
and
map
to
.
The sequential continuity of
follows from theorem 4.22.
Further, for all
,
.
Further, if we single out an
, by Fubini's theorem and integration by parts we obtain
.
Hence,

and the lemmas are applicable.
Proof:
1.
Let
be arbitrary, and let
be a sequence converging to
and let
such that
. Then

is compact. Hence, if
is arbitrary, then
uniformly. But outside
,
. Hence,
uniformly. Further, for all
. Hence,
in the sense of bump functions. Thus, by continuity of
,
.
2.
We proceed by induction on
.
The induction base
is obvious, since
for all functions
by definition.
Let the statement be true for all
such that
. Let
such that
. We choose
such that
(this is possible since otherwise
). Further, we define
.
Then
, and hence
.
Furthermore, for all
,
.
But due to Schwarz' theorem,
in the sense of bump functions, and thus
.
Hence,
, since
is a bump function (see exercise 3.3).
3.
This follows from 1. and 2., since
is a bump function for all
(see exercise 3.3).
- Let
be (tempered) distributions and let
. Prove that also
is a (tempered) distribution.
- Let
be essentially bounded. Prove that
is a tempered distribution.
- Prove that if
is a set of differentiable functions which go from
to
, such that there exists a
such that for all
it holds
, and if
is a sequence in
for which the pointwise limit
exists for all
, then
converges to a function uniformly on
(hint:
is sequentially compact; this follows from the Bolzano–Weierstrass theorem).
- Let
such that
is a distribution. Prove that for all
.
- Prove that for
the function
is a tempered distribution (this function is called the Dirac delta distribution after Paul Dirac).
- For each
, find
such that neither
nor
.