Derive the one-, two-, and three-point Gaussian quadrature formulas such that

Give Bounds on the error of these formulas.
|
For this problem, we need the first three orthogonal polynomials
with respect to a weighted inner product

where
in our case. This can be done using Gram-Schmidt Orthogonalization on the basis
Let
The roots of these polynomials will be the interpolation nodes used in Gauss Quadrature.
In Gaussian quadrature we have
, where

where
is the root of
.
where
are the roots of
.
where
are the roots of
.
We know that this quadrature is exact for all polynomials of degree at most
.
We choose a polynomial
of degree at most
that Hermite interpolates i.e.
The error for this interpolation is
Compute the error of the quadrature as follows:
where the last line follows from the mean value theorem of integrals.
Notice that
is simply the polynomials orthogonal with respect to weight function since
are the roots of the polynomials.
Hence, the error for 1 point gaussian quadrature is
The error for 2 point quadrature:
The error for 3 point quadrature:
We wish to solve iteratively where
Show that for this the Jacobi method and the Gauss-Seidel method both converge. Explain why for this one of these methods is better than the other.
|
Decompose
into its diagonal, lower, and upper parts i.e.
Derive Jacobi iteration by solving for x as follows:
Decompose
into its diagonal, lower, and upper parts i.e.
Derive Gauss-Sediel iteration by solving for x as follows:
Convergence occurs for the Jacobi iteration if the spectral radius of
is less than 1 i.e.
The eigenvalues of the matrix
are
i.e. the eigenvalue
has order 3.
Therefore, the spectral radius is
.
Convergence occurs for the Gauss-Seidel iteration if the spectral radius of
is less than 1 i.e.
The eigenvalues of the matrix
are
Therefore, the spectral radius is
.
In general, the Gauss-Seidel iteration converges faster than the Jacobi iteration since Gauss-Seidel uses the new components of
as they become available, but in this case
, so the Jacobi iteration is faster.
Consider the three-term polynomial recurrence

initialized by , where each is real and each is nonzero.
|
Prove that each is a monic polynomial of degree , and for every , one has

|
We prove the claim by by induction.
is monic with degree zero, and
.
is monic with degree one, and
.
is monic with degree 2, and
.
Assume
is monic with degree
, and
.
Also, assume
is monic with degree
, and
.
Then from the hypothesis,
is monic with degree
.
Also,
since
is just
plus a linear combination of lower order terms already in
.
We prove the claim by induction.
Let's consider the case
. We know that
The quadratic formula shows that
has two simple real roots.
Let
and
be the zeros of
. Then, we have (because of sign changes) that
and
there exists
such that
.
and
there exists
such that
.
In conclusion,
.
Let
and
be the simple real roots of
and
, respectively, such that the roots of
are interlaced with the roots of
, i.e.,
Then, we have that
From the induction hypothesis
and
have different signs since
. Then, there exists
such that
.
Proceeding in the same manner for all the intervals
, we obtain that there exists
such that
for
We now consider the smallest and largest roots of
i.e.
Since for
,
is a monic polynomial,
Hence for any
(any
larger than the largest root of
)
Therefore
and
implies there exists
such that
.
If
is even, then by similar reasoning
and
there exists
such that
.
If
is odd,
and
there exists
such that
.
Show that for every the roots of are the eigenvalues of the symmetric tridiagonal matrix

|
Let
Then
is a monic polynomial in
of degree
.
Expanding this determinant along the last row, we have

where
and
are monic polynomials of degree
and
, respectively.
Notice that
and if we let
, then (1) is equivalent to the three-term recurrence stated in the problem.
Thus,
shows that the roots of
are the eigenvalues of
.