Jump to content

Numerical Methods Qualification Exam Problems and Solutions (University of Maryland)/August 2005

From Wikibooks, open books for an open world

Problem 1

[edit | edit source]

Derive the one-, two-, and three-point Gaussian quadrature formulas such that

Give Bounds on the error of these formulas.

Solution 1

[edit | edit source]

Find Orthogonal Polynomials w.r.t. Weighting Function

[edit | edit source]

For this problem, we need the first three orthogonal polynomials with respect to a weighted inner product

where in our case. This can be done using Gram-Schmidt Orthogonalization on the basis

Zero Order Polynomial

[edit | edit source]

Let

First Order Polynomial

[edit | edit source]

Second Order Polynomial

[edit | edit source]

Third Order Polynomial

[edit | edit source]

Find Roots of Polynomials

[edit | edit source]

The roots of these polynomials will be the interpolation nodes used in Gauss Quadrature.

Formula to Compute Coefficients

[edit | edit source]

In Gaussian quadrature we have

, where

1 point formula

[edit | edit source]



where is the root of .

2 point formula

[edit | edit source]



where are the roots of .

3 point formula

[edit | edit source]



where are the roots of .

Derive Error Bound

[edit | edit source]

We know that this quadrature is exact for all polynomials of degree at most .


We choose a polynomial of degree at most that Hermite interpolates i.e.



The error for this interpolation is



Compute the error of the quadrature as follows:



where the last line follows from the mean value theorem of integrals.


Notice that is simply the polynomials orthogonal with respect to weight function since are the roots of the polynomials.


Hence, the error for 1 point gaussian quadrature is



The error for 2 point quadrature:



The error for 3 point quadrature:


Problem 2

[edit | edit source]

We wish to solve iteratively where



Show that for this the Jacobi method and the Gauss-Seidel method both converge. Explain why for this one of these methods is better than the other.

Solution 2

[edit | edit source]

Jacobi Method

[edit | edit source]

Decompose into its diagonal, lower, and upper parts i.e.



Derive Jacobi iteration by solving for x as follows:


Gauss-Seidel Method

[edit | edit source]

Decompose into its diagonal, lower, and upper parts i.e.



Derive Gauss-Sediel iteration by solving for x as follows:


Jacobi Convergence

[edit | edit source]

Convergence occurs for the Jacobi iteration if the spectral radius of is less than 1 i.e.



The eigenvalues of the matrix



are i.e. the eigenvalue has order 3.


Therefore, the spectral radius is .

Gauss-Seidel Convergence

[edit | edit source]

Convergence occurs for the Gauss-Seidel iteration if the spectral radius of is less than 1 i.e.



The eigenvalues of the matrix



are

Therefore, the spectral radius is .

Comparison of Methods

[edit | edit source]

In general, the Gauss-Seidel iteration converges faster than the Jacobi iteration since Gauss-Seidel uses the new components of as they become available, but in this case , so the Jacobi iteration is faster.

Problem 3

[edit | edit source]

Consider the three-term polynomial recurrence

initialized by , where each is real and each is nonzero.

Problem 3a

[edit | edit source]

Prove that each is a monic polynomial of degree , and for every , one has


Solution 3a

[edit | edit source]

We prove the claim by by induction.

Base Case

[edit | edit source]

is monic with degree zero, and .


is monic with degree one, and .


is monic with degree 2, and .

Induction Step

[edit | edit source]

Assume is monic with degree , and .


Also, assume is monic with degree , and .


Then from the hypothesis, is monic with degree .


Also,



since is just plus a linear combination of lower order terms already in .

Problem 3b

[edit | edit source]

Show that for every the polynomial has simple real roots that interlace with the roots of .

Solution 3b

[edit | edit source]

We prove the claim by induction.

Base Case

[edit | edit source]

Let's consider the case . We know that



The quadratic formula shows that has two simple real roots.


Let and be the zeros of . Then, we have (because of sign changes) that


and there exists such that .

and there exists such that .


In conclusion, .

Induction Step

[edit | edit source]

Let and be the simple real roots of and , respectively, such that the roots of are interlaced with the roots of , i.e.,



Then, we have that




From the induction hypothesis and have different signs since . Then, there exists such that .


Proceeding in the same manner for all the intervals , we obtain that there exists such that for


We now consider the smallest and largest roots of i.e.


Since for , is a monic polynomial,



Hence for any (any larger than the largest root of )



Therefore


and


implies there exists such that .


If is even, then by similar reasoning


and there exists such that .


If is odd,


and there exists such that .

Problem 3c

[edit | edit source]

Show that for every the roots of are the eigenvalues of the symmetric tridiagonal matrix



Solution 3c

[edit | edit source]

Let


Then is a monic polynomial in of degree .


Expanding this determinant along the last row, we have

where and are monic polynomials of degree and , respectively.


Notice that and if we let , then (1) is equivalent to the three-term recurrence stated in the problem.


Thus, shows that the roots of are the eigenvalues of .