Jump to content

Numerical Methods Qualification Exam Problems and Solutions (University of Maryland)/January 2000

From Wikibooks, open books for an open world

Problem 1

[edit | edit source]

Let be a function with 3 continuous derivatives. Let be a quadratic polynomial that interpolates at . Let and

.

Problem 1a

[edit | edit source]

Show that

,

where depends only on and determine . (Hint: the key to this is to prove that vanishes at some point in . The result can then be obtained by integration.)

Solution 1a

[edit | edit source]

Proof of Hint

[edit | edit source]

Claim: There exists such that


Proof: The interpolation polynomial may be expressed using dividing difference coefficients i.e.



which implies



In general, the divided difference coefficients may be expressed as a factorial weighted point of a derivative of i.e.



Hence,



which implies


Application of Hint

[edit | edit source]

From the hint we know that



which implies



Since is quadratic, is constant i.e. for all


Therefore,



By the fundamental theorem of calculus,



Therefore,


Problem 1b

[edit | edit source]

Now suppose and has 4 continuous derivatives. In this case show



where . What is in terms of the derivatives of ?


Solution 1b

[edit | edit source]

Third Derivative of f has Zero

[edit | edit source]

We know that , because . Now, by we can conclude that there exists such that .

Application of Fundamental Theorem of Calculus (Twice)

[edit | edit source]

Problem 2a

[edit | edit source]

Find such that is a polynomial of degree and this set is orthogonal on with respect to the weight function . (Note: , )

Solution 2a

[edit | edit source]

Apply Gram Schmidt

[edit | edit source]

To find orthogonal use the Gram Schmidt method.

Let be a basis of .

Calculate p_0

[edit | edit source]

Choose .

Calculate p_1

[edit | edit source]

From Gram Schmidt, we have


, where




Therefore

Calculate p_2

[edit | edit source]

Proceeding with Gram Schmidt, we have

where






Therefore


Problem 2b

[edit | edit source]

Derive the 2-point Gaussian formula



i.e. find the weights and nodes

Solution 2b

[edit | edit source]

Find the Nodes

[edit | edit source]

The nodes and are the roots of the th orthogonal polynomial i.e.


Applying the quadratic formula yields the roots:


Find the Weights

[edit | edit source]

The approximation is exact for polynomials at most of degree . Hence, we have the following system of equations




Solving the solving the system of equation by substitution yields the weights:



Problem 3

[edit | edit source]

Let be an nonsingular matrix, and consider the linear system


Problem 3a

[edit | edit source]

Write down the Jacobi iteration for solving in a way that it would be programmed on a computer

Solution 3a

[edit | edit source]

Choose


for

<convergence condition>

end


Where , is diagonal, are lower and upper triangular, respectively.

Problem 3b

[edit | edit source]

Suppose has non-zero elements with . How many operations per iteration does the Jacobi iteration take?

Solution 3b

[edit | edit source]

The diagonal entries of are non-zero since otherwise would not exist.


Therefore contains off-diagonal non-zero entries.


The computation during each iteration is given by



Therefore there are multiplies in each iteration.

Problem 3c

[edit | edit source]

Assume that is strictly diagonally dominant: for



Show that the Jacobi iteration converges for any guess . (Hint: You may use Gerschgorin's theorem without proving it.)

Solution 3c

[edit | edit source]

Let .


Theorem 8.2.1 [SB] states that if and only if the Jacobi iteration converges.


Matrix multiplication and the definitions of gives the explicit entrywise value of


for and


Then, using Gerschgorin's Theorem and diagonal dominance, we have the result.