Jump to content

Linear Algebra with Differential Equations/Printable version

From Wikibooks, open books for an open world


Linear Algebra with Differential Equations

The current, editable version of this book is available in Wikibooks, the open-content textbooks collection, at
https://en.wikibooks.org/wiki/Linear_Algebra_with_Differential_Equations

Permission is granted to copy, distribute, and/or modify this document under the terms of the Creative Commons Attribution-ShareAlike 3.0 License.

Homogeneous Linear Differential Equations

Introduction

[edit | edit source]

Translation:

We call an expression in the form

X' = AX + G(t)

homogeneous if G(t)≡0. Now, in previous methods of differential equations, it turned out that X had an exponential of the transcendental number e in its form, so if a uniqueness theorem is developed, we can define a possible answer with this form, set it in the equation, and determine if this answer works and if so how to obtain the answer and its corresponding exponentials.

Existence and Uniqueness Theorem

[edit | edit source]

Results

[edit | edit source]

So, because the exponential function appeared many times in simpler differential equations, we will guess that the solution for X is X = u, where u is a coefficient matrix.

Thus:

There is a lie here, we're also going to make one more assumption: a constant matrix for A; but this is the definition for an eigenvalue-eigenvector pair! Thus with a two-by-two matrix there are two linearly independent solutions, and thus by the principle of superposition the constant matrix multiplication by an augmented matrix of these two solutions makes the fundamental set of solutions of which we are trying to look for.

However, due to the property of these eigenvalues (and that we want real-solutions to help analysis in physical models utilizing these differential techniques), there are different ways of creating the fundamental set of solutions to the three possible cases that the pair of eigenvalues could fall under:


Homogeneous Linear Differential Equations/Real, Distinct Eigenvalues Method

If the eigenvalues for the characteristic equation are real and distinct, mathematically, nothing is really wrong. Thus, by our guess and the existence and uniqueness theorem, for an n-size square matrix, the solution set is determined by:

Then since the linear combination of two solutions is also a solution (which can be verified directly from the structure of the problem), we can form the general solution as such:

What's interesting is when the eigenvalues are not so simple.


Homogeneous Linear Differential Equations/Imaginary Eigenvalues Method

When eigenvalues become complex, mathematically, there isn't much wrong. However, in certain physical applications (like oscillations without damping) there is a problem in understanding what exactly does an imaginary answer mean? Thus there is a concerted effort to try to "mathematically hide" the complex variables in order to achieve a more approachable answer for physicists and engineers. Essentially, we have a solution that in part looks like this:

But by Euler's formula:

Now we distribute the terms:

Since this is a linear combination of two terms, is a constant (complex, but still a constant), each part is an element of the set of solutions and the general solution can be constructed therein.


Homogeneous Linear Differential Equations/Repeated Eigenvalue Method

When the eigenvalue is repeated we have a similar problem as in normal differential equations when a root is repeated, we get the same solution repeated, which isn't linearly independent, and which suggest there is a different solution. Because the case is very similar to normal differential equations, let us try for and we see that this does not work; however, DOES work (For the observant reader, this gives a hint to the changes in the Method of Undetermined Coefficients as compared to differential equations without linear algebra).

In fact if we use this we see that where is a typical eigenvector; and we see that where is a normal eigenvector defined by

Thus our fundamental set of solutions is:

Using the same process of derivation, higher-order problems can be solved similarly.


Heterogeneous Linear Differential Equations

Introduction

[edit | edit source]

We now tackle the problem of being nonzero, so that we have the following problem:

There are four reasonable ways to solve this.


Heterogeneous Linear Differential Equations/Diagonalization

First of all (and kind of obvious suggested by the title), must be diagonalizable. Second, the eigenvalues and eigenvectors of are found, and form the matrix which is an augemented matrix of eigenvectors, and which is a matrix consisting of the corresponding eigenvalues on the main diagonal in the same column as their corresponding eigenvectors. Then with our central problem:

We substitute:

Then left multiply by

As a consequence of Linear Algebra we take the following identity:

Thus:

And because of the nature of the diagonal the problem is a series of one-dimensional normal differential equations which can be solved for and used to find out .


Heterogeneous Linear Differential Equations/Method of Undetermined Coefficients

This is very similar to the Method of Undetermined Coefficients encountered in normal differential equations, with some slight exceptions to the "rules" of guessing. Actually, there's only one rule extra. In the normal method of undetermined coefficients when there was a conflict of the characteristic equation with the particular solution, there was a multiplication by the independent variable. In this case we multiply by to include more possible solutions. That, and when working through the problem thoughts about the signifigance of getting a trivial solution when finding eigenvalues must be kept in mind. Other than that, it's pretty much as it was, and is a very powerful method (although it can be quite tedious).


Heterogeneous Linear Differential Equations/Variation of Parameters

As with the variation of parameters in the normal differential equations (a lot of similarities here!) we take a fundamental solution and by using a product with a to-be-found vector, see if we can come upon another independent solution by these means. In other words, since the general solution can be expressed as where is the constant matrix and is the augmented set of independent solutions to the homogeneous equation, we try out a form like so:

And determine to find a unique solution. The math is fairly straightforward and left as an exercise for the reader, and leaves us with:

... which is a fairly strong, striaghtforward, yet exceedingly complicated formula.


Heterogeneous Linear Differential Equations/Laplacian Transforms

Yet AGAIN, very similar to the normal technique. The only nuance is how to take a Laplacian operator of a matrix, however, the Laplacian operator by definition is basically an integral: take the operator of each term inside the matrix. The Laplacian operator then boils the problem to an exercise of linear algebra. and the reverse Laplacian operator works the same way: on each term in the matrix. It's nearly identical to how it worked in normal differential equations.


Non-Linear Differential Equations

Some Graphical Analysis

[edit | edit source]

So far we've dealt with being a constant matrix, and other niceties; but when it is otherwise, and thus a non-linear differential equation, the best way to find a solution is by graphical means. By taking the independent variables on the axis of a graph, we can note several types of behavior that suggest the form of a solution.

So without adue, here are the main types of behaviors, and their suggested causes:

A nodal source (the graph tends away from a point): real, distinct positive eigenvalues.

A nodal sink (the graph approaches in towards a point): real, distinct negative eigenvalues.

A saddle point (the graph approaches from one end and deviates away at another): real, distinct, opposite eigenvalues.

A spiral point (spirals in or away from a point): a complex eigenvalues.

A series of ellipses around a point: a purely imaginary eigenvalue.

A star point (straight lines deviating or coming towards a point): repeated eigenvalues.