Let
such that
where
. Then for
and
it holds
, so
is a linear transformation.
The matrix
is a linear mapping
. Let
be a basis for
. Then the space
has dimension at most
. Then, using the dimension formula we have
, so rearranging we get
.
Let
be a matrix of rank 1. Then, the image of
is a space spanned by a single vector, say
, and
for some nonzero
. We can assume that
, since the vector is unique up to a scaling and change of basis. Then, the kernel of
is given by the vectors
for
. Next, consider the matrix
, so that
as well, and
. It is easy to see that
and
describe the same linear transformation, so they are equal as matrices. The representation of
is not unique, since we could scale one of the vectors
and
arbitrarily as long as we scale the other accordingly.
a) It is very easy to see that performing the vector space operations coordinate-wise preserves the vector space structure in the product space.
b) Let
. Then we have
and
, so
is a linear operator.
c) We have
where
so
. Furthermore, we have by definition that
, and
. Therefore, the dimension formula has the form
.
We can write
,
and
. We have the following multiplication table
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
where the first element of a column denotes the matrix that is multiplied from the right by the first element of a given row. Then,
. For
we have then in the given basis the form
.
The matrix with the given property satisfies the equation
. Solving this yields that the matrix has to have the form
for any