Matrices II
Contents
4.2. Matrices II#
This section deals with the concepts of vector spaces associated with matrices, span, rank, invertible matrices, similar matrices, gram matrices, pseudo inverses, traces and determinants.
4.2.1. Spaces Associated with a Matrix#
Definition 4.32 (Column space)
The column space of a matrix is defined as the vector space spanned by columns of the matrix.
Let
Then the column space is given by
Definition 4.33 (Row space)
The row space of a matrix is defined as the vector space spanned by rows of the matrix.
Let
Then the row space is given by
4.2.2. Rank#
Definition 4.34 (Column rank)
The column rank of a matrix is defined as the maximum number of columns which are linearly independent. In other words column rank is the dimension of the column space of a matrix.
Definition 4.35 (Row rank)
The row rank of a matrix is defined as the maximum number of rows which are linearly independent. In other words row rank is the dimension of the row space of a matrix.
Theorem 4.21 (Equality of row and column rank)
The column rank and row rank of a matrix are equal.
Definition 4.36 (Rank)
The rank of a matrix is defined to be equal to its column rank which is equal to its row rank.
Lemma 4.1 (Rank bounds)
For an
Lemma 4.2 (Zero rank matrix)
The rank of a matrix is 0 if and only if it is a zero matrix.
Definition 4.37 (Full rank matrix)
An
In other words it is either a full column rank matrix or a full row rank matrix or both.
Lemma 4.3 (Rank of product of two matrices)
Let
Lemma 4.4 (Full rank post multiplication)
Let
Lemma 4.5 (Full rank pre multiplication)
Let
Lemma 4.6 (Rank of a diagonal matrix)
The rank of a diagonal matrix is equal to the number of nonzero elements on its main diagonal.
Proof. The columns which correspond to diagonal entries which are zero are zero columns. Other columns are linearly independent. The number of linearly independent rows is also the same. Hence their count gives us the rank of the matrix.
4.2.3. Invertible Matrices#
We say that an
We say that a square matrix
A special left or right inverse is the pseudo inverse, which is denoted by
Column space of a matrix is denoted by
We say that a matrix is symmetric when
When a square matrix is not invertible, we say that it is singular. A non-singular matrix is invertible.
Definition 4.38 (Invertible matrix)
A square matrix
The matrix
Lemma 4.7 (Invertibility of the inverse)
If
Lemma 4.8 (Invertibility of identity matrix)
Identity matrix
Proof. We can see that
Lemma 4.9 (Linear independence of columns of invertible matrices)
If
Proof. Assume
Assume that columns of
a contradiction. Hence columns of
Lemma 4.10 (Span of columns of invertible matrix)
If an
Proof. Assume
Now let
be any arbitrary vector.We need to show that there exists
such thatBut
Thus if we choose
, thenThus columns of
span .
Lemma 4.11 (Columns of invertible matrix as basis)
If
Proof. In
Lemma 4.12 (Invertibility of transpose)
If
Proof. Assume
Applying transpose on both sides we get
Thus
Lemma 4.13 (Invertibility of Hermitian transpose)
If
Proof. Assume
Applying conjugate transpose on both sides we get
Thus
Lemma 4.14 (Invertibility of matrix product)
If
Proof. We note that
Similarly
Thus
Lemma 4.15 (Group of invertible matrices)
The set of
Proof. We verify the properties of a group
[Closure] If
and are invertible then is invertible. Hence the set is closed.[Associativity] Matrix multiplication is associative.
[Identity element]
is invertible and for all invertible matrices.[Inverse element] If
is invertible then is also invertible.
Thus the set of invertible matrices is indeed a group under matrix multiplication.
Lemma 4.16 (Rank of an invertible matrix)
An
Corollary 4.5 (Equality of rank of a matrix and its inverse)
The rank of an invertible matrix and its inverse are same.
4.2.4. Similar Matrices#
Definition 4.39 (Similar matrix)
An
Lemma 4.17 (Symmetry of similarity)
If
Proof. We have
Thus there exists a matrix
Thus
Lemma 4.18 (Rank of similar matrices)
Similar matrices have same rank.
Proof. Let
Since
and using Lemma 4.5 we have
Thus
Lemma 4.19 (Similarity as equivalence relation)
Similarity is an equivalence relation on the set of
Proof. Let
is similar to itself through an invertible matrix .If
is similar to then is similar to itself.If
is similar to via s.t. and is similar to via s.t. then is similar to via such that .Thus similarity is an equivalence relation on the set of square matrices and if
is any matrix then the set of matrices similar to forms an equivalence class.
4.2.5. Gram Matrices#
Definition 4.40 (Gram matrix)
The Gram matrix of columns of
Definition 4.41 (Frame operator)
The frame operator is the Gram matrix of rows of
Usually when we talk about Gram matrix of a matrix we are looking at the Gram matrix of its column vectors.
Remark 4.3 (Gram matrix and frame operators for real matrices)
For real matrix
Following results apply equally well for the real case.
Lemma 4.20 (Linear dependence of columns and Gram matrix)
The columns of a matrix are linearly dependent if and only if
the Gram matrix of its column vectors
Proof. Let
If columns of
are linearly dependent, then there exists a vector such thatThus
Hence the columns of
are also linearly dependent.Hence
is not invertible.
Conversely let us assume that
Thus columns of
are dependent.There exists a vector
such thatNow
From previous equation, we have
Since
hence columns of are also linearly dependent.
Corollary 4.6 (Linear independence of columns and Gram matrix)
The columns of a matrix are linearly independent if and only if
the Gram matrix of its column vectors
Proof. Columns of
The Gram matrix is not invertible only if columns of
Corollary 4.7
Let
Lemma 4.21
The null space of
Proof. Let
Thus
Now let
Thus we have
Lemma 4.22
The rows of a matrix
Proof. Rows of
There exists a vector
Thus
Since
Converse: assuming that
Now
Since
Corollary 4.8
The rows of a matrix
Corollary 4.9
Let
4.2.6. Pseudo Inverses#
Definition 4.42 (Moore Penrose pseudo inverse)
Let
. . i.e. is Hermitian. i.e. is Hermitian.
Theorem 4.22 (Existence and uniqueness of Moore Penrose pseudo inverse)
For any matrix
We omit the proof for this. The pseudo-inverse can actually be obtained by the
singular value decomposition of
Lemma 4.23 (Moore Penrose pseudo inverse of a diagonal matrix)
Let
Proof. We note that
We now verify the requirements in Definition 4.42.
Lemma 4.24 (Moore Penrose pseudo inverse of a rectangular diagonal matrix)
Let
Proof. We note that
is an
We now verify the requirements in Definition 4.42.
Lemma 4.25 (Moore Penrose pseudo inverse of full column rank matrices)
If
It is a left inverse of
Proof. By Corollary 4.7
We now verify all the properties.
Hermitian properties:
Lemma 4.26 (Moore Penrose pseudo inverse of full row rank matrices)
If
It is a right inverse of
Proof. By Corollary 4.9
We now verify all the properties.
Hermitian properties:
4.2.7. Trace#
Definition 4.43 (Trace of a square matrix)
The trace of a square matrix is defined as the sum of the entries on its main diagonal.
Let
where
Lemma 4.27
The trace of a square matrix and its transpose are equal.
Lemma 4.28
Trace of sum of two square matrices is equal to the sum of their traces.
Lemma 4.29 (Trace product rule)
Let
Proof. Let
Thus
Now
Let
Thus
Hence
This completes the proof.
Lemma 4.30 (Trace triple product rule)
Let
Proof. Let
Similarly the other result can be proved.
Lemma 4.31
Trace of similar matrices is equal.
4.2.8. Determinants#
Following are some results on determinant of a square matrix
Lemma 4.32 (Determinant and scalar multiplication)
Lemma 4.33
Determinant of a square matrix and its transpose are equal.
Lemma 4.34
Let
Proof. We proceed as follows:
Lemma 4.35 (Determinant product rule)
Let
Lemma 4.36 (Determinant of inverse)
Let
Lemma 4.37 (Determinant power rule)
Lemma 4.38 (Determinant of triangular matrices)
Determinant of a triangular matrix is the product of its diagonal entries;
i.e., if
Lemma 4.39 (Determinant of diagonal matrices)
Determinant of a diagonal matrix is the product of its diagonal entries;
i.e., if
Lemma 4.40 (Determinants of similar matrices)
Determinant of similar matrices is equal.
Proof. Let
for some invertible matrix
Now
We used Lemma 4.35 and Lemma 4.36.
Lemma 4.41
Let
Lemma 4.42 (Determinant of perturbation of an identity matrix)
Let