# 4.2. Matrices II#

This section deals with the concepts of vector spaces associated with matrices, span, rank, invertible matrices, similar matrices, gram matrices, pseudo inverses, traces and determinants.

## 4.2.1. Spaces Associated with a Matrix#

Definition 4.32 (Column space)

The column space of a matrix is defined as the vector space spanned by columns of the matrix. Let $$\bA$$ be an $$m \times n$$ matrix with

$\bA = \begin{bmatrix} \ba_1 & \ba_2 & \dots & \ba_n \end{bmatrix}$

Then the column space is given by

$\ColSpace(\bA) = \{\bx \in \FF^m \ST \bx = \sum_{i=1}^n \alpha_i \ba_i \; \text{for some } \alpha_i \in \FF \}.$

Definition 4.33 (Row space)

The row space of a matrix is defined as the vector space spanned by rows of the matrix.

Let $$\bA$$ be an $$m \times n$$ matrix with

$\begin{split} \bA = \begin{bmatrix} \ba_1^T \\ \ba_2^T \\ \vdots \\ \ba_m^T \end{bmatrix} \end{split}$

Then the row space is given by

$\RowSpace(\bA) = \{\bx \in \FF^n \ST \bx = \sum_{i=1}^m \alpha_i \ba_i \; \text{for some } \alpha_i \in \FF \}.$

## 4.2.2. Rank#

Definition 4.34 (Column rank)

The column rank of a matrix is defined as the maximum number of columns which are linearly independent. In other words column rank is the dimension of the column space of a matrix.

Definition 4.35 (Row rank)

The row rank of a matrix is defined as the maximum number of rows which are linearly independent. In other words row rank is the dimension of the row space of a matrix.

Theorem 4.21 (Equality of row and column rank)

The column rank and row rank of a matrix are equal.

Definition 4.36 (Rank)

The rank of a matrix is defined to be equal to its column rank which is equal to its row rank.

Lemma 4.1 (Rank bounds)

For an $$m \times n$$ matrix $$\bA$$

$0 \leq \Rank(A) \leq \min(m, n).$

Lemma 4.2 (Zero rank matrix)

The rank of a matrix is 0 if and only if it is a zero matrix.

Definition 4.37 (Full rank matrix)

An $$m \times n$$ matrix $$\bA$$ is called full rank if

$\Rank (\bA) = \min(m, n).$

In other words it is either a full column rank matrix or a full row rank matrix or both.

Lemma 4.3 (Rank of product of two matrices)

Let $$\bA$$ be an $$m \times n$$ matrix and $$\bB$$ be an $$n \times p$$ matrix then

$\Rank(\bA \bB) \leq \min (\Rank(\bA), \Rank(\bB)).$

Lemma 4.4 (Full rank post multiplication)

Let $$\bA$$ be an $$m \times n$$ matrix and $$\bB$$ be an $$n \times p$$ matrix. If $$\bB$$ is of rank $$n$$ then

$\Rank(\bA \bB) = \Rank(\bA).$

Lemma 4.5 (Full rank pre multiplication)

Let $$\bA$$ be an $$m \times n$$ matrix and $$\bB$$ be an $$n \times p$$ matrix. If $$\bA$$ is of rank $$n$$ then

$\Rank(\bA \bB) = \Rank(\bB).$

Lemma 4.6 (Rank of a diagonal matrix)

The rank of a diagonal matrix is equal to the number of nonzero elements on its main diagonal.

Proof. The columns which correspond to diagonal entries which are zero are zero columns. Other columns are linearly independent. The number of linearly independent rows is also the same. Hence their count gives us the rank of the matrix.

## 4.2.3. Invertible Matrices#

We say that an $$m \times n$$ matrix $$\bA$$ is left-invertible if there exists an $$n \times m$$ matrix $$\bB$$ such that $$\bB \bA = \bI$$. We say that an $$m \times n$$ matrix $$\bA$$ is right-invertible if there exists an $$n \times m$$ matrix $$\bB$$ such that $$\bA \bB= \bI$$.

We say that a square matrix $$\bA$$ is invertible when there exists another square matrix $$\bB$$ of same size such that $$\bA \bB = \bB \bA = \bI$$. A square matrix is invertible iff it is both left and right invertible. Inverse of a square invertible matrix is denoted by $$\bA^{-1}$$.

A special left or right inverse is the pseudo inverse, which is denoted by $$\bA^{\dag}$$.

Column space of a matrix is denoted by $$\ColSpace(\bA)$$, the null space by $$\NullSpace(\bA)$$, and the row space by $$\RowSpace(\bA)$$.

We say that a matrix is symmetric when $$\bA = \bA^T$$, conjugate symmetric or Hermitian when $$\bA^H =\bA$$.

When a square matrix is not invertible, we say that it is singular. A non-singular matrix is invertible.

Definition 4.38 (Invertible matrix)

A square matrix $$\bA$$ is called invertible if there exists another square matrix $$\bB$$ of same size such that

$\bA \bB = \bB \bA = \bI.$

The matrix $$\bB$$ is called the inverse of $$\bA$$ and is denoted as $$\bA^{-1}$$.

Lemma 4.7 (Invertibility of the inverse)

If $$\bA$$ is invertible then its inverse $$\bA^{-1}$$ is also invertible and the inverse of $$\bA^{-1}$$ is nothing but $$\bA$$.

Lemma 4.8 (Invertibility of identity matrix)

Identity matrix $$\bI$$ is invertible.

Proof. We can see that

$\bI \bI = \bI \implies \bI^{-1} = \bI.$

Lemma 4.9 (Linear independence of columns of invertible matrices)

If $$\bA$$ is invertible then columns of $$\bA$$ are linearly independent.

Proof. Assume $$\bA$$ is invertible, then there exists a matrix $$\bB$$ such that

$\bA \bB = \bB \bA = \bI.$

Assume that columns of $$\bA$$ are linearly dependent. Then there exists $$\bu \neq \bzero$$ such that

$\bA \bu = \bzero \implies \bB \bA \bu = \bzero \implies \bI \bu = \bzero \implies \bu = \bzero$

a contradiction. Hence columns of $$\bA$$ are linearly independent.

Lemma 4.10 (Span of columns of invertible matrix)

If an $$n\times n$$ matrix $$\bA$$ is invertible then columns of $$\bA$$ span $$\FF^n$$.

Proof. Assume $$\bA$$ is invertible, then there exists a matrix $$\bB$$ such that

$\bA \bB = \bB \bA = \bI.$
1. Now let $$\bx \in \FF^n$$ be any arbitrary vector.

2. We need to show that there exists $$\ba \in \FF^n$$ such that

$\bx = \bA \ba.$
3. But

$\bx = \bI \bx = \bA \bB \bx = \bA ( \bB \bx).$
4. Thus if we choose $$\ba = \bB \bx$$, then

$\bx = \bA \ba.$
5. Thus columns of $$\bA$$ span $$\FF^n$$.

Lemma 4.11 (Columns of invertible matrix as basis)

If $$\bA$$ is invertible, then columns of $$\bA$$ form a basis for $$\FF^n$$.

Proof. In $$\FF^n$$ a basis is a set of vectors which is linearly independent and spans $$\FF^n$$. By Lemma 4.9 and Lemma 4.10, columns of an invertible matrix $$\bA$$ satisfy both conditions. Hence they form a basis.

Lemma 4.12 (Invertibility of transpose)

If $$\bA$$ is invertible than $$\bA^T$$ is invertible.

Proof. Assume $$\bA$$ is invertible, then there exists a matrix $$\bB$$ such that

$\bA \bB = \bB \bA = \bI.$

Applying transpose on both sides we get

$\bB^T \bA^T = \bA^T \bB^T = \bI.$

Thus $$\bB^T$$ is inverse of $$\bA^T$$ and $$\bA^T$$ is invertible.

Lemma 4.13 (Invertibility of Hermitian transpose)

If $$\bA$$ is invertible than $$\bA^H$$ is invertible.

Proof. Assume $$\bA$$ is invertible, then there exists a matrix $$\bB$$ such that

$\bA \bB = \bB \bA = \bI.$

Applying conjugate transpose on both sides we get

$\bB^H \bA^H = \bA^H \bB^H = \bI.$

Thus $$\bB^H$$ is inverse of $$\bA^H$$ and $$\bA^H$$ is invertible.

Lemma 4.14 (Invertibility of matrix product)

If $$\bA$$ and $$\bB$$ are invertible then $$\bA \bB$$ is invertible.

Proof. We note that

$( \bA \bB) (\bB^{-1}\bA^{-1}) = \bA ( \bB \bB^{-1}) \bA^{-1} = \bA \bI \bA^{-1} = \bI.$

Similarly

$(\bB^{-1}\bA^{-1}) (\bA\bB) = \bB^{-1} (\bA^{-1} \bA ) \bB = \bB^{-1} \bI \bB = \bI.$

Thus $$\bB^{-1}\bA^{-1}$$ is the inverse of $$\bA \bB$$.

Lemma 4.15 (Group of invertible matrices)

The set of $$n \times n$$ invertible matrices under the matrix multiplication operation form a group.

Proof. We verify the properties of a group

• [Closure] If $$\bA$$ and $$\bB$$ are invertible then $$\bA \bB$$ is invertible. Hence the set is closed.

• [Associativity] Matrix multiplication is associative.

• [Identity element] $$\bI$$ is invertible and $$\bA \bI = \bI \bA = \bA$$ for all invertible matrices.

• [Inverse element] If $$\bA$$ is invertible then $$\bA^{-1}$$ is also invertible.

Thus the set of invertible matrices is indeed a group under matrix multiplication.

Lemma 4.16 (Rank of an invertible matrix)

An $$n \times n$$ matrix $$\bA$$ is invertible if and only if it is full rank i.e.

$\Rank(\bA) = n.$

Corollary 4.5 (Equality of rank of a matrix and its inverse)

The rank of an invertible matrix and its inverse are same.

## 4.2.4. Similar Matrices#

Definition 4.39 (Similar matrix)

An $$n \times n$$ matrix $$\bB$$ is similar to an $$n \times n$$ matrix $$\bA$$ if there exists an $$n \times n$$ non-singular matrix $$\bC$$ such that

$\bB = \bC^{-1} \bA \bC.$

Lemma 4.17 (Symmetry of similarity)

If $$\bB$$ is similar to $$\bA$$ then $$\bA$$ is similar to $$\bB$$. Thus similarity is a symmetric relation.

Proof. We have

$\bB = \bC^{-1} \bA \bC \implies \bA = \bC \bB \bC^{-1} \implies \bA = (\bC^{-1})^{-1} \bB \bC^{-1}$

Thus there exists a matrix $$\bD = \bC^{-1}$$ such that

$\bA = \bD^{-1} \bB \bD.$

Thus $$\bA$$ is similar to $$\bB$$.

Lemma 4.18 (Rank of similar matrices)

Similar matrices have same rank.

Proof. Let $$\bB$$ be similar to $$\bA$$. Then their exists an invertible matrix $$\bC$$ such that

$\bB = \bC^{-1} \bA \bC.$

Since $$\bC$$ is invertible hence we have $$\Rank (\bC) = \Rank(\bC^{-1}) = n$$. Now using Lemma 4.4

$\Rank (\bA \bC) = \Rank (\bA)$

and using Lemma 4.5 we have

$\Rank(\bC^{-1} (\bA \bC) ) = \Rank (\bA \bC) = \Rank(\bA).$

Thus

$\Rank(\bB) = \Rank(\bA).$

Lemma 4.19 (Similarity as equivalence relation)

Similarity is an equivalence relation on the set of $$n \times n$$ matrices.

Proof. Let $$\bA, \bB, \bC$$ be $$n \times n$$ matrices.

1. $$\bA$$ is similar to itself through an invertible matrix $$\bI$$.

2. If $$\bA$$ is similar to $$\bB$$ then $$\bB$$ is similar to itself.

3. If $$\bB$$ is similar to $$\bA$$ via $$\bP$$ s.t. $$\bB = \bP^{-1} \bA \bP$$ and $$\bC$$ is similar to $$\bB$$ via $$\bQ$$ s.t. $$\bC = \bQ^{-1} \bB \bQ$$ then $$\bC$$ is similar to $$\bA$$ via $$\bP \bQ$$ such that $$\bC = (\bP \bQ)^{-1} \bA (\bP \bQ)$$.

4. Thus similarity is an equivalence relation on the set of square matrices and if $$\bA$$ is any $$n \times n$$ matrix then the set of $$n \times n$$ matrices similar to $$\bA$$ forms an equivalence class.

## 4.2.5. Gram Matrices#

Definition 4.40 (Gram matrix)

The Gram matrix of columns of $$\bA$$ is given by

$\bG = \bA^H \bA.$

Definition 4.41 (Frame operator)

The frame operator is the Gram matrix of rows of $$\bA$$ is given by

$\bF = \bA \bA^H$

Usually when we talk about Gram matrix of a matrix we are looking at the Gram matrix of its column vectors.

Remark 4.3 (Gram matrix and frame operators for real matrices)

For real matrix $$\bA \in \RR^{m \times n}$$, the Gram matrix of its column vectors is given by $$\bA^T \bA$$ and the frame operator for its row vectors is given by $$\bA \bA^T$$.

Following results apply equally well for the real case.

Lemma 4.20 (Linear dependence of columns and Gram matrix)

The columns of a matrix are linearly dependent if and only if the Gram matrix of its column vectors $$\bA^H \bA$$ is not invertible.

Proof. Let $$\bA$$ be an $$m\times n$$ matrix and $$\bG = \bA^H \bA$$ be the Gram matrix of its columns.

1. If columns of $$\bA$$ are linearly dependent, then there exists a vector $$\bu \neq \bzero$$ such that

$\bA \bu = \bzero.$
2. Thus

$\bG \bu = \bA^H \bA \bu = \bzero.$
3. Hence the columns of $$\bG$$ are also linearly dependent.

4. Hence $$\bG$$ is not invertible.

Conversely let us assume that $$\bG$$ is not invertible.

1. Thus columns of $$\bG$$ are dependent.

2. There exists a vector $$\bv \neq \bzero$$ such that

$\bG \bv = \bzero.$
3. Now

$\bv^H \bG \bv = \bv^H \bA^H \bA \bv = (\bA \bv)^H (\bA \bv) = \| \bA \bv \|_2^2.$
4. From previous equation, we have

$\| \bA \bv \|_2^2 = 0 \implies \bA \bv = \bzero.$
5. Since $$\bv \neq \bzero$$ hence columns of $$\bA$$ are also linearly dependent.

Corollary 4.6 (Linear independence of columns and Gram matrix)

The columns of a matrix are linearly independent if and only if the Gram matrix of its column vectors $$\bA^H \bA$$ is invertible.

Proof. Columns of $$\bA$$ can be dependent only if its Gram matrix is not invertible. Thus if the Gram matrix is invertible, then the columns of $$\bA$$ are linearly independent.

The Gram matrix is not invertible only if columns of $$\bA$$ are linearly dependent. Thus if columns of $$\bA$$ are linearly independent then the Gram matrix is invertible.

Corollary 4.7

Let $$\bA$$ be a full column rank matrix. Then $$\bA^H \bA$$ is invertible.

Lemma 4.21

The null space of $$\bA$$ and its Gram matrix $$\bA^H \bA$$ coincide; i.e.,

$\NullSpace(\bA) = \NullSpace(\bA^H \bA).$

Proof. Let $$\bu \in \NullSpace(\bA)$$. Then

$\bA \bu = \bzero \implies \bA^H \bA \bu = \bzero.$

Thus

$\bu \in \NullSpace(\bA^H \bA ) \implies \NullSpace(\bA) \subseteq \NullSpace(\bA^H \bA).$

Now let $$\bu \in \NullSpace(\bA^H \bA)$$. Then

$\bA^H \bA \bu = \bzero \implies \bu^H \bA^H \bA \bu = \bzero \implies \| \bA \bu \|_2^2 = 0 \implies \bA \bu = \bzero.$

Thus we have

$\NullSpace(\bA^H \bA) \subseteq \NullSpace(\bA).$

Lemma 4.22

The rows of a matrix $$\bA$$ are linearly dependent if and only if the Gram matrix of its row vectors $$\bA \bA^H$$ is not invertible.

Proof. Rows of $$\bA$$ are linearly dependent, if and only if columns of $$\bA^H$$ are linearly dependent.

There exists a vector $$\bv \neq \bzero$$ s.t.

$\bA^H \bv = \bzero.$

Thus

$\bG \bv = \bA \bA^H \bv = \bzero.$

Since $$\bv \neq \bzero$$ hence $$\bG$$ is not invertible.

Converse: assuming that $$\bG$$ is not invertible, there exists a vector $$\bu \neq \bzero$$ s.t.

$\bG \bu = \bzero.$

Now

$\bu^H \bG \bu = \bu^H \bA \bA^H \bu = (\bA^H \bu)^H (\bA^H \bu) = \| \bA^H \bu \|_2^2 = 0 \implies \bA^H \bu = \bzero.$

Since $$\bu \neq \bzero$$ hence columns of $$\bA^H$$ and consequently rows of $$\bA$$ are linearly dependent.

Corollary 4.8

The rows of a matrix $$\bA$$ are linearly independent if and only if the Gram matrix of its row vectors $$AA^H$$ is invertible.

Corollary 4.9

Let $$\bA$$ be a full row rank matrix. Then $$\bA \bA^H$$ is invertible.

## 4.2.6. Pseudo Inverses#

Definition 4.42 (Moore Penrose pseudo inverse)

Let $$\bA$$ be an $$m \times n$$ matrix. An $$n\times m$$ matrix $$\bA^{\dag}$$ is called its Moore-Penrose pseudo-inverse if it satisfies all of the following criteria:

• $$\bA \bA^{\dag} \bA = \bA$$.

• $$\bA^{\dag} \bA \bA^{\dag} = \bA^{\dag}$$.

• $$\left(\bA \bA^{\dag} \right)^H = \bA \bA^{\dag}$$ i.e. $$\bA \bA^{\dag}$$ is Hermitian.

• $$(\bA^{\dag} \bA)^H = \bA^{\dag} \bA$$ i.e. $$\bA^{\dag} \bA$$ is Hermitian.

Theorem 4.22 (Existence and uniqueness of Moore Penrose pseudo inverse)

For any matrix $$\bA$$ there exists precisely one matrix $$\bA^{\dag}$$ which satisfies all the requirements in Definition 4.42.

We omit the proof for this. The pseudo-inverse can actually be obtained by the singular value decomposition of $$\bA$$. This is shown in Lemma 4.79.

Lemma 4.23 (Moore Penrose pseudo inverse of a diagonal matrix)

Let $$\bD = \Diag(d_1, d_2, \dots, d_n)$$ be an $$n \times n$$ diagonal matrix. Then its Moore-Penrose pseudo-inverse is $$\bD^{\dag} = \Diag(c_1, c_2, \dots, c_n)$$ where

$\begin{split} c_i = \left\{ \begin{array}{ll} \frac{1}{d_i} & \mbox{if d_i \neq 0};\\ 0 & \mbox{if d_i = 0}. \end{array} \right. \end{split}$

Proof. We note that $$\bD^{\dag} \bD = \bD \bD^{\dag} = \bF = \Diag(f_1, f_2, \dots f_n)$$ where

$\begin{split} f_i = \left\{ \begin{array}{ll} 1 & \mbox{if d_i \neq 0};\\ 0 & \mbox{if d_i = 0}. \end{array} \right. \end{split}$

We now verify the requirements in Definition 4.42.

$\bD \bD^{\dag} \bD = \bF \bD = \bD.$
$\bD^{\dag} \bD \bD^{\dag} = \bF \bD^{\dag} = \bD^{\dag}.$

$$\bD^{\dag} \bD = \bD \bD^{\dag} = \bF$$ is a diagonal hence Hermitian matrix.

Lemma 4.24 (Moore Penrose pseudo inverse of a rectangular diagonal matrix)

Let $$\bD = \Diag(d_1, d_2, \dots, d_p)$$ be an $$m \times n$$ rectangular diagonal matrix where $$p = \min(m, n)$$. Then its Moore-Penrose pseudo-inverse is an $$n \times m$$ rectangular diagonal matrix $$\bD^{\dag} = \Diag(c_1, c_2, \dots, c_p)$$ where

$\begin{split} c_i = \left\{ \begin{array}{ll} \frac{1}{d_i} & \mbox{if d_i \neq 0};\\ 0 & \mbox{if d_i = 0}. \end{array} \right. \end{split}$

Proof. We note that

$\bF = \bD^{\dag} \bD = \Diag(f_1, f_2, \dots f_n)$

is an $$n \times n$$ matrix where

$\begin{split} f_i = \left\{ \begin{array}{ll} 1 & \mbox{if d_i \neq 0};\\ 0 & \mbox{if d_i = 0};\\ 0 & \mbox{if i > p}. \end{array} \right. \end{split}$

$$\bG = \bD \bD^{\dag} = \Diag(g_1, g_2, \dots g_n)$$ is an $$m \times m$$ matrix where

$\begin{split} g_i = \left\{ \begin{array}{ll} 1 & \mbox{if d_i \neq 0};\\ 0 & \mbox{if d_i = 0};\\ 0 & \mbox{if i > p}. \end{array} \right. \end{split}$

We now verify the requirements in Definition 4.42.

$\bD \bD^{\dag} \bD = \bD \bF = \bD.$
$\bD^{\dag} \bD \bD^{\dag} = \bD^{\dag} \bG = \bD^{\dag}.$

$$\bF = \bD^{\dag} \bD$$ and $$\bG = \bD \bD^{\dag}$$ are both real diagonal hence Hermitian matrices.

Lemma 4.25 (Moore Penrose pseudo inverse of full column rank matrices)

If $$\bA$$ is full column rank then its Moore-Penrose pseudo-inverse is given by

$\bA^{\dag} = (\bA^H \bA)^{-1} \bA^H.$

It is a left inverse of $$\bA$$.

Proof. By Corollary 4.7 $$\bA^H \bA$$ is invertible. First of all we verify that it is a left inverse.

$\bA^{\dag} \bA = (\bA^H \bA)^{-1} \bA^H \bA = \bI.$

We now verify all the properties.

$\bA \bA^{\dag} \bA = \bA \bI = \bA.$
$\bA^{\dag} \bA \bA^{\dag} = \bI \bA^{\dag} = \bA^{\dag}.$

Hermitian properties:

$\left(\bA \bA^{\dag} \right)^H = \left(\bA (\bA^H \bA)^{-1} \bA^H \right)^H = \left(\bA (\bA^H \bA)^{-1} \bA^H \right) = \bA \bA^{\dag}.$
$(\bA^{\dag} \bA)^H = \bI^H = \bI = \bA^{\dag} \bA.$

Lemma 4.26 (Moore Penrose pseudo inverse of full row rank matrices)

If $$\bA$$ is full row rank then its Moore-Penrose pseudo-inverse is given by

$\bA^{\dag} = \bA^H (\bA \bA^H)^{-1} .$

It is a right inverse of $$\bA$$.

Proof. By Corollary 4.9 $$\bA \bA^H$$ is invertible. First of all we verify that it is a right inverse.

$\bA \bA^{\dag} = \bA \bA^H (\bA \bA^H)^{-1}= \bI.$

We now verify all the properties.

$\bA \bA^{\dag} \bA = \bI \bA = \bA.$
$\bA^{\dag} \bA \bA^{\dag} = \bA^{\dag} \bI = \bA^{\dag}.$

Hermitian properties:

$\left(\bA \bA^{\dag} \right)^H = \bI^H = \bI = \bA \bA^{\dag}.$
$(\bA^{\dag} \bA)^H = \left (\bA^H (\bA \bA^H)^{-1} \bA \right )^H = \bA^H (\bA \bA^H)^{-1} \bA = \bA^{\dag} \bA.$

## 4.2.7. Trace#

Definition 4.43 (Trace of a square matrix)

The trace of a square matrix is defined as the sum of the entries on its main diagonal. Let $$\bA$$ be an $$n\times n$$ matrix, then

$\Trace (\bA) = \sum_{i=1}^n a_{ii}$

where $$\Trace(\bA)$$ denotes the trace of $$\bA$$.

Lemma 4.27

The trace of a square matrix and its transpose are equal.

$\Trace(\bA) = \Trace(\bA^T).$

Lemma 4.28

Trace of sum of two square matrices is equal to the sum of their traces.

$\Trace(\bA + \bB) = \Trace(\bA) + \Trace(\bB).$

Lemma 4.29 (Trace product rule)

Let $$\bA$$ be an $$m \times n$$ matrix and $$\bB$$ be an $$n \times m$$ matrix. Then

$\Trace(\bA \bB) = \Trace(\bB \bA).$

Proof. Let $$\bA \bB = \bC = [c_{ij}]$$. Then

$c_{ij} = \sum_{k=1}^n a_{i k} b_{k j}.$

Thus

$c_{ii} = \sum_{k=1}^n a_{i k} b_{k i}.$

Now

$\Trace(\bC) = \sum_{i=1}^m c_{ii} = \sum_{i=1}^m \sum_{k=1}^n a_{i k} b_{k i} = \sum_{k=1}^n \sum_{i=1}^m a_{i k} b_{k i} = \sum_{k=1}^n \sum_{i=1}^m b_{k i} a_{i k}.$

Let $$\bB \bA = \bD = [d_{ij}]$$. Then

$d_{ij} = \sum_{k=1}^m b_{i k} a_{k j}.$

Thus

$d_{ii} = \sum_{k=1}^m b_{i k} a_{k i}.$

Hence

$\Trace(D) = \sum_{i=1}^n d_{ii} = \sum_{i=1}^n \sum_{k=1}^m b_{i k} a_{k i} = \sum_{i=1}^m \sum_{k=1}^n b_{k i} a_{i k}.$

This completes the proof.

Lemma 4.30 (Trace triple product rule)

Let $$\bA \in \FF^{m \times n}$$, $$\bB \in \FF^{n \times p}$$, $$\bC \in \FF^{p \times m}$$ be three matrices. Then

$\Trace(\bA \bB \bC) = \Trace(\bB \bC \bA) = \Trace(\bC \bA \bB).$

Proof. Let $$\bA \bB = \bD$$. Then

$\Trace(\bA \bB \bC) = \Trace( \bD \bC) = \Trace( \bC \bD) = \Trace(\bC \bA \bB).$

Similarly the other result can be proved.

Lemma 4.31

Trace of similar matrices is equal.

Proof. Let $$\bB$$ be similar to $$\bA$$. Thus

$\bB = \bC^{-1} \bA \bC$

for some invertible matrix $$\bC$$. Then

$\Trace(\bB) = \Trace(\bC^{-1} \bA \bC ) = \Trace (\bC \bC^{-1} \bA) = \Trace(\bA).$

We used Lemma 4.29.

## 4.2.8. Determinants#

Following are some results on determinant of a square matrix $$\bA$$.

Lemma 4.32 (Determinant and scalar multiplication)

$\det(\alpha \bA) = \alpha^n \det(\bA).$

Lemma 4.33

Determinant of a square matrix and its transpose are equal.

$\det(\bA) = \det(\bA^T).$

Lemma 4.34

Let $$\bA$$ be a complex square matrix. Then

$\det(\bA^H) = \overline{\det(\bA)}.$

Proof. We proceed as follows:

$\det(\bA^H) = \det(\overline{\bA}^T) = \det(\overline{\bA}) = \overline{\det(\bA)}.$

Lemma 4.35 (Determinant product rule)

Let $$\bA$$ and $$\bB$$ be two $$n\times n$$ matrices. Then

$\det (\bA \bB) = \det(\bA) \det(\bB).$

Lemma 4.36 (Determinant of inverse)

Let $$\bA$$ be an invertible matrix. Then

$\det(\bA^{-1}) = \frac{1}{\det(\bA)}.$

Lemma 4.37 (Determinant power rule)

$\det(\bA^{p}) = \left(\det(\bA) \right)^p.$

Lemma 4.38 (Determinant of triangular matrices)

Determinant of a triangular matrix is the product of its diagonal entries; i.e., if $$\bA$$ is upper or lower triangular matrix then

$\det(\bA) = \prod_{i=1}^n a_{i i}.$

Lemma 4.39 (Determinant of diagonal matrices)

Determinant of a diagonal matrix is the product of its diagonal entries; i.e., if $$\bA$$ is a diagonal matrix then

$\det(\bA) = \prod_{i=1}^n a_{i i}.$

Lemma 4.40 (Determinants of similar matrices)

Determinant of similar matrices is equal.

Proof. Let $$\bB$$ be similar to $$\bA$$. Thus

$\bB = \bC^{-1} \bA \bC$

for some invertible matrix $$\bC$$. Hence

$\det(\bB) = \det(\bC^{-1} \bA \bC ) = \det (\bC^{-1}) \det (\bA) \det(\bC).$

Now

$\det (\bC^{-1}) \det (\bA) \det(\bC) = \frac{1}{\det(\bC)} \det (\bA) \det(\bC) = \det(\bA).$

We used Lemma 4.35 and Lemma 4.36.

Lemma 4.41

Let $$\bu$$ and $$\bv$$ be vectors in $$\FF^n$$. Then

$\det(\bI + \bu \bv^T) = 1 + \bu^T \bv.$

Lemma 4.42 (Determinant of perturbation of an identity matrix)

Let $$\bA$$ be a square matrix and let $$\epsilon \approx 0$$. Then

$\det(\bI + \epsilon \bA ) \approx 1 + \epsilon \Trace(\bA).$