Inner Product Spaces
Contents
4.5. Inner Product Spaces#
Inner products are a generalization of the notion of dot product.
We restrict our attention to real vector spaces and complex vector spaces.
Thus, the field
4.5.1. Inner Product#
(Inner product)
An inner product over an
[Positive definiteness]
[Conjugate symmetry]
[Linearity in the first argument]
(Scaling in second argument)
Let
Proof. We proceed as follows:
(Distribution in second argument)
Let
Proof. We proceed as follows:
(Inner product with zero)
Let
Proof. We proceed as follows:
By cancelling terms, we get:
Using the conjugate symmetry, we get:
Linearity in first argument extends to any arbitrary linear combination:
Similarly we have conjugate linearity in second argument for any arbitrary linear combination:
The standard inner product on
This is often called the dot product or scalar product.
The standard inner product on
Let
Now:
. Thus, . Thus, it is positive definite. . It is symmetric.We can also verify that it is linear in the first argument.
Thus, it satisfies all the properties of an inner product.
Note that, in the matrix notation, we can write this inner product as:
The matrix
is positive definite.
Its trace is
Let
It can be easily seen that:
where
Let
For any
We identify the
When
This is the standard inner product on the space of column vectors.
For complex inner products, the inner product is determined identified by its real part.
This statement may be confusing. Let us unpack what it means. Let
Then, computing the inner product involves computing the
real part as well as computing the complex part.
What the statement means is that, if we know how to
compute
Proof. Let
For any complex number
Since,
Thus,
4.5.2. Real Inner Product#
From the perspective of convex analysis, the general inner product is not very useful.
We prefer a special class of inner products whose value is always real.
This is applicable on vector spaces where the field of scalars is
(Real inner product)
A real inner product over an
[Positive definiteness]
[Symmetry]
[Linearity in the first argument]
Real inner product is always real valued no matter whether the vectors are real or complex.
Since the real inner product is symmetric, hence since it is linear in first argument, it is linear in second argument too.
In this example, we are dealing with
Let
Then
is positive definite; i.e., . is symmetric.For any
. Thus, it is linear in first argument.
Now, for any
Following the argument above, it is a real inner product on
Interestingly, if
While the presentation in rest of the section will be based on the general conjugate symmetric inner product, it will be easy to extrapolate the results for the special case of real inner products.
4.5.3. Inner Product Space#
(Inner product space / Pre-Hilbert space)
An
4.5.4. Orthogonality#
Orthogonality is the generalization of the notion of perpendicularity from elementary geometry.
(Orthogonal vectors)
Any two vectors
We write
(Set of orthogonal vectors)
A set of non-zero vectors
(Orthogonality implies independence)
A set of orthogonal vectors is linearly independent.
Proof. Let
Taking inner product on both sides with
Thus, the only zero linear combination is the trivial combination. Thus, the vectors are linearly independent.
4.5.5. Norm Induced by Inner Product#
(Norm induced by inner product)
Every inner product
We shall justify that this function satisfies all the properties of a norm later. But before that, let us examine some implications of this definition which are useful in their own right.
Note that it is easy to see that
Also, it is positively homogeneous, since:
(Pythagoras theorem)
If
Proof. Expanding:
where we used the fact that:
(Cauchy Schwartz inequality)
For any
The equality holds if and only if
Proof. If either
Define
Then,
Thus,
Multiplying on both sides by
Taking square roots on both sides,
In the derivation above, the equality holds if and only if
which means that
Conversely, if
giving us
(Inner product induced norm justification)
The function
Proof. We need to verify that
Taking square root on both sides, we obtain:
Thus,
We recap the sequence of results to emphasize the logical flow:
We started with just the definition of
in Definition 4.76.We proved positive definiteness from the definition itself.
We proved positive homogeneity also from the definition itself.
We proved Pythagoras theorem utilizing previously established results for inner products.
We proved Cauchy Schwartz inequality using positive definiteness, positive homogeneity and Pythagoras theorem.
We proved triangle inequality using Cauchy Schwartz inequality.
(Inner product space to metric space)
Every inner product space is a normed space. Hence it is also a metric space.
Proof. An inner product induces a norm which makes the vector space a normed space. A norm induces a metric which makes the vector space a metric space.
4.5.6. Hilbert Spaces#
(Hilbert space)
An inner product space
In other words,
4.5.7. Orthonormality#
(Set of orthonormal vectors)
A set of non-zero vectors
i.e.,
In other words, the vectors are unit norm (
Since orthonormal vectors are orthogonal, hence they are linearly independent.
(Orthonormal basis)
A set of orthonormal vectors form an orthonormal basis for their span.
(Expansion of a vector in an orthonormal basis)
Let
Proof. Since
where
Taking inner product with
Since
(Norm of a vector in an orthonormal basis)
Let
Then,
Proof. Expanding the expression for norm squared:
Here are some interesting questions:
Can a basis in an inner product space be converted into an orthonormal basis?
Does a finite dimensional inner product space have an orthonormal basis?
Does every finite dimensional subspace of an inner product space have an orthonormal basis?
The answer to these questions is yes. We provide a constructive answer by the Gram-Schmidt algorithm described in the next section.
4.5.8. The Gram-Schmidt Algorithm#
The Gram-Schmidt algorithm (described below) can construct an orthonormal basis from an arbitrary basis for the span of the basis.
(The Gram-Schmidt algorithm)
Inputs
Outputs
. .For
: . .
(Justification for Gram-Schmidt algorithm)
Let
Moreover, for each
Proof. We prove this by mathematical induction.
Consider the base case for
. .Thus,
. because is a nonzero scalar multiple of .
Now, assume that the set
Thus,
.Since
is linearly independent from , hence .Thus,
.Hence,
. If it was , then would be linearly dependent on .Thus,
.Thus,
is well-defined.Also,
by construction, thus, is unit-norm.Note that
is orthogonal to . For any , we have:since
are orthonormal.Thus, for any
:Thus,
is orthogonal to .Since, all of them are unit norm, hence,
are indeed orthonormal.
We also need to show that
Note that
since by inductive hypothesis.Thus,
since is just scaled .Thus,
.For the converse, by definition
.Hence,
.Thus,
.Thus,
must be true.
(Existence of orthonormal basis)
Every finite dimensional inner product space has an orthonormal basis.
Proof. This is a simple application of the Gram-Schmidt algorithm.
Every finite dimensional vector space has a finite basis.
Every finite basis can be turned into an orthonormal basis by the Gram-Schmidt algorithm.
Thus, we have an orthonormal basis.
Every finite dimensional subspace of an inner product space has an orthonormal basis.
4.5.9. Orthogonal Complements#
(Orthogonal complement)
Let
(Orthogonal complement of a vector)
Let
(Orthogonal complement is a linear subspace)
If
Proof. To verify that
It contains the zero vector.
It is closed under vector addition.
It is closed under scalar multiplication.
We proceed as follows:
holds for any . Thus, .Let
. Then, and for every .Thus,
for every .Thus,
.
Similarly, if
, then for every .
Thus,
The orthogonal complement of the inner product space
(Orthogonal complement and basis)
If
Specifically, if
Proof. Let
Then, for any
where
Now, if
Thus,
Now, assume
We first show that
Let
.Then,
for every .In particular,
for since .Thus,
.Thus,
.
We next show that
Let
.Then,
for .But then, for any
since
is a linear combination of .Thus,
for every .Thus,
.Thus,
.
Combining:
(Orthogonal decomposition)
Let
where
Proof. Let
Define:
And
By construction,
Now, for every
Thus,
We have shown that the existence of the decomposition of an vector
For contradiction, assume there was another decomposition:
such that
Then,
gives us:
Thus,
This is possible only if
Thus,
is a unique decomposition.
(Intersection between a subspace and its complement)
If
In other words, the only vector common between
(Vector space as direct sum)
If
In other words,
Proof. From Corollary 4.14, the
intersection between
By Theorem 4.85,
every vector
where
Thus,
However, since both
(Dimension of vector space as direct sum)
Let
Proof. Since
By Theorem 4.86
Then, due to Theorem 4.20
(Orthogonal complement of orthogonal complement)
Let
In other words, in a finite dimensional space, the orthogonal complement of orthogonal complement is the original subspace itself.
Note that this result is valid only for finite dimensional
spaces since in that case both
Proof. Since
We shall first show that
Let
.Then, by definition,
.Thus,
.Thus,
.
We now show that
Let
.By Theorem 4.86,
since is a finite dimensional subspace of .Thus,
such that and .Since
, hence .We have already shown above that
. Hence .Thus,
since both and belong to .Thus,
as by orthogonal decomposition above.But, by Corollary 4.14
since is the orthogonal complement of and is finite dimensional.Thus,
.Thus,
.Thus,
.Since
was an arbitrary element of , hence .
Combining the two:
(n-1 dimensional subspaces)
Let
In other words, the
Proof. Let
This gives us
Since
Thus,
4.5.10. Orthogonal Projection#
Recall that a projection
operator
The range of
The null space of
(Orthogonal projection operator)
A projection operator
(Orthogonal projection operator for a subspace)
Let
where
is the unique orthogonal decomposition of
.For any
, . is a linear map. is the identity map when restricted to ; i.e., . . . .For any
, .For any
and :with equality if and only if
.
Proof. For the sake of brevity, we abbreviate
Following (4.3), indeed:
For any
Since
[Linear map]
Let
.Let
and .Consider
.Then,
and .Since, the orthogonal decomposition is unique, hence
.Similarly, for
, .With
and , .
Thus,
For any
[Range]
Since
maps to a component in , hence .Since for every
, there is such that (specifically ), hence .Combining
.
[Null space]
Let
. Write .Then,
as is in the null space of .Hence,
.Thus,
.Now, let
.We can write
as where and .Thus,
.Thus,
.Combining,
.
[
For any
, we have, .Since
, hence .Thus,
.Since
was arbitrary, hence, .
[
We have
.By Pythagoras theorem:
.Thus,
.Taking square root on both sides:
.
[
Let
and .Note that
hence .By definition
.Thus,
.We have:
.Applying Pythagoras theorem:
Taking square root on both sides:
Equality holds if and only if
if and only if .
In order to show that
is a projection operator. .
We have shown that:
. Hence is a projection operator. and .By definition, for any
and , .Thus,
is an orthogonal projection operator.
(Orthogonal projectors are adjoint)
A projection operator is orthogonal if and only if it is self adjoint.
(Orthogonal projection on a line)
Consider a unit norm vector
Thus
Consider
Now
Thus
Now,
Thus
Also,
Thus
Let
Then,
Thus
Any vector
such that
Then,
Thus
(Projections over the column space of a matrix)
Let
where
The column space of
It can be shown that
Consider the operator
Now,
Thus
Thus
Hence
4.5.11. Parallelogram Identity#
(Parallelogram identity)
Proof. Expanding:
Also:
Thus,
When inner product is a real number following identity is quite useful.
(Parallelogram identity for real inner product)
Proof. Expanding:
Also,
Thus,
since for real inner products
4.5.12. Polarization identity#
When inner product is a complex number, polarization identity is quite useful.
(Polarization identity for complex inner product)
Proof. Expanding
Also,
And,
And,
Thus,