Basic Inequalities
Contents
7.4. Basic Inequalities#
Probability theory is full of inequalities. Many results are derived from the application of these inequalities. This section collects some basic inequalities.
A good reference is Wikipedia list of inequalities. In particular see the section on probability inequalities.
In this section we will cover the basic inequalities.
7.4.1. Markov’s inequality#
Theorem 7.18
Let
7.4.2. Chebyshev’s inequality#
Theorem 7.19
Let
Proof. TBD.
Choosing
7.4.3. Boole’s inequality#
This is also known as union bound.
7.4.4. Fano’s inequality#
7.4.5. Cramér–Rao inequality#
7.4.6. Hoeffding’s inequality#
This inequality provides an upper bound on the probability that the sum of random variables deviates from its expected value.
We start with a version of the inequality for i.i.d Bernoulli random variables.
Theorem 7.20
Let
by
and
The two inequalities can be summarized as
The inequality states that the number of successes that we see is concentrated around its mean with exponentially small tail.
We now state the inequality for the general case for any (almost surely) bounded random variable.
Theorem 7.21
Let
Define the empirical mean of the variables as
Then the probability that
and
Together, we have
Note that we don’t require
Clearly,
The proof of this result depends on what is known as Hoeffding’s Lemma.
Lemma 7.6
Let
7.4.7. Jensen’s inequality#
Jensen’s inequality relates the value of a convex function of an integral to the integral of the convex function. In the context of probability theory, the inequality take the following form.
Theorem 7.22
Let
The equality holds if and only if either
7.4.8. Bernstein inequalities#
7.4.9. Chernoff’s inequality#
This is also known as Chernoff bound.