For any random variable X, if a is a real number, it is clear that the probability that X will be larger than a goes to 0 as a grows without limit. This is because a probability distribution has integral 1, and therefore the area under the probability density curve (continuous distribution) to the right of a, or the sum of the probabilities for values of X larger than a (discrete distribution) have to go to 0 as a goes to infinity.
Let's now consider the special case of a r.v. X that can take only nonnegative values. Then the fall off of the probability for X to be larger than a when a grows can be quantified (more specifically, can be bounded upward) by a simple expression involving the expectation of X. This bound is given by Markov inequality.
Markov inequality states that, for a r.v. that can take only non negative values, and for any positive number a :
where E[X] is the expectation of X.
The inequality is of course trivial for a E[X].
We demonstrate here Markov inequality.
The restriction about "X taking only non negative values" is removed in Chebyshev inequality. The price to pay is that X will be required to have a variance.
Related readings :
Want to contribute to this site ?