# Chebyshev Inequality

Also found in: Dictionary, Wikipedia.

Related to Chebyshev Inequality: variance, Markov inequality

## Chebyshev Inequality

**(1)** A basic inequality for monotonic sequences or functions. In the case of the finite sequences *a*_{1} ≤ *a*_{2} ≤ . . . ≤ *a _{n}* and

*b*

_{l}≤

*b*

_{2}= ≤ . . . ≤

*b*it has the form

_{n}In integral form we have

Here, *f*(*x*) ≥ 0 and *g*(*x*) ≥ 0; in addition, the functions are either both increasing or both decreasing. The inequality was derived by P. L. Chebyshev in 1882.

**(2)** An inequality that provides an estimate of the probability that the deviation of a random variable from its mathematical expectation exceeds some given limit. Let ξ be a random variable, Eξ = *a* its mathematical expectation, and Dξ = σ^{2} its variance. The Chebyshev inequality asserts that the probability of the inequality

|ξ – *a*| ≥ *k*σ

does not exceed the quantity 1/*k*^{2}. If ξ is the sum of independent random variables and some additional restrictions are made, then the estimate 1/*k*^{2} can be replaced by the estimate 2 exp (–*k*^{2}/4), which decreases with increasing *k* much more rapidly.

The inequality is named for P. L. Chebyshev, who used it in 1867 to establish extremely broad conditions for the application of the law of large numbers to sums of independent random variables. (*See*LARGE NUMBERS, LAW OF; LIMIT THEOREMS.)