Chebyshev's inequality


Also found in: Dictionary, Wikipedia.

Chebyshev's inequality

[′cheb·ə·shəfs ‚in·i′kwäl·əd·ē]
(statistics)
Given a nonnegative random variable ƒ(x), and k > 0, the probability that ƒ(x) ≥ k is less than or equal to the expected value of ƒ divided by k.
References in periodicals archive ?
ESTIMATION FOR UPPER BOUND OF CUMULATIVE PROBABILITY USING CHEBYSHEV'S INEQUALITY
To mitigate this limitation to some extent, a method to estimate the upper bound of the cumulative probability is proposed by using Chebyshev's inequality.
According to Chebyshev's inequality, for a random variable with finite mean value p and finite variance [[sigma].
Chebyshev's inequality indicates that no matter what the distribution function is, as long as the mean and the variance are known, an upper bound for the cumulative probability can be given by Equation 9.
The upper bounds of cumulative probability estimated by Chebyshev's inequality are shown in Table 13.
The proof is straightforward using Chebyshev's inequality.
Under the assumption of unimodality, Chebyshev's inequality may be sharpened (though not uniformly).