Markov inequality

(redirected from Markov's inequality)
Also found in: Wikipedia.

Markov inequality

[′mar‚kȯf ‚in·i′kwäl·əd·ē]
(statistics)
If x is a random variable with probability P and expectation E, then, for any positive number a and positive number n P (| x | ≥ a) ≤ E (| x |n/ a n).
References in periodicals archive ?
2,i] [less than or equal to] 0 and, by Markov's inequality and Lemma 1, for any r > 0 and [lambda] > 0, such that 2[lambda] [less than or equal to] [delta], then
By applying Markov's inequality and the conditions of Theorem 1, we have