Chebyshev's inequality


Also found in: Dictionary, Wikipedia.

Chebyshev's inequality

[′cheb·ə·shəfs ‚in·i′kwäl·əd·ē]
(statistics)
Given a nonnegative random variable ƒ(x), and k > 0, the probability that ƒ(x) ≥ k is less than or equal to the expected value of ƒ divided by k.
McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.
References in periodicals archive ?
To mitigate this limitation to some extent, a method to estimate the upper bound of the cumulative probability is proposed by using Chebyshev's inequality. With this estimated upper bound, we could draw the conclusion that although the estimated cumulative probability using the extrapolation method has some uncertainty in its result, the cumulative probability will not exceed the estimated upper bound.
According to Chebyshev's inequality, for a random variable with finite mean value p and finite variance [[sigma].sup.2], the following inequality can be established:
Chebyshev's inequality indicates that no matter what the distribution function is, as long as the mean and the variance are known, an upper bound for the cumulative probability can be given by Equation 9.
Berry's improvement on Peddada's sufficient condition was derived using Chebyshev's inequality. Our improvement on Berry's result has been obtained via a tighter probability inequality.
Under the assumption of unimodality, Chebyshev's inequality may be sharpened (though not uniformly).