What if you don’t know the pdf ? Limited information about the event is available and you want to bound the probability of that particular event We do not know the pdf or pmf of the random variable and we need to make some prediction Markov Inequality: how likely or unlikely actual value might deviate from the expected value.
Markov Inequality Definition Let X be an Independent Random Variable. E(x) : Expected Value of X a > 0 Pr(X ≥ a) ≤ E(x) /a (Most Generalised form ) If a = k E(x) , then Pr (X ≥ k E(x)) ≤ 1/k
Simple Proof
Chebyshev’s Inequality
Chernoff Bound statement
Proof
A small result
Also,
Minimize the upper bound
Proving the Chernoff bound
Other used forms
Example A biased coin flipped 200 times consecutively and comes up heads with probability 1/10 each time flipped find the upper bound probability that heads come more than120 times. Give the upper bound of the event using Markov inequality and Chernoff bound.
E(X) = np = 20 Markov bound => Pr (X ≥ 120) ≤ 1/6 Chernoff bound: Pr (X ≥ (1+5) E(x)) ≤ e ^-(25*20/7) ≤ e ^(-71.4) The Chernoff bound is a better upper bound than Markov inequality.
Applications Probabilistic method for proving randomised rounding algorithms like Approximate algorithm for Minimum congestion problem Approximate Algorithm for Vertex cover problem
Pr[ Xe > kC ∗ ] < 1 /n^3 Choose k in such manner that it satisfies the above condition.(We get a k approximation)