hw4 - Information Theory and Coding-HW 3 V Balakrishnan...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Information Theory and Coding-HW 3 V Balakrishnan Department of ECE Johns Hopkins University October 8, 2006 1 Markov inequality and Chebyshevs inequality 1.1 Markov Inequality Let f x ( x ) be the probability distribution function of X. P r ( X ) = Z f x ( x ) dx Z x f x ( x ) dx Z + - x f x ( x ) dx = E ( X ) The first step follows from the definition of probability The second step follow from the fact that x/ 1 for x The third step follows from the fact that f x ( x ) 0 for all x as it is a probability density function. 1.2 Chebyshevs inequality We see that E ( X ) = E [( Y- ) 2 ] = 2 and we are required to compute P r ( | Y- | > ) = P r ( X > 2 ) So we have P r ( | X | > 2 ) P r ( | X | 2 ) E ( | X | ) 2 = 2 2 The second step follows directly from Markov inequality and from the fact that | X | = X as X is defined as ( Y- ) 2 1 1.3 The weak law of large numbers Let us compute the variance of Z n . The mean of it is clearly . We have it to be E h ( Z n- ) 2 i = 1 n 2 n X i =1 E...
View Full Document

This note was uploaded on 04/05/2011 for the course EE 5368 taught by Professor Staff during the Spring '08 term at UT Arlington.

Page1 / 5

hw4 - Information Theory and Coding-HW 3 V Balakrishnan...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online