Isye 2027

# For example an opinion poll report might state that

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ALS 2.9 45 Markov and Chebychev inequalities and conﬁdence intervals The mean and variance of a random variable are two key numbers that roughly summarize the distribution of the random variable. In some applications, the mean, and possibly the variance, of a random variable is known, but the distribution of the random variable is not. The following two inequalities can still be used to provide bounds on the probabilities of certain tail events. The second of these is used to provide a conﬁdence interval on an estimator of the parameter p of a binomial distribution. First, the Markov inequality states that if Y is a nonnegative random variable, then for c > 0, E [Y ] . c To prove Markov’s inequality for a discrete-type nonnegative random variable Y with possible values u1 , u2 , . . . , note that for each i, ui is bounded below by zero if ui < c, and ui is bounded below by c if ui ≥ c. Thus, P {Y ≥ c} ≤ E [Y ] = ui pY (ui ) i ≥ 0 · pY (ui ) + i:ui <c cpY (ui ) i:ui ≥c pY (ui ) = cP {Y ≥ c}, =c i:ui ≥c which implies the Markov inequality. Equality...
View Full Document

## This note was uploaded on 02/09/2014 for the course ISYE 2027 taught by Professor Zahrn during the Spring '08 term at Georgia Tech.

Ask a homework question - tutors are online