This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Fall 2010) Problem Set 9 Due November 22, 2010 1. Random variable X is uniformly distributed between − 1 . 0 and 1 . 0. Let X 1 ,X 2 ,... , be indepen- dent identically distributed random variables with the same distribution as X . Determine which, if any, of the following sequences (all with i = 1 , 2 ,... ) are convergent in probability. Fully justify your answers. Include the limits if they exist. X 1 + X 2 + ... + X i (a) U i = i (b) W i = max( X 1 ,...,X i ) (c) V i = X 1 · X 2 · ... · X i 2. Demonstrate that the Chebyshev inequality is tight, that is, for every µ , σ > 0, and c ≥ σ , construct a random variable X with mean µ and standard deviation σ such that σ 2 P ( | X − µ | ≥ c ) = 2 c Hint: You should be able to do this with a discrete random variable that takes on only 3 distinct values with nonzero probability. 3. Assume that a fair coin is tossed repeatedly, with the tosses being independent. We want to determine the expected number of tosses necessary to first observe a head directly followed by a tail. To do so, we define a Markov chain with states S,H,T,HT , where S is a starting state, H indicates a head on the current toss, T indicates a tail on the current toss (without heads on the previous toss), and HT indicates heads followed by tails over the last two tosses. This Markov indicates heads followed by tails over the last two tosses....
View Full Document
- Fall '10
- Electrical Engineering, Probability theory, Jack, Probabilistic Systems Analysis