Discrete-time stochastic processes

N o c let sn x1 xn show that pr sn 6 sn

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: . a) Show that, for uncorrelated rv’s, the expected value of the product is equal to the product of the expected values (by definition, X and Y are uncorrelated if £ § E (X − X )(Y − Y ) = 0). b) Show that if X and Y are uncorrelated, then the variance of X + Y is equal to the variance of X plus the variance of Y . 54 CHAPTER 1. INTRODUCTION AND REVIEW OF PROBABILITY c) Show that if X1 , . . . , Xn are uncorrelated, the the variance of the sum is equal to the sum of the variances. d) Show that independent rv’s are uncorrelated. e) Let X, Y be identically distributed ternary valued random variables with the PMF pX (−1) = pX (1) = 1/4; pX (0) = 1/2. Find a simple joint probability assignment such that X and Y are uncorrelated but dependent. f ) You have seen that the moment generating function of a sum of independent rv’s is equal to the product of the individual moment generating functions. Give an example where this is false if the variables are uncorrelated but dependent. Exercise 1.18. Suppose X has the Poisson PMF, pX (n) = ∏n exp(−∏)/n! for n ≥ 0 and Y has the Poisson PMF, pY (m) = µn exp(−µ)/n! for n ≥ 0. Find the distribution of Z = X + Y and find the conditional distribution of Y conditional on Z = n. Exercise 1.19. a) Suppose X , Y and Z are binary rv’s, each taking on the value 0 with probability 1/2 and the value 1 with probability 1/2. Find a simple example in which X , Y , Z are statistically dependent but are pairwise statistically independent (i.e., X , Y are statistically independent, X , Z are statistically independent, and Y , Z are statistically independent). Give pX Y Z (x, y , z ) for your example. Hint: In the simplest examle, only 4 of the joint values for x, y , z have positive probabilities. b) Is pairwise statistical independence enough to ensure that hYn i Yn E Xi = E [Xi ] i=1 i=1 for a set of rv’s X1 , . . . , Xn ? £ § Exercise 1.20. Show that E [X ] is the value of z that minimizes E (X − z )2 . Exercise 1.21. A computer system has n users, each with a unique name and password. Due to a software error, the n passwords are randomly permuted internally (i.e. each of the n! possible permutations are equally likely. Only those users lucky enough to have had their passwords unchanged in the permutation are able to continue using the system. a) What is the probability that a particular user, say user 1, is able to continue using the system? b) What is the expected number of users able to continue using the system? Hint: Let Xi be a rv with the value 1 if user i can use the system and 0 otherwise. Exercise 1.22. Suppose the rv X is continuous and has the distribution function FX (x). Consider another rv Y = FX (X ). That is, for any sample point α such that X (α) = x, we have Y (α) = FX (x). Show that Y is uniformly distributed in the interval 0 to 1. 1.8. EXERCISES 55 Exercise 1.23. Let Z be an integer valued rv with the PMF pZ (n) = 1/k for 0 ≤ n ≤ k − 1. Find the mean, variance, and moment generating function of Z . Hint: The elegant way to do this is to let U be a uniformly distributed continuous rv over (0, 1] that is independent of Z . Then U + Z is uniform over (0, k]. Use the known results about U and U + Z to find the mean, variance, and mgf for Z . Exercise 1.24. Let {Xn ; n ≥ 1} be a sequence of independent but not identically distributed rv’s. We say that the weak law of large numbers holds for this sequence if for all ≤>0 ΩØ æ Ø Ø Sn E [Sn ] Ø lim Pr Ø − (a) Ø ≥ ≤ = 0 where Sn = X1 + X2 + · · · + Xn . n→1 n n a) Show that (a) holds if there is some constant A such that VAR (Xn ) ≤ A for all n. b) Suppose that VAR (Xn ) ≤ A n1−α for some α < 1 and for all n. Show that (a) holds in this case. Exercise 1.25. Let {Xi ; i ≥ 1} be IID binary rv’s. Let Pr {Xi = 1} = δ , Pr {Xi = 0} = 1 − δ . Let Sn = X1 + · · · + Xn . Let m be an arbitrary but fixed positive integer. Think! then evaluate the following and explain your answers: P a) limn→1 i:nδ−m≤i≤nδ+m Pr {Sn = i} P b) limn→1 i:0≤i≤nδ+m Pr {Sn = i} P c) limn→1 i:n(δ−1/m)≤i≤n(δ+1/m) Pr {Sn = i} Exercise 1.26. (Details in the proof of Theorem 1.3) ˘ a) Show that if a sequence {Xi ; i ≥ 0} are IID, then the truncated versions Xi are also IID. hi 2 ˘ ˘ b) Show that each Xi has a finite mean E X and finite variance σX . Show that the ˘ variance is upper bounded i y the second moment around the original mean X , i.e., show b h 2 ≤ E | X − E [X ] |2 . ˘ that σX ˘ £§ 2 ˘ c) Assume that Xi is Xi truncated to X ± b. Show that σX ≤ 2bE X . Hint: The fact that ˘ |A − B | ≤ |A| + |B | for any numbers or rv’s might be helpful. ˘ ˘ ˘ d) Let Sn = X1 + · · · + Xn and show that for any ε > 0, Ø (Ø ) ØS h iØ ≤ 8bE [|X |] Ø ˘n Ø ˘ Pr Ø −E X Ø≥ ≤ . Øn Ø2 n≤2 hi ˘ e) Use (1.24) to show that for sufficiently large b, | E X − E [X ] |≤ ε/2 and use this to show that Ø...
View Full Document

Ask a homework question - tutors are online