This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 2. Two independent uniform random variables. Let X and Y be independently and uniformly drawn from the inverval [0 , 1]. (a) ±ind the pdf of U = max( X,Y ) . (b) ±ind the pdf of V = min( X,Y ). (c) ±ind the pdf of W = UV . (Hint: In case you are stuck, you can start working on part (d) ²rst.) (d) ±ind the probability P { XY  ≥ 1 / 2 } . 3. Onebit quantization of Gaussian sources. Let X ∼ N (0 , 1) and let Y = b 1 , if X ≥ ,1 , otherwise. Thus Y encodes the sign of X . (a) ±ind the pmf of Y . (b) What is the conditional pdf of X given the observation that X is nonnegative? In other words, ²nd f X  Y ( x  1). 1 (c) Find the minimum MSE (mean squared error) estimator of X given Y . That is, ±nd the estimator g ( y ) that minimizes the MSE E b ( Xg ( Y )) 2 B . (d) What is the associated MSE? 2...
View
Full
Document
This note was uploaded on 07/07/2011 for the course EECS 153 taught by Professor Kim during the Spring '11 term at UCSD.
 Spring '11
 Kim

Click to edit the document details