This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 1 EE464  Chugg, Spring 1994  Final Exam Solution 1 Basic Problems (60 points) (1a) (10 points) In general P ( A B ) = P ( A ) + P ( B ) P ( A B ), so that P ( A B ) = P ( A ) + P ( B ) if and only if P ( A B ) = 0. A sufficient condition is that A and B are mutually exclusive events (i.e. A B = ). The cumulative distribution function (cdf) of X ( u ) is defined as F X ( u ) ( z ) = Pr { X ( u ) z } . The probability density function (pdf) of X ( u ) is related to the cdf by f X ( u ) ( z ) = d dz F X ( u ) ( z ) (1b) (10 points) These are all basic definitions: m X = integraldisplay  xf X ( u ) ( x ) dx 2 X = integraldisplay  ( x m X ) 2 f X ( u ) ( x ) dx = integraldisplay  x 2 f X ( u ) ( x ) dx m 2 X Pr { a < X ( u ) b } = integraldisplay b a f X ( u ) ( x ) dx (1c) (10 points) X ( u ) is uniform between a = 1 and b = 4 m X = a + b 2 = 5 / 2 2 X = ( b a ) 2 12 = 3 / 4 Pr { X ( u ) (0 , 2] } = integraldisplay 2 f X ( u ) ( x ) dx = integraldisplay 2 1 dx 3 = 1 / 3 . (1d) (10 points) We have discussed many special properties of Gaussian random variables; here are a few: The complete statistical description (i.e., the pdf) depends only on the second moment description (i.e., means, variances and covariances). A linear combination of jointlyGaussian random variables produces jointlyGaussian random variables. 2 K.M. Chugg  May 3, 1994 Two jointlyGaussian random variables are independent if and only if they are uncorrelated. If X ( u ) and Y ( u ) are jointlyGaussian, then Y ( u ) is Gaussian conditioned on the value of X ( u ) and viceversa. The conditional mean is an affine function: m Y  X ( x ) = m Y + Y X ( x m X ) . The conditional variance of Y ( u ) given X ( u ) = x is not a function of x . The distribution of a normalized sum of independent identically distributed ran dom variables tends to Gaussian as the number of random variables in the sum tends to infinity. This is the Central Limit Theorem  or as we studied it, the approximation to the Binomial distribution. (1e) (10 points) FALSE: In general, if two random variables are uncorrelated, they are not inde pendent. TRUE: If X ( u ) and Y ( u ) are independent, then W ( u ) = [ X ( u )] 3 and Z ( u ) = cos(2 Y ( u )) are also independent and thus uncorrelated. FALSE: The second moment description does not determine the complete statis tical description. FALSE: The joint behavior of X ( u ) and Y ( u ) is not determined by the marginal densities, but rather by the joint density. FALSE: This is seen by Jensens inequality, a special case of which ensures that the variance is nonnegative: 2 X = E { [ X ( u )] 2 } m 2 X ....
View
Full
Document
 Spring '06
 Caire

Click to edit the document details