This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Purdue University: ECE438 - Digital Signal Processing with Applications 1 ECE438 - Laboratory 7: Discrete-Time Random Processes (Week 2) October 6, 2010 1 Bivariate Distributions In this section, we will study the concept of a bivariate distribution. We will see that bivariate distributions characterize how two random variables are related to each other. We will also see that correlation and covariance are two simple measures of the dependencies between random variables, which can be very useful for analyzing both random variables and random processes. 1.1 Background on Bivariate Distributions Sometimes we need to account for not just one random variable, but several. In this section, we will examine the case of two random variablesthe so called bivariate casebut the theory is easily generalized to accommodate more than two. The random variables X and Y have cumulative distribution functions (CDFs) F X ( x ) and F Y ( y ), also known as marginal CDFs. Since there may be an interaction between X and Y , the marginal statistics may not fully describe their behavior. Therefore we define a bivariate , or joint CDF as F X,Y ( x,y ) = P ( X x,Y y ) . (1) If the joint CDF is sufficiently smooth, we can define a joint probability density func- tion, f X,Y ( x,y ) = 2 xy F X,Y ( x,y ) . (2) Conversely, the joint probability density function may be used to calculate the joint CDF: F X,Y ( x,y ) = integraldisplay y integraldisplay x f X,Y ( s,t ) dsdt. (3) Questions or comments concerning this laboratory should be directed to Prof. Charles A. Bouman, School of Electrical and Computer Engineering, Purdue University, West Lafayette IN 47907; (765) 494- 0340; email@example.com Purdue University: ECE438 - Digital Signal Processing with Applications 2 The random variables X and Y are said to be independent if and only if their joint CDF (or PDF) is a separable function, which means f X,Y ( x,y ) = f X ( x ) f Y ( y ) . (4) Informally, independence between random variables means that one random variable does not tell you anything about the other. As a consequence of the definition, if X and Y are independent, then the product of their expectations is the expectation of their product. E [ XY ] = E [ X ] E [ Y ] (5) While the joint distribution contains all the information about X and Y , it can be very complex and is often difficult to calculate. In many applications, a simple measure of the dependencies of X and Y can be very useful. Three such measures are the correlation , covariance , and the correlation coefficient . Correlation E [ XY ] = integraldisplay integraldisplay xyf X,Y ( x,y ) dxdy (6) Covariance E [( X X )( Y Y )] = integraldisplay integraldisplay ( x X )( y Y ) f X,Y ( x,y ) dxdy (7) Correlation coefficient XY = E [( X X )( Y Y )] X Y = E [ XY ] X Y X Y (8) If the correlation coefficient is 0, then...
View Full Document