Classnotes_8 - 2 The Joint Probability Distribution for a...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 2 The Joint Probability Distribution for a Pair of Discrete Random Variables V i bl A Joint Probability Distribution y Example (Example 5-1) Let: L t 3 4 The Marginal Probability Distribution A Marginal Probability Distribution Example (Example 5 2) 5-2 • The individual probability distribution of a random variable is referred to as its marginal probability distribution • In general, the marginal probability distribution of X can be determined from the joint probability distribution of X and other random variables d i bl • For the case of two discrete random variables: 5 The Conditional Probability Distribution 6 Properties of Conditional pmfs • Given this definition, verify that the joint pmf in Example 5-1 leads to the following conditional pmfs: fX|y(x) fY|x(y) x (#bars of signal strength) y (#times city name is ( y stated) 4 3 2 1 1 2 3 0.7500 0.1000 0.1000 0 1000 0.0500 0.4000 0.4000 0.1200 0 1200 0.0800 0.0909 0.0909 0.3636 0 3636 0.4545 x (#bars of signal strength) y (#times city name is ( y stated) 4 3 2 1 1 2 3 0.5000 0.1176 0.0800 0 0800 0.0357 0.3333 0.5882 0.1200 0 1200 0.0714 0.1667 0.2941 0.8000 0 8000 0.8929 7 8 An Example of Conditional Mean and Variance Independence of Two Discrete Random Variables • Given the formulae on the previous slide and the p , y conditional pmfs below, verify that conditional means and conditional variances below are correct: fY|x(y) x (#bars of signal strength) fX|y(x) x (#bars of signal strength) y (#times city name is stated) 1 2 3 y (#times city name is stated) 4 3 2 1 0.7500 0.1000 0.1000 0.0500 0.4000 0.4000 0.1200 0.0800 0.0909 0.0909 0.3636 0.4545 4 3 2 1 3.5500 0.7475 3.1200 0.8256 1.8182 0.8760 Conditional mean: Conditional variance: 1 2 3 0.5000 0.1176 0.0800 0.0357 0.3333 0.5882 0.1200 0.0714 0.1667 0.2941 0.8000 0.8929 Conditional Conditional mean: variance: 1.6667 2.1765 2.7200 2.8571 0.5556 0.3806 0.3616 0.1939 9 10 An Example of Independence p p (Example 5-6) An Example of Independence (Example 5 6) 5-6 • The example on the previous page could also be expressed i tabular form as follows: d in b l f f ll fXY(x,y) y (length conforms) 1 0 fX(x) x( l (colour conforms) f ) 0 1 0.0098 0.9702 0.0002 0.0198 0.01 0 01 0.99 0 99 fY|x(y) fY(y) 0.98 0.02 fX|y(x) x (colour conforms) x (colour conforms) y (length conforms) 0 1 y (length conforms) 0 1 1 0 0.9800 0.0200 0 0200 0.9800 0.0200 0 0200 1 0 0.0100 0.0100 0 0100 0.9900 0.9900 0 9900 0.9800 0.0196 0.9800 0.0196 Conditional mean: Conditional variance: 11 Conditional Conditional mean: variance: 0.9900 0.9900 0 9900 0.0099 0.0099 0 0099 12 The Joint Probability Distribution y for a Pair of Continuous Random Variables V i bl The Multinomial Distribution 13 A Joint Probability Distribution y Example (Example 5-12) 14 A Joint Probability Distribution Example (Example 5 12) 5-12 15 16 The Conditional Probability y Distribution The Marginal Probability Distribution • The individual probability distribution of a random variable is referred to as its marginal probability distribution • In general, the marginal probability distribution of X can be determined from the joint probability distribution of X and other random variables d i bl • For the case of two continuous random variables: 17 An Example of Conditional Probability (Example 5 14) 5-14 18 Conditional Mean and Variance 19 20 Independence of Two Continuous Random Variables An Example of Independence (Example 5 18) 5-18 21 22 An Example of Covariance for p Discrete Random Variables The Covariance Between Two Random Variables • Before we define covariance we need to determine the expected value of a function of two random variables: • Now we are ready to define covariance: fXY(x,y) 23 y 3 2 1 fX(x) x 1 0.20 0.10 0.30 3 0.30 0 30 0.20 0.20 0.70 fY(y) 0.30 0 30 0.40 0.30 24 The Correlation Between Two Random Variables What Does C Wh t D Covariance Signify? i Si if ? Covariance is a measure of the strength of the linear relationship between two random variables. If the relationship is nonlinear, the covariance may not be useful. E.g. in Fig. 513 (d) there is definitely a relationship between the variables, undetected by the covariance. 25 A Correlation Example (Example 5 26) 5-26 fXY(x,y) y 3 2 1 0 fX(x) 26 A Correlation (and Covariance) Example (Example 5 26) 5-26 x 0 2 0.10 0.10 0.20 0.20 1 0.10 0.10 0.20 0.20 3 0.40 fY(y) 0.40 0.20 0.20 0.20 0.40 27 28 An Example ( p (Covariance Between Independent Variables) (Example 5-26) The Covariance and Correlation Between Independent Variables • If X and Y are exactly linearly related (i.e. (i e Y=aX+b for constants a and b) then the correlation UXY is either +1 or -1 (with the same 1 sign as the constant a) – If a is positive, UXY = 1 positive – If a is negative, UXY = -1 29 An Example ( p (Covariance Between Independent Variables) (Example 5-26) 30 The Bivariate Normal Distribution 31 32 Linear Combinations of Random Variables 33 34 35 36 A Linear Combination of Random Variables Example Example 5-33 ...
View Full Document

Ask a homework question - tutors are online