Classnotes_9

# Classnotes_9 - 2 The Joint Probability Distribution for a...

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 2 The Joint Probability Distribution for a Pair of Discrete Random Variables V i bl 3 A Joint Probability Distribution y Example (Example 5-1) Let: L t 4 The Marginal Probability Distribution • The individual probability distribution of a random variable is referred to as its marginal probability distribution • In general, the marginal probability distribution of X can be determined from the joint probability distribution of X and other random variables d i bl • For the case of two discrete random variables: 5 A Marginal Probability Distribution Example (Example 5 2) 5-2 6 The Conditional Probability Distribution • Given this definition, verify that the joint pmf in Example 5-1 leads to the following conditional pmfs: fX|y(x) fY|x(y) x (#bars of signal strength) y (#times city name is ( y stated) 4 3 2 1 1 2 3 0.7500 0.1000 0.1000 0 1000 0.0500 0.4000 0.4000 0.1200 0 1200 0.0800 0.0909 0.0909 0.3636 0 3636 0.4545 x (#bars of signal strength) y (#times city name is ( y stated) 4 3 2 1 1 2 3 0.5000 0.1176 0.0800 0 0800 0.0357 0.3333 0.5882 0.1200 0 1200 0.0714 0.1667 0.2941 0.8000 0 8000 0.8929 7 Properties of Conditional pmfs 8 An Example of Conditional Mean and Variance • Given the formulae on the previous slide and the conditional pmfs below, verify that conditional means and p , y conditional variances below are correct: fY|x(y) x (#bars of signal strength) fX|y(x) x (#bars of signal strength) y (#times city name is stated) 1 2 3 y (#times city name is stated) 1 2 3 4 3 2 1 0.7500 0.1000 0.1000 0.0500 0.4000 0.4000 0.1200 0.0800 0.0909 0.0909 0.3636 0.4545 4 3 2 1 0.5000 0.1176 0.0800 0.0357 0.3333 0.5882 0.1200 0.0714 0.1667 0.2941 0.8000 0.8929 3.5500 0.7475 3.1200 0.8256 1.8182 0.8760 Conditional mean: Conditional variance: Conditional Conditional mean: variance: 1.6667 2.1765 2.7200 2.8571 0.5556 0.3806 0.3616 0.1939 9 Independence of Two Discrete Random Variables 10 An Example of Independence p p (Example 5-6) 11 An Example of Independence (Example 5 6) 5-6 • The example on the previous page could also be expressed i tabular form as follows: d in b l f f ll fXY(x,y) y (length conforms) 1 0 fX(x) x( l (colour conforms) f ) 0 1 0.0098 0.9702 0.0002 0.0198 0.01 0 01 0.99 0 99 fY|x(y) fY(y) 0.98 0.02 fX|y(x) x (colour conforms) x (colour conforms) y (length conforms) 0 1 y (length conforms) 0 1 1 0 0.9800 0.0200 0 0200 0.9800 0.0200 0 0200 1 0 0.0100 0.0100 0 0100 0.9900 0.9900 0 9900 0.9800 0.0196 Conditional Conditional mean: variance: 0.9800 0.0196 Conditional mean: Conditional variance: 0.9900 0.9900 0 9900 0.0099 0.0099 0 0099 12 The Multinomial Distribution 13 The Joint Probability Distribution y for a Pair of Continuous Random Variables V i bl 14 A Joint Probability Distribution y Example (Example 5-12) 15 A Joint Probability Distribution Example (Example 5 12) 5-12 16 The Marginal Probability Distribution • The individual probability distribution of a random variable is referred to as its marginal probability distribution • In general, the marginal probability distribution of X can be determined from the joint probability distribution of X and other random variables d i bl • For the case of two continuous random variables: 17 The Conditional Probability y Distribution 18 An Example of Conditional Probability (Example 5 14) 5-14 19 Conditional Mean and Variance 20 Independence of Two Continuous Random Variables 21 An Example of Independence (Example 5 18) 5-18 22 The Covariance Between Two Random Variables • Before we define covariance we need to determine the expected value of a function of two random variables: • Now we are ready to define covariance: 23 An Example of Covariance for p Discrete Random Variables fXY(x,y) y 3 2 1 fX(x) x 1 0.20 0.10 0.30 3 0.30 0 30 0.20 0.20 0.70 fY(y) 0.30 0 30 0.40 0.30 24 What Does C Wh t D Covariance Signify? i Si if ? Covariance is a measure of the strength of the linear relationship between two random variables. If the relationship is nonlinear, the covariance may not be useful. E.g. in Fig. 513 (d) there is definitely a relationship between the variables, undetected by the covariance. 25 The Correlation Between Two Random Variables 26 A Correlation Example (Example 5 26) 5-26 fXY(x,y) y 3 2 1 0 fX(x) x 0 2 0.10 0.10 0.20 0.20 1 0.10 0.10 0.20 0.20 3 0.40 fY(y) 0.40 0.20 0.20 0.20 0.40 27 A Correlation (and Covariance) Example (Example 5 26) 5-26 28 The Covariance and Correlation Between Independent Variables • If X and Y are exactly linearly related (i.e. (i e Y=aX+b for constants a and b) then the correlation UXY is either +1 or -1 (with the same 1 sign as the constant a) – If a is positive, UXY = 1 positive – If a is negative, UXY = -1 29 An Example ( p (Covariance Between Independent Variables) (Example 5-26) 30 An Example ( p (Covariance Between Independent Variables) (Example 5-26) 31 The Bivariate Normal Distribution 32 33 Linear Combinations of Random Variables 34 A Linear Combination of Random Variables Example Example 5-33 35 36 ...
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online