This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Section 3.6 Joint Distributions Earlier we discussed how to display and summarize 1 , , n x x K on a variable x . We extended these ideas to the population distribution with density ( ) f x (continuous case) or mass function ( ) p x (discrete case). We now discuss the joint distribution for two variableas x and y . 1. Discrete Case . Let x and y be discrete variables. For example, x = number of courses taken y = number of hours spent (in a day) Definition . The joint distribution ( , ) p x y (or ( , ) f x y ) of x and y is defined by (i) ( , ) p x y x ; (ii) ( , ) 1 p x y = . Also, ( ) ( , ) y p x p x y = and ( ) ( , ) x p y p x y = are respectively called the marginal distribution of x and y . 1 The mean (or the expected value ) of ( , ) h x y is ( , ) ( , ) ( , ) h x y y x h x y p x y μ = . (Also, denoted by )) , ( ( y x h E .) Example 1 . The joint distribution of ) , ( y x p of x (number of cars) and y (the number of buses) per signal cycle at a particular left-turn lane is given by y p (x,y) 1 2 x 1 2 3 4 5 .025 .050 .125 .150 .100 .050 .015 .030 .075 .090 .060 .030 .010 .020 .050 .060 .040 .020 a) Find the proportion of cycles with the same number of cars and buses. b) Find the marginal functions of x and y . c) Suppose a bus occupies three vehicle spaces and a car occupies just one. What is the mean number of vehicle spaces occupied during a signal cycles?...
View Full Document
- Summer '08
- Probability theory, joint distribution, Marginal distribution, dxdy, 10.5%