03.stats-review2-2rv

03.stats-review2-2rv - Lecture 3: Review of Statistics...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
    Lecture 3:  Review of Statistics (Part 2) Two Random Variables BUEC 333  Summer 2009 Simon Woodcock
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
    Two Random Variables Most interesting questions in economics involve 2 (or more) variables what’s the relationship between education and earnings? what’s the relationship between stock price and profits? We describe the probabilistic relationship between two (or more) random variables using three kinds of probability distributions: the joint distribution marginal distributions conditional distributions
Background image of page 2
    The Joint Distribution The joint distribution of discrete RVs X and Y is the probability that the two RVs simultaneously take on certain values, say x and y . That is, Pr( X = x , Y = y ) Example: the relationship between weather and commuting time. Let C denote commuting time. Suppose commuting time can be long ( C = 1) or short ( C = 0). Let W denote weather. Suppose weather can be fair ( W = 1) or foul ( W = 0). There are four possible outcomes: ( C = 0, W = 0), ( C = 0, W = 1), ( C = 1, W = 0), ( C = 1, W = 1). The probabilities of each of these outcomes define the joint distribution of C and W : Foul Weather (W=0) Fair Weather (W=1) Total Short Commute (C=0) 0.15 0.25 0.4 Long Commute (C=1) 0.55 0.05 0.6 Total 0.7 0.3 1
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
    Marginal Distributions When X,Y have a joint distribution, we use the term marginal distribution to describe the probability distribution of X or Y alone. We can compute the marginal distribution of X from the joint distribution of X,Y by adding up the probabilities of all possible outcomes where X takes a particular value. That is, if Y takes one of k possible values: Example: commuting time and weather. The marginal distribution of weather is in blue. The marginal distribution of commuting time is in yellow. ( 29 = = = = = k i i y Y x X x X 1 , Pr ) Pr( Foul Weather (W=0) Fair Weather (W=1) Total Short Commute (C=0) 0.15 0.25 0.4 Long Commute (C=1) 0.55 0.05 0.6 Total 0.7 0.3 1
Background image of page 4
    Conditional Distributions The distribution of a random variable Y conditional on another random variable X taking a specific value is called the conditional distribution of Y given X . The conditional probability that Y takes value y when X takes value x is written Pr( Y = y | X = x ).
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 06/21/2009 for the course BUEC 333 taught by Professor Simonwoodcock during the Summer '09 term at Simon Fraser.

Page1 / 16

03.stats-review2-2rv - Lecture 3: Review of Statistics...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online