− f y | x z y | x z f z | x z | x d z ∙

Info iconThis preview shows pages 15–23. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: − f Y | X , Z y | x , z f Z | X z | x d z ∙ Intuitively, we average out z but it must be a weighted average, where the weights depend on the conditional density of Z given X . 15 ∙ Here is a useful way of thinking generally about conditional distributions. Suppose Y is a random variable – say, individual earnings. Then it has some distribution in the population. When we study the conditional distribution of Y given a variable X – such as education – we are essentially partitioning the population based on the possible values of X . For each outcome x , we can study the distribution of Y . This gives us a set of conditional distributions, and we are interested in how these distributions depend on x . 16 1 . 2 . Conditional Distributions and Independence ∙ It is easily seen by the definition of the conditional density that Y and X are independent if and only if f Y | X y | x f Y y , all x and y This follows because independence is the same as f X , Y x , y f X x f Y y for all x and y . 17 ∙ The requirement f Y | X y | x f Y y is actually is more intuitive as a definition of independence: knowing the outcome on X has no affect on the probabilities we assign to different outcomes on Y . ∙ If Y and X are dependent, then (at least sometimes) knowing the value of X changes the probability of events involving Y . 18 EXAMPLE : Suppose Y is a Bernoulli p random variable indicator whether an adult in a large population is employed. Without knowing anything about individuals in the population, we assign p P Y 1 as the employment probability. But suppose we can observe X , the highest grade completed (and, say, assume this is recorded as the values 0,1,2,...,20 ). If the probability of employment changes as X changes, then Y and X are dependent. 19 ∙ A useful shorthand to denote that the conditional distribution of Y does not depend on X is D Y | X D Y , where D Y is shorthand for the unconditional distribution. ∙ An important extension is the notion of conditional independence . Sometimes two sets of random vectors are dependent, but they become independent when a third set of variables is conditioned on. In particular, suppose D Y | X , Z D Y | X . Then we say Y and Z are independent conditional on X . 20 ∙ Spelled out in terms of conditional densities, f Y | X , Z y | x , z f Y | X y | x , all x , y , and z ∙ This notion is very important in the literatures on program evaluation; a related notion is (implicitly) used in multiple regression analysis. 21 ∙ Once we have a conditional density, say f Y | X y | x , we can compute conditional probabilities of events that explicitly depend on X . So, if X is a scalar, we can compute, say, P Y ≤ 2 X | X x when X x . This is just F Y | X 2 x | x . Notice how x appears in two places in general.appears in two places in general....
View Full Document

{[ snackBarMessage ]}

Page15 / 67

− f Y | X Z y | x z f Z | X z | x d z ∙ Intuitively we...

This preview shows document pages 15 - 23. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online