This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: M4056 Bayes Estimators. October 20, 2010 Background on joint distributions. Recall that if f X,Y ( x, y ) is a joint density, then the marginal and conditional density functions are defined as follows: marginals: f X ( x ) := integraldisplay ∞ ∞ f X,Y ( x, y ) dy f Y ( y ) := integraldisplay ∞ ∞ f X,Y ( x, y ) dx, conditionals: f X  Y ( x  Y = y ) := f X,Y ( x, y ) f Y ( y ) , f Y  X ( y  X = x ) := f X,Y ( x, y ) dy f X ( x ) . The “ Y = y ” may be shortened: f X  Y ( x  y ) = f X  Y ( x  Y = y ). Similarly for X . We also have marginals and conditionals in case X and Y are discrete. Suppose X has values in { 1 , . . . , m } and Y in { 1 , . . . , n } . Each joint outcome can written as a pair ( i, j ), and we can picture the set of all pairs as a rectangular array. Denote its probability p X,Y ( i, j ). The marginal pmf s in this case are the row and column sums: p X ( i ) = n summationdisplay j =1 p ( i, j ) and p Y ( j ) = m summationdisplay i =1 p ( i, j ) ....
View
Full
Document
This note was uploaded on 11/29/2011 for the course MATH 4056 taught by Professor Staff during the Fall '08 term at LSU.
 Fall '08
 Staff
 Statistics

Click to edit the document details