hw1sol - Probablistic Graphical Models, Spring 2007...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Probablistic Graphical Models, Spring 2007 Homework 1 solutions 1 Representation Solution due to Steve Gardiner Consider N +1 binary random variables X 1 ,...X N ,Y that have the following conditional independencies: i,jX i X j | Y 1. Consider the following conjecture: Conjecture 1. i,j,k X i ,X j X k | Y The conjecture is false; a counterexample follows. Consider a model where N = 3 and X 1 ,X 2 and Y are all i.i.d. uniform Bernoulli random variables, and X 3 is deterministically generated as X 3 = Y ( X 1 X 2 ). (where refers to the logical XOR operator) The pairwise conditional independencies hold, namely, X 1 X 2 | Y,X 1 X 3 | Y and X 2 X 3 | Y , however it is not the case that X 1 ,X 2 X 3 | Y 2. Suppose you wish to store the joint probability distribution of these N + 1 variables as a single table. How many parameters will you need? 2 N +1- 1 3. An undirected graphical model that encodes the required conditional independencies is Y X 1 X 2 X 3 .... X N 2 Gaussians Solution due to Steve Gardiner 1. Show that for any joint density function P ( X,Y ), if X Y then Cov ( X,Y ) = 0. Lemma 2. If X Y then E [ XY ] = [ EX ][ EY ] Proof. I show the result for continuous random variables X and Y ; the proof for discrete random variables uses summation symbols instead of integration but in the same manner. 1 E [ XY ] = integraldisplay integraldisplay xyP ( x,y ) dxdy = integraldisplay integraldisplay xyP ( x ) P ( y ) dxdy (by independence) = integraldisplay yP ( y ) integraldisplay xP ( x ) dxdy = bracketleftBig integraldisplay yP ( y ) dy bracketrightBigbracketleftBig integraldisplay xP ( x ) dx bracketrightBig = [ EX ][ EY ] Corollary 3. If X Y then Cov ( X,Y ) = 0 Proof. Cov ( X,Y ) = E bracketleftbig ( X- EX )( Y- EY ) bracketrightbig = E bracketleftbig XY- Y EX- XEY + [ EX ][ EY ] bracketrightbig = E [ XY ]- E [ Y EX ]- E [ XEY ] + E bracketleftbig [ EX ][ EY ] bracketrightbig = E [ XY ]- [ EY ][ EX ]- [ EX ][ EY ] + [ EX ][ EY ] = E [ XY ]- [ EX ][ EY ] = E [ XY ]- E [ XY ] (by Lemma 2) = 0 2. To Show: Suppose P ( X,Y ) a Gaussian distribution. Then Cov ( X,Y ) = 0 = X Y Proof. The joint distribution of (X,Y) is given by P ( x,y ) = 1 2 X Y radicalbig 1- 2 exp bracketleftBig 1 2(1- 2 ) bracketleftbig 2 ( x- EX )( y- EY ) X Y- ( x- EX ) 2 2 X- ( y- EY ) 2 2 Y bracketrightbig bracketrightBig...
View Full Document

Page1 / 7

hw1sol - Probablistic Graphical Models, Spring 2007...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online