wk03 - CS195f Homework 2 Mark Johnson and Erik Sudderth...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
CS195f Homework 2 Mark Johnson and Erik Sudderth Homework due at 2pm, 1st October 2009 The frst question asks you to analyse the Following naive Bayes model that describes the weather in a mythical country. Y = { night , day } X 1 = { cold , hot } X 2 = { rain , dry } P( X 1 ,X 2 ,Y ) = P( Y )P( X 1 | Y )P( X 2 | Y ) P( Y = day ) = 0 . 5 P( X 1 = hot | Y = day ) = 0 . 9 P( X 1 = hot | Y = night ) = 0 . 2 P( X 2 = dry | Y = day ) = 0 . 75 P( X 2 = dry | Y = night ) = 0 . 4 Question 1: For each of the following formulae except the ±rst, write an equation which de±nes it in terms of formulae that appear earlier in the list. (For example, you should give a formula for P( x 1 ,x 2 ) in terms of P( x 1 ,x 2 ,y ) ). Then given the model above, calculate and write out the value of the formula for possible each combination of values of the variables that appear in it. a) P( x 1 ,x 2 ,y ) . b) P( x 1 ,x 2 ) . c) P( y | x 1 ,x 2 ) . d) P( x 1 ) . e) P( x 2 ) . f) P( x 1 | x 2 ) . g) P( x 2 | x 1 ) . 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
h) P( x 1 | x 2 ,y ) . i) P( x 2 | x 1 ,y ) . Are X 1 and X 2 conditionally independent given Y ? Are X 1 and X 2 marginally independent, integrating over Y ? Provide a short proof for both answers. Consider a binary categorization problem, and let p ( y i | x i ) denote the posterior distri- bution of the latent class label y i ∈ { 0 , 1 } given observation x i . Suppose that the classiFer ˆ y ( x i ) is allowed to make one of three decisions: choose class 0, choose class 1, or “reject” this data (refuse to make a decision). We can use a Bayesian decision theoretic approach to tradeo± the losses incurred by incorrect decisions and rejections. Question 2: Suppose that the classiFer incurs a loss of 0 whenever it chooses the correct class, a loss of 1 whenever it chooses the wrong class, and a loss of λ whenever it selects the reject option. Express the optimal decision rule
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 4

wk03 - CS195f Homework 2 Mark Johnson and Erik Sudderth...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online