This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Problem Set 3 MAS 622J/1.126J: Pattern Recognition and Analysis Due Monday, 16 October 2006 [Note: All instructions to plot data or write a program should be carried out using either Python accompanied by the matplotlib package or Matlab. Feel free to use either or both, but in order to maintain a reasonable level of consistency and simplicity we ask that you do not use other software tools.] Problem 1: (DHS 2.6) Optimal Decision Boundaries Your friend has built a system to recognize into which of two categories, 1 or 2 , her advisors email can be classified. She has brilliantly identified two features such that her training data is well approximated by two Gaussians: p ( x  1 ) N ( 1 , 1 ) p ( x  2 ) N ( 2 , 2 ) where 1 = [8 9] T , 2 = [0 9] T , 1 = I, and 2 = 16 , where I is the identity matrix. 16 a. Plot the onesigma ellipses for these two classes in the place x = [ x 1 x 2 ] T . b. Your friend finds that choosing a threshold at x 1 = 4 perfectly separates the training examples she has; thus, she proposes that this should be the best classifier. Show her an expression, in terms of x , which can improve her classifier with respect to minimizing the Bayes probability of error. Assume that email from class 2 is twice as likely as email from class 1 . c. The shape of this optimal decision boundary is: a line a parabola a hyperbola a circle an ellipse none of the above (explain) 1 Be sure to justify your answer.answer....
View
Full
Document
 Fall '00
 KevinAmaratunga

Click to edit the document details