{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

lecture_10pt2 - Computational functional genomics(Spring...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
Computational functional genomics (Spring 2005: Lecture 10) David K. Gifford (Adapted from a lecture by Tommi S. Jaakkola) MIT CSAIL
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Topics Basic classification approaches decisions estimation variable selection Examples More advanced methods
Background image of page 2
Classification We can divide the large variety of classification approaches into roughly two main types 1. Generative build a generative statistical model e.g., mixture model 2. Discriminative directly estimate a decision rule/boundary e.g., logistic regression
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Generative approach to classification A mixture of two Gaussians, one Gaussian per class choice of class P ( class = 1) P ( class = 0) X N ( µ 1 , Σ 1 ) X N ( µ 0 , Σ 0 ) where X corresponds to, e.g., a tissue sample (expression levels across the genes). Three basic problems we need to address: 1. decisions 2. estimation 3. variable selection
Background image of page 4
Mixture classifier cont’d Examples X (tissue samples) are classified on the basis of which Gaussian better explains the new sample (cf. likelihood ratio test) log P ( X | µ 1 , Σ 1 ) P ( class = 1) > 0 class = 1 (1) P ( X | µ 0 , Σ 0 ) P ( class = 0) 0 class = 0 (2) where the prior class probabilities P ( class ) bias our decisions to- wards one class or the other. Decision boundary log P ( X | µ 1 , Σ 1 ) P ( class = 1) = 0 (3) P ( X | µ 0 , Σ 0 ) P ( class = 0)
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Mixture classifier: decision boundary Equal covariances X N ( µ 1 , Σ) , class = 1 (4) X N ( µ 0 , Σ) , class = 0 (5) The decision rule is linear
Background image of page 6
Mixture classifier: decision boundary Unequal covariances X N ( µ 1 , Σ 1 ) , class = 1 (6) X N ( µ 0 , Σ 0 ) , class = 0 (7) The decision rule is quadratic
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}