dm3part3 - University of Florida CISE department...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon
University of Florida CISE department Gator Engineering Classification Part 3 Dr. Sanjay Ranka Professor Computer and Information Science and Engineering University of Florida, Gainesville
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
University of Florida CISE department Gator Engineering Data Mining Sanjay Ranka Spring 2011 Bayesian Classifiers • A probabilistic framework to solve the classification problem • Conditional Probability: • Bayes Theorem:
Background image of page 2
University of Florida CISE department Gator Engineering Data Mining Sanjay Ranka Spring 2011 Bayes Theorem: Example • Given: – A doctor knows that meningitis causes stiff neck 50% of the time – Prior probability of any patient having meningitis is 1 / 50,000 – Prior probability of any patient having stiff neck is 1 / 20 • If a patient has stiff neck, what’s the probability he/she has meningitis?
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
University of Florida CISE department Gator Engineering Data Mining Sanjay Ranka Spring 2011 Bayesian Classifiers • Consider each attribute and class label as random variables • Given a set of attributes ( A 1 , A 2 , … , A n ) – Goal is to predict class C – Specifically, we want to find value of C that maximizes P( C| A 1 , A 2 , … , A n ) • Can we estimate P( C | A 1 , A 2 , … , A n ) directly from data?
Background image of page 4
University of Florida CISE department Gator Engineering Data Mining Sanjay Ranka Spring 2011 Bayesian Classifiers • Approach: – Compute the posterior probability P( C | A 1 , A 2 , … , A n ) for all values of C using the Bayes theorem – Choose value of C that maximizes P( C | A 1 , A 2 , … , A n ) – Equivalent to choosing value of C that maximizes P( A 1 , A 2 , … , A n | C ) P( C ) • How to estimate P( A 1 , A 2 , … , A n | C )?
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
University of Florida CISE department Gator Engineering Data Mining Sanjay Ranka Spring 2011 Naïve Bayes Classifier • Assume independence among attributes A i when class is given: – P( A 1 , A 2 , … , A n | C j ) = P( A 1 | C j ) P( A 2 | C j )… P( A n | C j ) – Can estimate P( A i | C j ) for all A i and C j – New point is classified to C j if P( C j ) Π P( A i | C j ) is maximal
Background image of page 6
Image of page 7
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 11/13/2011 for the course CIS 4930 taught by Professor Staff during the Spring '08 term at University of Florida.

Page1 / 21

dm3part3 - University of Florida CISE department...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online