class_11_19 - Statistical Data Mining ORIE 474 Fall 2007...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon
Statistical Data Mining ORIE 474 Fall 2007 Tatiyana V. Apanasovich 11/19/07 Bayesian NN
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Why the Excitement? What are they? Bayesian nets are a network-based framework for representing and analyzing models involving uncertainty Where did they come from? Cross fertilization of ideas between the artificial intelligence, decision analysis, and statistic communities Why the sudden interest? Development of propagation algorithms followed by availability of easy to use commercial software Growing number of creative applications How are they different from other knowledge representation and probabilistic analysis tools? Different from other knowledge-based systems tools because uncertainty is handled in mathematically rigorous yet efficient and simple way Different from other probabilistic analysis tools because of network representation of problems, use of Bayesian statistics, and the synergy between these
Background image of page 2
Definition of a Bayesian Network Factored joint probability distribution as a directed graph: structure for representing knowledge about uncertain variables computational architecture for computing the impact of evidence on beliefs Knowledge structure: variables are depicted as nodes arcs represent probabilistic dependence between variables conditional probabilities encode the strength of the dependencies Computational architecture: computes posterior probabilities given evidence about selected nodes exploits probabilistic independence for efficient computation
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Joint Probability Distribution(JPD) P(A, B) JPD, Probability of both A and B. P(A,B)<P(A|B) P(A|B) Conditional probability. The probability of A, given that B already happen. A B
Background image of page 4
Bayes Rule Based on definition of conditional probability p(A i |E) is posterior probability given evidence E p(A i ) is the prior probability P(E|A i ) is the likelihood of the evidence given A i p(E) is the preposterior probability of the evidence = = = = i i i i i i i i ) )p(A A | p(E ) )p(A A | p(E p(E) ) )p(A A | p(E E) | p(A p(B) A)p(A) | p(B p(B) B) p(A, B) | p(A A 1 A 2 A 3 A 4 A 5 A 6 E
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
A1 A0 B1 B0 B1 B0 C1 0.00175 0.00175 0.009975 0.009975 C0 0.00075
Background image of page 6
Image of page 7
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 22

class_11_19 - Statistical Data Mining ORIE 474 Fall 2007...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online