Probability-Theory

Probability-Theory - Machine Learning ! ! ! ! !Srihari...

Info iconThis preview shows pages 1–10. Sign up to view the full content.

View Full Document Right Arrow Icon
Machine Learning Srihari 1 Probability Theory Sargur N. Srihari srihari@cedar.buffalo.edu
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Machine Learning Srihari 2 Probability Theory with Several Variables • Key concept is dealing with uncertainty – Due to noise and finite data sets • Framework for quantification and manipulation of uncertainty 2 apples 3 oranges 3 apples 1 orange Box is random variable B ( has values r or b ) Fruit is random variable F ( has values o or a ) Let p(B=r)=4/10 and p(B=b)=6/10
Background image of page 2
Machine Learning Srihari 3 Probabilities of Interest • Marginal Probability – what is the probability of an apple? • Conditional Probability – Given that we have an orange what is the probability that we chose the blue box? • Joint Probability – What is the probability of orange AND blue box? 2 apples 3 oranges 3 apples 1 orange
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Machine Learning Srihari 4 Sum Rule of Probability Theory • Consider two random variables X can take on values x i , i=1,, M Y can take on values y i , i=1,. .L N trials sampling both X and Y • No of trials with X=x i and Y=y j is n ij • Marginal Probability
Background image of page 4
Machine Learning Srihari 5 Product Rule of Probability Theory • Consider only those instances for which X=x i • Then fraction of those instances for which Y=y j is written as p(Y=y j |X=x i ) • Called conditional probability • Relationship between joint and conditional probability: p ( Y = y j | X = x i ) = n ij c i p ( X = x i , Y = y j ) = n ij N = n ij ci c i N = p ( Y = y j | X = x i ) p ( X = x i )
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Machine Learning Srihari 6 Bayes Theorem • From the product rule together with the symmetry property p(X,Y)=p(Y,X) we get • Which is called Bayes’ theorem • Using the sum rule the denominator is expressed as Normalization Constant to ensure sum of conditional probability on LHS sums to 1 over all values of Y
Background image of page 6
Machine Learning Srihari 7 Rules of Probability Given random variables X and Y Sum Rule gives Marginal Probability Product Rule: joint probability in terms of conditional and marginal Combining we get Bayes Rule where Viewed as Posterior α likelihood x prior
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Machine Learning Srihari 8 Joint Distribution over two Variables N = 60 data points Histogram of Y (Fraction of data points having each value of Y) Histogram of X Histogram of X given Y=1 X takes nine possible values, Y takes two values Fractions would equal the probability as N oo
Background image of page 8
Machine Learning Srihari 9 Bayes rule applied to Fruit Problem • Probability that box is red given that fruit picked is orange • Probability that fruit is orange – From sum and product rules p ( B = r | F = o ) = p ( F = o | B = r ) p ( B = r ) p ( F = o ) = 3 4 × 4 10 9 20 = 2 3 = 0.66 p ( F = o ) = p ( F = o , B = r ) + p ( F = o , B = b ) = p ( F = o | B = r ) p ( B = r ) + p ( F = o | B = b ) p ( B = b ) = 6 8 × 4 10 + 1 4 × 6 10 = 9 20 = 0.45 The a posteriori probability of 0.66 is different from the a priori probability of 0.4 The marginal probability of 0.45 is lower than average probability of 7/12=0.58
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 10
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 46

Probability-Theory - Machine Learning ! ! ! ! !Srihari...

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online