Week 2
Probability theory
Bayes’ Nets
Last week
!
Talked about how to formulate an AI
problem
!
Define your variables
!
Determine desired actions
!
Build an agent that produces desired action
given input
• In this course, mostly through learning
!
Let’s look at reasoning…
Logical reasoning
!
In 630, we talked about logic
Q: Does everyone
love John?
John loves John
Betty loves John
Mary loves John
Logical reasoning
!
Here, the “variables” are logic sentences
S1^S2^S3=>
!
x loves(x,J)
S3: loves(J,J)
S2: loves(B,J)
S1: loves(M,J)
Q: Does everyone
love John?
John loves John
Betty loves John
Mary loves John
Logical reasoning
!
Values are true, false
true (if world
only consists of
M,B,J)
S1^S2^S3=>
!
x loves(x,J)
Q: Does everyone
love John?
true
S3: loves(J,J)
John loves John
true
S2: loves(B,J)
Betty loves John
true
S1: loves(M,J)
Mary loves John
Reasoning with uncertainty
!
What if we don’t know the answers?
P(
!
x
loves(x,J))
Q: Does everyone
love John?
P(loves(J,J))
John almost
certainly loves John
P(loves(B,J))
I’m not sure if Betty
loves John
P(loves(M,J))
It’s likely Mary
loves John
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentReasoning with uncertainty
!
Can give the probability of values
P(
!
x loves(x,J)=true)=
0.8*0.5*0.95=0.38
(assuming above are independent)
Q: Does everyone
love John?
P(loves(J,J)=true)=0.95
John almost
certainly loves John
P(loves(B,J)=true)=0.5
I’m not sure if Betty
loves John
P(loves(M,J)=true)=0.8
It’s likely Mary
loves John
Where do probabilities come from?
!
From life experience
!
From guessing
!
From controlled sample pools
!
The quality of the judgments made using
this data will depend on the sample that
the probabilities came from
!
How well does the source match the test
conditions?
!
Language statistics from newswire applied to
childrens books
What are probabilities in terms
of logic?
!
Probabilities describe the degree of belief in a
particular proposition
!
No longer just true or false
!
“The chance of rain today is 10%”
P(rain) = .1
!
“80% of the time, squealing indicates bad brakes”
… means that we believe 80% of the time
Squeal => BadBrakes
!
It is not that the proposition is x% true
!
P(rain)=.1 does not mean it is raining 10%
Random variables
!
In order to determine the probability of events,
we have to know how many different
possibilities there are
!
A random variable takes on one or more values
!
6sided die roll: Roll=1, Roll=2, …, Roll=6
!
Squealing: Squeal=true, Squeal=false
!
Random variables have three components:
!
The name of the variable
!
The range of its elements
!
A probability associated with each element
• This is called a probability distribution
Random variables
!
Typically written with a capital letter (particularly
!
3 types, depending on domain
!
boolean: <true, false>
• Logical propositions
• Can abbreviate P(Rain=true)=P(rain),
P(Rain=false)=P(~rain)
!
discrete: <a,b,c,d>
!
continuous: [0,1]
!
Examples of each?
Unconditional probabilities
This is the end of the preview.
Sign up
to
access the rest of the document.
 Fall '08
 EricFoslerLussier
 Artificial Intelligence, Probability theory, cavity, toothache

Click to edit the document details