This preview shows pages 1–5. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Course Work: Homeworks 0 There will be 5 homeworks o Homework 0 will not count towards your grade a You should be able to solve all problems in HWO
o If you have trouble in HWO, you will have trouble later CSci 5512: Artificial Intelligence  ° H°mew°rk subml55i°n$
: Using the submit system. only pdf (encouraged) in Paper copy at the beginning of the class
Instructor: Arindam Banerjee o All programming in Matlab, Java, C, Python 0 Follow the instructions for programming assignments
o Other languages will not receive any credit January 18, 2012 0 Ok to discuss with others, you have to write on your own, put
the name(s) of people you discussed with Instructor: Arindam Banerjee Instructor: Arindam Banerjee —
General Information Course Work: Homeworks (Contd.) 0 Course Number: CSci 5512 0 Class: Mon Wed 12:4502:00 pm 0 Dates: . Location Keller Ha” 3_111 o HW 0: Posted Jan 18 (Wed), due Jan 23 (Mon) at 12:45 pm
' o HW 1: Jan 30 (Mon), due Feb 13 (Mon) at 12:45 pm 0 HW 2: Feb 20 (Mon), due Mar 05 (Mon) at 12:45 pm
_ HW 3: Mar 26 (Mon), due Apr 09 (Mon) at 12:45 pm
0 TA: James Parker, Jparker@cs.umn.edu o HW 4: Apr 16 (Mon), due May 30 (Mon) at 12:45 pm 0 Instructor: Arindam Banerjee, banerjee©cs.umn.edu 0 Late submission policy: Late by 0—24 hrs: 25% deducted from actual score Late by 2448 hrs: 50% deducted from actual score Late by more than 48 hrs: Will receive a zero All late submissions must be submitted in pdf using Submit 0 Office Hours: a Arindam: EE/CS 6—213 Mon Wed 02:00—03:00 pm
a James: EE/CS 2216 Thu 10:0011:00 am, Fri 01:0002:00 pm 0 Web page: http://www—users.itlabs.umn.edu/classes/Spring—
2012/csci5512 Instructor: Arindam Banerjee Instructor: Arindam Banerjee Course Work: Exams, etc. Topics 0 Exams . . .
o Mid—Term: Mar 26 (Mon) in class ° Quant'fy'"g uncerta'my
: Final: May ?? (??), ?:00?:00 o Probabilistic Reasoning
° CI°Sed b°°k' Closed "_°tes exam _ o Probabilistic Reasoning over Time
a Allowed 1 sheet for midterm, 2 sheets for final 0 Making Simple Decisions
0 Participation: 0 Making Complex Decisions
0 Ask questions 0 Learning from Examples
' Participate in discussim‘s o Learning Probabilistic Models 0 Web Links: 0 Reinforcement Learning o Online Submission for homeworks 0 Natural Language Processing
0 Bulletin Board for discussions Instructor: Arindam Banerjee Instructor: Arindam Banerjee — o Uncertainty inherent in decision problems 0 Partial knowledge of environment
in Environment may be complex or stochastic
o Existence of other agents o Homework: 50 % = 4 x 12.5 % o MidTerm: 20 % 0 Final: 25 % o Firstorder logic is inappropriate for such domains 0 Several different events are possible
0 Participation: 5 % 0 Each event 0 Has a different "probability" of happening 0 Has different "utility" or “payoffs”
o In order to pass the course: : Average on exams (midterm and final) must be at least 50%. Rational decisions maximize expected utility
° Overall Sc°re must be at least 50% Decision Theory E Utility Theory + Probability Theory Instructor: Arindam Banerjee Instructor: Arindam Banerjee ° Game of Monopoly 0 Random variables are mappings of events (to real numbers) . i o MappingX:Qi—>R
: i' V? 0 Any event w maps to X(w) 0 Example: o Tossing a coin has two possible outcomes
o Denoted by {H, T} or {0,1} ° PUVSUIt With constraints a Fair coin has uniform probabilities
: Chasing in Manhattan
 1 1
o Robotic teams for search/rescue p(X = 0) = _ P(X = 1) = _
2 2 o The Stock Market 0 Random variables (r.v.s) can be 0 Discrete, e.g., Bernoulli
0 Continuous. e.g., Gaussian Instructor: Arindam Banerjee Instructor: Arindam Banerjee — — o For a continuous r.v. a Distribution function F(x) = P(X S x)
9 Corresponding density function f(x)dx = dF(x) 0 Sample space 9 of events
0 Each "event" to E (2 has an associated "measure"
a Probability of the event P(w) a Note that x
o Axioms of Probability: F(X) =/ f(t)dt
0 Va), P(w) 6 [0,1] t=—oo
o P(Q) = 1 _
, p(w1 Uwz) : pom) + p(w2) _ P(wl mm) o For a discrete r.v.
o Probability mass function f(x) = P(X = x) = P(X)
I . . a We will call this the probability of a discrete event
° N°te We are bang '"f0’ma' . Distribution function P(X) = P(X g x) Instructor: Arindam Banerjee Instructor: Arindam Banerjee Joint Distributions, Marginals Independence o For two continuous r.v.s X1,X2
0 Joint distribution F(X1,X2) = P(X1 S X1,X2 S x2)
o Joint density function f(x1,X2) can be defined as before ° JOInt Pr0bablllty P(Xl = X17X2 = X2)
9 The marginal probability density o X1,X2 are different dice 0 X1 denotes if grass is wet, X2 denotes if sprinkler was on 00
f(X1) = / f(X1,X2)dX2
X2=—°° 0 Two r.v.s are independent if
c For two discrete r.v.s X1,X2
0 Joint probability f(X1,X2) = = X1,X2 2 X2) = P(X1,X2)
o The marginal probability P X = X = P X = X ,X = X a Two different dice are independent
( 1 1) x22 ( 1 1 2 2) o If sprinkler was on, then grass will be wet => dependent P(X1 = x1,X2 = x2) = P(Xl = x1)P(X2 2 X2) 0 Can be extended to joint distribution over several r.v.s
0 Many hard problems involve computing marginals Instructor: Arindam Banerjee Instructor: Arindam Banerjee — Grass Wet Grass Dry
Sprinkler On 0.4 Sprinkler Off 0.2 o The expected value of a r.v. X o For continuous r.v.s E[X] = fx xp(x)dx o Inference problems: ° For discrete r'V' E[X] = Z:iX’pi o Given 'grass wet' what is P('sprinkler on'‘grass wet')
o Expectation is a linear operator o Given ‘symptom' what is P(‘disease'‘symptom') o For any r.v.s X, Y, the conditional probability P(xly) = P) 0 Since P(x, y) = P(yx)P(x), we have P(ylx) = —P (1?: (y) o Expressing ‘posterior' in terms of 'conditional': Bayes Rule Instructor: Arindam Banerjee Instructor: Arindam Banerjee E[aX—l— bY + c] = aE[X] + bE[Y] + c Product Rule & Independence Conditional Independence 0 Product Rule: 0 X and Y are conditionally independent given Z
° For X1,X2.P(X1iX2)= P(X1)P(X2IX1)
. For X1,X2,X3, P(X1,X2,X3) = P(X1)P(X2X1)P(X3X1,X2) P(Xi VIZ) = P(XIZ)P(YZ)
o In general, the chain rule n
P(X1, ,X,,) = HP(X,X1,...,X,_1) ° Example:
[:1 P( Toothache, Catch Cavity)
= P( Toothache Cavity)P(Catch Cavity) 0 Joint distribution of n Boolean variables
0 Specification requires 2" — 1 parameters 0 Recall Independence: _, . .. . . . . .
O For X1,X2' P(thz) = P(X1)P(X2) o Conditional Independence Simplifies Jomt distributions a In general " o Often reduces from exponential to linear in n PX,,X,, = PX,
“ I I I P(X,Y,Z)=P(Z)P(XZ)P(YZ) 0 Independence reduces specification to n parameters Instructor: Arindam Banerjee Instructor: Arindam Banerjee — —
Independence Naive Bayes Model 0 If X1, . . . ,X,, are independent given Y A
/ I Cavity ’1
CaVIty decomposesinm {cothache Catct> P(Y, X1, _ _ _ a X") = H y)
[:1 Toothache Catch ‘ \V/
Weather
0 Exa m p le: P(Cavity, Toothache, Catch) 0 Consider 4 variables: Toothache,Catch,Cavity,Weather = P(Cavity)P( ToothacheCavity)P(CatchCavity)
0 Independence implies o More generally
I1
P(TOOthaCheacatCha caViWa weather) P(Cause, Effectl, . . . , Effect”) = P(Cause) H P(EffectilCause)
= P(Toothache, Catch, Cavity)P(Weather) i=1 0 In terms of the joint distribution: ® ®
in 32 parameters reduced to 12
o For boolean variables 2" — 1 reduces to n ’ I ‘
0 Absolute independence powerful but rare ' ' ° Instructor: Arindam Banerjee Instructor: Arindam Banerjee ...
View Full
Document
 Spring '08
 Staff

Click to edit the document details