{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Lecture28HO - CS440/ECE448 Intro to Articial Intelligence...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
Lecture 28: (the last one…) Review session Prof. Julia Hockenmaier [email protected] http://cs.illinois.edu/fa11/cs440 CS440/ECE448: Intro to Artificial Intelligence Monday, May 2 , 4 pm in 1404 Siebel Center Natural Language Applications Across Genres: From News to Novels Prof. Kathleen McKeown, Columbia University Monday, May 2 , 6 pm in 2405 Siebel Center Attending Graduate School: A panel discussion Tuesday, May 3 , 10 am 2405 Siebel Center Machine Learning - Modern Times Dr Corinna Cortes (Head of Google Research, NY) Class admin: Exams Final exam: Friday, May 13, 7 pm in 1404 Conflict exam: Thursday, May 12, 10 am in 3401. 4 th credit hour presentation/Demo: Thursday, May 12, 2pm – 3pm Class admin: Office hours, review sessions Last set of extra credit problem set office hours: Monday, May 1,1pm – 3pm, 3405 Wednesday, May 3, 11am –1pm, 3405 Additional office hours: Monday, May 9, 1pm –3pm (Parisa Haghani) Tuesday, May 10, 2pm – 3pm, 3324 Wednesday, May 11, 2pm – 3pm, 3324 (Julia Hockenmaier) Thursday, May 13, 1pm – 3pm (Yonatan Bisk)
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Review Probability questions 1. How many parameters to do you need to specify the full joint of P(A,B,C)? 2. How can you compute P(A,B,C) using the chain rule? 3. How many parameters do you need if you assume A, B, C are independent? if you assume A and B are conditionally independent given C? 4 . How do you compute P(A) from the full joint P(A,B,C)? Joint and conditional probabilities Joint: P(A, B) = P(A | B)P(B) (product rule) Conditional: P(A | B) = P(A , B)/P(B) 7 CS440/ECE448: Intro AI B A A B Joint distributions P(X 1 …X i … X n ) How many parameters does P(A, B) have? Answer: K A × K B parameters (here: K = number of outcomes) In general: The joint distribution of n discrete random variables X 1 …X i … X n with K i possible outcomes each has K 1 ! ! K i ! ! K n parameters. 8 CS440/ECE448: Intro AI
Background image of page 2
Chain rule Extends the product rule to multiple variables: P(X 1 , …,X n ) = P(X 1 ) ! P(X 2 | X 1 ) ! P(X 3 | X 1..2 ) ! ! P(X i | X 1..i-1 ) ! ! P(X n | X 1..n-1 ) 9 CS440/ECE448: Intro AI The parameters of a distribution How many numbers do we need to specify a distribution? Bernoulli distribution: 1 parameter (two, but one is implied) Categorical distribution: N-1 parameters (N, but one is implied) Joint distribution of N Bernoulli RVs: 2 N parameters 10 CS440/ECE448: Intro AI Parameters of conditional distributions How many distributions does P(A |B) stand for? Answer: K B ; one for each possible value of B. How many parameters does P(A |B=b) have? Answer: K A ; one for each possible value of A (minus one implied parameter) So, P(A |B) has K A × K B parameters 11 CS440/ECE448: Intro AI Marginal distributions If we only know the full joint P(X,Y), we can still compute P(X): 12 CS440/ECE448: Intro AI P(X) = P ( X , Y = y ) y ! = P ( X | Y = y ) y ! " P ( Y = y )
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Independence Two random variables X and Y are independent if P(X,Y) = P(X)P(Y) and/or P(X|Y) = P(X) Two random variables X and Y are conditionally independent given Z if P(X,Y | Z) = P(X | Z)P(Y | Z) If we assume X,Y, Z are independent, we can factor the distribution: P(X,Y, Z) = P(X) ! P(Y) ! P(Z) How many parameters do we need to know to specify P(X,Y, Z) if we assume they are independent?
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}