CSE 150. Assignment 6
Out: Thu Mar 07
Due: Thu Mar 14
Reading: Sutton & Barto, Chapters 1-4.
6.1
CAPE Survey
You should have received an email from CAPE asking you to evaluate this course. Please complete the online survey if you have not already done so.
NAME:_
LOGIN:_
Signature:_
Computer Science and Engineering 150
Programming Languages for Artificial Intelligence
Thursday, May 9, 2007
M I DT E RM E XAM
DO NOT TURN THIS PAGE UNTIL YOU ARE TOLD TO START!
Please DO NOT put your name at the top of each pa
CSE 150. Assignment 2
Out: Tue Jan 22
Due: Tue Jan 29
Reading: Russell & Norvig, Chapter 14; Korb & Nicholson, Chapter 2.
2.1
Probabilistic reasoning
A patient is known to have contracted a rare disease which comes in two forms, represented by the values
aé a.
Riggs.»
fig;
XE?
32 a»;
$va
i
4
, 2 i
25,2:ng 536- Ewa-
g
a:ng
a; x
S.
Amy: , , V , § :y
y»
n »
:
) {Le
a V
iii ,é
3
§
kéé
t
{smgaééé Ci
V
3_
{556% 5%; E 3284,} A? 23: x L $.52
ii 5.5 ~
a? $9 a?
,fiésiu it? J,
{3
ax , _,
Es
mfg 5
5m,
, A, F
3 R1555;
f: §E§ w ii: ,\ é; mééév
:5 \aiz : E 5?
z 4&9 MR ME
A?
135
S
f
3
g}
3
2%
f
5: w g 35?.
vi é§§s K 52%? ,1 § «Mu
E Q My. 5 ; égériéiéz
Ag;
if
5
r
\
2
g
g
f.
E
w»
z
2?:
{
:5 fr?
2H, ,3
pus}
my
«fizasa§§v§§isx$7
S s:
CSE 150 Homework 1
Out: Tue Aug 5
Due: Fri Aug 8 @ 5pm
1.1
Kullback-Leibler distance
Often it is useful to measure the dierence between two probability distributions over the same random
variable. For example, as shorthand let
pi = P (X = xi |E),
qi = P (
CSE 20
Lecture 11: Proof Techniques
CSE 20: Lecture 11
Challenge Problem
Problem 6 of Assignment 2.
Please submit your sollution by next Monday.
CSE 20: Lecture 11
Quiz 2 Problems
CSE 20: Lecture 11
Quiz 2 Problems
2. Convert the following statement into
CSE 20: Quiz 2b
Total marks: 50
Total Time: 30 minutes
25th April, 2014
Name:
PID:
1. (5 + 15 Marks)
(a) If a and b are two rational numbers then prove that a + b is also rational.
(b) If a is a rational number and b is not a rational number prove that a
CSE 20: Midterm - Section A
Maximum marks: 100
Total Time: 50 minutes
9th May, 2014
Name:
PID:
1. (20 Marks) If p and q are two primes such that, p = q and 6 divides (p + q) prove that 6 does
not divides (p2 + q 2 ).
2. (20 +20 Marks) A number is rational
CSE 150. Assignment 2
Winter 2015
Out: Tue Jan 20
Due: Tue Jan 27
Reading: RN, Chapter 14; KN, Chapter 2.
2.1
Probabilistic reasoning
A patient is known to have contracted a rare disease which comes in two forms, represented by the values
of a binary rand
CSE 150. Assignment 1
Winter 2015
Out: Tue Jan 13
Due: Tue Jan 20 (in class)
Supplementary reading: RN, Ch 13; KN, Ch 1.
1.1
Kullback-Leibler distance
Often it is useful to measure the difference between two probability distributions over the same random
CSE 150. Intro to AI!
Probabilistic Reasoning
and Decision-Making!
Welcome to CSE 150!
I've always considered the most boring 20 minutes of
the semester the time I spend reading the syllabus on
the first day of class.
Students come in, potentially excited
V, 3 , 3 g, r
E
iiv'l
A E} if {g {1% Vi!
%
§§a§*§%5 {iffxngiégxgé} {5K {:3
K > « x 4:35; I %§ Wig éiéfféi/éiég}
2:} aiza éééiémm awié (ma E3 1?
, 1 :\ 3A
if} $§3z§x§£§§é§ 2% 5% $5 $156 if? {iiga W 3% Raw
r \ s
3% i g, _ V x g » < (I ,\
{éégxsé
Mathematical Database
MATHEMATICAL INDUCTION
1. Introduction
Mathematics distinguishes itself from the other sciences in that it is built upon a set of axioms
and definitions, on which all subsequent theorems rely. All theorems can be derived, or proved,
CSE 20 - Spring 2014
Quiz 1
Quiz 1b Solutions
1
Convert [31021]4 to base 5.
Let us rst convert the number in base 4 to base 10.
[31021]4 = 841 in base 10. Let us then convert 841 into base 5.
841 = [11331]5 .
2 What is the sum of [24665]7 and [63606]7 in
A,C Visible, B Hidden
Inference problem
E-Step
Simplest BN
Log Likelihood
Learning CPTs
M-step
Not dierentiable cleanly
B
Iteratively
update the
CPTs at
C
A
This looks like a complete data problem that we know how to do, but the
dierence is that the CPTs
w_l denote the word at the l-th position in a sentence
Notation
joint distribution P(w_1, . . . , w_L) noted as P(Ws) below
Finite context/memory
Position does not play in a role, only previous words
Properties
Position Invariance
Since the CPTs for a Mar
Marginalize over all n^T hidden states combinations
Other forms
Likelihood of observation
Recursion for t>1
Note
_iT
Forward Algorithm
Base Case
O(Tn^2)
Algorithm Analysis
Linear, not exponential in length T
Quadratic in hidden states n
In a naive impleme
Propreties of ML estimation in BNs
Y take on value 1 to m as topic category
RVs
Fixed length vector to represent dictionary word lists
X take on value 1 and 0
Learning from complete data (ML)
DAG
Model
BN
CPT
P(Y=y)
P(X_i=1|Y=y)
Document Classication
from
DAG
BN
This type BN called naive bayes model
With Z cfw_1, 2, . . . , k k types of movie goers
Recommend
Homework Problem
Movie Recommender System
T Instances from Google Survey of R_1 to 50
R takes on value.
Do not recommend
Have not seen
Use T to predic
Maximize P(observed data|model)
Graph Structure
DAGs (hard to learn)
What to learn?
BN components
Known
Unknown
Complete
Evidance
CPTs
Incomplete
(Some can't observe)
Iterative
Non-iterative
Algorithms to build BN
Local
Global
Biased coin
X belongs to cfw
The Markov Blanket B_X of an individual node X consists of
parents of X, children of X, and parent of children of X (spouses).
A node X is conditionally independent of nodes outside B_X given nodes in B_X,
that is: P(X|BX, Y ) = P(X|BX), where Y / BX give