This preview shows pages 1–4. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Lecture 26: A bit about learning theory Prof. Julia Hockenmaier juliahmr@illinois.edu http://cs.illinois.edu/fa11/cs440 CS440/ECE448: Intro to ArtiFcial Intelligence Monday, May 2 , 4 pm in 1404 Siebel Center Natural Language Applications Across Genres: From News to Novels Prof. Kathleen McKeown, Columbia University Monday, May 2 , 6 pm in 2405 Siebel Center Attending Graduate School: A panel discussion Tuesday, May 3 , 10 am 2405 Siebel Center Machine Learning  Modern Times Dr Corinna Cortes (Head of Google Research, NY) Binary classifcation: training Input: {( x i, y i )} with (x 1. x d ) R d y i {+1, 1} Task: ind weights w = (w w 1. w d ) R d+1 that deFne f( x ) = wx 4 CS440/ECE448: Intro AI x 1 x 2 + + + + + + + + + x x x x x x x x x x Decision boundary f( x ) = 0 Boolean XOR XOR is not linearly separable 5 CS440/ECE448: Intro AI 0 1 x 1 x 2 1 0 From perceptrons to neural networks We can think of a single perceptron as one neuron 6 CS440/ECE448: Intro AI Output Input Links Activation Function Input Function Output Links a 0 = 1 a j = g ( in j ) a j g in j w i,j w ,j Bias Weight a i Artifcial Neural Networks: Multilayer perceptrons ! " # ! " # ! " # ! " # 7 From perceptrons to neural nets A neural net consists of nodes connected by directed links. Each node has an activation a i Links a ij propagate the activation a i from i to j. Each link has a weight w ij that determines the strength and sign of the connection 8 CS440/ECE448: Intro AI w 3,5 3,6 w 4,5 w 4,6 w 5 6 w 1,3 1,4 w 2,3 w 2,4 w 1 2 3 4 w 1,3 1,4 w 2,3 w 2,4 w 1 2 3 4 (b) (a) From perceptrons to neural networks Each unit computes a weighted sum of its inputs: in j = !...
View
Full
Document
 Spring '08
 Levinson,S

Click to edit the document details