# Lesson 38 - Module 12 Machine Learning Version 1 CSE IIT...

This preview shows pages 1–5. Sign up to view the full content.

Module 12 Machine Learning Version 1 CSE IIT, Kharagpur

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Lesson 38 Neural Networks - II Version 1 CSE IIT, Kharagpur
12.4.3 Perceptron Definition: It’s a step function based on a linear combination of real-valued inputs. If the combination is above a threshold it outputs a 1, otherwise it outputs a –1. x1 x2 xn {1 or –1} X0=1 w0 w1 w2 wn Σ A perceptron draws a hyperplane as the decision boundary over the (n-dimensional) input space. A perceptron can learn only examples that are called “linearly separable”. These are examples that can be perfectly separated by a hyperplane. + + + - - - Decision boundary ( WX = 0) O(x1,x2,…,xn) = 1 if w0 + w1x1 + w2x2 + … + wnxn > 0 -1 otherwise Version 1 CSE IIT, Kharagpur

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
+ + + - - - + + + - - - Linearly separable Non-linearly separable Perceptrons can learn many boolean functions: AND, OR, NAND, NOR, but not XOR However, every boolean function can be represented with a perceptron network that has two levels of depth or more. The weights of a perceptron implementing the AND function is shown below.
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 09/20/2010 for the course MCA DEPART 501 taught by Professor Hemant during the Fall '10 term at Institute of Computer Technology College.

### Page1 / 6

Lesson 38 - Module 12 Machine Learning Version 1 CSE IIT...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online