ln021 - capable of calculating XOR . The numbers within the...

Info iconThis preview shows pages 1–9. Sign up to view the full content.

View Full Document Right Arrow Icon
Learning We have seen machine learning with different representations: (1) Decision trees -- symbolic representation of various decision rules -- “disjunction of conjunctions” (2) Perceptron -- learning of weights that represent alinear decision surface classifying a set of objects into two groups Different representations give rise to different hypothesis or model spaces . Machine learning algorithms search these model spaces for the best fitting model . Chap 19 Alex
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Perceptron Learning Revisited R Demo
Background image of page 2
What About Non-Linearity? x 1 x 2 ! ! ! ! ! ! = +1 = -1 Decision Surface Can we learn this decision surface? …Yes! Multi-Layer Perceptronsl.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Multi-Layer Perceptrons x 0 x 1 x 2 X n-1 x n y Input Layer Hidden Layer Output Layer Combination Function Transfer Function Linear Unit
Background image of page 4
Artificial Neural Networks Feed-forward with Backpropagation Signal Feed-forward Error Backpropagation x 0 x 1 x 2 X n-1 x n y Input Layer Hidden Layer Output Layer
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Learning the XOR Function A multi-layer Perceptron
Background image of page 6
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 8
Background image of page 9
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: capable of calculating XOR . The numbers within the perceptrons represent each perceptrons' explicit threshold. The numbers that annotate arrows represent the weight of the inputs. This net assumes that if the treshhold is not reached, zero (not -1) is output. x y ! ! 1 1 Note : no linear decision surface exists for this dataset. = TRUE ! = FALSE Representational Power Every bounded continuous function can be approximated with arbitrarily small error by a network with one hidden layer. Any function can be approximated to arbitrary accuracy by a network with two hidden layers. Hidden Layer Representations Target Function: Can this be learned? Hidden Layer Representations 1 0 0 0 0 1 0 1 0 1 1 1 0 0 0 0 1 1 1 0 1 1 1 0 Hidden layers allow a network to invent appropriate internal representations....
View Full Document

Page1 / 9

ln021 - capable of calculating XOR . The numbers within the...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online