This preview shows page 1. Sign up to view the full content.
Unformatted text preview: hts: w1, w2, w3, …
Each input pattern is classified into one of
two classes depending on whether y > 0 or y < 0 .
Learning rule: Δwi = k ( y − y ) xi
where desired output: y
learning rate: k > 0 Example of supervised learning Barn Owl
Behavior: Visual input serves as a teacher for
learning to orient towards an auditory target.
Neurophysiology: Some neurons in the optic tectum
have both a visual receptive field (V) and an auditory
receptive field (A). Prisms shift the location of visual
space, making it misaligned with the auditory space.
After training, the auditory receptive field is shifted to
realign with the visual receptive field. Supervised learning:
Relation to optimal linear mapping
Optimal linear mapping y = Wx finds the weight matrix W that minimizes
M E=∑y ( m) m =1 − Wx ( m) 2 where ⎡ x (1) ,, x ( M ) ⎤ = X are the input vectors, and
⎦ ⎡ y (1) ,, y ( M ) ⎤ = Y are the desired output vectors. The result is W = YX †
where X † is the pseudoinverse of X. Perceptron learning rule for y = Wx is ΔW = k ( y − y )x T ("online learning")....
View Full Document
This document was uploaded on 02/28/2014 for the course BME 580.402 at Johns Hopkins.
- Spring '14