Oculardominance columns develop slowly after birth, as this autoradiographs from the developing kitten cortex shows.
This model develops quasi-periodic columns, with variations due to random initial
Optimal Probabilistic Adaptation
When flying from the outside of a canyon into.
the corridors of the canyon
the statistics of the environment change.
Optimal Probabilistic Adaptation
Kalmanfilt
Outline of the Lecture
Bayesian Inference
Device Optimization
Bayesian Decision ROC Analysis
Bayesian Decision Theory Helps Understanding Inference in the Brain
Computations performed by neural n
Computations performed by neural networks can be expressed as energy minimization.
A link exists between energy minimization and Bayesian processes (and therefore, between these things and neural net
Outline of the Lecture
Energy Minimization
Energy Motion Aperture Problem Regularization
One Can Understand Some Network Computations as Energy Minimization
Swimming uses half centers, with cross
Swimming uses half centers, with cross inhibition ending mainly by local inhibition.
The bifurcation diagram shows fixed points and limit cycles of varying frequencies as one modulates the tonic exci
Outline of the Lecture
Information Theory
Entropy Noise Entropy Mutual Information
Mutual Information Measures How Much Responses Tell about Stimuli
In a black-box model, we try to describe a sys
In a black-box model, we try to describe a system well enough to predict its responses without knowing what is inside the system.
If the firing is different when one presents the same stimulus twice,
Outline of the Lecture
Population Decoding
Population Code Population-vector Bayesian Decoding
Different Decoding Schemes Lead to Different Accuracies of Measurement
In a black-box model, we try
In a black-box model, we try to describe a system well enough to predict its responses without knowing what is inside the system.
In this example, the stimulus was a motion of varying speed (A), resp
Outline of the Lecture
Nonlinear System Identification
Volterra Series
Nonlinear Kernels Wiener Series
Volterra and Wiener Kernels Characterize Linear and Nonlinear Systems
In a black-box model,
In a black-box model, we try to describe a system well enough to predict its responses without knowing what is inside the system.
If the black box is linear, then we can describe the system fully wit
Outline of the Lecture
Linear System Identification
Back-box Models Impulse Response Reverse Correlation
Reverse Correlation Can Determine a Neural Linear-systems Impulse Response
A recurrent net
A recurrent network is a feedforward network with a recurrent synaptic weight matrix.
Some neuronal tissues are so massive and complex that network analysis is not too useful.
Perception is a constr
Outline of the Lecture
Minimal-wiring Hypothesis
Elastic Nets
Dimensionality Reduction Development
The Function of Cortical Maps May Be to Minimize Wiring
Oculardominance columns develop slowly
Unsupervised learning is self-organization to maximize extraction of information from input.
One way is by having a supervisor tell the network whether its performance is good.
Classical conditionin
Outline of the Lecture
Hebbian Supervised Learning
With
Spider!
Without Error Correction Error Correction
Hebbian Supervised Learning Benefits from External Information about the Truth
This model
Spider!
This model develops quasi-periodic columns, with variations due to random initial conditions.
Unsupervised learning is self-organization to maximize extraction of information from input.
Ho
Outline of the Lecture
Hebbian Unsupervised Learning
Development Orientation Selectivity Ocular Dominance
Hebbian Unsupervised Learning Leads to Self Organization of Neural Circuits
Long-term po
Long-term potentiation and depression at the hippocampus are examples of Hebb-Stent rule.
A full feedforward network has vector inputs and outputs connected by a weight matrix.
The simplest rule fol
Outline of the Lecture
Models of Motor-pattern Generation
A Simple Model Computer Simulations
Models of Spinal MotorPattern Generation Depend Strongly on Parameters
If the system is rectifying, a
If the system is rectifying, and Re() > 0 and Im() 0, trajectories converge to limit cycles.
Reduced preparations show that the motorpattern generation circuitry is in the spinal cord.
One of the b
Outline of the Lecture
Excitatory-inhibitory Network Models
Non-symmetric Matrices Olfactory Bulb Phase Plane
Recurrent Networks with Non-symmetric Matrices May Exhibit Oscillations
A recurrent ne
A recurrent network is a feedforward network with a recurrent synaptic weight matrix.
For symmetric M, the eigenvectors are orthonormal, i.e., e e = and general solutions have time constants r/(
Outline of the Lecture
Nonlinear Recurrent Network Models
Rectification Gain Modulation Winner-take-all
Rectification Induces Higher Amplification and Selection, and Tuning and Gain Controls
A rec
A recurrent network is a feedforward network with a recurrent synaptic weight matrix.
Assuming that the activation function F is linear, that is, F(x)=x, and denoting the input as h = W.u
dv r = v +
Outline of the Lecture
Linear Recurrent Network Models
Recurrent Matrices Properties Eigenvalues, Eigenvectors
Synaptic-Matrix Eigenvector Properties Determine Responses of Linear Recurrent Network
A full feedforward network has vector inputs and outputs connected by a weight matrix.
A recurrent network is a feedforward network with a recurrent synaptic weight matrix.
For a feedforward network
Outline of the Lecture
Feedforward Network Models
Feedforward Networks Example: Reaching Dynamics
Feedforward Networks Are the Simplest Kind of Brain Circuits
The simplest neural-network model for