This preview shows pages 1–13. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: In a blackbox model, we try to describe a system well enough to predict its responses without knowing what is inside the system. If the firing is different when one presents the same stimulus twice, then how does the brain know what is in the stimulus? We can also ask how much a system can tell about the ensemble of stimuli in the world; the answer comes from information theory. Lets represent the probability of each input, P(s), and each output, P(r), by color brightness. Input Variable (s) Output Variable (r) Lets represent the probability of an output given an input, P(rs), by an arrow thickness. Input Variable (s) Output Variable (r) Each input variable has its own probabilities to lead to different outputs. Input Variable (s) Output Variable (r) If the output distribution is sharp, then the output is not informative about the input. Input Variable (s) Output Variable (r) In other words, for a system to be informative, the output must have variety. Input Variable (s) Output Variable (r) An important measure of variety in a probability distribution is entropy. Shannon looked for a quantity h(P(r)) that expressed surprise ( i.e. , fell with P) and was additive for independent responses: h P r 1 , r...
View
Full
Document
 Spring '09
 Grzywacz

Click to edit the document details