4_9_09_InformationTheory

4_9_09_InformationTheory - In a black-box model, we try to...

Info iconThis preview shows pages 1–13. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: In a black-box model, we try to describe a system well enough to predict its responses without knowing what is inside the system. If the firing is different when one presents the same stimulus twice, then how does the brain know what is in the stimulus? We can also ask how much a system can tell about the ensemble of stimuli in the world; the answer comes from information theory. Lets represent the probability of each input, P(s), and each output, P(r), by color brightness. Input Variable (s) Output Variable (r) Lets represent the probability of an output given an input, P(r|s), by an arrow thickness. Input Variable (s) Output Variable (r) Each input variable has its own probabilities to lead to different outputs. Input Variable (s) Output Variable (r) If the output distribution is sharp, then the output is not informative about the input. Input Variable (s) Output Variable (r) In other words, for a system to be informative, the output must have variety. Input Variable (s) Output Variable (r) An important measure of variety in a probability distribution is entropy. Shannon looked for a quantity h(P(r)) that expressed surprise ( i.e. , fell with P) and was additive for independent responses: h P r 1 , r...
View Full Document

Page1 / 29

4_9_09_InformationTheory - In a black-box model, we try to...

This preview shows document pages 1 - 13. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online