This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: An introduction to information theory and entropy Tom Carter http://astarte.csustan.edu/˜ tom/SFICSSS Complex Systems Summer School Santa Fe June, 2007 1 Contents Measuring complexity 8 Some probability ideas 14 Basics of information theory 23 Some entropy theory 34 The Gibbs inequality 42 A simple physical example (gases) 54 Shannon’s communication theory 67 Application to Biology (genomes) 87 Some other measures 111 Some additional material Examples using Bayes’ Theorem 119 Analog channels 137 A Maximum Entropy Principle 142 Application: Economics I 145 Application: Economics II 154 Application to Physics (lasers) 164 KullbackLeibler information measure 170 References 177 2 The quotes } Science, wisdom, and counting } Being different – or random } Surprise, information, and miracles } Information (and hope) } H (or S) for Entropy } Thermodynamics } Language, and putting things together } Tools To topics ← 3 Science, wisdom, and counting “Science is organized knowledge. Wisdom is organized life.” Immanuel Kant “My own suspicion is that the universe is not only stranger than we suppose, but stranger than we can suppose.” John Haldane “Not everything that can be counted counts, and not everything that counts can be counted.” Albert Einstein (18791955) “The laws of probability, so true in general, so fallacious in particular .” Edward Gibbon 4 Measuring complexity ← • Workers in the field of complexity face a classic problem: how can we tell that the system we are looking at is actually a complex system? (i.e., should we even be studying this system? :) Of course, in practice, we will study the systems that interest us, for whatever reasons, so the problem identified above tends not to be a real problem. On the other hand, having chosen a system to study, we might well ask “How complex is this system?” In this more general context, we probably want at least to be able to compare two systems, and be able to say that system A is more complex than system B. Eventually, we probably would like to have some sort of numerical rating scale. 5 • Various approaches to this task have been proposed, among them: 1. Human observation and (subjective) rating 2. Number of parts or distinct elements (what counts as a distinct part?) 3. Dimension (measured how?) 4. Number of parameters controlling the system 5. Minimal description (in which language?) 6. Information content (how do we define/measure information?) 7. Minimal generator/constructor (what machines/methods can we use?) 8. Minimum energy/time to construct (how would evolution count?) 6 • Most (if not all) of these measures will actually be measures associated with a model of a phenomenon. Two observers (of the same phenomenon?) may develop or use very different models, and thus disagree in their assessments of the complexity. For example, in a very simple case, counting the number of parts is likely to depend on the scale at which the phenomenon is viewed (counting atoms is...
View
Full Document
 Fall '09
 Carter
 Differential Equations, Information Theory, Equations, Probability, Entropy, Probability distribution

Click to edit the document details