*This preview shows
pages
1–8. Sign up
to
view the full content.*

This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*
**Unformatted text preview: **Entropy, Power Laws, and Economics Tom Carter Complex Systems Summer School SFI, 2007 http://astarte.csustan.edu/ tom/ Santa Fe June, 2007 1 Contents Mathematics of Information 6 Some entropy theory 13 A Maximum Entropy Principle 17 Application: Economics I 20 Fit to Real World TM 26 A bit about Power Laws 30 Application: Economics II 40 References 47 2 The quotes } Science, wisdom, and counting } Surprise, information, and miracles } Information (and hope) } H (or S) for Entropy To topics 3 Science, wisdom, and counting Science is organized knowledge. Wisdom is organized life.- Immanuel Kant My own suspicion is that the universe is not only stranger than we suppose, but stranger than we can suppose.- John Haldane Not everything that can be counted counts, and not everything that counts can be counted.- Albert Einstein (1879-1955) The laws of probability, so true in general, so fallacious in particular .- Edward Gibbon 4 Surprise, information, and miracles The opposite of a correct statement is a false statement. The opposite of a profound truth may well be another profound truth.- Niels Bohr (1885-1962) I heard someone tried the monkeys-on-typewriters bit trying for the plays of W. Shakespeare, but all they got was the collected works of Francis Bacon.- Bill Hirst There are only two ways to live your life. One is as though nothing is a miracle. The other is as though everything is a miracle.- Albert Einstein (1879-1955) 5 Mathematics of Information We would like to develop a usable measure of the information we get from observing the occurrence of an event having probability p . Our first reduction will be to ignore any particular features of the event, and only observe whether or not it happened. Thus we will think of an event as the observance of a symbol whose probability of occurring is p . We will thus be defining the information in terms of the probability p . The approach we will be taking here is axiomatic: on the next page is a list of the four fundamental axioms we will use. Note that we can apply this axiomatic system in any context in which we have available a set of non-negative real numbers. A specific special case of interest is probabilities (i.e., real numbers between 0 and 1), which motivated the selection of axioms . . . 6 We will want our information measure I ( p ) to have several properties: 1. Information is a non-negative quantity: I ( p ) 0. 2. If an event has probability 1, we get no information from the occurrence of the event: I (1) = 0. 3. If two independent events occur (whose joint probability is the product of their individual probabilities), then the information we get from observing the events is the sum of the two informations: I ( p 1 * p 2 ) = I ( p 1 )+ I ( p 2 ) ....

View
Full
Document