This preview shows pages 1–3. Sign up to view the full content.
Probabilities, Utilities and Decision Making
1 Probabilities
1.1 Example in this course
We have encountered probability theory, and have seen how it takes the idea of
elementary events, counts them up,, and produces measures called probability. We have
applied this to figuring out whether a thief stole socks from one person or another, based
on the color of the socks. And to figuring out which island a drunken bird has land on,
from the color of the grass. Most relevant to this course, we have used it to figure out
which of several documents is most likely to be relevant for a particular query, based on
the words in the query, and the
words in the document (Language Models).
1.2 Origins in gambling
But probability ahs many other applications, some if which are very important.
Probability originated with the efforts of gamblers to decide how to bet. For example,
betting on a “7” when two dice are rolled is a much safer bet than betting on a “3”
because there are more ways it can occur.
1.3 Utility: Von Neumann and Morganstern
About 70 years ago, the mathematician John von Neumann, and the economist, Oskar
Morgenstern worked together to put the ideas that gamblers used into a form that would
work for many other kinds of problems. To do this they had to introduce a new idea:
“utility”.
In ordinary language utility means several things (electricity, gas, water) and also
“usefulness”.
[Utility programs on the computer are so named because they are useful.].
This sense of “usefulness” is closest to what the economists mean. Let’s see why they
introduce it.
2 Expected value of uncertain alternatives
Consider two possible alternatives: (A) you definitely get one dollar. (B) with
probability 50%
you get two dollars, and with probability 50% you get nothing.
On
average, how much money is each of these worth, per play? To figure that out we
consider how much money we get in each case, multiply it by the probability, and add
them up. So the calculations look like this
Case
A
Outcomes
1
2
3
4
5
Probability
100%
Value
1.00
Product
1.00
0.00
0.00
0.00
0.00
1.00
So, not surprisingly, this sure thing is worth $1
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentBut the second alternative, Case B, is, on the average, also worth one dollar.
Case
B
Outcomes
1
2
3
4
5
Probability
50%
50%
Value
0.00
2.00
Product
0.00
1.00
0.00
0.00
0.00
1.00
We call the value computed by multiplying value by probability, and summing them up,
the
expected value
of the alternative.
A gambler must take risks, but he wants the
expected value to be in his favor.
And, according to this principle, he is just as happy
with case B, as with case A.
Because the expected value is the same
3 Problems with expected monetary value
But this principle may not be an accurate or complete description of his behavior.
According to expect value, he should prefer another alternative (Case C) to either of
these, because it is worth (in terms of expected value) 40 cents more, and “every little
bit helps”.
This is the end of the preview. Sign up
to
access the rest of the document.
 Fall '09
 Boros

Click to edit the document details