n3 - Economics 405/505 Introduction to Game Theory Prof....

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Economics 405/505 Introduction to Game Theory Prof. Rui Zhao Mixed Strategies Motivating Example: The following Matching Pennies game does not have a pure strategy Nash equilibrium. payoffs Heads Tails Heads 1, -1-1, 1 Tails-1, 1 1, -1 1 Probability 1.1 Probability Distributions If we toss a coin, one of two outcomes { H=Heads, T=Tails } could occur. If the coin is fair, then each outcome occurs with equal likelihood or probabil- ity. The vector of probabilities ( 1 2 , 1 2 ) is a probability distribution over the set of outcomes, { H, T } , indicating outcome H occurs with probability 1/2 and outcome T occurs with probability 1/2. We can have different probability distributions over the same set of out- comes. For example, if a somewhat biased coin is being tossed so that outcome H occurs with probability 2/3 and outcome T occurs with prob- ability 1/3, then we have a probability distribution (abbreviated as prob. dist.) ( 2 3 , 1 3 ) over the same set of outcomes. When we toss a fair die, the prob. dist. over the six outcomes { 1 dot, 2 dots, ..., 6 dots } is ( 1 6 , 1 6 , 1 6 , 1 6 , 1 6 , 1 6 ) . 1.2 Random Variables When we attach numbers to random outcomes, we obtain random variables . For example, consider prob. dist. ( 1 2 , 1 2 ) , on the set of outcomes { H, T } and let X be as follows: X = 1 : if H occurs 0 : if T occurs Then X is a random variable , which can take on two values { 1, 0 } with respective probabilities ( 1 2 , 1 2 ) . Let Y be as follows: Y = 1 : if H occurs- 1 : if T occurs 1 Then Y is a random variable , which can take on two values { 1, -1 } with respective probabilities ( 1 2 , 1 2 ) . 1.3 Expected Values of Random Variables The expected value of random variable X defined above can be computed as the following: E ( X ) = 1 1 2 + 0 1 2 = 1 2 . Definition 1 Formally, if X is a random variable which can take on n val- ues x 1 ,x 2 ,...,x n with probabilities p 1 ,p 2 ,...,p n respectively, then the expected value of X is E ( X ) = n X i =1 p i x i = p 1 x 1 + + p n x n . In the fair die example, we can define a random variable Z so that Z = 1 if the outcome is 1 dot; Z = 2 if the outcome is 2 dots, etc. Thus Z takes on each of the six different values, 1, 2, ..., 6 with equal probability 1 6 . We have E ( Z ) = 1 1 6 + 2 1 6 + ... + 6 1 6 = 7 2 2 Expected Utility and Mixed Strategies 2.1 Expected Utility (or Payoff) In the fair coin example, suppose further that a person receives a payoff (or utility) u ( H ) = $1 if H occurs and receives payoff u ( T ) = 0 if T occurs. Then the persons payoff is a random variable which takes on values { 1, 0 } with probabilities ( 1 2 , 1 2 ) respectively. The expected payoff of the person is the expected value of his random payoff: E ( u ) = u ( H ) prob. ( H ) + u ( L ) prob. ( L ) = 1 1 2 + 0 1 2 = 1 2 ....
View Full Document

This note was uploaded on 09/14/2011 for the course ECON 505 taught by Professor Zhao during the Spring '11 term at SUNY Albany.

Page1 / 11

n3 - Economics 405/505 Introduction to Game Theory Prof....

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online