Game Theory Definitions

Game Theory Definitions - Game Theory Definitions A...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Game Theory Definitions A player’s strategy is a complete contingent plan; that is, a strategy assigns a (probability of each) action in each possible state of the world. A pure strategy is a strategy where the player non‐randomly takes a specific action. A mixed strategy is a (non‐degenerate) probability distribution over actions. A strategy profile is a vector of strategies, one for each player. A game is a function that specifies each player’s payoff for every possible strategy profile. Payoffs are usually denominated in either dollars or Bernoulli utility amounts. The normal form (a.k.a. strategic form) of a game is a matrix where the rows are the strategies of Player 1, the columns are the strategies of Player 2, and the entries are the players’ payoffs. Some types of (simultaneous‐move) two‐player games: Zero‐sum game: When one player “wins,” the other player “loses.” Coordination game: If the players can coordinate, they can both “win.” Prisoner’s dilemma: Both players prefer that they both “cooperate” rather than both “defecting” (i.e., cheating), but each does best if he “defects” while the other player “cooperates.” A solution concept is a rule for how to make a prediction about what will happen in any game. A dominant strategy is a strategy that is optimal regardless of the other players’ strategies. Not all games have dominant strategies. Therefore, the theory that people always play dominant strategies is not a solution concept; there are games where that theory does not make a prediction. Nash equilibrium (NE): A strategy profile such that each player’s strategy is optimal, taking as given the other players’ strategies. NE is the primary solution concept used in game theory. At least one NE exists in virtually every game we care about. If the players have dominant strategies in a game, then every NE requires them to be played. Multiple equilibria are said to occur when there is more than one NE. An equilibrium refinement is a solution concept that predicts a subset of the (multiple) NE. A simultaneous‐move game is where no players’ action can be contingent on any other player’s action. A sequential‐move game is where later players’ actions can be contingent on earlier players’ actions. 1 The extensive form of the game is a tree, where the nodes indicate where a player makes a decision, and the branches extending from the node are the possible actions the player can take. Every game has a normal form representation, but only sequential‐move games have an extensive form representation. A subgame is a new game that starts at some decision node of the original game and continues as in the original game from then on. Subgame‐perfect Nash equilibrium (SPNE): A strategy profile such that for every subgame, the restriction of the strategy profile to that subgame is a NE. SPNE is the primary solution concept used for sequential‐move games. At least one SPNE exists in virtually every sequential‐move game we care about. Every SPNE is a NE, but not every NE is a SPNE. We find SPNE by backwards induction: we start with the last mover, find his optimal action, and then work backwards to the second‐last mover, etc. 2 ...
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online