{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

A_Course_in_Game_Theory_-_Martin_J._Osborne 26

A_Course_in_Game_Theory_-_Martin_J._Osborne 26 -...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Page 11 2 Nash Equilibrium Nash equilibrium is one of the most basic concepts in game theory. In this chapter we describe it in the context of a strategic game and in the related context of a Bayesian game. 2.1 Strategic Games 2.1.1 Definition A strategic game is a model of interactive decision-making in which each decision-maker chooses his plan of action once and for all, and these choices are made simultaneously. The model consists of a finite set N of players and, for each player i , a set A i of actions and a preference relation on the set of action profiles. We refer to an action profile as an outcome , and denote the set of outcomes by A . The requirement that the
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: preferences of each player i be defined over A , rather than A i , is the feature that distinguishes a strategic game from a decision problem: each player may care not only about his own action but also about the actions taken by the other players. To summarize, our definition is the following. •Definition 11.1 A strategic game consists of • a finite set N (the set of players ) • for each player a nonempty set A i (the set of actions available to player i ) • for each player a preference relation on (the preference relation of player i ). If the set A i of actions of every player i is finite then the game is finite ....
View Full Document

{[ snackBarMessage ]}