Signaling game

From Infogalactic: the planetary knowledge core
(Redirected from Signalling game)
Jump to: navigation, search
An extensive form representation of a signaling game

A signaling game is a dynamic, Bayesian game with two players, the sender (S) and the receiver (R). The sender has a certain type, t, which is given by nature. The sender observes his own type while the receiver does not know the type of the sender. Based on his knowledge of his own type, the sender chooses to send a message from a set of possible messages M = {m1, m2, m3,..., mj}. The receiver observes the message but not the type of the sender. Then the receiver chooses an action from a set of feasible actions A = {a1, a2, a3,...., ak}. The two players receive payoffs dependent on the sender's type, the message chosen by the sender and the action chosen by the receiver.[1][2] A related game in game theory is a screening game where rather than choosing an action based on a signal, the receiver gives the sender proposals based on the type of the sender, which the sender has some control over.

Costly versus cost-free signaling

One of the major uses of signaling games both in economics and biology has been to determine under what conditions honest signaling can be an equilibrium of the game. That is, under what conditions can we expect rational people or animals subject to natural selection to reveal information about their types?

If both parties have coinciding interest, that is they both prefer the same outcomes in all situations, then honesty is an equilibrium. (Although in most of these cases non-communicative equilbria exist as well.) However, if the parties' interests do not perfectly overlap, then the maintenance of informative signaling systems raises an important problem.

Consider a circumstance described by John Maynard Smith regarding transfer between related individuals. Suppose a signaler can be either starving or just hungry, and she can signal that fact to another individual which has food. Suppose that she would like more food regardless of her state, but that the individual with food only wants to give her the food if she is starving. While both players have identical interests when the signaler is starving, they have opposing interests when she is only hungry. When the signaler is hungry she has an incentive to lie about her need in order to obtain the food. And if the signaler regularly lies, then the receiver should ignore the signal and do whatever he thinks best.

Determining how signaling is stable in these situations has concerned both economists and biologists, and both have independently suggested that signal cost might play a role. If sending one signal is costly, it might only be worth the cost for the starving person to signal. The analysis of when costs are necessary to sustain honesty has been a significant area of research in both these fields.

Perfect Bayesian equilibrium

The equilibrium concept that is relevant for signaling games is Perfect Bayesian equilibrium. Perfect Bayesian equilibrium is a refinement of Bayesian Nash equilibrium, which is an extension of Nash equilibrium to games of incomplete information. Perfect Bayesian equilibrium is the equilibrium concept relevant for dynamic games of incomplete information.

A sender of type t_j sends a message m^*(t_j) in the set of probability distributions over M. (m(t_j) represents the probabilities that type t_j will take any of the messages in M.) The receiver observing the message m takes an action a^*(m) in the space of probability distributions over A.

A game is in perfect Bayesian equilibrium if it meets all four of the following requirements:

  • The receiver must have a belief about which types can have sent message m. These beliefs can be described as a probability distribution \mu(t_i|m), the probability that the sender has type t_i if he chooses message m. The sum over all types t_i of these probabilities has to be 1 conditional on any message m.
  • The action the receiver chooses must maximize the expected utility of the receiver given his beliefs about which type could have sent message m, \mu(t | m). This means that the sum \sum_{t_i} \mu(t_i|m)U_R(t_i,m,a) is maximized. The action a that maximizes this sum is a^*(m).
  • For each type, t, the sender chooses to send the message m^* that maximizes the sender's utility U_S (t, m,a^*(m)) given the strategy chosen by the receiver, a^*.
  • For each message m the sender can send, if there exists a type t such that m^*(t) assigns strictly positive probability to m (i.e. for each message which is sent with positive probability), the belief the receiver has about the type of the sender if he observes message m, \mu(t|m) satisfies Bayes' rule: \mu(t|m) = p(t)/\sum_{t_i} p(t_i)

The perfect Bayesian equilibria in such a game can be divided in three different categories, pooling equilibria, semi-pooling (also called semi-separating), and separating equilibria. A pooling equilibrium is an equilibrium where senders with different types all choose the same message. A semi-pooling equilibrium is an equilibrium where some types of senders choose the same message and other types choose different messages. A separating equilibrium is an equilibrium where senders with different types always choose different messages. Therefore, if there are more types of actors than there are messages, the equilibrium can never be a separating equilibrium (but may be semi-separating equilibria).

Applications of signaling games

Signaling games describe situations where one player has information the other player does not have. These situations of asymmetric information are very common in economics and behavioral biology.

Philosophy

The first known use of signaling games occurs in David K. Lewis' Ph. D. dissertation (and later book) Convention.[3] Replying to W.V.O. Quine,[4][5] Lewis attempts to develop a theory of convention and meaning using signaling games. In his most extreme comments, he suggests that understanding the equilibrium properties of the appropriate signaling game captures all there is to know about meaning:

I have now described the character of a case of signaling without mentioning the meaning of the signals: that two lanterns meant that the redcoats were coming by sea, or whatever. But nothing important seems to have been left unsaid, so what has been said must somehow imply that the signals have their meanings.[6]

The use of signaling games has been continued in the philosophical literature. Others have used evolutionary models of signaling games to describe the emergence of language. Work on the emergence of language in simple signaling games includes models by Huttegger,[7] Grim, et al.,[8] Skyrms,[9][10] and Zollman.[11] Harms,[12][13] and Huttegger,[14] have attempted to extend the study to include the distinction between normative and descriptive language.

Economics

The first application of signaling games to economic problems was Michael Spence's model of job market signaling.[15] Spence describes a game where workers have a certain ability (high or low) that the employer does not know. The workers send a signal by their choice of education. The cost of the education is higher for a low ability worker than for a high ability worker. The employers observe the workers' education but not their ability, and choose to offer the worker a high or low wage. In this model it is assumed that the level of education does not cause the high ability of the worker, but rather, only workers with high ability are able to attain a specific level of education without it being more costly than their increase in wage. In other words, the benefits of education are only greater than the costs for workers with a high level of ability, so only workers with a high ability will get an education.

Biology

Valuable advances have been made by applying signaling games to a number of biological questions. Most notably, Alan Grafen's (1990) handicap model of mate attraction displays.[16] The antlers of stags, the elaborate plumage of peacocks and bird-of-paradise, and the song of the nightingale are all such signals. Grafen’s analysis of biological signaling is formally similar to the classic monograph on economic market signaling by Michael Spence.[17] More recently, a series of papers by Getty[18][19][20][21] shows that Grafen’s analysis, like that of Spence, is based on the critical simplifying assumption that signalers trade off costs for benefits in an additive fashion, the way humans invest money to increase income in the same currency. This assumption that costs and benefits trade off in an additive fashion might be valid for some biological signaling systems, but is not valid for multiplicative tradeoffs, such as the survival cost – reproduction benefit tradeoff that is assumed to mediate the evolution of sexually selected signals.

Charles Godfray (1991) modeled the begging behavior of nestling birds as a signaling game.[22] The nestlings begging not only informs the parents that the nestling is hungry, but also attracts predators to the nest. The parents and nestlings are in conflict. The nestlings benefit if the parents work harder to feed them than the parents ultimate benefit level of investment. The parents are trading off investment in the current nestlings against investment in future offspring.

Pursuit deterrent signals have been modeled as signaling games.[23] Thompson's gazelles are known sometimes to perform a 'stott', a jump into the air of several feet with the white tail showing, when they detect a predator. Alcock and others have suggested that this action is a signal of the gazelle's speed to the predator. This action successfully distinguishes types because it would be impossible or too costly for a sick creature to perform and hence the predator is deterred from chasing a stotting gazelle because it is obviously very agile and would prove hard to catch.

The concept of information asymmetry in molecular biology has long been apparent.[24] Although molecules are not rational agents, simulations have shown that through replication, selection, and genetic drift, molecules can behave according to signaling game dynamics. Such models have been proposed to explain, for example, the emergence of the genetic code from an RNA and amino acid world.[25]

See also

References

  1. Lua error in package.lua at line 80: module 'strict' not found.
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found. (Reprinting)
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. Lewis (1969), p. 124.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Lua error in package.lua at line 80: module 'strict' not found.
  9. Lua error in package.lua at line 80: module 'strict' not found.
  10. Lua error in package.lua at line 80: module 'strict' not found.
  11. Lua error in package.lua at line 80: module 'strict' not found.
  12. Lua error in package.lua at line 80: module 'strict' not found.
  13. Lua error in package.lua at line 80: module 'strict' not found.
  14. Lua error in package.lua at line 80: module 'strict' not found.
  15. Lua error in package.lua at line 80: module 'strict' not found.
  16. Lua error in package.lua at line 80: module 'strict' not found.
  17. Lua error in package.lua at line 80: module 'strict' not found.
  18. Lua error in package.lua at line 80: module 'strict' not found.
  19. Lua error in package.lua at line 80: module 'strict' not found.
  20. Lua error in package.lua at line 80: module 'strict' not found.
  21. Lua error in package.lua at line 80: module 'strict' not found.
  22. Lua error in package.lua at line 80: module 'strict' not found.
  23. Lua error in package.lua at line 80: module 'strict' not found.
  24. John Maynard Smith. (2000) The Concept of Information in Biology. Philosophy of Science. 67(2):177-194
  25. Lua error in package.lua at line 80: module 'strict' not found.