Several years ago, the British social scientist Richard Titmuss suggested that (see The Gift Relationship: From Human Blood to Social Policy) paying people to donate their blood would certainly reduce the quality of blood that was available for blood transfusions and may decrease the quantity as well. While economists and health policy researchers were skeptical at first, subsequent studies have shown that his hypothesis was correct. Titmuss suggested that paying blood donors would lead to a "crowding out effect" (also known as the "overjustification effect"). In other words, providing monetary rewards actually decreased the altruistic motivation to donate blood.
More recently, the British economist Julian Le Grand suggested that public policy generally conceives humans as knights, knaves, or pawns (see Le Grand's excellent book, Motivation, Agency, and Public Policy: Of Knights and Knaves, Pawns and Queens). Building upon my previous discussions on intrinsic versus extrinsic motivation (for more, please see my post "Holes"), individuals are generally motivated by virtue/altruism ("knights"), self-interest ("knaves"), or are passive victims of circumstance ("pawns"). Sachin Jain and Christine Cassel used Le Grand's model in their discussion of physician "pay for performance" models.
With both Titmuss and Le Grand in mind, I wanted to go back to the most famous problem in game theory, the Prisoner's Dilemma. Recall that the "dilemma" in this game refers to the fact that the best individual option ("Defect") is worse than the best collective option ("Cooperate"). As you can see in the pay-off matrix below, if both prisoners work together and cooperate, they each would have been sentenced to one year in jail. However, the best individual option is to defect (testify) which leads to a three year jail sentence for both prisoners!
Of course, in the example we previously discussed, players "played" the game only once. That doesn't seem as realistic of a scenario (perhaps with the one exception being the actual scenario of two prisoners being held in separate jail cells). If we are going to use the Prisoner's Dilemma to model more complex problems, we need to know how the game works when (1) there are multiple, iterated (repeated) games and (2) there are more than just two players. Recall from an earlier discussion (see "Tit for Tat") that Robert Axelrod found that the best strategy is "Tit for Tat" (a relatively simple strategy in which the player cooperates on the first move and then does whatever the other player did in the previous move in all subsequent moves). In other words, when the games are repeated, it is possible to identify a strategy that actually rewards cooperation!
But what happens when there are more than two players? First, let's change the scenario slightly and get away from the subject of crime and punishment. Rather than two prisoners, let's start the game with two farmers. The strategic choice is whether or not to limit water utilization during the late, dry summer months. If both Farmer A and Farmer B limit their water utilization (Cooperate/Cooperate in the pay-off matrix above), it's better for the environment and the cost is distributed evenly between them. However, if Farmer A chooses to limit water utilization ("Cooperate") and Farmer B does not ("Defect"), then Farmer A incurs the costs alone, while there is no cost to Farmer B. The converse is also true. As a result, neither farmer chooses to limit water utilization and everyone is worse off as a result ("Defect"/"Defect" in the pay-off matrix above).
Now we can easily envision a scaled up version of our Prisoner's Dilemma using farmers and water utilization, i.e. one in which there are more than just the two farmers. With more than two players, the situation gets even worse and we have what is called a collective action problem, which is the situation where everyone would be better off cooperating, but they choose not to because it is in their individual interests not to do so. These problems typically result in either the so-called "Tragedy of Commons" or the "Free Rider" problem. We'll talk about these two problems next time.
No comments:
Post a Comment