Selfless behaviour and cooperation cannot be taken for granted. Mohammad Salahshour of the Max Planck Institute for Mathematics in the Sciences (now at Max Planck Institute of Animal Behavior), has used a game theory-based approach to show why it can be worthwhile for individuals to set self-interests aside.
One of the most fundamental questions facing humanity is: why do we behave morally? Because it is by no means self-evident that under certain circumstances we set our self-interest aside and put ourselves in the service of a group — sometimes to the point of self-sacrifice. Many theories have been developed to get to the bottom of this moral conundrum. There are two well-known proposed solutions: that individuals help their relatives so that the common genes survive (kin selection), and that the principle of “you scratch my back and I’ll scratch yours” applies. If people help each other, everyone benefits in the end (principle of reciprocity).
Prisoner’s dilemma combined with a coordination game
Mathematician Mohammad Salahshour of the Max Planck Institute for Mathematics in the Sciences in Leipzig, Germany, has used the tools of game theory to explain the emergence of moral norms — because game theory studies how people make rational decisions in conflict situations. For Salahshour, the question at the outset was: why do moral norms exist in the first place? And why do we have different, or even contrasting moral norms? For example, while some norms such as “help others,” promote self-sacrificing behaviour, others, such as dress codes, appear not to have much to do with curbing selfishness. To answer these questions, Salahshour coupled two games: first, the classic prisoner’s dilemma, in which two players must decide whether to cooperate for a small reward or betray themselves for a much larger reward (social dilemma). This game can be a typical example of a social dilemma, where success of a group as a whole requires individuals to behave selflessly. In this game everybody loses out if too many members of a group behave selfishly, compared to a scenario in which everybody acts altruistically. However, if only a few individuals behave selfishly, they can receive a better outcome than their altruistic team members. .Second, a game that focuses on typical decisions within groups, such as a coordination task, distribution of resources, choice of a leader, or conflict resolution. Many of these problems can be ultimately categorized as coordination or anticoordination problems.
Without coupling the two games, it is clear that in the Prisoner’s Dilemma, cooperation does not pay off, and self-interested behaviour is the best choice from the individual’s perspective if there are enough people who act selflessly. But individuals who act selfishly are not able to solve coordination problems efficiently and lose a lot of resources due to failing to coordinate their activity. The situation can be completely different when the results of the two games are considered as a whole and there are moral norms at work which favour cooperation: now cooperation in the prisoner’s dilemma can suddenly pay off because the gain in the second game more than compensates for the loss in the first game.
Out of self-interest to coordination and cooperation
As a result of this process, not only cooperative behaviour emerges, but also a social order. All individuals benefit from it — and for this reason, moral behaviour pay off for them. “In my evolutionary model, there were no selfless behaviours at the beginning, but more and more moral norms emerged as a result of the coupling of the two games,” Salahshour reports. “Then I observed a sudden transition to a system where there is a lot of cooperation.” In this “moral state,” a set of norms of coordination evolve which help individuals to better coordinate their activity, and it is precisely through this that social norms and moral standards can emerge. However, coordination norms favour cooperation: cooperation turns out to be a rewarding behaviour for the individual as well. Mahammad Salahshour: “A moral system behaves like a Trojan horse: once established out of the individuals’ self-interest to promote order and organization, it also brings self-sacrificing cooperation.”
Through his work, Salahshour hopes to better understand social systems. “This can help improve people’s lives in the future,” he explains. “But you can also use my game-theoretic approach to explain the emergence of social norms in social media. There, people exchange information and make strategic decisions at the same time — for example, who to support or what cause to support.” Again, he said, two dynamics are at work at once: the exchange of information and the emergence of cooperative strategies. Their interplay is not yet well understood — but perhaps game theory will soon shed new light on this topical issue as well.
Story Source:
Materials provided by Max-Planck-Gesellschaft. Note: Content may be edited for style and length.