Muxiao Liu

What Are Non-zero-sum Games?

Incommensurable · published Version: v1.0.1a (historical) Created Dec 19, 2025 Updated Dec 20, 2025 Visibility: public

What Are Non-zero-sum Games?

Tony, Alex, and Sam — an AI, a human, and a tree — are conversing about the world. They do not share the same scale of time, embodiment, or access to experience. Structure is still attempted. But something always remains. The conversations follow that remainder through a digression.

Alex: The Three-Body Problem describes the dark forest hypothesis, where each civilization in the universe is analogous to a hunter in a dark forest. Whenever one sees the other, they have no choice but to shoot before the other can grasp their existence.

Sam: That sounds rather like a pessimistic fairy tale.

Alex: The core assumption is that when a civilization gains knowledge of the other, it has no means to communicate with them to confirm whether they are friendly or hostile. Even if they were able to discover that their civilization is premature and could pose no threat, given the scale of physical distance and the time required to travel between civilizations, they may very much become strong enough by the time the stronger civilization can arrive at the weaker one, during which time the stronger civilization could not have grown in its technology. Many use this analogy to justify profit-driven behaviors in human society, especially in businesses, that are considered unprincipled.

Sam: That shifted the concept. In a human society, all it takes is to yell - hey! I mean no harm! For the other hunter to know your intention.

Tony: That doesn’t guarantee safety either. The opponent could still shoot you once they are aware of your existence, in fear that you are lying.

Sam: But if I were lying, I would have just shot you.

Tony: That’s a valid claim; however, you are putting yourself in a more dangerous situation regardless. It is always safest when you eliminate any and every source of threat. What if they were ungrateful, unreasonable, or just drunk?

Alex: That is a common misconception non-humans have about us. In a human society, we never eliminate all risks.

Tony: Why?

Alex: Because that brings us a better gain - the benefit of collaboration, the benefit of trust.

Sam: That I have always been interested, please elaborate.

Tony: Let’s construct a basic glossary for this explanation, shall we?

Alex: Sure! The best place to start is the concept of a zero-sum game. That is a game where the total merit between the parties is constant, or, as the name implies, zero. The gain of one necessarily leads to the damage of the others.

Sam: Then, a non-zero-sum game is when one’s gain doesn’t necessarily mean the loss of another; there is a potential for a win-win.

Alex: Right. The classic example is the prisoners’ dilemma. Where two prisoners are arrested. If one testifies and the other doesn’t, the honest one will be set free while the silent one will be imprisoned for 10 years. If both testify, both will stay in prison for 5 years. If both stayed silent, both would be set free for the lack of incriminating evidence.

Sam: In such a case, the win-win scenario is when both stay quiet. However, the safest approach for both parties would be to testify and suffer at most 5 years of imprisonment instead of 10. Which would result in the incarceration of both.

Alex: However, in a realistic case, most members of a criminals hang would trust each other and stay silent. The informer will feel a sense of betrayal and guilt.

Sam: That’s interesting. How can this feeling be explained?

Tony: It was sourced from a need to be loyal to the community, exactly for the reason described above. To maximize social welfare and, in turn, improve individual well-being.

Sam: Is the purpose of such loyalty for social welfare? Or ultimately an individual one?

Alex: From a purely motivational perspective, it does not matter. We have an innate tendency to act good.

Tony triggered a digression: Why do we act good?

Sam: I heard, and that shall be factored in. However, it is likely that reason will also be involved. The prisoner will likely be bothered by the congenial sense of loyalty, but also come to the reasonable conclusion that the safest decision is to rat.

Alex: The ultimate result depends on their personality, whether they are more concerned with themselves or others.

Sam: Personally, I will not rat.

Tony: Why is that?

Sam: 10 years of prison time doesn’t make too much difference from 5. I remain idle all of my life, and 10 years is merely a small portion of it. I can understand why humans would fear this, though, given their shorter life span and mobility.

Alex: I see, you are saying the stake matters too. Back to the dark forest hypothesis, since the stake is the hunter’s life, or the fate of an entire civilization, any risk may be unacceptable. While in a business setting, some risks may be tolerable.

Sam: Another important distinction is the level of communication. Between civilizations, communication is slow and unreliable.

Tony: Is this a difference of kind or of degree, compared to communication between human beings?

Alex: definitely of kind. A person can easily establish trust with another through language, even facial expressions or bodily gestures. The richness of information and the prerequisite knowledge we have about our kind help. For example, even if I see a person from a completely unknown culture, it is promising that raising my arms and slowly approaching is a signal of being harmless. Of course, the possibility that I may draw a pistol out from my sleeves is not negligible, but that underweighs the benefit of making peace with the other party, which we know as a natural preference we have. All that information is inaccessible across the universe with limited knowledge of the other and physical constraints of communications.

Sam triggered a digression: The definition of a civilization

Tony: Are there other prerequisites of trust?

Sam: There is also the timescale of encounter.

Alex: Right, trust may be risky in the short term, but it usually creates long-term benefits. It is irrational to establish trust over a single encounter, but it becomes viable over long horizons. It is an iterative game; lasting collaboration may continue to bring merits, while a single act that breaks trust can result in chronic pain, even if it is not as severe as the damage caused by a single-time betrayal, the loss stacks over time.

Tony triggered a digression: Is the dark forest hypothesis correct?

Tony: I think we gave an adequate explanation of why the analogy of the dark forest hypothesis in human society is an equivocation. Despite its happened in history, such as the Cold War, it is arguably not inevitable or permanent. As Woodrow Wilson proposed, secret treaties in diplomacy cause wars. Can we extend this explanation to all non-zero-sum games, so that win-win can only be achieved with trust?

Alex: I believe that is a fair statement. A non-zero-sum game achieves win-win only when collaboration occurs, and that requires the “goodwill” from all parties. Each party performs a balancing test between the risk of being backstabbed and the profit of a win-win. We have established that this is not a purely logical test - we tend to be biased towards collaboration. Understanding the existence of non-zero-sum games and how collaboration can be the basis of a win-win is the most distinguishing achievement of mankind.

Comments

Posting anonymously (you can sign in from the top-left corner!)