We believe stuff because it benefits us to believe it, not necessarily because it is true. Phrased that way, it seems like an obvious point—of course evolution made us like that, what else could it have done? But this has surprising explanatory power.
-
Go talk with your roommate about who washes more of the dishes. Almost certainly, you’ll both think you do more of them than the other person thinks you do. Why? Probably because having a distorted view makes you a more convincing negotiator and gets you a better bargain in the future.
-
Say you and some friends would like to start a business in a line of work with lots of opportunities for theft. If you all happen to believe there’s an omnipotent power that rewards good behavior after you die, and you’re in a close-knit community where betrayals have huge social costs, then you can all trust each other. Without the need to spend resources policing each other, you will outcompete all the non-religious groups. (See: The diamond trade in Antwerp, long dominated by devout Orthodox Jews and now dominated by devout Indian Jains.)
-
Say that you look around and you sense that it would be beneficial to align yourself with a group of people who have certain political views. Now think about the game theory: Your vote has a tiny influence on what policies actually take place, but your stated views have a big influence on your social position and status. And the best way to look like you believe something is to genuinely believe it. You’ll soon find that it’s easy to convince yourself that the group’s positions are correct and anyone who disagrees is an evil idiot. You might feel like even listening to the other side is a betrayal. (See: Basically the entire modern world? See also: Why understanding can be traitorous.)
-
Maybe right and wrong don’t “really” exist. But it’s tough out there and you don’t want people to hurt you. So you loudly advertise that if anyone defects against you, you’ll go out of your way to punish them—even “irrationally” hurting yourself if that’s necessary to get revenge. To make this threat maximally credible, you adopt as core beliefs that right and wrong do exist and that defectors are wrong.
Even better, you and your neighbors agree to collectively punish defectors. You even punish people for failing to punish, by calling them cowards and denying them respect and status. (See: We all believe in right and wrong. See also: How humans lived in groups for thousands of years before we invented the leviathan, and a big part of how we live in groups today.)
-
Are you in a social group where it’s beneficial for you to look down on someone? You’ll soon find lots of reasons why this person sucks. Or if it’s beneficial to admire someone, you’ll soon find many reasons that person is amazing. (See: Teenagers.)
-
For most of us, what we do with our lives isn’t cosmically significant. But if you can delude yourself a little bit that your Important Projects are going to Change the World, this will probably make you better at what you do, and ultimately help you collect more grubby resources and status and so on. (See: Everyone is working on Important Projects.)
-
Most of us overestimate how good-looking we are and how much people like us and so on. (Except for the clinically depressed?) But if these delusions make us more confident and charismatic, then they’re probably beneficial. (See: “Our results show proof for a strikingly simple observation: that individuals perceive their own beauty to be greater than that expressed in the opinions of others (p < 0.001)”.)
-
At least in recent history, people on both sides of wars seem to believe they are fighting for the side of good. Obviously, that can’t be right, and in a sense, two such parties fighting should be cause for them to sit down and work through Aumann’s agreement dynamics. But since the French revolution, we know that ideologically committed armies are vastly more effective, so everyone finds a way to believe. (See: Hahahahaha, you think soldiers in wars will follow Aumann’s agreement dynamics? No one has ever followed Aumann’s agreement dynamics.)
-
Successfully raising a baby takes a ton of resources. In principle, this might lead to a tragedy of the commons where each parent has an incentive to invest less than the other one so there’s low investment overall. But if each parent could pre-commit to high investment, that would be better for both. For this reason, perhaps, people in couples are wired to mutually fixate on each other as uniquely amazing and flawless, through a series of hard-to-fake gradual escalations.
The point is: Effectiveness often happens to align with truth, but that’s really sort of a coincidence. Any time there’s a conflict between the two, we evolved to throw truth out the window.
The term “cognitive biases” is arguably misleading in that it suggests that it would be a kind of default to believe true things. Arguably, it’s impressive that we manage to get near the truth at all. If you want your beliefs to be accurate, you’re constantly swimming against your own biology and instincts.
Broadly speaking, I guess there are three reasons that people are wrong about stuff.
- Sometimes there isn’t enough information available to be right. No one can fault Socrates for not believing in wave-particle duality.
- Sometimes information is available, but it’s hard to sort through the information and figure out what’s true. Probably this is why many people believe aspartame is bad for you.
- Finally, there’s cases like the ones we’ve examined here, where it’s straight-up beneficial to have a distorted view of the world.
If you want the world to be more accurate, you can deal with the first issue by gathering more information, and the second by curating that information. But the last one is a real challenge.