DYNOMIGHT ABOUT RSS SUBSTACK

Rules for weird ideas

Rules for weird ideas

Aug 2022

It’s frustrating to propose an idea and have people dismiss it just because it’s weird. You’ve surely seen people ridicule ideas like worrying about wild animal suffering or computers becoming sentient or comets crashing into the planet. I’ve encountered some of this for claiming aspartame is likely harmless but ultrasonic humidifiers might not be.

The thing is, dismissing weird ideas is not wrong.

I have a relative who got the J&J vaccine for Covid, so as some people were getting their third shot, she still only had one. I claimed that it would be fine to go ahead and get a second shot of an mRNA vaccine since this was sure to be approved soon, and was already approved in some countries. She gently responded, “I will get another shot when my doctor tells me to.”

Was she wrong? In a narrow sense, maybe. Mixing-and-matching of vaccines was approved soon after, and I maintain that this was knowable in advance. But more broadly, she was following a good strategy: For most people, “just do what your doctor says” will give better results than, “take unsolicited medical advice from uppity relatives.”

From a Bayesian standpoint, it would arguably have been a mistake if she did listen to me. Skepticism of weird ideas is a kind of “immune system” to prevent us from believing in nonsense.

The problem, of course, is that weird ideas are sometimes right. For 200 years, most Western people thought that tomatoes were poisonous. Imagine you were one of the initial contrarians going around saying, “Well actually, tomatoes are fine!” and demonstrating that you could eat them. I bet you’d have had a rough time.

Especially because if you convinced someone and they went home and cooked some tomatoes, their cookware probably had lead in it which the acidity in the tomatoes would leach out, leading to lead poisoning. Your follow-up campaign of, “really tomatoes are ok, we just need to switch to non-leaded cookware!” would bomb even harder.

I’m glad people persevered so we aren’t covering our pizzas with mayonnaise. But how are we supposed to resolve this tension in general? Here are eight proposed rules.

1. We need to work at the population level

If you think about it, almost everything you know comes from other people. Even when you “check the facts” what that usually means is “see what other people say”. If you trace your knowledge back to observations in the world, it’s a huge graph of you trusting people who trust other people who trust other people.

Understanding the world is a social process. This is important because I don’t think the tension of weird ideas can be resolved at an individual level. You’ve got finite time to investigate crackpot theories. But fortunately, you don’t need to resolve all questions yourself. We just need to follow habits that lead to us collectively identifying good ideas and discarding bad ones.

2. Don’t expect most people to take your weird idea seriously

For one, this is just being realistic about how the world is. But more seriously, it would be unreasonable to expect people to follow a strategy that is bad for them.

We are all assaulted by bad ideas all the time. If every person who heard the claim that vaccines cause autism looked at the evidence with an open mind, well, we’d have a lot more people who think that vaccines cause autism.

There’s no time to investigate every random claim anyway. The complexity of the world greatly exceeds the capacity of individual people. We have to live within the social process where we get trusted information from other people.

3. Don’t feel bad about dismissing weird ideas

Remember, it’s the correct prior to be biased against weird ideas, and it’s correct game theory to be hesitant to look into them, given that we have short lifespans and tiny little error-prone brains.

Yet somehow, I think a lot of people feel like they aren’t supposed to do this? The problem isn’t that people don’t dismiss weird ideas—most of us do that instinctually. The problem is that we aren’t honest about why we are dismissing them, either to others or even to ourselves. Speaking of which…

4. Be honest about why you reject weird ideas

There are lots of reasons you might do this.

  1. Pure prior: The idea sounds stupid and you haven’t looked at the argument.
  2. You’ve looked at the argument, but you think it’s wrong.
  3. You looked at the argument, but then realized you don’t have the background to understand it, so you went back to your prior.
  4. You looked at the argument, you do understand it, and it looks pretty good. But your prior is so strong you still reject the idea anyway.
  5. You looked at the argument, you understand it, it seems strong, and on an intellectual level, it overcomes your prior. But somehow you just aren’t able to get emotionally invested in the conclusion. (Sometimes I feel this way about AI risk.)

These are all valid! But it’s very important to be clear about which one you’re using. Because here’s something that happens a lot:

  • There’s a weird idea.
  • Lots of people reject it just because it’s weird (#1) or because they don’t understand the argument (#3).
  • But they feel like they aren’t “supposed” to reject it for those reasons, so they give a misleading impression that they reject the argument in detail.
  • This creates an illusion of a false consensus that everyone thinks the argument is wrong, screwing up the social process that’s supposed to eventually lead to truth.

5. Beware shifting goal posts

Here’s another pattern:

A: Here’s a weird idea.

B: That can’t be true because of X.

A: [Evidence that X is false.]

B: Oh, OK. But your idea is still wrong because of Y.

A: [Evidence that Y is false.]

B: Fine, but your idea is still wrong because of Z.

For example, with aspartame, people often claim it’s carcinogenic. When that’s shown to be false, they retreat to saying it’s genotoxic (it isn’t), that it causes an insulin spike (it doesn’t), that it’s metabolized into formaldehyde (that’s normal), that it causes obesity (only in correlational studies), and then something about the microbiome.

Now, it’s fine to oppose an idea because of reasons X, Y, and Z. And it’s good (admirable!) to abandon reasons when they are shown to be false. But still, this pattern is a warning sign.

Most obviously, in disagreements it’s always best to start with your central point. If I say I disagree with you because of X, then showing that X is false should change my mind—otherwise, I wasn’t fully candid about my reasons.

But this pattern has particular relevance for weird ideas. What’s happening in each person’s brain during the conversation?

A, of course, feels frustrated because it seems like there is no evidence that would convince B, so it feels like B is arguing in bad faith.

But B’s perspective is different. They decided that the idea is too weird to be considered (which is reasonable!). Then, they applied basic logic: If you know that aspartame is harmful, and you’re shown that it isn’t carcinogenic, then it is correct to infer that there must be some other mechanism of harm.

I think it’s human nature to play the role of B in this conversation. When we dismiss weird ideas, it often “feels” like we have reasons.

What’s the solution? I think B needs to be more self-reflective and more straightforward. It’s OK to just decide you aren’t going to consider an idea and you aren’t going to be convinced by any evidence to the contrary. We all frequently do this. But when doing it, it’s better to do it explicitly. A fear of looking closed-minded can cause you to throw up a series of Potemkin arguments that only present an illusion of engaging on the merits.

6. Consider a fraction of weird ideas

It’s probably good to look into a certain percentage of crazy ideas. This is mostly an act of altruism, something that we should do to make the social truth process work better.

You probably do this already. For topics that are particularly important to you, or that you particularly enjoy reading about, you probably have more patience to indulge in outlandish concepts.

Another criterion would be expertise. Likely we should leave the rebuttals of perpetual motion machines to physicists.

But I don’t think we want to be too single-minded in leaving truth to the experts. The issue is that expertise is often concentrated in tiny little bubbles of society. When we have high-trust channels from the experts to the public, that’s fine. For example, our current system for communicating when an earthquake has happened works very well.

But other times the experts are siloed and most of the population is several low-trust links away from them. Or maybe the experts aren’t that reliable, or they just aren’t any experts on this particular topic. In these cases, we need more participants to give the true weird ideas a chance of spreading.

7. Or on second thought maybe don’t

Public health authorities are now seen as less reliable than they were a few years ago. To my mind, that was a “correct” update: They were always OK but not infallible, so the current view is closer to reality.

But what has the effect of that been?

It’s not clear it was positive. Some people have certainly found alternative sources of information and learned the limits of what public figures can say. But lots of other people also seem to be caught up in nonsense conspiracy theories.

This worries me and I’m not sure what to do about it. It’s tempting to say that you should only look into things if you can do so successfully. But perhaps your ability to evaluate the details is correlated with your ability to judge your own ability?

8. Accept weird ideas hesitantly

You don’t have to update all the way. Probably you should almost never do that! In most cases, the right conclusion would be, “important if true” and maybe “I don’t see an obvious flaw.” This is enough to make the social process work and avoids the personal risks of acting on crazy ideas.

Comments at reddit, substack.

new dynomight every thursday
except when not

(or try substack or rss)