Pop Bayesianism: cruder than I thought?

Pop Bayesianism is an odd spiritual movement I've been puzzling over recently. In the new video above, Julia Galef explains how it has changed the way she thinks. I found her explanation startling.

This book—Meaningness—has much in common with the Bayesian movement. Both aim to improve the way regular people think. We want to provide tools for noticing and escaping common, mistaken, emotionally-laden patterns of thought, which often cause big trouble.

Unfortunately, I think Bayesianism is also mistaken, and maybe harmful; yet many smart, well-meaning people get caught up in it. As a social phenomenon, it is fascinating, yet baffling.

Teaching probability

On a charitable interpretation of pop Bayesianism, its message is:

Everyone needs to understand basic probability theory!

That is a sentiment I agree with violently. I think most people could understand probability, and it should be taught in high school. It’s not really difficult, and it’s incredibly valuable. For instance, many public policy issues can’t properly be understood without probability theory.

Unfortunately, if this is the pop Bayesians’ agenda, they aren’t going at it right. They preach almost exclusively a formula called Bayes’ Rule. (The start of Julia Galef’s video features it in neon.) That is not a good way to teach probability.

Bayes' formula

Bayes’ Rule is hard to understand on its own. If you try to teach people just that, they are not going to get it. Also, the formula is almost never useful in everyday life.

On the other hand, once you understand the basic principles of probability, Bayes’ Rule is obvious, and not particularly important. Many similar calculations become obvious once you get the fundamental ideas clearly.1 It’s all just arithmetic.

Bayes’ Rule as religious icon

In pop Bayesianism, the Rule is evidently not arithmetic; it is the sacred symbol of Rationality.

In the video, Galef admits this almost immediately (0:35). Occasions in which you can actually apply the formula are rare. Instead, it’s a sort of holy metaphor, or religious talisman. You bow down to it to show your respect for Rationality and membership in the Bayesian religion.

The rest of the video goes on to say that Bayesianism boils down to “don’t be so sure of your beliefs; be less sure when you see contradictory evidence.”

Now that is just common sense. Why does anyone need to be told this? And how does the formula help?

I had thought that the message was something much more sophisticated. So I checked LessWrong, the main pop Bayesian web site. Its article “What is Bayesianism?” defines it as “a mindset that takes three core tenets fully into account”:

  1. Any given observation has many different possible causes.
  2. How we interpret any event, and the new information we get from anything, depends on information we already had.
  3. We can use the concept of probability to measure our subjective belief in something. Furthermore, we can apply the mathematical laws regarding probability to choosing between different beliefs. If we want our beliefs to be correct, we must do so.

Tenets 1 and 2 are obvious (I hope!), and pretty much what Galef said. Tenet 3 I agree with, except for the word “must.”2

The leaders of the movement presumably do understand probability. But I’m wondering whether they simply use Bayes’ formula to intimidate lesser minds into accepting “don’t be so sure of your beliefs.” (In which case, Bayesianism is not about Bayes’ Rule, after all.)

I don’t think I’d approve of that. “Don’t be so sure” is a valuable lesson, but I’d rather teach it in a way people can understand, rather than by invoking a Holy Mystery.

Debunking Bayesianism

I had been thinking about writing a fairly sophisticated, technical explanation of where Bayesianism goes wrong. I’d like to point in a more productive direction.

I haven’t done that yet because actual Bayesianism is a bit technical. It’s a metaphysical interpretation of probability theory. It’s not really difficult, but a clear explanation would take some work.3

Galef is a prominent proponent of pop Bayesianism. My startledness at her video was realizing that none of what I was going to say is relevant. Evidently, the pop version is cruder than I thought. Evidently, I had been taking its allusions to advanced probability theory too seriously.4

Evidently, quite a different response is needed.

Does Bayesianism matter?

Maybe Bayesianism is like acupuncture. It has little practical value, and its elaborate theoretical framework is nonsense; but it’s mostly harmless, and it makes people feel better about themselves, so it’s good on balance. Also, it’s odd enough that few people are going to waste time and money on it. Some quack medical systems ought to be suppressed, but it’s probably best to leave acupuncture alone.

My worry, though, is that Bayesianism may have exactly the wrong effect: it makes people over-confident about their beliefs, because they think Sacred Mathematics justifies them. Empirically, pop Bayesians do often seem confident of things that seem highly unlikely to me.

On my Memetic Threat Assessment Scale, Bayesianism scores only 2/10, vs. 8/10 for “All Is One,”—a much more serious danger. So maybe I’m wasting your time by writing about it here.

My baffled fascination may be purely personal. I interact with many Bayesians on the net, and have come to love some of them. They tend to be smart, kind, unusual people. Bayesianism somehow appeals hugely to people who are like me in most ways. Yet it’s a near miss: my brain is not quite susceptible to infection with this meme.

Bayesianism as eternalism

Pop Bayesianism is a manifestation of non-theistic eternalism. The obvious eternalisms are explicitly religious, like Christianity. Non-theistic eternalisms are also common, but more insidious because they are less obvious. Bayesianism behaves like a religion in many ways, yet it is anti-supernatural.

Atheism and naturalism are a good start, but only the first steps in freeing yourself from eternalism. Beliefs about God are false, but factual wrongness is not the biggest problem with Christianity. It’s the eternalist emotional dynamics—and I see a lot of those in Bayesianism as well.

It’s important to understand that eternalism remains emotionally compelling even after you’ve rejected God. The hope that salvation is possible through Rationality is squarely eternalistic.

I may write more about that soon. What do you think? Would that be interesting?

  • 1. I suspect it is metaphysics that stops Bayesians from teaching probability in an understandable way. Gian-Carlo Rota, who taught probability at MIT when I was there, would frequently exclaim “It’s all just balls into boxes!” You can solve any probability problem easily if you look at it that way. This presentation is “frequentist,” though, which Bayesians reject on metaphysical grounds. I would suggest that, if you are committed to Bayesian metaphysics, it would still be best to teach “balls into boxes” first, because it makes the calculations easy. Then you can explain that frequentism is the root of all evil, and Bayesian metaphysics is the One True Way to salvation.
  • 2. The technical critique I mention below would explain why that is wrong.
  • 3. This explanation would not advocate frequentism as an alternative. I’m not interested in that metaphysical controversy. Rather, I’d argue that Bayes’ Rule is not usefully applicable in ways Bayesians want to believe it is.
  • 4. A chain of large belief-strength updates ensued!

Navigation

You are reading a metablog post, dated June 5, 2013.

The next metablog post is How To Think Real Good.

The previous metablog post was RSS users: please re-subscribe.

This page’s topics are Atheism, Eternalism, History of ideas, and Rationalism.

General explanation: Meaningness is a hypertext book (in progress), plus a “metablog” that comments on it. The book begins with an appetizer. Alternatively, you might like to look at its table of contents, or some other starting points. Classification of pages by topics supplements the book and metablog structures. Terms with dotted underlining (example: meaningness) show a definition if you click on them. Pages marked with ⚒ are still under construction. Copyright ©2010–2017 David Chapman.