Monday, May 18, 2020

My self-help guide to making sense of a confusing world

It has become painfully evident over the last decade or so that social media has a somewhat corrosive effect on "truth" and "discussion". There are a variety of reasons for this - many unidentified - but a few factors are:
  1. For every opinion, no matter how bizarre, it has become easy to find a community with similar beliefs.
  2. The discoverability of almost all information coupled with the shortening of attention spans allows people with strange beliefs to search for information at - at least if only glanced at for 15 seconds - may be interpreted to confirm their strange belief.
  3. Algorithms that maximize engagement also maximize enragement -- if the algorithms show me content that draws me into a time-sink discussions with no benefit, they are "winning" (in terms of the metrics against which they are evaluated).
  4. The social media platforms favor "immediacy of feedback" vs. taking time to think things through. Social media discussions often devolve into name-calling or rapid-fire quoting of dubious studies without any qualitative understanding - people quote papers and sources they never critically evaluated.
Aside from that, creating false but engagement-producing content has become a veritable industry. Attention can be monetized, so one can earn good money by making up lies that appear credible to some subset of the population. The quality of mainstream reporting has been caught up in this downward spiral.

The result of this is "fake news" and the "death of a consensus on reality"; strange conspiracy theories; and generally many many hours wasted. The problem cuts across class and educational layers; it is simply not true that "only the uneducated" fall prey to internet falsehoods.

Personally, I am terrified of believing things that are not true. I am not quite sure why; but to assuage my fears of misleading myself, I have adapted a number of habits to function as checks on my own beliefs.

By and large, I have found them very useful.

In this blog post, I intend to share my thoughts on how people mislead themselves, and the checks I have decided to apply to my own beliefs. I am not always successful, but this is the bar I try to measure myself against. My hope is that this is helpful for others; perhaps it can help reject false beliefs somewhere.

So let's begin with an underlying assumption I make:

People tend to believe in things that help them.

As a young man I believed that people try to understand a situation, and then form a belief based on that. This is not what I observed in my life. My observation is that people choose beliefs and systems of belief to fulfill a function for them.

My father is born in the 30s in Germany, and as a pre-teen and early teen, he got a front-row seat to watch all adults perform an ideological 180-degree turn in front of him. The question of "how do people adjust their beliefs" has always been important in my discussions with him.

My conclusion is that people are very good at identifying what they want, and what is beneficial to them. They also like to feel good about themselves, and about what they do. Given these constraints, people tend to pick beliefs and systems of belief that ...
  • ... allow them to do what they want to do.
  • ... allow them to reap benefits.
  • ... allow them to feel good about themselves at the same time.
I alluded to this with the sentence "Everybody wants to be the hero of their own story" in my disclosure Rashomon post.

It is crucially important to be aware that belief systems have a functional role for those that believe them. This is why it can be so hard to "convince" anyone of the incorrectness of their belief system: You are asking the person to give up more than a false belief - often, you are asking the person to adjust their view of themselves as being less benign than they like to believe, or you are asking the person to adjust their view in a manner that would cast doubt on their ability to obtain some other benefit.

When I write "people" above, this includes you and me.

Being aware of the functional role of beliefs is hence important when you investigate your own beliefs (more on that later). Trying to believe what makes us feel good is the most fundamental cognitive bias.

So what am I trying to do to counter that bias? Here's my list of 7 habits:
  1. Clarify your beliefs
  2. Ask about the benefits of your beliefs
  3. Seek out original sources
  4. Examine evidence for your beliefs and competing hypotheses
  5. What new information would change your beliefs?
  6. Provide betting odds
  7. Discuss for enlightenment, not "winning"

Habit 1: Clarify beliefs

It may sound odd, but it takes conscious effort to turn everyday diffuse "belief" into an actual clearly articulated statement. For me, nothing quite clarifies thoughts like writing them down - often things that appear clear and convincing in my head turn out to be quite muddled and unstructured when I try to put them to paper.

Asking oneself the question "what are my beliefs on a given topic", and trying to write them down coherently, is surprisingly powerful. It forces a deliberate effort to determine what one actually believes, and committing to that belief in writing (at least to oneself).

Habit 2: Ask about the benefits of your beliefs - "am I the baddie?"

Awareness of the functional role of beliefs is important when examining one's own beliefs. Feynman famously said about science that "the first principle is that you must not fool yourself and you are the easiest person to fool".

When examining my own beliefs, it try to ask myself: What benefits does this belief bestow on me? What does this belief permit me to do? How does this belief make me feel good about myself?

It is quite often helpful to actively try to examine alternative narratives in which one casts oneself in a bad light. Trying to maintain our positive image of ourselves is a strong bias; making a conscious effort at examining alternate, unflattering narratives can be helpful.

My wife and me sometimes jokingly play "mock custody battle" - a game where we jokingly try to portray each other as some sort of terrible person and reinterpret each others entire biography as that of a villain - and it is quite enlightening.

Habit 3: Seek out original sources, distrust secondary reporting

Both for things like presidential debates and for scientific papers, the secondary reporting is often quite incorrect. In political debates, it is often much less relevant how the debate went and what happened -- only a small fraction of the population will have witnessed the event. What really counts is the narrative that gets spun around what happened.

You can observe this during election season in the US, where as soon as the debate is over, all sides will try to flood all the talkshows and newscasts with their "narrative" (which has often been pre-determined prior to the debate happening - "He is a flip-flopper. He flipflops." or something along those lines).

Likewise, scientific papers often get grossly misrepresented in popular reporting, but also by people that only superficially read the paper. Reporting is often grossly inaccurate, and if you are an expert on a topic, you will notice that on your topic of expertise, reporting is often wrong; at the same time, we somehow forget about this and assume that it is more accurate on topics where we are not experts (the famous "Gell-Mann amnesia").

A friend of mine (scientist himself) recently forwarded me a list of papers that estimated COVID-19 IFR; one of them reported an IFR of 0%. Closer examination of the contents of the paper revealed that they examined blood from blood donors for antibodies; nobody that died of COVID-19 went to donate blood 2 weeks later, so clearly there were no fatalities in their cohort.

Nonetheless, the paper was cited as "evidence that the IFR is lower than people think".

A different friend sent me a "scientific paper" that purported to show evidence that tetanus vaccines had been laced with a medication to cause infertility. Examining the paper, it was little than assembling a bunch of hearsay; no experimental setup, no controlling, no alternative hypothesis etc. Examining the homepages of the authors revealed they were all strange cranks, peddling various strange beliefs. It was "published" in a "we accept everything" journal.

Acquiring the habit to read original sources is fascinating - there are bookshelves full of books that are quoted widely, mostly by people who have never read them (Sun Tzu, Clausewitz, and the Bible are probably the most common). It is also useful: Getting into the habit of reading original papers helps cut out the middle-man and start judging the results directly; it is also a good motivator to learn a good bit of statistics.

 

Habit 4: Examine evidence for your beliefs, analyze competing hypotheses

Once one's own beliefs are clarified in writing, and one has looked at the primary sources, one can gather evidence for one's belief.

While one does so, one should also make a deliberate effort to gather and entertain competing hypotheses: What other explanations for the phenomenon under discussion exist? What are hypotheses advanced by others?

Given one's own beliefs, and alternate hypotheses, one can look at the evidence supporting each of them carefully.

Two principles help me at this stage to discount less-credible hypotheses (including my own):
  • Occam's Razor: Often, the simpler explanation is the more likely explanation
  • Structural malice: Malicious individuals are pretty rare, but if an incentive structure exists where an individual can benefit from malice while explaining it away, the tendency is for that to happen.
  • Incompetence is much more common than competent malice. The Peter principle and Parkinsons law apply to human organisations.
After this step, I end up forming an opinion - looking at the various competing hypotheses, I re-examine which I find most credible. Often, but not always, it is the beliefs that I declared initially, but with reasonable frequency I have to adjust my beliefs after this step.

Habit 5: What new information would change my opinion?

John Maynard Keynes is often quoted with "When the facts change, I change my mind; what do you do, Sir?". It is worth examining what would be necessary to change one's beliefs ahead of time.

Given my belief on a subject right now, what new information would need to be disclosed to me for me to change my mind?

This is very helpful to separate "quasi-religious" beliefs from changeable beliefs.

Habit 6: Provide betting odds

This is perhaps the strangest, but ultimately one of my more useful points. Over the last years, I have read a bit about the philosophy of probability; particularly De Finetti's "Philosophical Lectures on Probability".

When we speak about "probability", we actually mean two different things: The fraction for a given outcome if we can repeat an experiment many times (coin-flip etc.), and the strength of belief that a given thing is true or will happen in situations where the experiment cannot be repeated.

These are very different things - the former has an objective truth, the second one is fundamentally subjective.

At the same time, if my subjective view of reality is accurate, I will assign good probability values (in the 2nd sense) to different events. ("Good" here means that if a proper scoring rule was applied, I would do well).

Beliefs carry little cost, and little accountability. Betting provides cost for being wrong, and accountability about being wrong.

This means that if I truly believe that my beliefs are correct, I should be willing to bet on them; and through the betting odds I provide, I can quantify the strength of my belief.

For me, going through the exercise of forcing myself to provide betting odds has been extremely enlightening: It forced me to answer the question "how strongly do I actually believe this?".

In a concrete example: Most currently available data about COVID-19 hints at an IFR of between 0.49% and 1% (source) with a p-value of < 0.001. My personal belief is that the IFR is almost certainly >= 0.5%. I am willing to provide betting odds of 3:1 (meaning of you bet against me, you get 3x the payout) for the statement "By the end of 2022, when the dust around COVID-19 has settled, the empirical IFR for COVID-19 will have been greater than 0.5%".

This expresses strong belief in the statement (much better than even odds), but some uncertainty around the estimate in the paper (the p-value would justify much more aggressive betting odds).

(Be aware that these betting odds are only valid for the next 4 days, as my opinion may change).

To sum up: Providing betting odds is a great way of forcing oneself to confront one's own strength of belief. If I believe something, but am unwilling to bet on it, why would that be the case? If I believe something, and am unwilling to provide strong odds in favor of that belief, why is that the case? Do I really believe these things if I am unwilling to bet?

Habit 7: Discuss for enlightenment, not "winning"

When I was young, my father taught me that the purpose of a discussion is never to win, or even to convince. The purpose of a discussion is to understand - the topic under discussion, or the position of the other side, or a combination thereof. This gets lost a lot in modern social media "debates".
 
Social media encourages participating in discussions and arguing a side without ever carefully thinking about one's view on a topic. The memoryless and repetitive nature of the medium allows one to spend countless hours re-hashing the same arguments over and over, without making any advance, and ignoring any thought-out arguments that may have been put in writing.

After few weeks after the "Rashomon of Disclosure"-post, a Twitter discussion about disclosure erupted; and I upset a few participants by more or less saying: "Y'know, I spent the time writing down my thoughts and arguments around the topic, and I am very willing to engage with anybody that is willing to spend the time writing down their thoughts and arguments, but I am not willing to engage in Twitter yelling matches where we ignore all the nuance and just whip up tribal sentiment."

This was not universally well-received, but refusal to engage in social media yelling matches and the dopamine kick that arises from the immediacy of the experience is an important step if we want to understand either the topic or the other participants in the debate.

Summary

This was a very long-winded post. Thank you for having read this far. I hope this post will be helpful - perhaps some of the habits can be useful to others. If not, this blog post can at least explain why I will sometimes withdraw from social media discussions, and insist on long-form write ups as a means of communication. It should also be helpful to reduce bewilderment if I offer betting odds in surprising situations.