There was a great bit of reportage by Maggie Koerth-Baker on critical thinking (or rather, its failure) in The New York Times recently. The article explored the psychology behind conspiracy theories, honing in on eight key reasons so many people believe in them:
- Some are predisposed to conspiracy belief. “‘The best predictor of belief in a conspiracy theory is belief in other conspiracy theories,’ says Viren Swami, a psychology professor who studies conspiracy belief at the University of Westminster in England. Psychologists say that’s because a conspiracy theory isn’t so much a response to a single event as it is an expression of an overarching worldview.”
- Cynicism, low self-worth, uncertainty, powerlessness correlate with conspiracy belief. “[Psychologists] have, through surveys and laboratory studies, come up with a set of traits that correlate well with conspiracy belief. In 2010, Swami and a co-author summarized this research in The Psychologist, a scientific journal. They found, perhaps surprisingly, that believers are more likely to be cynical about the world in general and politics in particular. Conspiracy theories also seem to be more compelling to those with low self-worth, especially with regard to their sense of agency in the world at large. Conspiracy theories appear to be a way of reacting to uncertainty and powerlessness. […] Psychologists aren’t sure whether powerlessness causes conspiracy theories or vice versa. Either way, the current scientific thinking suggests these beliefs are nothing more than an extreme form of cynicism, a turning away from politics and traditional media — which only perpetuates the problem.”
- When shit happens, the amygdala activates. “In […] moments of powerlessness and uncertainty, a part of the brain called the amygdala kicks into action. Paul Whalen, a scientist at Dartmouth College who studies the amygdala, says it doesn’t exactly do anything on its own. Instead, the amygdala jump-starts the rest of the brain into analytical overdrive — prompting repeated reassessments of information in an attempt to create a coherent and understandable narrative, to understand what just happened, what threats still exist and what should be done now.”
- Being “in the know” is a form of superiority. “‘If you know the truth and others don’t, that’s one way you can reassert feelings of having agency,’ Swami says. It can be comforting to do your own research even if that research is flawed. It feels good to be the wise old goat in a flock of sheep.”
- Sometimes conspiracies really are at work and this gives people an excuse to seek them out. “Kathryn Olmsted, a historian at the University of California, Davis, says that conspiracy theories wouldn’t exist in a world in which real conspiracies don’t exist. And those conspiracies — Watergate or the Iran-contra Affair — often involve manipulating and circumventing the democratic process.”
- Confirmation bias. “Confirmation bias — the tendency to pay more attention to evidence that supports what you already believe — is a well-documented and common human failing. People have been writing about it for centuries. In recent years, though, researchers have found that confirmation bias is not easy to overcome. You can’t just drown it in facts.”
- The backfire effect. “In 2006, the political scientists Brendan Nyhan and Jason Reifler identified a phenomenon called the “backfire effect.” They showed that efforts to debunk inaccurate political information can leave people more convinced that false information is true than they would have been otherwise. Nyhan isn’t sure why this happens, but it appears to be more prevalent when the bad information helps bolster a favored worldview or ideology.”
- The Internet, by making facts and alternative narratives on any subject just a Google search away, fuel both the backfire effect and tribalism. “Not only does more exposure to these alternative narratives help engender belief in conspiracies, he says, but the Internet’s tendency toward tribalism helps reinforce misguided beliefs.”
Here’s the lesson I would draw from these eight observations in the NYT piece. When I’m engaging in critical thinking on any subject (not just a conspiracy theory), I should ask myself questions like these:
- What role might my own habits of thought, cynicism, low self-worth, uncertainty, or powerlessness be playing in the way that I’m trying to get at the truth of this matter?
- What role are my emotions playing in my conclusion? Am I obsessing? Why is it important to me that the conclusion land one way as opposed to another?
- Is there a comfy or straightforward narrative I’m telling myself that I’m not interrogating? What actual evidence do I have for thinking the narrative is true? Might the evidence also support a different narrative? What is that alternative narrative?
- Is my self esteem being bolstered by the conclusion I’m drawing, and is that distorting my perception?
- Am I engaging in ad hominem (“adding the man”–the source of an argument–to the evaluation of an argument)? In other words, am I dismissing an argument simply because it’s coming from someone or some movement that I don’t like? Setting the source aside, what is the substance of the argument? Is there evidence for the argument? Is the quality and quantity of that evidence good?
- On a scale of 1 to 10 (1 being a low amount, 5 medium, 10 high), how much belief should I apportion to the claims, arguments, narratives, and quality and quantity of evidence available to me on this matter?
Such questions are not easy to ask of a subject (especially if we’re heavily invested in a particular conclusion, under time pressures, distracted, or are simply feeling impatient for heuristic shortcuts). But wouldn’t it be nice if we could find ways to consistently wed our beliefs to slow, calm, and methodical critical thinking?
I think the number one reason why some people are so fond of conspiracy theories might be missing: They’re bat-shit crazy!