The Myth of the Magic Bullet

 

 

Be careful. People like to be told what they already know. Remember that. They get uncomfortable when you tell them new things. New things… well, new things aren’t what they expect. They like to know that, say, a dog will bite a man. That is what dogs do. They don’t want to know that a man bites a dog, because the world is not supposed to happen like that. In short, what people think they want is news, but what they really crave is olds… Not news but olds, telling people that what they think they already know is true.

~Terry Pratchett

Track the Burden of Proof

Last time we began to look at some ways for engaging in productive, back-and-forth argumentation with someone else so as to arrive at what's (hopefully) likely to be true. We're calling this dynamic argumentation (or dialectic). The first golden rule of dynamic argumentation was to always respond to the argument of the person you're speaking with. Don't just give an argument for your position. You must first respond directly to their point. Otherwise, nothing will come of the exchange but frustration and hostility.

The second golden rule is this: track the burden of proof. Tracking the burden of proof has to do with keeping track of who made which claims and making sure that they defend those claims. Before moving forward, let's take a look at an example to make this clearer.

Ted: Decisive people typically have the attributes to be a great president. Zephyr Teachout is very decisive. Therefore, she would make a great president.

 

Ted's back. In the argument above, Ted is the claimant, the person claiming to know something. In particular, he is claiming that decisive people typically have the attributes to be a great president, that Zephyr Teachout is very decisive, and that Zephyr Teachout would make a great president. You, then, are the respondent. Per Lyons and Ward (2018: 334-337), your role as a respondent is not to prove that decisive people typically don't have the attributes to be a great president, or that Zephyr Teachout isn't very decisive, or that Zephyr Teachout wouldn't make a great president. In the words of Lyons and Ward, "Ted's the one making the big claims; he's assumed the burden of proof" (335). Put simply, Ted offered the argument. It's up to him to give evidence for the claims (i.e, premises) on which his conclusion rests. That's what it means for Ted to have the burden of proof.

Noting that the burden of proof is on the claimant is not a minor point. In fact, if you unnecessarily assume the burden of proof, and thus have to prove the claimant's premises are false, then you make yourself vulnerable to losing the argument (ibid.).

Having said that, the point here is not to win arguments but to find the truth. Tracking the burden of proof is the best way to do that. Of course, if Ted doesn't have good reasons for believing in his premises, then his argument goes nowhere. If that's the case, you first make that clear and then proceed to give your own argument for your conclusion—hopefully with premises for which you have evidence. That's dialectic done right.

Let's take a look at an example from Lyons and Ward now:

Ted: Scientific creationism is entirely consistent with the data provided by the fossil record. So, it deserves to be taken seriously as a legitimate scientific theory.

Lyons and Ward

Lyons and Ward go over two possible responses: a bad one and a good one. The bad idea is to give an account of how science really works. For example, you might argue that real science has to do with only making falsifiable claims, a notion developed by Karl Popper that we learned about in ...for the Stronger. The good idea is to press the claimant on whether mere consistency with the data is enough for something to count as science. In other words, ask the claimant for evidence that his second premise is true. If you're anything like me, your intuition is to go with the bad one. (That's what I've actually have done in real-life debates.) But let me show you why that's a bad idea.

Attempting to give your own account of how science works is a losing battle. Many have tried and have failed, e.g., Popper (see the lesson titled ...for the Stronger). Science is a really complex institution with lots of moving parts, several overlapping and conflicting methodologies, and tons of specialization. Characterizing how it works is a highly non-trivial task. Don't pretend to know how it really works—please. No one does—or at least, as of this writing, there is no generally accepted theory about how science works (but see Collins 2009 for an interesting theory in the sociology of science).

That leaves us with putting the burden of proof on the person who deserves it. If the claimant (Ted) wants to argue that mere consistency with the data is enough for something to count as science, then it's up to him to defend the claim. By the way, this is a heavy burden. This is a good time to remind you of what logical consistency means: two statements are logically consistent if it is possible for both to be true at the same time. Put bluntly, it just means the statements don't contradict each other (or imply a contradiction). So now, when you think about it, tons of things are consistent with scientific data but they seem to not count as science. For example, I have a theory about how to assemble the best peanut butter and jelly sandwiches. So far as I know, if you were to write out all the steps to my delicious recipe, at no point would you contradict any of the fundamental laws of physics, any of the top theories in the mind sciences or in evolutionary theory, or any of the findings in biochemistry, linguistics, and complexity science. In other words, my recipe is consistent with the data. Doesn't that mean it counts as a science? Of course not. That's nonsense. The claimant appears to have made a false claim. But still, let them try to defend it. Odds are they'll fail.

Really? Mere consistency with the data is enough for something to count as science? So, the hypothesis that the universe was created by the Flying Spaghetti monster is a legitimate scientific theory, right up there with quantum mechanics and relativity theory? It is consistent with the data. What data do we have that entails that the universe was not created by such a thing? None. As we've seen, science is complex... [and] it is clear that merely telling tall stories that are compatible with the data doesn't cut it” (Lyons and Ward 2018: 336; emphasis in original).

 

 

 

On What is Possible, Continued

What gives rise to conspiratorial thinking

Uscinki (2019)

In their chapter of Conspiracy Theories and the People Who Believe Them, Butter and Knight (2019) give a history of the research into conspiracy theories. The authors argue against an early trend in conspiracy theory research—the tendency to see all those who believe in conspiracy theories as necessarily paranoid. This pathologizing approach, intuitive as it may be to some, was not very helpful according to the authors. This is because this attitude towards conspiratorial thinking marginalized conspiracies and conspiracy theorists. However, as the history of conspiracy theories shows, conspiratorial thinking is not a marginal phenomenon; rather, it is widespread (see the lesson titled Possibilia). More to the point, belief in conspiracies is not limited to those that can be easily characterized as paranoid. The authors then report that this pathologizing paradigm of research has (thankfully) been challenged since the 1990s, and so the authors move to review some more recent experimental studies that shed light on why someone might believe in a conspiracy theory. Here is what several of these studies have in common. It appears that people who have been experimentally induced into experiencing emotional uncertainty or a loss of control are more likely to draw on conspiratorial interpretations of events (see also Whitson et al. 2015). The authors conclude that this type of research, which is more sensitive to cultural context and is more methodologically sound in addition to being more reflective of who actually engages in conspiratorial thinking, shows great promise for conspiracy theory research.

So feelings of uncertainty and loss of control have something to do with it. Is that all? In Lecture 3 of Shermer (2019) we are given details about the experimental literature that Butter and Knight think shows so much promise. In general, Shermer makes the case that those who engage in conspiratorial thinking do so for various reasons; i.e., there is no magic-bullet, there is no single cause for belief in conspiracy theories (and hence no single solution for ending rampant conspiratorial thinking). Here are some of the contributing factors.

 

Feeling lack of control makes us more conspiracy-minded

Person feeling lack of control

Sighting Whitson and Galinsky's 2008 Lacking Control Increases Illusory Pattern Perception, (in which people were made to feel a lack of control via a doctored task), Shermer first makes the case that those who have suffered a recent setback (financial, personal, or otherwise), along with those who are naturally disposed to be distrusting and/or paranoid, are prone to conspiratorial thinking.

An example will help. Perhaps you were asked to perform some task at work that you feel is above and beyond what other people in your position are asked to do. In essence, you feel like you are working more than the others for the same pay and with no real justification. This, of course, is a type of setback. It is financial and personal. You are feel that you are working more for the same pay as everyone else and you feel personally slighted—like they have something against you in particular. And so, it is under these conditions that you might start thinking in conspiratorial ways. Perhaps the boss has a crush on the other person in your position and doesn't want to make him (or her) do any extra work, the result being that you have to do the extra work. Or maybe they are trying to fire you, and they're looking for a reason to do so. Maybe if they overwork you, you'll snap and go off on someone. Then they can fire you. Or maybe... You get the picture. You can start dreaming up all kinds of scenarios. Of course, all else being equal, you typically will have no good evidence for believing any of these theories are actually true. But(!), given your emotional disposition, your feeling of loss of control, you are vulnerable to entertaining all kinds of crazy ideas.

 

The way our brains incorporate new beliefs

A web of belief
A web of belief.

Here's another factor that contributes to conspiratorial thinking: epistemic cognition. Epistemic cognition is a mental process that involves the assessment of what we know, how we know what we think we know, what we doubt and why, etc. There are various theories in this subfield of Cognitive Science, but Shermer cites one in particular. Shermer's preferred theory is called global coherence. It posits that our beliefs are like a web that coheres or fits together. This web of beliefs is held together by the beliefs that we believe the most in. All other beliefs are "attached" to these more strongly-held beliefs. Importantly, all incoming beliefs must comport (or fit in with) the pre-existing beliefs. Moreover, if a belief is strong enough (or is somehow strengthened), all less firmly-held beliefs have to comport to that belief. So, with this context in place, Shermer explains how this relates to conspiracy theories. It might be the case that people already believe in nefarious actors. Perhaps they have this firmly held-belief in the existence of bad people because of personal experience (e.g., maybe they're the victim of a conspiracy) or because they’re naturally disposed to being paranoid. Whatever the case may be, their less firmly-held beliefs are shifted/distorted to fit in with the strong beliefs. In other words, if belief in nefarious actors is strong enough, all incoming information will be forced into a web of beliefs where nefarious actors play a dominant role. It's not hard to see how this can lead to conspiracy theories. If you strongly believe that bad people exist and they're always plotting, then all incoming information will only reinforce that belief. Sooner or later, you'll "grow" a conspiracy theory web. In fact, it may even be the case that one might have inconsistent weakly-held beliefs, such as the belief that Princess Diana both was assassinated and faked her death. This is possible and consistent with global coherence theory as long as these inconsistent weakly-held beliefs comport to the strongly-held belief (that nefarious actors exist).

 

The alluring simplicity of conspiracy theories

Here's another factor that contributes to being susceptible to conspiratorial thinking. Perhaps conspiracy theories might be believed due to their simplicity. The world, of course, is extremely complex and we can only even approximate understanding it with complex mathematical models. That's the business of the sciences. Conspiracy theories, on the other hand, are simpler and can simplify our cognitive efforts. As you recall from the lesson titled The Distance of the Planets, one prominent theory in the mind sciences is the view that our brain is in the business of making predictions; that's its evolutionary function (see Clark 2015). These predictions will hopefully keep you alive and mobile long enough to find a mate, reproduce, and pass on your genes. Of course, making predictions is tough and whenever possible the brain will attempt to make its job easier by limiting its analysis to only the most important factors. Conspiracy theories, superficially at least, appear to do this. They explain basically everything in terms of a few nefarious actors. This simplicity, Shermer argues, is what makes them so easy to believe in.

 

The pressure of social identity

Shermer also reminds us, by the way, that some conspiracy theories are believed because they cohere with our social identity. For example, it's no secret that liberals are more likely to believe in conspiracies involving banks, while conservatives are more likely to believe in plots to enact gun control. Shermer adds that this might be the worst contributor to conspiracist tendencies since it usually implies the demonizing of “the other” (see also Douglas et al. 2017).

 

Innate paranoia

Although we had earlier mentioned that the pathologizing paradigm of conspiracy theory research has fallen out of favor, this doesn't mean that paranoia isn't used in explaining some instances of belief in conspiracy theories. In Something’s Going on Here: Psychological Predictors of Belief in Conspiracy Theories, Joshua Hart and Molly Graether (2018) found that “conspiracy believers are relatively untrusting, ideologically eccentric, concerned about personal safety, and prone to perceiving agency in actions.” They were also more likely to be religious, female, and younger in age. They also scored higher in “dangerous world beliefs” and schizo-type personality.1

 

Shermer’s 10 components of belief in conspiracy theories

Overall, then, Shermer (Lecture 5) gives his list of what gives rise to belief in conspiracy theories. As you read this, note that the second half of this list appears to be components of beliefs that ensure that believers maintain their beliefs; the first half is about belief formation.

Michael Shermer
Michael Shermer.
  1. The brain generates beliefs naturally, whereas skepticism is unnatural and even painful (see Harris et al. 2009). In general, there is no penalty for forming false beliefs. So, we tend to take the easy route and stick to our beliefs, rather than opting for the cognitively-demanding task of critical thinking.
  2. Authorities enable belief. There are numerous psychological studies, such as Milgram’s Obedience to Authority, that demonstrate that people are willing to take the word of authorities at face value. Even Plato knew this, since he believed that harmony in the ruling Guardian class would lead to harmony in the lower classes—the lower classes taking their lead from the elite. So, if we see an authority we trust spouting conspiracy theories, or perhaps just failing to denounce them, then we may end up believing them.
  3. Peers reinforce belief. Numerous social psychology experiments lend credence to this, e.g., Asch’s conformity studies). In short, if others (in your in-group) say they believe in something, you're more likely to believe in it too.
  4. We’re more likely to believe people we like or respect. This is why conspiracy theorists play up similarities between the victims of the conspiracy (us) and the evildoers (the "other"). The conspiracy theorist, by putting you in the same camp with them (i.e., the victim camp), makes you feel closer to them. By default, this makes you more antagonistic to the "other", the ones perpetrating the conspiracy. Thus, you are more likely to believe in the conspiracy.2
  5. Beliefs are reinforced by payoffs, success, and happiness. This is why many conspiracy theories promise believers that they will be rewarded, either financially or spiritually. Alternatively, believers might feel empowered through secret knowledge. So, either they think they'll get some payoff, or they can feel superior to others by knowing something they don't know. Either way, belief in conspiracy is reinforced.
  6. Beliefs are reinforced by the confirmation bias.. You know all about this by now. You selectively process information such that your pre-existing beliefs are reinforced. Everything you see is more evidence of the conspiracy, and your brain refuses to see the rest!
  7. Beliefs are reinforced by the optimism bias, our tendency to believe that we are less likely to experience a negative event than others, in particular in complex domains. So, just because we haven't directly suffered any negative outcomes from a conspiracy theory that we believe and supposedly affects a lot of people, that doesn't mean we have to stop believing in how widespread this conspiracy is. It's just that we're lucky! And, it's also the case that we're lucky enough to know "the truth", while others are being duped by the conspirators.
  8. Beliefs are reinforced by the self-justification bias, the tendency to rationalize our decisions post-hoc (i.e., after the fact) in order to justify the belief that what we did was in fact the ideal course of action. Once we commit to a belief, we tend to defend it by arguing that our belief has positive consequences for us. You might even say, "You can go on believing the official story like a sucker." But we feel that we're in the know. We know what's really going on.
  9. Beliefs are reinforced by the sunk cost fallacy, our tendency to stick with a plan of action (even though it might not lead to any benefit) just because we don't want to see our past investments (in the form of time and money) go to waste. Basically, it's this: if we’ve already invested in some particular theory, it’s hard to let it go.
  10. Beliefs are reinforced by the endowment effect, our tendency to overvalue something just because we own it (regardless of its objective market value). We value a conspiracy theory more than truth merely because it’s already our view.

 

Other factors

Although Shermer does not mention them, I'd like to add two more factors that contribute to one being vulnerable to conspiratorial thinking. In chapter 2 of Minds Make Societies, cognitive scientist Pascal Boyer considers how “junk information”, i.e., the low-quality information which is heavily-trafficked in conspiracy-minded groups, cults, etc., gets passed around so readily. Part of it, Boyer argues, is that junk information is typically moralized or has a high-threat element attached to it. In other words, junk information is either given a moral dimension so that it seems more important or it is made to seem like you're putting yourself at risk if you don't pay attention to it. Both of these features of junk information make it so that it is likely to bypass any skeptical vigilance guard and turn itself into a belief rattling around in our head. Boyer explains:

“From this perspective, the moralization of other people’s behavior is an excellent instrument for social coordination, which is required for collective action. Roughly speaking, stating that someone’s behavior is morally repugnant creates consensus more easily than claiming that the behavior results from incompetence. The latter could invite discussions of evidence and performance, more likely to dilute consensus than to strengthen it. This would suggest that our commonsense story about moral panics may be misguided—or at least terribly incomplete. It is not, or not just, that people have beliefs about horrible misdeeds and deduce that they need to mobilize others to stop them. Another factor may be that many people intuitively—and, of course, unconsciously—select beliefs that will have that recruitment potential because of their moralizing content. So, the millennial cults with failed prophecies are only a limiting case of the more general phenomenon whereby the motivation to recruit is an important factor in people’s processing of their beliefs. That is to say, beliefs are pre-selected in an intuitive manner, and those that could not trigger recruitment are simply not considered intuitive and compelling” (Boyer 2018).

 

Boyer's Minds Make Societies

Put simply, we want others on our side. This will make it easier to coordinate collective action. To convince others to be on our side, you can either talk about the competence of your competitors or say they are moral monsters. Given the way our brains evolved, calling someone a moral monster is simply more persuasive. It bypasses any need for looking at evidence, and it moves right to the gut. It makes you feel like the person might be a threat. Thus, people side with you and not with the moral monster, whether or not they are actually a monster. Boyer goes further, though. He is saying that, unconsciously, you select the things that you are going to say such that they have as much oomph as possible. In other words, without knowing it, you generally say things with the most recruitment power; you say what's most likely to get others on your side. Moralized content and high-threat alerts tend to fit the bill pretty well. And this is why conspiracy theories are so alluring. They are highly-moralized, high-threat content. It may be junk information, but it is difficult to ignore junk information.

Last but not least... Interestingly, across the world, higher economic inequality, as measured by the Gini coefficient, is correlated with more belief in conspiracy theories. In other words, the more income and wealth inequality, the greater the belief in conspiracy theories (Drochon 2019).

What's to be done? Stay tuned.

 

 


 

Do Stuff

  • In a post of around 300 words, please address the follow question: How serious of a societal problem is belief in conspiracy theories: very serious, somewhat serious, or not serious at all? In your answer, be sure to give an argument for whatever your position is. For example, if you believe that conspiratorial thinking is a very serious problem, you probably believe a sizable percentage of the population is negatively affected by conspiratorial thinking. So, give an account of how conspiracy-mindedness is negatively affecting your society, as well as an estimate of how many people are being negatively affected. You may also detail a particular conspiracy theory that you believe is causing trouble. Alternatively, if you believe conspiratorial thinking is only a small problem, you might detail how only certain sectors of society are actually negatively affected by conspiracy theories, and give reasons for this position. And, of course, if you believe it's not a problem at all, you can argue that conspiratorial thinking is a harmless pastime that some people spend their time on—with very few people actually suffering any harm.
  • After you post your discussion response, be sure to provide a substantive, 100 word minimum reply to two of your classmates' posts. You may, for example, note similarities or differences in your viewpoints, or you may express disagreement with how your classmate estimated how many people are being negatively affected by conspiratorial thinking, or you may add details to a particular conspiracy theory that your classmate focused on, etc.

 


 

To be continued...

 

FYI

Suggested Reading: Internet Encyclopedia of Philosophy Entry on Conspiracy Theories (same as last time)

TL;DR: The School of Life, How to Resist Conspiracy Theories (same as last time)

Supplemental Material—

Related Material—

Advanced Material—

  • Reading: Stanford Encyclopedia of Philosophy Entry on Evidence

 

Footnotes

1. The link between paranoia and conspiracy-mindedness is very complicated. I'm taking the following discussion from Wood and Douglas (2019). Here's how I understand it. In general, as is also mentioned by Butter and Knight (2019), there was a general tendency in academic circles to pathologize conspiracy theorists—a tendency that is still around in laypeople today. In psychological jargon, they were guilty of the fundamental attribution error, essentially immediately attributing paranoia to conspiracy theorists, as opposed to considering the effect of their environment on them. Now research in this field is a little more nuanced, less pathologizing. Given this new and improved methodology, it does appear that there is a relationship between distrust and conspiracy-mindedness. However, Wood and Douglas argue that the evidence is merely correlational—and remember correlation is not causation. Rather than framing conspiracy-mindedness research in terms of paranoia, the authors then make the case that conspiracy-mindedness should be understood as a form of disbelief, rather than as a belief. Consider, for example, that conspiracy theorists spend most of their time arguing for why the official story is implausible and comparatively little time explaining how the conspiracy came about. This is, by the way, sort of the opposite of die-hard conspiracy skeptics, those that reject all conspiracies outright. In fact, die-hard skeptics about conspiracy theories are actually less likely to know about or believe in actual conspiracies, like MKULTRA and instances of US interventionism, like the overthrows in Hawaii, Cuba, Puerto Rico, the Philippines, Nicaragua, Honduras, Iran, Guatemala, South Vietnam, Chile, Grenada, Panama, Afghanistan, and Iraq (see Kinzer 2007). (Either way, both those prone to conspiratorial thinking and die-hard conspiracy skeptics are failing to be critical thinkers—neither actually assesses the evidence.) In any case, on the issue of paranoia, the authors argue that conspiracy theories are not confined to the fringes of society; so, it's not just those who are paranoid. Moreover, it is possible to believe in just one conspiracy, complicating the narrative that belief in one conspiracy leads to belief in many conspiracies. For example, most Americans believe there was a conspiracy around the JFK assassination, but that doesn't necessarily lead to a belief that, say, 9/11 was an inside job. There’s also partisan conspiracies, as in the case where Republicans are more likely to believe Obama was secretly born in Kenya and Democrats are more likely to believe the George W. Bush administration was involved with 9/11. So, obviously, you might believe just the conspiracy that has to do with the other party, and not necessarily start accepting all political conspiracy theories that you hear. (Coincidentally, conspiracies about vote-rigging are more likely to be believed if one’s preferred party loses, see p. 249.) All in all, paranoia is a cause for conspiracy-mindedness for some people, but it's not the only cause. As we've seen, subjects who were made to feel a lack of control are more likely to believe in conspiracies, exhibit paranoid behaviors, and be mistrusting of others (p. 250n34). Moreover, people who feel powerless are more likely to believe in conspiracy theories, and believing in conspiracy theories is correlated with prejudice against high-power—but not low-power—groups (p252n41). In short, the story of the cause of conspiracy-mindedness is very complicated.

2. Shermer is definitely on to something here. It is a well-attested finding that we are more likely to be influenced by those we like than by those we dislike. In fact, this finding is so prevalent that Robert Cialdini places it as his third principle of influencing in his best-selling book Influence: The psychology of persuasion. In a nutshell, according to Cialdini, you assent to those that you like. One study sheds light on this. First, for some context, Cialdini discusses how few Americans believe that evolution alone gave rise to humans in their present form—depressingly, only about a third. He then discusses how science communicators have clearly missed the lesson: you can’t just throw evidence at someone to update their beliefs if those beliefs were not arrived at through evidence in the first place. And so, Cialdini moves to a discussion of an experiment that was(!) successful in changing minds. In the experiment, researchers tested whether evolutionary theory would be more readily accepted if subjects believed that a well-liked person advocated the view. So, some subjects were made to believe that George Clooney (who else?!) was an advocate of evolutionary theory. And indeed they subsequently were more likely to accept evolutionary theory themselves—at least when contrasted with the control group (see chapter 3 of Cialdini 2021).