Evolution Icon Evolution
Intelligent Design Icon Intelligent Design

Why Science Needs a Scout Mindset: Lessons from Julia Galef

Photo credit: BORNTHISWAYMEDIA, CC BY-SA 4.0 , via Wikimedia Commons.

Have you ever wondered why some people are able to think about the world in a clearer way, forming more balanced and nuanced views about controversial topics, than others? Have you pondered what thinking patterns are most conducive to good reasoning and well supported conclusions, and how one might avoid the pitfalls of confirmation bias and self-deception? In her book The Scout Mindset: Why Some People See Things Clearly and Others Don’t, Julia Galef (host of the podcast “Rationally Speaking” and co-founder of the Center for Applied Rationality) attempts to answer these questions.1 Here, I shall summarize Galef’s insights and discuss what lessons we as scholars engaged in the debate over ID and evolution can glean from the book.

The Soldier and the Scout

Galef distinguishes between what she dubs “the soldier mindset” and “the scout mindset.” According to Galef, the soldier mindset, also known as motivated reasoning, leads us to loyally defend the stronghold of our belief commitments against intellectual threats, come what may. This involves actively seeking out data that tends to confirm our beliefs, while rationalizing or ignoring contrary data that tends to disconfirm them. On the other hand, the scout mindset attempts to honestly determine how the world really is — as Galef defines it, the scout mindset is “the motivation to see things as they are, not as you wish they were” (p. ix).

For someone in the soldier mindset, argues Galef, reasoning is like defensive combat — “it’s as if we’re soldiers, defending our beliefs against threatening evidence” (p. 7). For the soldier, to change one’s mind — to admit that one was wrong — is seen as surrender and failure, a sign of weakness. One’s allegiance is to one’s cherished beliefs rather than to the truth, even if those beliefs conflict with the balance of evidence. For the soldier, determining what to believe is done by asking oneself “Can I believe this?” or “Must I believe this?”, depending on one’s motives. For the one in scout mindset, by contrast, reasoning may be likened to mapmaking, and discovering that you are wrong about one or more of your beliefs simply means revising your map. Thus, scouts are more likely to seek out and carefully consider data that tends to undermine their own beliefs (thereby making one’s map a more accurate reflection of reality), deeming it more fruitful to pay close attention to those who disagree with their own opinions than to those whose thinking aligns with theirs.

The prevalence of the soldier mindset in society today is aptly demonstrated by a sobering study, cited by Galef, in which participants were tested in regard to their “scientific intelligence” with a set of questions.2 Questions were divided into four categories — basic facts; methods; quantitative reasoning; and cognitive reflection. Remarkably, when conservative Republican and liberal Democratic participants were also asked whether they affirmed the statement that there is “solid evidence” of recent global warming due “mostly” to “human activity such as burning fossil fuels,” there was a positive correlation between “scientific intelligence” and divergent opinion. That is to say, the higher one’s scientific intelligence, the more likely a liberal Democrat was to affirm the statement and the more likely a conservative Republican was to disagree with it. This is not the only study to reveal the tendency of more educated people to diverge in opinion on controversial topics. Another study surveyed people’s views on ideologically charged topics, including stem cell research, the Big Bang, human evolution, and climate change.3 The finding was that “Individuals with greater education, science education, and science literacy display more polarized beliefs on these issues,” though the study found “little evidence of political or religious polarization regarding nanotechnology and genetically modified foods.” Galef summarizes the implications of those studies: “This is a crucially important result, because being smart and being knowledgeable on a particular topic are two more things that give us a false sense of security in our own reasoning. A high IQ and an advanced degree might give you an advantage in ideologically neutral domains like solving math problems or figuring out where to invest your money. But they won’t protect you from bias on ideologically charged questions” (p. 48).

Though there is an element of scout and soldier in all of us, Galef argues, “some people, in some contexts, are better scouts than most,” being “more genuinely desirous of the truth, even if it’s not what they were hoping for, and less willing to accept bad arguments that happen to be convenient. They’re more motivated to go out, test their theories, and discover their mistakes. They’re more conscious of the possibility that their map of reality could be wrong, and more open to changing their mind” (pp. 14-15). On the flip side, often “We use motivated reasoning not because we don’t know any better, but because we’re trying to protect things that are vitally important to us — our ability to feel good about our lives and ourselves, our motivation to try hard things and stick with them, our ability to look good and persuade, and our acceptance in our communities” (p. 26). For example, if we are being honest, how often do we, when considering a claim, “implicitly ask ourselves, ‘What kind of person would believe a claim like this, and is that how I want other people to see me?’” (p. 23). Such thinking fuels the soldier mindset. In practice, we cannot eliminate the soldier mindset from our reasoning processes entirely. After all, it is our default mentality. By nature, we like having our beliefs confirmed. But we can take intentional steps towards cultivating more of a scout mindset.

Characteristics of the Scout Mindset

What are some of the key characteristics that distinguish the scout from the soldier mindset? Galef gives five features that define a scout. The first is the ability to tell other people when you realize that they were right. Galef qualifies this by noting that “Technically, scout mindset only requires you to be able to acknowledge to yourself that you were wrong, not to other people. Still a willingness to say ‘I was wrong’ to someone else is a strong sign of a person who prizes the truth over their own ego.” The second quality is reacting well to criticism. Galef explains, “To gauge your comfort with criticism, it’s not enough just to ask yourself, ‘Am I open to criticism?’ Instead, examine your track record. Are there examples of criticism you’ve acted upon? Have you rewarded a critic (for example, by promoting him)? Do you go out of your way to make it easier for other people to criticize you?” (p. 52). Denying critics of evolution tenure, firing them from faculty positions, or preventing them from being able to publish are hardly marks of scout mindset.

The third quality that marks a scout is the ability to prove oneself wrong. Galef asks, “Can you think of any examples in which you voluntarily proved yourself wrong? Perhaps you were about to voice an opinion online, but decided to search for counterarguments first, and ended up finding them compelling. Or perhaps at work you were advocating for a new strategy, but changed your mind after you ran the numbers more carefully and realized it wouldn’t be feasible” (p. 54). The fourth feature of the scout mindset is to avoid biasing one’s information. “For example,” writes Galef, “when you ask your friend to weigh in on a fight you had with your partner, do you describe the disagreement without revealing which side you were on, so as to avoid influencing your friend’s answer? When you launch a new project at work, do you decide ahead of time what will count as a success and what will count as a failure, so you’re not tempted to move the goalposts later?” (p. 56). 

The fifth feature that Galef lists is being able to recognize good critics. She comments, “It’s tempting to view your critics as mean-spirited, ill-informed, or unreasonable. And it’s likely that some of them are. But it’s unlikely that all of them are. Can you name people who are critical of your beliefs, profession, or even choices who you consider thoughtful, even if you believe they’re wrong? Or can you at least name reasons why someone might disagree with you that you would consider reasonable (even if you don’t happen to know of specific people who hold those views)?” (p. 57). 

In summary, Galef notes, “Being able to name reasonable critics, being willing to say ‘The other side has a point this time,’ being willing to acknowledge when you were wrong – it’s things like these that distinguish people who actually care about truth from people who only think they do” (p. 57). How many times have you heard proponents of evolution, such as Eugenie Scott, assert that they “don’t know any evidence against evolution”? The inability to recognize evidence against your own position — i.e., the failure to acknowledge data that surprises you given the precepts of your preferred theory — is the surest sign that confirmation bias plays a significant role in your reasoning. Even if one thinks the preponderance of the evidence still favors the evolutionary view, to assert that there is no evidence that negatively affects its probability, even mildly, is pure denialism.

Too often I see people reason that, if a set of data can be made compatible with their beliefs, then they have neutralized the objection to those beliefs. However, this approach is quite simplistic. It is nearly always possible to make discordant data compatible with your beliefs. But that doesn’t mean that the data is not better predicted given that your beliefs are false than that they are true, or that you should not lower your confidence in those beliefs. The appropriate question, when confronted with discordant data, is not to ask “Can I believe I am still right?” Galef rightly points out that “Most of the time, the answer is ‘Yes, easily’” (p. 141). Rather, we should ask to what extent our confidence in our beliefs needs to be updated in response to this new data.

Testing for Our Own Biases

Galef offers five tests for bias in our reasoning. The first test is the double standard test, which essentially asks whether we apply the same standards to ourselves that we would apply to others. The second test is the outsider test, which attempts to determine how you would assess the same situation or data if you had no vested interest in the outcome. The third is the conformity test, which attempts to discern the extent to which one’s opinion is in fact one’s own. Galef explains, “If I find myself agreeing with someone else’s viewpoint, I do a conformity test: Imagine this person told me that they no longer held this view. Would I still hold it? Would I feel comfortable defending it to them?” (p. 66). The fourth test is the selective skeptic test — “Imagine this evidence supported the other side. How credible would you find it then?” (p. 68). The final test is the status quo bias test — “Imagine your current situation was no longer the status quo. Would you then actively choose it? If not, that’s a sign that your preference for your situation is less about its particular merits and more about a preference for the status quo” (p. 69).

How to Be Wrong

Another thing that marks out a scout, according to Galef, is one’s attitude towards being wrong. Scouts, explains Galef, “revise their opinions incrementally over time, which makes it easier to be open to evidence against their beliefs” (p. 144). Further, “they view errors as opportunities to hone their skill at getting things right, which makes the experience of realizing ‘I was wrong’ feel valuable, rather than just painful” (p. 144). Galef even suggests that we should drop the whole “wrong confession” altogether and instead talk about “updating.” She explains, “An update is routine. Low-key. It’s the opposite of an overwrought confession of sin. An update makes something better or more current without implying that its previous form was a failure” (p. 147). Galef points out that we should not think about changing our minds as a binary thing — rather, we should think of the world in “shades of grey,” and think about changing our mind in terms of an “incremental shift” (p. 140). She notes that thinking about revising one’s beliefs in this way makes “the experience of encountering evidence against one of your beliefs very different” since “each adjustment is comparatively low stakes” (p. 140). For example, “If you’re 80 percent sure that immigration is good for the economy, and a study comes out showing that immigration lowers wages, you can adjust your confidence in your belief down to 70 percent” (p. 140).

Who Should We Listen To?

Galef also points out that, when it comes to intentionally exposing ourselves to content representing the “other side” of a debate in which we are interested, people tend to make the mistake of always ending up “listening to people who initiate disagreements with us, as well as the public figures and media outlets who are the most popular representatives of the other side” (p. 170). However, as Galef explains, “Those are not very promising selection criteria. First of all, what kind of person is most likely to initiate a disagreement? A disagreeable person. (‘This article you shared on Facebook is complete bulls**t — let me educate you…’) Second, what kind of people or media are likely to become popular representatives of an ideology? The ones who do things like cheering for their side and mocking or caricaturing the other side — i.e., you” (pp. 170-171). Instead, Galef suggests, “To give yourself the best chance of learning from disagreement, you should be listening to people who make it easier to be open to their arguments, not harder. People you like or respect, even if you don’t agree with them. People with whom you have some common ground — intellectual premises, or a core value that you share — even though you disagree with them on other issues. People whom you consider reasonable, who acknowledge nuance and areas of uncertainty, and who argue in good faith” (p. 171).

Striving for Clearer Thinking

In summary, cultivating a scout mindset, and minimizing soldier mindset, can help us to think more clearly and with greater intellectual honesty about our beliefs and our reasons for holding them. I cannot recommend Julia Galef’s book any more highly. I would also recommend her TEDx Talk, “Why ‘scout mindset’ is crucial to good judgment”

Notes

  1. Julia Galef, The Scout Mindset: Why Some People See Things Clearly and Others Don’t (New York: Porfolio, 2021).
  2. Dan M. Kahan, “Ordinary science intelligence’: a science-comprehension measure for study of risk and science communication, with notes on evolution and climate change,” Journal of Risk Research 20, no. 8 (2017), 995-1016.
  3. Caitlin Drummond and Baruch Fischoff, “Individuals with greater science literacy and education have more polarized beliefs on controversial science topics,” Proceedings of the National Academy of Sciences 114, no. 36 (Sept. 2017), 9587-9592.