I’ve talked to creationists one-on-one about this before, and they can’t tell me what I’m thinking at all accurately — it’s usually some nonsense about hating God or loving Satan, and it’s not at all true. But at the same time, I’m able to explain to them why they’re promoting creationism in a way they can agree with. — PZ Myers
PZ’s quandary reminds me of my own attempts to discuss political topics (terrorism, Islam, Israel and Palestine) and “religious” ones (methods used by Christian origins scholars, mythicism) with both academics and lay folk. Yesterday I read Jerry Coyne’s complete failure to explain the meaning of Zionism. Coyne has very strong views about Israel but he does not know what Zionism is or why some people oppose it. I have found the same ignorance when it comes to Islamist terrorism and Islam itself in a number of discussions here on this blog. Ironically that ignorance sometimes expresses itself in response to posts where I have cited or directly quoted serious research into the questions. Some people appear to ignore the explanations of the ideas they are supposedly responding to.
I once spent many, many exchanges with a Butler university then associate professor comparing the evidence for Socrates and Jesus. I could not understand why he appeared to keep repeating arguments that I thought I had so clearly demonstrated were false so I asked him to tell me what he understood my argument to be. It took quite a while but eventually he did respond and he stymied me by responding with a nonsensical idea that completely missed my point. I can only assume he was sincere and he really was not registering what I was writing in my exchanges with him. We have seen the same travesty with his inability to explain the most fundamental arguments of Earl Doherty and Richard Carrier even after supposedly reading sections of their books. But that’s no surprise because we saw the same distortions in Ehrman’s and Casey’s claims to have read and responded to mythicist arguments.
They — people like Coyne and McGrath — are not really engaging with the arguments of their opponents. They really do not know what their opponents are arguing.
But then I have to confess that I sometimes have rushed to conclusions about political and religious claims and other situations on some sort of instinct, or certainly with knee-jerk reactions. I do know that there was a time when I was like PZ Myer’s creationists. I was confident that I knew the fallacies at the heart of evolution and the thinking of scientists who wrote about it. And I know I have a tendency to form instant judgments when I listen to certain politicians speak.
So it was with interest that I read Rick Shenkman’s discussion of two types of thinking in Political Animals: How Our Stone Age Brain Gets in the Way of Smart Politics.
[E]volution teaches us to think quickly. In the life-and-death setting common in the world of hunter-gatherers, speed was of the essence in sizing up both people and situations. We couldn’t let anything get in the way of our making up our minds, not even an absence of facts. In circumstances where we lacked facts— a common occurrence in the real world— we found other bases upon which to make a decision. The point was to act. Dillydallying could kill you.
The legacy of this evolutionary inheritance is that today we leap to make decisions even when we don’t need to. Instead of waiting for facts we rush to judgment. Though in the modern world we are seldom called on to render a lightning-fast, life-or-death judgment involving a politician, that’s what we do. We can’t help ourselves. We are hardwired to think fast rather than to reflect at length.
Fast thinking (also known as System 1), as the pioneering psychologist Daniel Kahneman points out, is easy. It doesn’t require us to dwell. It really doesn’t require us to think at all, at least as most people define thinking. That’s because it mostly happens in the unconscious, where most of our brain functioning actually takes place. As psychologist Michael Gazzaniga informs us, “98 percent of what the brain does is outside of conscious awareness.” When Michael Jordan dunks a ball he doesn’t think through all the steps he needs to take to gain lift, angle his arms, and provide thrust. He performs these tasks automatically. If he suddenly tried to think about what he’s doing when dunking a ball he’d probably stumble. Reflection gets in the way of the performance of tasks that are usually left to the unconscious. Why is that? Reflection takes time. It’s slow thinking (System 2). Literally slow. Operations in the brain involving the unconscious are five times faster than those involving consciousness.
How do we arrive at a quick decision? We use shortcuts, what social scientists refer to as heuristics. Quick— which of these capital cities in Africa has the most people?
3. Cape Town
The answer is Cape Town. How do you know this? Because you have heard of Cape Town (pop. 3.74 million), and you probably haven’t heard of Libreville (pop. 797,000) or Asmara (pop. 649,000). Your brain concluded that since you haven’t heard of either city, chances are they aren’t very big. This is an example of the recognition heuristic. If we recognize something, our brain automatically assumes it must be because it’s important. Why do we vote for people whose names we recognize on the ballot even if we know nothing about them? It’s because we recognize their names. Our mere recognition of them must mean that they are known for something, and in the absence of a strong negative cue, we naturally believe it must be something positive. In fact, social scientists have discovered that familiarity seems to have the same effect on us as happiness. We get a charge in the reward center of our brain when we experience the familiar. And when do we become less analytical? When we are engaged in System 1 thinking.
Shenkman, Rick (2016-01-05). Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (pp. 53-54). Basic Books. Kindle Edition.
My System 1 thinking is telling me that that is “so true”! But will that snap judgment stand up to the test of System 2 thought?
Viewers reacted to these stories with anger. That was the right reaction. In this case, it’s self-evident, we could trust our instincts. We didn’t actually have to witness a beheading in person to feel the horror of it. The act of chopping off someone’s head is so appalling we could almost feel it as if we were present. We could live the news even as we experienced it vicariously (especially since we could watch a video showing what happened, even if the networks cut away before the final fatal blow was struck). That was our Pleistocene brain working the way it should under modern circumstances.
But what should we do when it doesn’t? One solution is to switch from System 1 to System 2. System 1, as you’ll recall, is our automatic system. It’s the system we use when our thinking is guided primarily by our emotions and instincts out of conscious awareness. System 2 is higher-order cognitive thinking, and it happens in conscious awareness. When we can’t rely on System 1 we have to switch to System 2. In theory, this shouldn’t be a problem. System 2 is designed for just such situations. When System 1 isn’t giving us the results we need, System 2 is supposed to kick in automatically. The eminent psychologist Jeffrey Alan Gray proposed that the reason we as a species developed consciousness in the first place was to allow us to adapt when we encounter situations where our emotions and our instincts don’t work. Consciousness, he explained, is an adaptive mechanism that allows humans to detect errors and make corrections. When System 1 isn’t performing sufficiently well, our conscious system— System 2— is supposed to take over, giving us the ability to respond creatively to problems we encounter.
The trouble is that our signaling network often doesn’t switch us over to System 2 when it should. Our brain’s surveillance system should detect the fact that in politics the context is usually wrong and that we need to be switching to System 2 as a result. But it doesn’t. This is why our Pleistocene brain, in the modern world, often seems to misfire.
In retrospect, when I was defending Nixon as a seventeen-year-old, I should have switched to System 2 thinking. As the evidence against him piled up I needed to question my assumptions and reevaluate my commitment. But my surveillance system didn’t sound an alarm. So I stayed on autopilot, reacting to the news rather than thinking hard about it. If you had asked me, I would have told you I was thinking hard about Watergate. I was reading everything I could get my hands on about the scandal and was following the story’s complicated twists and turns. But I didn’t realize that System 1 instincts were guiding my reactions. That’s because System 1 operates behind the scenes, in hiding places we seldom think to examine. I’d heard of cognitive dissonance. It was on the agenda of my Psych 101 class. But I didn’t spot it in myself. It didn’t occur to me to think that the more effort I put into my defense of Nixon the more I was likely to keep on defending him. I was using System 1 thinking but didn’t realize it. I was confident I was thinking, not just reacting.
This suggests that one of our most urgent tasks is to study ourselves. This is hardly a revelation we needed science to tell us. Plato told us two thousand years ago to examine our lives. But it is not obvious that if we want to understand the modern world and make good political choices we have to start by understanding the 98 percent of our brain that is inaccessible to conscious awareness. Plato didn’t mention that. What science teaches us is that as we look outward, we need to look inward simultaneously. We have to question our intuitions, which is counterintuitive.
In effect, we have to serve as our own watchdogs, ever on the lookout for System 1 thinking. We have to be our own “gotcha” cops. In the game of gotcha the media play, politicians are singled out when they commit a faux pas. Our job is to try to call ourselves out whenever we can when we catch ourselves operating on automatic pilot, particularly when serious issues are at stake. That’s the only way we can be sure our Pleistocene brain is helping us react the way we should be. Sometimes, as we’ve seen in our reaction to the ISIS beheadings, our System 1 reaction is the right reaction. But the only way to know if we are reacting properly is to put our political reactions under a microscope. In effect, we have to keep ourselves under glass as if we were our own science experiment. Now it’s not realistic to believe that we are going to be able to do this consistently. It would be too tiring to subject ourselves to this kind of self-scrutiny constantly.
Shenkman, Rick (2016-01-05). Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (pp. 217-218). Basic Books. Kindle Edition.
I guess if I can be thankful for anything that came out of my own past cult experience it was that it came to traumatic end. It was a traumatic experience to realize I had been so completely wrong about the most fundamental things of life and that I was left hanging for a time with no idea what to believe anymore, feeling adrift without sky above or ground beneath, totally lost and adrift for a time. Tim has also learned the lesson of humility from a similar experience:
The one thing that all ex-fundamentalists live with is the sure knowledge of how wrong we can be. I was absolutely certain that the world was created in six days just a few thousand years ago. I was wrong. I stood my ground and argued with people about the craziest ideas and I was absolutely, unshakably sure that I had the Truth. I did not have the truth. I wasn’t just misinformed about some trivial matter. I was dead wrong on almost everything.
There’s nothing like humiliation to make you humble.
I had to rethink everything. Beginning with life itself: what is it? Plants have life, so do lizards. and fungus. I began to rethink everything in System 2 mode. My views on life, social and political issues, people, all sorts of people, began to undergo serious System 2 review. It’s one more classic Damascus Road story but I try not to let the things I learned then become my new System 1 impulses. Paul and reformed smokers can be such jerks. So from that perspective I guess those of us who have been through this sort of experience can learn to accept and find some good out of that wasted fundamentalist life that left so much hurt in its wake.
It is common sense to think that emotion should be divorced from reason and can be. But that is not what science is finding. Common sense is wrong. The way the brain works, neuroscientists tell us, is that emotion and reason work together. You cannot separate them. The problem with those voters who waited eleven long months before finally deciding that Nixon had abused his office and violated their trust was not that they had been too emotional, but not emotional enough. They had left their reaction to Nixon on autopilot. Woodward and Bernstein had been breaking stories that should have made voters’ hair stand on end. Instead, they barely took notice.
What was missing? Anxiety— one of our key emotions. Voters weren’t anxious enough. Science has established that we digest information about the world in two ways, using System 1 or System 2.
System 1 Thinking
Because, as we learned earlier, most of what happens in the world is predictable, our brain mostly has to deal with challenges that are familiar. For this, System 1 is perfectly adequate. The way System 1 works is simple. It matches everything it encounters to a familiar pattern. In effect, our brain faces the same challenge as the contestants on the 1960s television game show Concentration, who had to guess what they were seeing by matching the fragments on the screen with the images already in their head. As with those contestants, our brain has to figure out if what it’s seeing is a dog, a cat, a tree, or something else, using as a reference point its databank of memories. Success comes when it makes a match. The match doesn’t have to be perfect. We don’t need to match up a phantom “parti poodle” that we happen upon with a memory of one in our brain to know it’s a poodle. Even if we have never before seen a poodle of that type (it’s rare) we can still figure out it’s a poodle. The brain performs this task so seamlessly, we aren’t even aware of what it’s doing. It just does it.
The brain uses the same system when performing motor functions. When you drink a cup of coffee you don’t have to think about it. You don’t have to consciously think about every step in the process as your fingers reach for the cup, grasp the handle, and bring it to your lips. Your unconscious brain, using a process known as the habit execution system, handles all of these actions seamlessly by matching what you want to do with a pattern of behavior you previously executed. It does this by using System 1, the system that Michael Jordan uses when he dunks a basketball. He does not consciously order his hands to reach for the ball while telling his legs to speed up as he makes his way down the court. He just automatically does these things. This is what athletes do. They practice like crazy so that when the ball is thrown to them they know what to do with it automatically. This is why practice makes perfect. The more we practice, the less we have to think consciously about what we’re doing. We can just let our unconscious brain (System 1) take over. And because our unconscious brain works at a phenomenally faster speed than our conscious brain, we can perform at a level that seems superhuman. This is the miracle of System 1 thinking.
System 2 Thinking
But the world isn’t always predictable. That’s why our brain is designed to pick out what’s novel. A surveillance system in our brain is constantly searching the environment for anything that seems strikingly different from what it has encountered before. It’s your surveillance system that goes into action when you see something you can’t believe and your eyes widen to take in more of the scene, or you smell something that’s a bit disturbing and your nostrils flare to give you a better sniff. Threats get particular attention. And when the surveillance system cannot find a match in its memory for what it has come across, it sends up a flare from the amygdala to our conscious brain to take notice, using System 2 to make sense of it. The emotion you feel when this happens is anxiety.
The reason those pro-Nixon voters took so long to come around to the obvious truth that Nixon was engaged in a cover-up of astonishing proportions was because for eleven months they did not feel anxious enough to revisit the assumptions they had made when deciding initially to vote for him. Flares went up. They were ignored. Not until the Watergate story snowballed with the resignations of Haldeman and Ehrlichman in the spring of 1973 did the avalanche of bad news finally trigger an amygdala reaction of sufficient force that it got voters to do what we humans hate to do: make the decision to change our minds. When exactly does this happen? George Marcus has discovered that it happens when the burden of hanging on to a belief becomes greater than the cost of changing it. That’s one of the key findings of what’s come to be known as the Theory of Affective Intelligence, the theory Marcus and his colleagues developed when they began studying emotion. Anxiety is particularly important because it’s like acid. It eats through the rusting metal of our preconceptions.
Shenkman, Rick (2016-01-05). Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (pp. 131-132). Basic Books. Kindle Edition. (My bolding, formatting and headings)
Living in large well-to-do societies can make us lazy so that we pay less attention to those anomalous signals. Group loyalty/identity and defence of our personal status keep us operating on System 1 way too often.
Latest posts by Neil Godfrey (see all)
- Gods – 2 (An Anthropology of Religion Perspective) - 2020-07-13 09:38:05 GMT+0000
- Gods (An Anthropology of Religion Perspective) - 2020-07-12 09:35:19 GMT+0000
- Once more on The Ascension of Isaiah and the Cathars - 2020-07-09 23:58:51 GMT+0000
If you enjoyed this post, please consider donating to Vridar. Thanks!