2016-05-09

Once more on System 1 and System 2 thinking

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

by Neil Godfrey

thinkingfastslowWhen I wrote Do You Understand What You Argue Against I had only just finished reading Richard Shenkman’s Political Animals: How Our Stone Age Brain Gets in the Way of Smart Politics and was sharing some reflections arising out of that book. In particular, I had been thinking about how Shenkman’s overview of recent findings in psychology and related studies helped us understand why so often we find people with very strong opinions about certain things (evolution, mythicism, Zionism, refugees, Muslims, terrorists, politicians, national history, poverty . . . ) even though they are incapable of explaining the viewpoint about those things that they oppose. I opened with something written by PZ Myers:

I’ve talked to creationists one-on-one about this before, and they can’t tell me what I’m thinking at all accurately — it’s usually some nonsense about hating God or loving Satan, and it’s not at all true. But at the same time, I’m able to explain to them why they’re promoting creationism in a way they can agree with.

I discussed this problem in the context of System 1 and System 2 types of thinking. It didn’t take very many comments on that post to send me looking for the main source for that model of System 1 and 2 thinking raised by Shenkman. Shenkman’s discussion was only second hand information. So I have since started reading the primary source: Thinking, Fast and Slow by Daniel Kahneman (2011). When I first heard of that book I impulsively dismissed it (System 1 reaction) because I thought the title and someone’s comment about it meant it was just another pop psychology book. I have since learned I could not have been more wrong. Daniel Kahneman is not a pop psychologist. See The Guardian’s article Daniel Kahneman changed the way we think about thinking. But what do other thinkers think of him?

Excuse me if I copy and paste some paragraphs from the conclusion of Kahneman’s book. Work pressures and bouts of illness have kept me from posting anything more demanding at this stage. There will be some slight shift of understanding of the nature of System 2 thinking in what follows. (Always check the primary sources before repeating what you think you understand from a secondary source!) Bolding is my own. 

The attentive System 2 is who we think we are. System 2 articulates judgments and makes choices, but it often endorses or rationalizes ideas and feelings that were generated by System 1. You may not know that you are optimistic about a project because something about its leader reminds you of your beloved sister, or that you dislike a person who looks vaguely like your dentist. If asked for an explanation, however, you will search your memory for presentable reasons and will certainly find some. Moreover, you will believe the story you make up. But System 2 is not merely an apologist for System 1; it also prevents many foolish thoughts and inappropriate impulses from overt expression. The investment of attention improves performance in numerous activities—think of the risks of driving through a narrow space while your mind is wandering—and is essential to some tasks, including comparison, choice, and ordered reasoning. However, System 2 is not a paragon of rationality. Its abilities are limited and so is the knowledge to which it has access. We do not always think straight when we reason, and the errors are not always due to intrusive and incorrect intuitions. Often we make mistakes because we (our System 2) do not know any better. . . .

System 1 registers the cognitive ease with which it processes information, but it does not generate a warning signal when it becomes unreliable. Intuitive answers come to mind quickly and confidently, whether they originate from skills or from heuristics. There is no simple way for System 2 to distinguish between a skilled and a heuristic response. Its only recourse is to slow down and attempt to construct an answer on its own, which it is reluctant to do because it is indolent. Many suggestions of System 1 are casually endorsed with minimal checking, as in the bat-and-ball problem. This is how System 1 acquires its bad reputation as the source of errors and biases. Its operative features, which include WYSIATI [What you see is all there is], intensity matching, and associative coherence, among others, give rise to predictable biases and to cognitive illusions such as anchoring, nonregressive predictions, overconfidence, and numerous others.

What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely: “This number will be an anchor…,” “The decision could change if the problem is reframed…” And I have made much more progress in recognizing the errors of others than my own.

The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2. This is how you will proceed when you next encounter the Müller-Lyer illusion. When you see lines with fins pointing in different directions, you will recognize the situation as one in which you should not trust your impressions of length. Unfortunately, this sensible procedure is least likely to be applied when it is needed most. We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available, and cognitive illusions are generally more difficult to recognize than perceptual illusions. The voice of reason may be much fainter than the loud and clear voice of an erroneous intuition, and questioning your intuitions is unpleasant when you face the stress of a big decision. More doubt is the last thing you want when you are in trouble. The upshot is that it is much easier to identify a minefield when you observe others wandering into it than when you are about to do so. Observers are less cognitively busy and more open to information than actors. That was my reason for writing a book that is oriented to critics and gossipers rather than to decision makers. . . . .

There is a direct link from more precise gossip at the watercooler to better decisions. Decision makers are sometimes better able to imagine the voices of present gossipers and future critics than to hear the hesitant voice of their own doubts. They will make better choices when they trust their critics to be sophisticated and fair, and when they expect their decision to be judged by how it was made, not only by how it turned out.

The analogy has also been made to decision making in successful corporations. That’s where System 2 thinking is most likely to be successfully implemented. Managers ensure that standards are adhered to, all bases are covered, checklists completed. Laziness is costly and scarcely tolerated. We go easier on ourselves.

The following two tabs change content below.

Neil Godfrey

Neil is the author of this post. To read more about Neil, see our About page.

Latest posts by Neil Godfrey (see all)



If you enjoyed this post, please consider donating to Vridar. Thanks!


2 thoughts on “Once more on System 1 and System 2 thinking”

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from Vridar

Subscribe now to keep reading and get access to the full archive.

Continue reading