2021-01-17

When, Why and How People Change Their Minds

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

by Neil Godfrey

I especially loved these words towards the end of the same podcast — I cannot pinpoint a single “straw that broke the back” of my religious beliefs; I look back instead at a series of moments that led me towards atheism. One can also understand why it is so easy to demonize those on “the other side” of a political or religious fence and from there begin to appreciate what it takes for our minds to change.

People are not perfect Bayesian reasoners as much as we would like to aspire to be. People do not have a set of priors that are well delineated and then collect new data and update them according to Bayes’s formula, that’s not what people do. But that doesn’t mean that people don’t change their minds, people change their minds all the time.

soft landing

1:29:48.7 SC: What often happens is something that can be very familiar to physicists who know about phase transitions, the thing that causes someone to change their mind might not be, and in fact, rarely is the straw that broke the camel’s back. There can be a little thing that they get, the little piece of information and experience, whatever it is, that is associated in time with the moment they change their mind. But the actual cause of them changing their mind is a set of many, many things stretching back in time, okay? You have a person with an opinion, with a belief, a credence in a certain proposition, and they get data that is against that proposition, and data in the very broadest sense, it’s not like they’re being physicists, but they get information, experiences, new stories, conversations with friends, that cause them to think about that particular proposition, and then they don’t change their mind immediately, ’cause that’s not how people work, but that has an effect on them. Even if the effect is invisible at the level of their actual beliefs in propositions, hearing that thing can nevertheless affect them at a deeper level.

1:30:56.8 SC: And if they hear something else, and something else, and something else over a period of time, they can eventually be led to change their mind without it ever being possible to associate the reason for that change with a particular piece of information that they got. Not to mention the fact that often, this data in a very, very broad sense is not data. In other words, the thing that is causing people to change their minds is not some piece of information or some rational argument, but something much more visceral, something much more emotional. Realizing that this person who is a member of a group that they have hated and denigrated for years, they meet a member of that group and become friends with them, suddenly maybe their minds change, right? You are against gay people getting married and then you have a child who turns out to be gay and wants to get married, maybe you change your mind, right? For no especially good reason epistemically, rationally, but you realize that, “I wasn’t really that devoted to that opinion in the first place.”

1:31:55.4 SC: There are many ways to change people’s minds, and it really does happen, and all of this is just to say it’s worth trying. It’s not worth trying reaching out to the extremists, to the crazies, but there are plenty of people who are not like that. There are plenty of people who are just not that devoted. And those people might not be wedded to the views that they very readily profess to believe in right now. This is part of the challenge of democracy, those people count, just as much as the most informed voters count. And of course, there are hyper-informed voters who are extremists on both sides, so it’s not just a matter of information levels, but there are people who are, in principle and in practice, reachable and people who are not, and we should try to reach the ones who are reachable. And again, I would give that advice to the other side as well, if the other side thinks that they wanna reach some people who are on the opposite side, they can try to reach me and I’m here to be reached, right?

1:32:54.6 SC: Change takes time. Often it is not a matter of marshaling better arguments, it’s just setting a good example, providing people with a soft landing. One of the hardest things about changing your mind politically is that it is associated with a million other things in your life, your friendship networks, your families, etcetera, your beliefs about many different things. The joke we had back in George W. Bush’s days, I think Michael Berube was the first person who’ve made this joke, but the joke was, “Well, yeah, I was a life-long Democrat but then 9/11 happened, and now I’m outraged about Chappaquiddick.” The point is, for those of you young people out here, Chappaquiddick was this scandal where Ted Kennedy was in an automobile accident and Mary Jo Kopechne, a woman who was in the car with him, and he plunged into the river and she died, she drowned and he was able to swim to shore, and survived obviously, and continued in the Senate. And Republicans were outraged though, this was like a terrible thing, and Democrats made excuses for it.

1:33:57.7 SC: And the joke being that once you change your tribal political affiliation, your opinion about this historical event changes along with it, because these are connected to each other. And so, I wanna mention this in the opposite way also, so not just that all of these other opinions will change along with you if you do change your mind about something, but that in order to get someone to change their mind, you have to make it seem reasonable for them to live in a whole another world, right? For them to live in a world where a whole set of beliefs are no longer taken for granted in a certain way. That’s what it means by offering a soft landing.

Anthony Pinn

1:34:33.9 SC: One of the very first podcast I did was with Tony Pinn, who was an atheist theologian, who reaches out to black communities and tries to spread the good word of atheism to them. And one of the points he made over and over again is that black people are very religious in part because atheism does not provide them with a soft landing. You can make a rational argument that God doesn’t exist, but they need to figure out a way to live their lives and in the lives of many black communities, religion plays an important role, and if you simply say, “Well, we’re not gonna replace that role, you gotta learn to live with it,” then they’re not gonna be persuaded to go along with you. So part of persuading the other side and reaching out to it is making them feel welcome. And again, I get it if this seems hard to do, if you just want these people to be punished and they don’t deserve it, etcetera, etcetera, I get that, but that’s gonna make living in a democracy harder for all of us, if that’s the attitude we all take.

The key takeaway point makes the third point here the one to think about the most:

 


2018-12-07

So this is why so many bosses are jerks, and other depressing thoughts for the day

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

by Neil Godfrey

From The bad news on human nature, in 10 findings from psychology by Christian Jarrett

A few excerpts:

We favour ineffective leaders with psychopathic traits. The American personality psychologist Dan McAdams recently concluded that the US President Donald Trump’s overt aggression and insults have a ‘primal appeal’, and that his ‘incendiary Tweets’ are like the ‘charging displays’ of an alpha male chimp, ‘designed to intimidate’. If McAdams’s assessment is true, it would fit into a wider pattern – the finding that psychopathic traits are more common than average among leaders. Take the survey of financial leaders in New York that found they scored highly on psychopathic traits but lower than average in emotional intelligence. A meta-analysis published this summer concluded that there is indeed a modest but significant link between higher trait psychopathy and gaining leadership positions, which is important since psychopathy also correlates with poorer leadership.

Another one of the ten says we are moral hypocrites. I know that’s true. I’m one myself. I like to think I’m a vegetarian for ethical reasons but I continue to eat fish.

This one is so depressing. I have spent most of my adult life believing in the power of education, only to learn it probably only has an effect on those who want to be better anyway.

We are blinkered and dogmatic. If people were rational and open-minded, then the straightforward way to correct someone’s false beliefs would be to present them with some relevant facts. However a classic study from 1979 showed the futility of this approach – participants who believed strongly for or against the death penalty completely ignored facts that undermined their position, actually doubling-down on their initial view. This seems co occur in part because we see opposing facts as undermining our sense of identity. It doesn’t help that many of us are overconfident about how much we understand things and that, when we believe our opinions are superior to others, this deters us from seeking out further relevant knowledge.

And do be careful not to tread on any ants from now on because they have feelings too, you know …. Bee-brained (Are insects ‘philosophical zombies’ with no inner life? Close attention to their behaviours and moods suggests otherwise).

And if you thought things really are getting worse it’s not simply concept creep either. The world really is going the way of the tediously saintly young. So says Matt Ridley whose books I once found happily enlightening.

That’s enough wallowing in misery for one weekend.

 


2018-07-28

Concept Creep

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

by Neil Godfrey

When I hear of how Russia “attacked” the USA in the 2016 elections, and when I hear of verbal abuse being labeled a form of “violence”, and when I think a purple dot is blue because fewer blue dots have been appearing lately, then I think of “concept creep”. And then I recall how many of those of us leaving cults or other extreme fundamentalist churches were said to be experiencing the same disorder as returning soldiers with war experiences, “post traumatic stress disorder”.

From The Conversation: When Your Brain Never Runs Out of Problems….

Concepts that refer to the negative aspects of human experience and behavior have expanded their meanings so that they now encompass a much broader range of phenomena than before. This expansion takes “horizontal” and “vertical” forms: concepts extend outward to capture qualitatively new phenomena and downward to capture quantitatively less extreme phenomena. The concepts of abuse, bullying, trauma, mental disorder, addiction, and prejudice are examined to illustrate these historical changes. In each case, the concept’s boundary has stretched and its meaning has dilated. A variety of explanations for this pattern of “concept creep” are considered and its implications are explored. I contend that the expansion primarily reflects an ever-increasing sensitivity to harm, reflecting a liberal moral agenda. Its implications are ambivalent, however. Although conceptual change is inevitable and often well motivated, concept creep runs the risk of pathologizing everyday experience and encouraging a sense of virtuous but impotent victimhood.

Why do some social problems seem so intractable? In a series of experiments, we show that people often respond to decreases in the prevalence of a stimulus by expanding their concept of it. When blue dots became rare, participants began to see purple dots as blue; when threatening faces became rare, participants began to see neutral faces as threatening; and when unethical requests became rare, participants began to see innocuous requests as unethical. This “prevalence-induced concept change” occurred even when participants were forewarned about it and even when they were instructed and paid to resist it. Social problems may seem intractable in part because reductions in their prevalence lead people to see more of them.

  • Levari, David E., Daniel T. Gilbert, Timothy D. Wilson, Beau Sievers, David M. Amodio, and Thalia Wheatley. 2018. “Prevalence-Induced Concept Change in Human Judgment.” Science 360 (6396): 1465–67. https://doi.org/10.1126/science.aap8731.

Continue reading “Concept Creep”


2011-12-11

A thinking cap that really works – & why the most educated can often be the most closed minded

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

by Neil Godfrey

English: Communications Director Jay Walsh, wi...
Image via Wikipedia

It’s not what you don’t know that’s the problem — it’s what you do know! And the more you know the bigger the problem! Here’s the hypothesis that was tested — with a genuine “thinking cap” —  and reported back in February this year:

The more we know, the more close-minded we are; in other words, the better informed we become, the less intuitive it is to “think outside the box”.

Check this science news archive, or if you’d rather just listen to talk then advance the sound recording here to about the 4th minute of the “Listen Now” or “Download” and start listening.


2010-07-22

All those Arab “God” phrases (May Allah protect you) – What they really mean

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

by Neil Godfrey

There’s an interesting and amusing Guardian article on how easy it is to make gaffes and to make Arabs look like religious geeks — even if they are atheistic communists — by Marie Dhumières.

It’s titled Bad Translation Makes Fundamentalists Of Us All. It begins:

Religious phrases are scattered liberally throughout Arabic languages. The secret to translating is not to take them literally.

Examples:

“Praise be to God” (Alhamdulilah), which can mean “I am fine”, “Cool, the electricity is back” or “Ah, you finally managed to pronounce this word”, and so many other things.

In Lebanon, they even use “May God dress you” when seeing a hot girl wearing a skirt or a top, meaning I guess, “Please God, quickly cover this great body before I jump on it.”

The same goes with insults: May God destroy your house, May God burn your religion, May God infect you with disease… It all sounds very scary, but be reassured, they don’t really mean it. And I am pretty sure that if God were actually to destroy your house at the moment they say it, they would feel kind of bad.

It’s all an enjoyable and informative read. And the comments at the Infoclearinghouse.info site are worth reading, too, for maybe a bit of balance to the article itself.

Check it out http://www.informationclearinghouse.info/article25973


2010-07-15

How Facts Backfire (Why facts don’t change people’s opinions)

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

by Neil Godfrey

FACTS ubt 2
Image via Wikipedia

How Facts Backfire is an article by Joe Keohane in the Boston Globe discussing a major threat to democracy. I select only those portions of this article that have more general relevance, and that can just as well apply to scholarly debates, and the mythicist-historicist arguments in particular.

Here’s the first startling passage (with my emphasis etc):

Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.

In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

And then there is this:

[People] aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

And this:

Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information.

And this

A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. . . . . . . the ones who were the most confident they were right were by and large the ones who knew the least about the topic. . . . . Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.

Is Self-Esteem a factor?

Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.

What of the most highly educated?

A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong.