I think most of us can relate to this point:
There are many reasons why we find pseudoscience persuasive, according to Dr Micah Goldwater, a cognitive scientist at the University of Sydney.
It is often simpler for us to add knowledge than subtract it, he said.
“It’s actually much easier to add things to your mental model of how things work than to take things away.”
That quote, and the ones following, are from
- Bogle, Ariel. 2020. “Why We Fall for Online Pseudoscience – and How to Stop.” ABC News. March 8, 2020. https://www.abc.net.au/news/science/2020-03-08/the-goop-effect-why-online-pseudoscience-is-so-seductive/12008148.
This principle sounds like what happens when we have a gut-rejection of a new idea we hear for the first time simply because it is one that does not fit our current framework of understanding.
If we are already convinced of, or suspect, something to be true then we are likely to be partial to any thought that reinforces our leanings. It’s more satisfying to “find answers” or explanations for what we understand about how things are than it is to keep finding reasons to knock anything we think we know out of left field. Think confirmation bias. Do we naturally prefer to find ways to fill our cup than to look for excuses to keep spilling its contents?
There’s also the ‘illusory truth effect’, where the more familiar something sounds, the more likely you are to believe it’s true.
As a rule we tend to prefer to believe our “own media” than that of a foreign country. Does not an American prefer to believe the New York Times or Fox News than the China Daily? If one spends a lot of time listening to conspiracy theories then any subsequent suggestion that something behind a government statement is problematic or unclear will be interpreted in a way to reinforce the idea of a conspiracy theory. If one is brought up in a fundamentalist Christian household and community then one is surely more likely to believe anything that tends to reinforce what was has been taught all one’s life about one’s faith and to be suspicious of contradictory ideas.
As a journalist who reports on online misinformation, I’ve spent plenty of time in anti-vaccination Facebook groups or in internet forums that suggest herbal remedies protect against the coronavirus.
In those groups, it’s easy to observe the seductive nature of personal stories. A friend’s nephew whose case of the measles was cured by tea tree oil is more engaging than a dozen dry public health announcements.
The personal anecdote. The personal drama. It’s always going to have an emotional appeal that will tend to be lacking in mere dry statistics.
One study conducted by Dr Goldwater, which has so far been presented at a conference, attempted to understand the power of positive and negative anecdotes.
Participants in the study were assessed on how stories about the impact of medical treatments on ‘Jamie’ (a fictional person) affected whether they would use the same treatment.
Even though they were told that the treatment worked for most people, knowing one negative personal story — about Jamie’s symptoms failing to improve — often made study participants report that they would not want to take that treatment.
“When you are affected by an anecdote, what you are potentially doing is generalising from a single case to your life,” he said. “But it’s possible that it just made you feel icky [about the treatment].”
The appeal of the personal anecdote is probably also why we like to indulge in and be swayed by the ad hominem personal attack on someone saying something we don’t favour. I guess the personal anecdote’s power is also why it features so prominently in evangelical efforts. It’s not just in the world of religion, either.
Humans like explanations that help them predict how the world works.
We are constantly thinking about cause and effect. That can lead us to a bias called ‘illusory causation’, where we interpret a causal effect when there really isn’t one.
If you take herbal medication for a cold and then get better two days later, you might assume the medicine did the trick.
“The bias people have is they don’t think, ‘wait, what would have happened if I didn’t take that herbal medication?’,” Dr Goldwater said.
“Well, you probably would have gotten better in two days just the same.”
The antidote:
“If you are constantly sceptical of your own thinking, that is potentially the best way to vet yourself,” he said.
But this is a struggle, even if it’s your life’s work.
I try. Or at least I like to think I try.
Neil Godfrey
Latest posts by Neil Godfrey (see all)
- Jesus Mythicism and Historical Knowledge, Part 4: Did Jesus Exist? - 2024-11-27 08:20:47 GMT+0000
- Jesus Mythicism and Historical Knowledge, Part 3: Prediction and History - 2024-11-24 09:10:07 GMT+0000
- Jesus Mythicism and Historical Knowledge, Part 2: Certainty and Uncertainty in History - 2024-11-18 01:15:24 GMT+0000
If you enjoyed this post, please consider donating to Vridar. Thanks!