Tag Archives: Psychology

So this is why so many bosses are jerks, and other depressing thoughts for the day

From The bad news on human nature, in 10 findings from psychology by Christian Jarrett

A few excerpts:

We favour ineffective leaders with psychopathic traits. The American personality psychologist Dan McAdams recently concluded that the US President Donald Trump’s overt aggression and insults have a ‘primal appeal’, and that his ‘incendiary Tweets’ are like the ‘charging displays’ of an alpha male chimp, ‘designed to intimidate’. If McAdams’s assessment is true, it would fit into a wider pattern – the finding that psychopathic traits are more common than average among leaders. Take the survey of financial leaders in New York that found they scored highly on psychopathic traits but lower than average in emotional intelligence. A meta-analysis published this summer concluded that there is indeed a modest but significant link between higher trait psychopathy and gaining leadership positions, which is important since psychopathy also correlates with poorer leadership.

Another one of the ten says we are moral hypocrites. I know that’s true. I’m one myself. I like to think I’m a vegetarian for ethical reasons but I continue to eat fish.

This one is so depressing. I have spent most of my adult life believing in the power of education, only to learn it probably only has an effect on those who want to be better anyway.

We are blinkered and dogmatic. If people were rational and open-minded, then the straightforward way to correct someone’s false beliefs would be to present them with some relevant facts. However a classic study from 1979 showed the futility of this approach – participants who believed strongly for or against the death penalty completely ignored facts that undermined their position, actually doubling-down on their initial view. This seems co occur in part because we see opposing facts as undermining our sense of identity. It doesn’t help that many of us are overconfident about how much we understand things and that, when we believe our opinions are superior to others, this deters us from seeking out further relevant knowledge.

And do be careful not to tread on any ants from now on because they have feelings too, you know …. Bee-brained (Are insects ‘philosophical zombies’ with no inner life? Close attention to their behaviours and moods suggests otherwise).

And if you thought things really are getting worse it’s not simply concept creep either. The world really is going the way of the tediously saintly young. So says Matt Ridley whose books I once found happily enlightening.

That’s enough wallowing in misery for one weekend.

 

Concept Creep

When I hear of how Russia “attacked” the USA in the 2016 elections, and when I hear of verbal abuse being labeled a form of “violence”, and when I think a purple dot is blue because fewer blue dots have been appearing lately, then I think of “concept creep”. And then I recall how many of those of us leaving cults or other extreme fundamentalist churches were said to be experiencing the same disorder as returning soldiers with war experiences, “post traumatic stress disorder”.

From The Conversation: When Your Brain Never Runs Out of Problems….

Concepts that refer to the negative aspects of human experience and behavior have expanded their meanings so that they now encompass a much broader range of phenomena than before. This expansion takes “horizontal” and “vertical” forms: concepts extend outward to capture qualitatively new phenomena and downward to capture quantitatively less extreme phenomena. The concepts of abuse, bullying, trauma, mental disorder, addiction, and prejudice are examined to illustrate these historical changes. In each case, the concept’s boundary has stretched and its meaning has dilated. A variety of explanations for this pattern of “concept creep” are considered and its implications are explored. I contend that the expansion primarily reflects an ever-increasing sensitivity to harm, reflecting a liberal moral agenda. Its implications are ambivalent, however. Although conceptual change is inevitable and often well motivated, concept creep runs the risk of pathologizing everyday experience and encouraging a sense of virtuous but impotent victimhood.

Why do some social problems seem so intractable? In a series of experiments, we show that people often respond to decreases in the prevalence of a stimulus by expanding their concept of it. When blue dots became rare, participants began to see purple dots as blue; when threatening faces became rare, participants began to see neutral faces as threatening; and when unethical requests became rare, participants began to see innocuous requests as unethical. This “prevalence-induced concept change” occurred even when participants were forewarned about it and even when they were instructed and paid to resist it. Social problems may seem intractable in part because reductions in their prevalence lead people to see more of them.

  • Levari, David E., Daniel T. Gilbert, Timothy D. Wilson, Beau Sievers, David M. Amodio, and Thalia Wheatley. 2018. “Prevalence-Induced Concept Change in Human Judgment.” Science 360 (6396): 1465–67. https://doi.org/10.1126/science.aap8731.

read more »

A thinking cap that really works – & why the most educated can often be the most closed minded

English: Communications Director Jay Walsh, wi...
Image via Wikipedia

It’s not what you don’t know that’s the problem — it’s what you do know! And the more you know the bigger the problem! Here’s the hypothesis that was tested — with a genuine “thinking cap” —  and reported back in February this year:

The more we know, the more close-minded we are; in other words, the better informed we become, the less intuitive it is to “think outside the box”.

Check this science news archive, or if you’d rather just listen to talk then advance the sound recording here to about the 4th minute of the “Listen Now” or “Download” and start listening.

All those Arab “God” phrases (May Allah protect you) – What they really mean

There’s an interesting and amusing Guardian article on how easy it is to make gaffes and to make Arabs look like religious geeks — even if they are atheistic communists — by Marie Dhumières.

It’s titled Bad Translation Makes Fundamentalists Of Us All. It begins:

Religious phrases are scattered liberally throughout Arabic languages. The secret to translating is not to take them literally.

Examples:

“Praise be to God” (Alhamdulilah), which can mean “I am fine”, “Cool, the electricity is back” or “Ah, you finally managed to pronounce this word”, and so many other things.

In Lebanon, they even use “May God dress you” when seeing a hot girl wearing a skirt or a top, meaning I guess, “Please God, quickly cover this great body before I jump on it.”

The same goes with insults: May God destroy your house, May God burn your religion, May God infect you with disease… It all sounds very scary, but be reassured, they don’t really mean it. And I am pretty sure that if God were actually to destroy your house at the moment they say it, they would feel kind of bad.

It’s all an enjoyable and informative read. And the comments at the Infoclearinghouse.info site are worth reading, too, for maybe a bit of balance to the article itself.

Check it out http://www.informationclearinghouse.info/article25973

How Facts Backfire (Why facts don’t change people’s opinions)

FACTS ubt 2
Image via Wikipedia

How Facts Backfire is an article by Joe Keohane in the Boston Globe discussing a major threat to democracy. I select only those portions of this article that have more general relevance, and that can just as well apply to scholarly debates, and the mythicist-historicist arguments in particular.

Here’s the first startling passage (with my emphasis etc):

Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.

In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

And then there is this:

[People] aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

And this:

Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information.

And this

A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. . . . . . . the ones who were the most confident they were right were by and large the ones who knew the least about the topic. . . . . Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.

Is Self-Esteem a factor?

Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.

What of the most highly educated?

A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong.