Center for Strategic Communication

[ by Charles Cameron — on human obstinacy, a change of heart, and what seems to me a major piece from Res Militaris ]
.

There’s a pattern of backlash that occurs when you present people with facts that don’t fit their preconceptions — they don’t switch, they double up. Here’s the opening of io9‘s report, The Backfire Effect shows why you can’t use facts to win an argument:

“Never let the facts get in the way of a good story” isn’t just a maxim for shady politicians and journalists. It’s also the way people often live their lives. One study indicates that there may even be a “backfire effect,” which happens when you show people facts that contradict their opinions.

Then there’s a study — Brendan and Jason Reifler, When Corrections Fail: The persistence of political misperceptions. I won’t go into the details, it’s the pattern it finds that’s of interest to me, but I will note that the title is a tip of the hat to Leon Festinger‘s When Prophecy Fails, a classic study in the same pattern of denial as it applied to a group whose belief in an end time prophecy was not shattered when the day arrived and the world went on as usual…

Here’s how the pattern works:

Participants in the experiments were more likely to experience the Backfire Effect when they sensed that the contradictory information had come from a source that was hostile to their political views. But under a lot of conditions, the mere existence of contradictory facts made people more sure of themselves — or made them claim to be more sure.

Everyone has experienced the frustration of bringing up pertinent facts, in the middle of an argument, and having those facts disregarded. Perhaps the big mistake was not arguing, but bringing up facts in the first place.

Okay? That’s a veeery interesting pattern to think about any time you’re considering ways to persuade people to change their minds during, for instance, a CVE campaign.

I’d like to dig into it a great deal more, of course.

**

Maajid Nawaz, a former recruiter for Hizb ut-Tahrir who renounced his membership and is now Chairman of the counter-extremist Quilliam Foundation, seems to have persuaded Tommy Robinson, until recently a leader of the English Defence League, to renounce the EDL and join Qulliam — a move whose results and second-order effects have yet to be seen. Both men, however, offer us examples of people who have in fact changed their minds on matters of profound belief, religious and political, and the odd uncomfortable fact may have played some role in those changes.

The role of anomalies (cf. “outliers”) in Kuhn‘s The Structure of Scientific Revolutions comes to mind.

And if showing people the error of their ways (a very loose equivalent of telling them unwelcome facts, I’ll admit) doesn’t work, here’s another anomaly that I ran across only yesterday, that “proves the rule” by, well, partially disproving it.

Dutch ex-politician Arnoud van Doorn, previously a senior member of Geert Wilders‘ fiercely anti-Islamic party, has changed his mind — or his heart was changed for him, within him, depending on your perspective. He has made the Shahada and is henceforth Muslim himself. In this photo, van Doorn is performing the Hajj, the pilgrimage to circumambulate the Kaaba in Mecca:

Do I detect a hint of enantiodromia here?

**

In closing, I would like to offer this link to an article in Res Militaris by Jean Baechler, titled Outlines of a psychology of war. It’s a weighty piece, as befits its grand sweep, and I believe it throws some light on the obstinacies of the mind to which this post is addressed.

I tried excerpting it, but it appeared to me that each sentence in every paragraph in turn begged to be highlighted, approved, tweaked, questioned, or disagreed with, and I wound up feeling you should read it for yourselves. I’ll be very interested to see if it captures the attention of the ZP readership, and leads to a more extended discussion…

Share