David McRaney, in his new book called "How Minds Change" (2022), reviews our understanding of why people can form tenacious beliefs which are resistant to change, leading to political polarization, conspiracy theorists, hate groups, cults, anti-vax groups, climate change denialism, etc.
I have discussed a lot of this material in some of my previous posts. A big focus in McRaney's book is on what strategies are most effective to help with these problems. He shows that simply presenting facts to a person with entrenched beliefs is usually ineffective, and could even cause the person to become even more entrenched in their beliefs. Instead, there are several techniques discussed which have much better success. These techniques are to some degree common-sensical, and are foundations of what might be found in any compassionate interaction, or any psychotherapy scenario.
He discusses several such strategies, including deep canvasing, the elaboration likelihood model (ELM), street epistemology, and motivational interviewing. All of these are similar--I'll summarize the core features here:
1) establish rapport. Empathize. The communicator must seem trustworthy, credible, respectful, and reliable. Obtain consent to talk about the issues at hand.
2) Ask how strongly the person feels about a particular issue; repeat back and clarify; identify a confidence level, such as from 0 to 10; ask how they chose that number; ask how they've judged the quality of their reasons for their choice; summarize; make sure you've done a good job summarizing correctly.
3) If there are core values influencing the person's opinion, such as about the importance of family, community, safety for children, freedom, loyalty, etc. be sure to empathize, acknowledge, and affirm these. If there are core values in common, be sure to emphasize the commonality.
3) If their confidence level was not at an extreme (0 or 10), ask why not?
4) Ask if there was a time in their life before they felt this way about the issue, and if so what led to the change?
5) share a story about someone affected by the issue.
6) Share a personal story about why and how you reached your own position, but do not argue.
7) ask for their rating again, then wrap up and wish the person well, possibly with an invitation to talk again.
Notably, these techniques do not involve arguing about facts, such as about scientific data. A person holding strong entrenched beliefs may consider contrary facts or data to be false, biased, or irrelevant. They may feel like they are betraying their ingroup or their sacred values if they were to change their position. Yet elsewhere in the book there is an emphasis on facts as well, it is just that there would need to be a tipping point of information frequency within the person's ingroup, beyond which the group opinion starts to change suddenly. Below that level, facts are easily dismissed, ignored, or even used to ironically consolidate their previous beliefs, while labeling the fact-provider as a misguided or even evil outsider.
In some of McRaney's examples, he shows, as I have discussed before, that strong ingroups can be the main factors causing resistance to rational changes in belief, even if the ingroup's beliefs are causing great harm to themselves and are contrary to their core values (the anti-vax movement is an example). He points out that sometimes people need to leave these ingroups for other reasons, before they become amenable to changing their beliefs. Exiting the ingroup sometimes needs to happen first. But this can be unlikely to happen. To facilitate ingroup members being able to leave, there would need to be a kind, respectful, compassionate approach. If we only show anger and hostility to these ingroups, the members are more likely to rally together, as if protecting themselves from an enemy attack.
McRaney alludes to many of the practioners of techniques such as deep canvasing having many video examples of the technique, to help others learn and offer constructive feedback about the technique. I think this would be something to check out online, to see examples of people working in this area lead a successful conversation leading to positive change. Otherwise, like so many other techniques in health care or in life, we are stuck with just reading about an idea, rather than practicing and learning "hands on" with the guidance and feedback of others.
The one critique I have of McRaney's book is that he leaves out discussion of many research leaders in the psychology of conspiracy theorists, cults, and persuasion. Cialdini's work from decades ago is never mentioned. Psychologists such as Sander Van Der Linden are not mentioned. There are some other techniques suggested by these other researchers, including a "fake news inoculation" technique, in which you can learn and practice ways to protect yourself from misleading information. See the website https://www.getbadnews.com/books/english/
Also, the book does not discuss individual variations in people as a factor affecting tenacity of belief, propensity to conspiracy beliefs, resistance to fact-based arguments, etc. In my previous post (https://garthkroeker.blogspot.com/2021/09/conspiracy-theories-vaccine-hesitancy.html?m=1) I discuss factors such as past trauma and personality disorders as factors which could cause an individual to hold more rigid harmful or false beliefs. There would need to be some varability in the approach to conversing with someone about these issues, given these individual variations. It may be valuable to focus persuasive efforts on those most ambivalent or amenable to change within a strong ingroup.