Showing posts with label Policy & Politics. Show all posts
Showing posts with label Policy & Politics. Show all posts

Friday, August 4, 2023

"The Power of Us" by Jay Van Bavel & Dominic Packer: a recommendation, review, and applications in psychiatry

 Jay Van Bavel and Dominic Packer are social psychologists whose recent book, The Power of Us, is a nice review of basic social psychology with a unique emphasis on the impact of identity and group affiliation on human behaviour and cognitive biases.  

This book would be an excellent accompaniment to The Righteous Mind, by Jonathan Haidt, and Blueprint, by Nicholas Christakis.   Haidt looks at individual differences in values as a factor affecting group behaviour.  For example, people who value loyalty and "purity" (as opposed to "compassion" or "fairness") as cardinal values may be more likely to have strong group adherence, and may be more accepting of hierarchical or paternalistic systems; such traits could lead in particular to involvement with conservative groups.   Haidt argues (and I strongly agree) that such values and traits have a strong hereditary basis (though are also partly influenced by environment & cultural milieu) and have evolved in humans due to selective advantages for those who have a strong inclination towards group affiliation.  But of course, too much loyalty can be a bad thing, if it causes people to adhere loyally to groups which are engaging in harmful behaviour--we see this problem in the news every day.    Christakis looks at group dynamics in an interesting mathematical way, with successful or unsuccessful group behaviour influenced by the structure of connectedness, which in turn is influenced by leadership styles, external factors,  and individual personality traits.  

The subject of group affiliation, identity, with associated biases, polarization, and conflict, is an incredibly important subject in the world today.  Group-based divisions arguably are a primary cause of political problems and war across the world, and lead to delays and inefficiencies in solving world problems such as poverty, environmental degradation, and war.  On the positive side, strong group allegiance has led to most of humanity's great achievements through history.  Most great accomplishments in the sciences, the arts, in politics, and in the law, involve large-scale collaboration.   

Group affiliation is a powerful source of identity for all of us.  If we have a strong attachment to a group, we are likely to favour ingroup members.   This is normal and ubiquitous,  but it can lead in an extreme case to hating or persecuting outgroup members.   To prevent this, it can be helpful to have a culture of interacting respectfully or collaboratively, or recreationally, with outgroup members (Jonathan Haidt made this point years ago, in The Righteous Mind).  It could be especially effective if any such recreational activity could blend members from different groups.  The authors cite some very successful examples of these ideas, such as having a soccer league in Iraq where each team was required to have players assigned equally from different conflicted religious groups.  The resulting games allowed each player, and each team, to like, respect, and enjoy outgroup members, since they became teammates,  leading to reduced conflict in their communities afterwards.  A famous example from classic social psychology research is the "Robbers Cave" experiment from the 1950s, in which antagonistic groups of teenage boys later worked together in friendship and harmony if they had to collaborate together to solve a problem external to them both.  

The chapter on "fostering dissent" is especially insightful.  The authors make the point that voicing a dissenting opinion within a group is socially costly.  Even if the dissent is about an important logical or moral issue, the risk of dissenting can be to make other group members angry, and therefore threaten one's position as a group member.  You risk being seen as disloyal or disrespectful.  They argue that you have to really care about your group to be willing to voice dissent.  I see this could often be true, but sometimes particular individuals are more oppositional or defiant, due to character traits, leading to frequent dissent even if they don't particularly care about their group status.  Another problem with dissent is that other group members may have quietly agreed with the dissenter's position, but it could be costly for them to endorse the dissent, since it could make them look bad or immoral for not having brought it up first.  So a default position in groups would be to maintain the status quo, and for dissent to be risky, even if the group is engaging in harmful behaviours or beliefs.   Unfortunately, this can cause harmful behaviour to be perpetuated in some groups, and for dissenters to be punished or ostracized.  Recent examples of this include U.S. politician Liz Cheney, who has spoken out against the deeply immoral behaviour in the leadership of her political party.  Unfortunately, she was defeated in the subsequent election.  While she should be seen as someone defending the honour, integrity, and values of her group, therefore protecting the group's long-term interests, she instead has been seen by her own ingroup members as disloyal, and punished for it.  I hope her own story is not over, and that her principled behaviour may prevail in the end.  

An approach to solving the dissent problem is to have a leadership structure or ethos in groups which encourages respectful disagreement, without fear of punishment or other consequences.  Also it is vitally important, as a persuasive factor, to frame dissent or challenge with the group's long-term well-being in mind--to remind others of the group's core values, of the group's long-term interests, with a dissenting view intended to be a service to the group rather than merely a criticism.  

On a larger scale, I think it is always helpful to expand the circle of our groups.  Instead of focusing on local or national or religious or political allegiances, why not focus on a shared humanity.  Some of the guiding insights of many of the world's religions, such as Christianity, were to expand a circle of love, respect, and inclusion to outgroup members, and not to shrink into insular, bitter enclaves judgmental of others outside of their own ranks.  

Psychiatric issues always exist in a social context.  Patients will always have group allegiances or identities.  These could involve religion, politics, gender, race, family, occupation, etc.  It is important to understand these group allegiances, empathize with them, and communicate therapeutic ideas with the group allegiances in mind.   Encouragement or advice for change carries a high risk of failing if it is expressed in such a way as to challenge a person's individual or group-based values.  A survey of group affiliation and identity factors should be an essential part of a psychiatric history, and an ongoing theme in a therapeutic dialogue.  



Friday, May 26, 2023

Foolproof, by Sander van der Linden: a recommendation, review, and analogy with psychotherapy

I strongly recommend a new book by Cambridge psychologist Sander van der Linden, entitled Foolproof: why misinformation infects our minds and how to build immunity.

I have followed van der Linden's research for several years, alongside other experts who are studying the psychology of persuasion, misinformation, and propaganda.    This area has been an interest of mine for many years, after discovering psychologists such as Cialdini and Kahneman.  

This is a subject that everyone needs to learn about!  Persuasive techniques (for good and for bad) have always been with us through history; the power and influence of these techniques will only continue to escalate, thanks to the internet era, and now the era of artificial intelligence (AI).  

I have discussed these issues in other posts, such as:


and 


and 

Garth Kroeker: "GroupThink" (October 6, 2016) 

Van der Linden reviews the history and scope of misinformation.  Among the many current examples are conspiracy theorists impacting public opinion and policy, political influencers attempting to sway elections, propagandists from other countries defending violent or oppressive policies or sowing discord among their opponents, and of course the anti-vaccine community.  

There are a couple of acronyms he introduces: the word CONSPIRE can help us to recognize some of the common features of conspiracy theories:  

C = contradictory.  Most conspiracy theories feature contradictions.  For example, there could be a belief that some awful event is a hoax, but then also a belief that the awful event is real but was caused by evil conspirators.  

O = over-riding suspicion.    A sense of general distrust that goes beyond the topic of the conspiracy theory, particularly a distrust of official or mainstream explanations.  

N = nefarious plot.  A belief that there is a shadowy group of evildoers, such as government officials, corporations, or (at worst) a particular racial or ethnic group, who behind the scenes have caused some bad thing, perhaps with a motive to advance themselves.  

S = "something's wrong."  The belief that regardless of any acknowledged or corrected fact about an event, there's something going on that isn't right.  

P = persecuted individual.  The belief that someone is being deliberately harmed (most commonly, the believers in the conspiracy theory).  

I = immune to evidence.  Presentations of evidence often have little or no effect to change the opinion of people having conspiracy theory beliefs, in fact evidence could even "backfire" and cause the conspiracy theorist to become even more entrenched, or to believe that you or your sources of evidence are all biased or part of the conspiracy.   Such immunity to evidence is common among people who have limited expertise or knowledge about science, but could also be present in some highly educated people.  A conspiracy theorist who does have more scholarly expertise may understandably deploy statistical or psychological terminology to defend their beliefs; for example, by accusing other scholars of having psychological biases (such as confirmation bias).

Re = reinterpreting randomness.  This is creating a false causal story about random, unrelated events.   Humans in general are prone to doing this.   

It's interesting as a psychiatrist to reflect on the "CONSPIRE" factors above.  They are very often present in frank psychotic states, or in milder variants such as paranoid personality.  The tendency to have paranoid thoughts exists as a trait on a continuum in the population.  This trait has various environmental causes, but also has a high heritability.   It is a typical psychotic symptom to believe that there is a special, often ominous explanation behind pseudorandom events.    

Of course, sometimes there are explanations for events which differ from the mainstream understanding.  Through history there have always been maverick scientists,  who demonstrated something new and important, despite the objections or condemnation of their peers.  One example that has always bothered me was Alfred Wegener, who in 1912 was the first to propose the theory of continental drift; he was ridiculed and dismissed by his peers, who couldn't believe that entire continents could move across the face of the earth; Wegener tragically died before his theory was proved correct.   We have to be open to consider alternative theories.  However, maverick scientists, unlike conspiracy theorists, have clear evidence to support their claims; their reasoning does not contain contradictions; they are not immune to evidence, do not reinterpret randomness, and do not have ominous, over-riding suspicious beliefs about persecution.  

Van der Linden's next acronym is "DEPICT", to help remember features of manipulative communication:
  
D - discrediting.  The manipulative communicator will portray experts who disagree with them (such as scientific leaders, or even entire communities such as leading scientific journals), as biased, poorly qualified, incompetent, or having some nefarious agenda.  It is frustrating to have a scientific debate with someone who is engaging in such discrediting, since any sound evidence you raise with them will be dismissed as invalid.  

E - emotional.  Using strong emotional language to induce fear, anger, or disgust as a persuasive tool.  

P - polarization.  Framing issues, and people who have positions on these issues, in a "black or white" fashion, rather than as shades of grey.  This leads to a false sense of dichotomy, and encourages the formation of teams of opponents holding increasingly extreme positions, and increasing disrespect for those who disagree. 

I - impersonation.  Using fake experts to bolster a claim.   A variant of this is using an actual expert, but whose expertise has nothing to do with the issue at hand.  

C - conspiracy theories.  Encouraging conspiracy theory beliefs. 

T - trolling.  Attacking, insulting, or threatening opponents, usually in an online environment, such as on social media.  Such harassment has at times been so intense that scientists or policy experts (including in public health) have been afraid to speak out, fearing for their safety.  

Van der Linden's work focuses on how we can best deal with misinformation.  He concludes with an analogy:  misinformation must be dealt with by "immunizing" ourselves against it.  

In order to build immunity against an infectious disease, it is necessary to be exposed to a weakened version of the pathogen, in order to train the immune system, such that future doses of pathogens would be dealt with quickly.  

Infectious diseases are much easier to manage, with much less risk of harm or spread, by building immunity, rather than by only relying on treatment after infection.   

Similarly, it is much harder to "treat" misinformation after the fact.  Tactics to "treat" misinformation would be debate, education, and careful review of evidence.  But many people who have fallen into a misinformation "rabbit hole" are difficult to reach or persuade using reasoned debate.  Such debate may even cause the misinformed person to become even more angry or stubbornly adherent to their ideas.    

It is better to prevent people from falling into the rabbit hole in the first place--not by eliminating rabbit holes (which is impossible) but by teaching people how to identify and manage rabbit holes if they encounter them.  

The idea of "vaccination" is presented as an analogy throughout the book.  But beliefs and persuasion are not exactly like the body's immune system.  It's a very good analogy, but not perfect.  Much of the phenomenon van der Linden is talking about is explainable through learning theory:   we learn much better if we actually practice "hands on" with things, rather than just passively absorbing theory.  If you want to learn mathematics, you actually have to work through a lot of problems, not just read about how to do them.  If you want to learn how to ride a bike or drive a car, you have to practice cycling and driving, not just read about those things in a book!  As part of the practice, it is best to face challenging situations, and learn through experience how to overcome them.  

Similarly, to deal with emergencies, it is imperative to do behavioural practice many times as a preparation.  We have to do fire drills to prepare for a potential fire.  Pilots need to practice many times in a simulator how to manage engine failure.  If you only read about something, or learn about something, without practicing, you can't possibly become proficient, especially under pressure.  

To deal with misinformation, we have to practice, hands-on, dealing with misinformation, at first with "easy" examples, then more and more difficult ones.  

Applying these ideas to psychotherapy: CBT (cognitive-behavioural therapy) is very important and useful, but at worst it can be too passive.  Many people engaging in CBT do a lot of passive learning, they do written exercises in a workbook, but do not really practice deliberate exposure to uncomfortable stimuli.  The "vaccine" analogy could be useful to incorporate into CBT for treating depression or anxiety.   This is something that I have advocated for many years, mainly an emphasis on the "B" part of CBT.  To deal with panic attacks, it is most helpful to actually practice having panic attacks, in safe, controlled conditions!   To deal with depressive thoughts, it could be a useful exercise to invent simulated depressive thoughts, at first mild ones, then more challenging ones, to understand the mechanism by which they are created, and to practice facing them without being negatively affected.   This exposure therapy is like van der Linden's "vaccine."  But most therapists don't emphasize this enough, they only try to teach people to relax or cope with symptoms after they have occurred.  One of the purposes of talking about past emotional trauma is to recreate the painful events in the mind, but in a limited, controlled, "virtual" form, within the safe context of a therapy office.  In this way talking therapy has a vaccine-like effect.  

Linden's book is a must-read, not only for those interested in propaganda or misinformation, but also for anyone wanting a better understanding of the mind itself, with ideas that touch upon managing almost any life adversity, including mental illnesses.  

References: 


Linden, S. V. D. (2023). Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity. WW Norton.


Wednesday, February 1, 2023

Why to get your COVID bivalent booster

The COVID vaccines have saved millions of lives, and spared millions more a frightening hospital or intensive care admission.   Many people may not realize that recovery from a COVID hospitalization will often not be complete; tissue damage from COVID pneumonia may not heal completely, also the psychological effect of respiratory failure should not be underestimated.  Severe respiratory failure (a terrifying, suffocating experience) can often be a cause of PTSD that could affect you psychologically for years afterwards.   The vaccines have caused a huge reduction in such episodes of respiratory failure.  

COVID vaccinations are not perfect, and their protective effect does diminish gradually with time, though does not disappear entirely.  There are indeed rare cases of serious adverse effects, much lower than the rate of similar or worse adverse effects from COVID itself.  Also, vaccination reduces the probability of spreading to other people, thereby multiplying the beneficial effects in the whole community.  Vaccination followed by a mild case of COVID a few months later likely adds robust protection compared to vaccination or infection alone.   But the most effective and safe protection is to have an updated bivalent COVID booster, particularly if your last dose of vaccine and any episode of COVID infection has been more than 2-3 months before present.  Unfortunately, fewer people have had their boosters compared to previous vaccine doses, resulting in thousands of needless hospitalizations and deaths.  

Anti-vaccine misinformation is widespread, with testimonial accounts from people claiming that the vaccines are harmful.  It is important to know that a bivalent booster will lead to a large reduction in risk of severe disease, hospitalization, ICU admission, and death.    Evidence to support this is very, very robust, and unfortunately has not been emphasized strongly enough in current public health information campaigns.  

I encourage perusing the references below.  Aside from reading the studies and assessing the evidence for yourself, I encourage you to look up the authors and verify for yourself that these are incredibly experienced, well-educated researchers from major research centers, with no major biases or profit motives affecting their findings.    The research findings are corroborated and consistent with the  experience of ICU and infectious disease physicians, who on a daily basis in the past months have continued to see much more severe COVID disease and dangerously high hospital occupancy among those who are not up-to-date with their booster vaccinations.  


The references below are a preliminary list; I encourage you to continue checking out other references I've included in my previous COVID-related posts.  


References

Watson, O. J., Barnsley, G., Toor, J., Hogan, A. B., Winskill, P., & Ghani, A. C. (2022). Global impact of the first year of COVID-19 vaccination: A mathematical modelling study. The Lancet Infectious Diseases, 22(9), 1293–1302. https://doi.org/10.1016/S1473-3099(22)00320-6

CDC. COVID Data Tracker.  Centers for Disease Control and Prevention. 
https://covid.cdc.gov/covid-data-tracker/#vaccine-effectiveness
https://covid.cdc.gov/covid-data-tracker/#rates-by-vaccine-status

Arbel, R., Peretz, A., Sergienko, R., Friger, M., Beckenstein, T., Yaron, S., Hammerman, A., Bilenko, N., & Netzer, D. (2023). Effectiveness of the Bivalent mRNA Vaccine in Preventing Severe COVID-19 Outcomes: An Observational Cohort Study (SSRN Scholarly Paper No. 4314067). https://doi.org/10.2139/ssrn.4314067

https://www.azdhs.gov/covid19/documents/data/rates-of-cov-19-by-vaccination.pdf?v=2023010

Lin, D.-Y., Xu, Y., Gu, Y., Zeng, D., Wheeler, B., Young, H., Moore, Z., & Sunny, S. K. (2023). Effectiveness of Vaccination and Previous Infection Against Omicron Infection and Severe Outcomes in Children Under 12 Years of Age (p. 2023.01.18.23284739). medRxiv. https://doi.org/10.1101/2023.01.18.23284739

Andersson, N. W., Thiesson, E. M., Baum, U., Pihlström, N., Starrfelt, J., Faksová, K., Poukka, E., Meijerink, H., Ljung, R., & Hviid, A. (2023). Comparative effectiveness of the bivalent BA.4-5 and BA.1 mRNA-booster vaccines in the Nordic countries (p. 2023.01.19.23284764). medRxiv. https://doi.org/10.1101/2023.01.19.23284764

Davydow, D. S., Gifford, J. M., Desai, S. V., Needham, D. M., & Bienvenu, O. J. (2008). Posttraumatic stress disorder in general intensive care unit survivors: A systematic review. General Hospital Psychiatry, 30(5), 421–434. https://doi.org/10.1016/j.genhosppsych.2008.05.006

Tenforde, M.W. et al. (2022). Early estimates of bivalent mRNA vaccine effectiveness in preventing COVID-19-associated emergency department or urgent care encounters and hospitalizations among immunocompetent adults. VISION Network, nine states, Sep-Nov 2022.  Morbidity and Mortality Weekly Report, 71(5152), 1616-1624. 




Thursday, August 18, 2022

How Minds Change by David McRaney: a book review and discussion

David McRaney, in his new book called "How Minds Change" (2022), reviews our understanding of why people can form tenacious beliefs which are resistant to change, leading to political polarization, conspiracy theorists, hate groups, cults, anti-vax groups, climate change denialism, etc.  

I have discussed a lot of this material in some of my previous posts.   A big focus in McRaney's book is on what strategies are most effective to help with these problems.  He shows that simply presenting facts to a person with entrenched beliefs is usually ineffective, and could even cause the person to become even more entrenched in their beliefs.  Instead, there are several techniques discussed which have much better success.  These techniques are to some degree common-sensical, and are foundations of what might be found in any compassionate interaction, or any psychotherapy scenario.  

He discusses several such strategies, including deep canvasing, the elaboration likelihood model (ELM), street epistemology, and motivational interviewing.  All of these are similar--I'll summarize the core features here: 

1) establish rapport.  Empathize.  The communicator must seem trustworthy, credible, respectful, and reliable.   Obtain consent to talk about the issues at hand.  

2) Ask how strongly the person feels about a particular issue;  repeat back and clarify; identify a confidence level, such as from 0 to 10; ask how they chose that number; ask how they've judged the quality of their reasons for their choice;  summarize; make sure you've done a good job summarizing correctly.  

3) If there are core values influencing the person's opinion, such as about the importance of family, community, safety for children, freedom, loyalty, etc. be sure to empathize, acknowledge, and affirm these.  If there are core values in common, be sure to emphasize the commonality.  

3) If their confidence level was not at an extreme (0 or 10), ask why not?  

4) Ask if there was a time in their life before they felt this way about the issue, and if so what led to the change?  

5) share a story about someone affected by the issue.

6) Share a personal story about why and how you reached your own position, but do not argue.  

7) ask for their rating again, then wrap up and wish the person well, possibly with an invitation to talk again.  

Notably, these techniques do not involve arguing about facts, such as about scientific data.  A person holding strong entrenched beliefs may consider contrary facts or data to be false, biased, or irrelevant.  They may feel like they are betraying their ingroup or their sacred values if they were to change their position.   Yet elsewhere in the book there is an emphasis on facts as well, it is just that there would need to be a tipping point of information frequency within the person's ingroup, beyond which the group opinion starts to change suddenly.  Below that level, facts are easily dismissed, ignored, or even used to ironically consolidate their previous beliefs, while labeling the fact-provider as a misguided or even evil outsider.  

In some of McRaney's examples, he shows, as I have discussed before, that strong ingroups can be the main factors causing resistance to rational changes in belief, even if the ingroup's beliefs are causing great harm to themselves and are contrary to their core values (the anti-vax movement is an example).  He points out that sometimes people need to leave these ingroups for other reasons, before they become amenable to changing their beliefs.  Exiting the ingroup sometimes needs to happen first.  But this can be unlikely to happen.  To facilitate ingroup members being able to leave, there would need to be a kind, respectful, compassionate approach.  If we only show anger and hostility to these ingroups, the members are more likely to rally together, as if protecting themselves from an enemy attack.  

McRaney alludes to many of the practioners of techniques such as deep canvasing having many video examples of the technique, to help others learn and offer constructive feedback about the technique.  I think this would be something to check out online, to see examples of people working in this area lead a successful conversation leading to positive change.  Otherwise, like so many other techniques in health care or in life, we are stuck with just reading about an idea, rather than practicing and learning "hands on" with the guidance and feedback of others.  

The one critique I have of McRaney's book is that he leaves out discussion of many research leaders in the psychology of conspiracy theorists, cults, and persuasion.  Cialdini's work from decades ago is never mentioned.  Psychologists such as Sander Van Der Linden are not mentioned.  There are some other techniques suggested by these other researchers, including a "fake news inoculation" technique, in which you can learn and practice ways to protect yourself from misleading information.  See the website https://www.getbadnews.com/books/english/

Also, the book does not discuss individual variations in people as a factor affecting tenacity of belief, propensity to conspiracy beliefs, resistance to fact-based arguments, etc.  In my previous post (https://garthkroeker.blogspot.com/2021/09/conspiracy-theories-vaccine-hesitancy.html?m=1) I discuss factors such as past trauma and personality disorders as factors which could cause an individual to hold more rigid harmful or false beliefs.  There would need to be some varability in the approach to conversing with someone about these issues, given these individual variations.  It may be valuable to focus persuasive efforts on those most ambivalent or amenable to change within a strong ingroup.


 

Sunday, July 31, 2022

Medical School Admission Criteria: a discussion

 It is very difficult to get admitted to a medical school.   At UBC, only about 10% of applicants are accepted.   

This leads to extreme competition.  Students admitted to the program  have average university grades just under 90%, and average standardized test scores (from the MCAT) just under the 90th percentile.  

Even if you have average university grades above 90%, it is no guarantee of admission.  Only 26% of applicants with such high grades are accepted.  

Therefore, there are other factors which increase the likelihood of admission, aside from grades and standardized test performance.  These are so-called "extracurriculars" such as history of volunteering and "leadership activities," reference letters, and performance in a "multiple mini-interview," which involves responding in a desired fashion, within a time limit, to various hypothetical scenarios with a sequence of 10 different interviewers.  

I understand the need to have multiple criteria to judge applicants.  But I would like to make the case here that the current selection process is not particularly efficient or fair, it has a very strong bias against people with particular personality types, despite those people being very well-suited to be excellent physicians, and also leads to years of unhealthy, expensive, and wasteful frenzied competition before starting medical school.  There is also a bias in favour of people from wealthier families, since such people would more easily be able to afford years of volunteering, cultural exploration, club leadership, MCAT prep courses, tutoring, etc. instead of having to work long hours for years near minimum wage to support educational expenses.  

Imagine who the best future surgeons would be.  They would likely have excellent hand-eye coordination, tactile skills, mastery of fine details such as of anatomy, immense patience with meticulous tasks, and ability to remain calm and focused for long periods of time.  Some of them might be not be "neurotypical."   Many of the most talented such people would not necessarily have great social skills, would not be inclined to volunteer at Big Brothers or at nursing homes, would not be on the executive of university clubs, would not have a history of musical or drama performances, would not seem impressive in rapid interviews, and also may not have high grades in biochemistry or English.  I don't believe that any of the relevant surgical talents described above are assessed at all in the medical school admissions process.  

The current medical school admissions process therefore excludes many of the best future surgeons.  For many other future surgeons who are fairly accepted, they would have spent perhaps years of extra time padding their CVs with life activities that they were not really interested in, just to keep up with the pre-med competition game.  This is a waste not only for these individuals, but for society as a whole (we have budding surgeons who have several fewer years of professional life due to them having spent these years doing CV padding activities).  

Another result of the pre-med competition process is that candidates will be well-motivated to pad their academic transcript with easier courses, so-called "grade boosters," while avoiding difficult or challenging courses which tend to have a low class average.   The challenging courses would lead to improved scholarship and wisdom, but people have to avoid them because they could drag their grades down.  Most medical colleges do not take into account the difficulty of the courses that people take.  In any case, one of the advantages of a standardized exam such as the MCAT is that everyone in the world takes the exact same exam, so there is no selective avoidance of difficult material.  

The competition to show extracurricular volunteering and "leadership activities" also creates a bias against introverts.  Many of us are quiet, shy, with relatively solitary habits.  Such gentle, quiet people often would make excellent physicians: people who are calm, good listeners, patient, kind, intelligent, sensitive, and skilled.  But for a person with this personality style, group involvement, group leadership, and many types of volunteering, are just simply unpleasant or impossible.   I am an example of such a quiet, shy, relatively solitary person.  

We should have a selections process that chooses people who are likely to be competent, skilled, and stable.  We should have a process that makes it hard for a psychopathic person to get admitted.  The existing process does select for competence, stability and skill indirectly through grades, even though most of the actual grades have little to do with skills that would be of clinical use during a medical career.   A psychopathic person is less likely to have consistently high grades and a good volunteer record.  But many psychopaths could present themselves very well in cross-sectional interviews, while many non-psychopaths who are simply shy or reserved would bomb the interviews.  

What would be reasonable to change the process?  I don't think there's an easy answer.  I think that grades and MCAT should continue to have a prominent impact on admissions, despite some of the biases involved. Maybe this is unavoidable.   People should be rewarded, rather than penalized, for taking difficult courses that may have lower average grades.  It would seem very reasonable and practical to be rewarded in the admissions hierarchy if you have proven experience or skill in health care work or in relevant skills; for example, people who have worked in nursing or other allied health fields, as a paramedic, in an anatomy lab, as a technician, doing other work requiring long hours of meticulous focus, veterinary work, or psychotherapeutic work.  I think that much less weight should go to performance in a cross-sectional interview process, since this is extremely prone to biases which are not relevant to future medical performance (this reminds me of Kahneman's descriptions of biased and meaningless selection interviews from "Thinking Fast and Slow").  I think that showing "leadership skills" should have minimal impact on admissions, and people should not be penalized in the process for not showing "leadership skills."   Furthermore, those who are most ambitious to show such "leadership" are often the worst leaders.  

Monday, May 23, 2022

The Elephant in the Brain & The Folly of Fools

 Two more books to recommend:  


The Folly of Fools (2011) by Robert Trivers and The Elephant in the Brain (2018) by Kevin Simler & Robin Hanson are both about the human tendency to engage in deception: not only the deliberate deception of others but the deception of self.  

Trivers approaches this issue from the point of view of genetics (he was the first to characterize the evolutionary biology of reciprocal altruism).  The capacity to deceive can be beneficial to survival, as we see in many species of animals, and in many human examples.   But such deception can only work up to a certain point, an equilibrium point in terms of frequency, otherwise the strategy fails.  If deception was too frequent, the evolved strategies to counter deception would render the deceptive strategy ineffective.  Similarly, cheating can be an evolved strategy, but if cheating occurs too frequently in a population, it would no longer be effective due to widespread awareness and countermeasures in the population.    

Trivers goes on to argue that self-deception is a type of advanced deceptive strategy.  The capacity to effectively deceive others is enhanced if we can deceive ourselves.  If you REALLY believe you can win a fight (despite poor objective evidence), you are more likely to convince your opponent that you can win, and therefore are more likely to actually win, even if you utterly lack fighting skills. 

Unfortunately, self-deception leads to many serious problems in society.  Trivers goes through many examples, showing that horrible accidents, wars, biased research, and religious phenomena, are often driven by self-deceptive factors which end up causing disastrous results.  

His chapter called "religion and self-deception" is particularly recommended.  

While I consider this book important and highly recommended, I did find it often to be quite informal in reasoning, punctuated by forays into humour, but this could be a bit problematic when he is wandering into areas (for example about politics, wars, and religion) that many people could be quite sensitive or easily offended about.  There are bound to be sections in this book which could cause people some offense.  


The Elephant in the Brain is quite a remarkable review of ideas from social psychology and behavioural economics.  There is influence from Kahneman, Trivers (The Folly of Fools is referenced), Haidt, and many other leaders in the research of this area over the past decades.  I think it's astounding that these two authors, who are not specialists in these areas, produced such a comprehensive and compelling summary of this research.  

The thesis of this book is that humans have a powerful motive to signal membership in groups.  The tendency to form ingroups is a powerful human trait, evolved over millions of years.  Group membership allows us to trust and collaborate with our group members, for safety, defence, maintaining a food supply, dealing with illness, finding a mate, and raising children.  But unfortunately, this tendency to form ingroups can become such a powerful motivation, often without our awareness, that it overwhelms reason, fosters needless and often terrible conflict with outgroup members, and can become very destructive or at least inefficient.    And the phenomenon tends to perpetuate itself, since members of ingroups (be it political or religious or cultural) tend to socialize, mate, and have children with fellow ingroup members.  

They refer to Bryan Caplan's argument about education, showing that a great deal of education leads to only an indirect signal of skill or competence.  Most people do not use subject matter they learned in university very often if at all in the work they do afterwards.  Instead, the degree and grades serve mainly as a competitive signal to employers about capacity to achieve work, conform stably to demands, etc.  I have reviewed Caplan's book elsewhere (I do have some disagreements about this).  

The authors show that political and religious membership have powerful ingroup effects.  The tendency to form strong beliefs about elements of religious doctrine can be understood as a badge of group membership; if one can engage in successful self-deception about these doctrinal elements, it is all the more effective as a group membership badge.    The beliefs become shibboleths which can allow some feeling of trust with co-believers, and a sense of distrust or frank dislike of outsiders.  Such belief systems can develop independently of rational moral reasoning.  While all religious systems contain positive insights about morality (e.g. "love your neighbour as yourself", "blessed are the meek", "blessed are the peacemakers," "judge not lest ye be judged", "do unto others as you would have them do unto you," etc.), the moral prominence of these beautiful insights is often lost in a cloud of doctrine that becomes more about maintaining an emblem of group involvement, an "us" vs. "them" mentality.  This mentality is a manifestation of an evolved trait pushing all humans towards group involvement, formation of local communities in which we can feel trust and belonging, but with the unfortunate consequence of having outgroups which we would not trust, and which we would treat with less positivity, warmth, and generosity.  

The same phenomena occur in political beliefs.  While there could be core rational beliefs about positions on a political spectrum, with regard to preferred economic strategy, international affairs, management of public works, etc., a great deal of political involvement involves doctrinaire beliefs that are badges of group membership, and which have nothing to do with any understanding of policy.  Most people don't even know what the policy positions are, exactly, of the candidates they vote for.   Many others support their ingroup's politicians even though the associated policies would be harmful to themselves economically or socially.    We have tragically seen this happen during the pandemic.  Extreme beliefs about vaccines, masks, etc. became emblems of political group membership; many people made decisions about these issues not because of rational evidence (which strongly supported vaccine and mask use, for the protection of everyone's health, including the anti-vaxxers' own health and well-being), but because of the beliefs of fellow ingroup members in political or religious factions.  Masks and vaccines have almost nothing whatsoever to do with religion or politics -- they are simply common-sensical public health measures -- but once these issues became badges of group involvement, the issue spiralled into disaster, to the detriment of everyone.  This is an extreme example of the phenomenon shown in the famous children's study, where kids randomly given shirts of a different colour end up forming hostile ingroups, opposed to each other.  In the case of the pandemic, a great deal of anti-vax belief was simply driven by factors akin to having a different shirt colour, just to show difference from an opposing outgroup.  

In both books, reference is made to psychiatric theory as an example of self-deception.  Psychoanalytic theory is basically a set of ideas akin to religious doctrine, with a strong ingroup community of "believers" who couch discussion of psychiatric issues through the lens of a theoretical system which is mostly fictional.  As with religions, there are core beliefs in psychoanalysis which reflect deep insight and wisdom.   For example, the idea of psychological defenses came from psychoanalysis, and is ironically an insight into the tendency for humans to engage in self-deception, with the implication that we should try to become aware of our defences, and to be able to set them aside.   Similar insights warning about self-deception can be found in religious texts.  But most of psychoanalytic theory is arbitrary, based on bizarre inferences made from case reports, coloured by the already biased opinions of the therapists.   But as with religious practices, much of the therapeutic value in psychoanalysis has nothing to do with the literal belief system, it has to do with the practice itself.   Visiting a trusted minister or priest, who would most likely be kind, gentle, understanding, supportive, and wise, could be a wonderfully healthy practice, as could a meditative practice of daily prayer, or visiting a congregation of loving friends.   These healthy and possibly healing effects would occur regardless of the belief system held by the group.  Similarly, the practice of psychoanalysis (or psychodynamic therapy more generally) requires frequent visits with a wise, compassionate, gentle, kind therapist who probably has some useful feedback about life problems, and there would be a healing effect of simply having a stable therapeutic relationship over a long period of time, irrespective of the fictional theoretical belief system held, such as strict Freudianism.  

While we can empathize and even endorse the benefits of ingroup membership phenomena, I believe it behooves us to strive for improved rationality, to guide our knowledge and decisions so as to benefit ourselves, our neighbours, and the world in the most effective way.  Societies across the world have improved in this way over the centuries, as Steven Pinker has shown us (see Enlightenment Now), but we have a lot of work to do to continue progress in building a just, peaceful, prosperous society.  

In both books, we are wisely cautioned to look to ourselves for our own self-deceptions.  It is another human tendency to see self-deception or folly in others, while not noticing our own.  In my case, I recognize this will be a work in progress.   I surely have beliefs or practices that are products of my ingroup or other biases; I hope that I will be able to keep working on better awareness of these issues over time, in service to my patients and to myself.   



Friday, March 4, 2022

Belief Bubbles, Delusions, and Overvalued Ideas

 One of the most important posts that I've written on my blog, in my opinion, has been "Political polarization, propaganda, conspiracy theories, and vaccine hesitancy: a psychiatric approach to understanding and management," initially published on September 1, 2021 but edited and updated numerous times since then.   I check periodically how many people visit my blog, and I see that there are relatively few.  If I could recommend just one of my articles to be published widely, it would be that one, since I think it is so important regarding individual and public mental and physical health issues in the world today.  

The topics in that post focused on misinformation, propaganda, and deluded beliefs regarding the pandemic.    

I frequently see similar issues at play in my daily work as a psychiatrist.  

What causes fixed false beliefs?  When would we call these "delusions" as opposed to overvalued ideas, or simply examples of erroneous thinking?  

In psychotic states, the mind creates delusional beliefs without any reinforcement from a social community.  This is caused by genetic factors, abnormalities in dopamine circuitry in the brain, magnified by psychosocial stress.  As a result of the individual nature of psychotic illness, fellow members of the community can easily recognize the problem, and hopefully attempt to help.  Such delusional beliefs are unlikely to spread in a social network.  

There are examples of "shared psychotic disorders" in which an individual may have a primary psychotic illness, leading to close associates or family members adopting the same beliefs.  But this is a relatively rare phenomenon.  

A much more challenging problem occurs when false beliefs are spread in a social network.  In this case, the beliefs may or may not have anything directly to do with the other beliefs or values within the social network .  For example, extremist anti-vax beliefs are more common in particular religious or political groups, but vaccines have very little to do with theology or ideology.   The process of ideological spreading in these cases is analogous to what Dawkins calls a "meme", though driven not by a natural selection process, but by a process akin to "sexual selection."   In "sexual selection" traits such as peacock feathers propagate together with traits for recognizing and desiring the initial trait.  For example, bird songs or feather colours are sexually selected due to the song or feather itself and the desire of other birds to recognize or value the song or feather.  The song or feather comes to be an emblem of the species itself, rather than having other adaptive or communicative value (bright or decorative feathers do not lead to improved flight).  Many examples of "mass delusion" such as anti-vax beliefs are likely similar; they have become emblems of membership in particular religious or political communities, which are found to be attractive by those within the communities, even though the beliefs are harmful to the group and contrary to the group's positive values.  In this way, they are ironically similar to a virus:  anti-vax dialogue and behaviour has become much more prevalent or even dominant in these religious or political groups, such that the groups' core values or policies are utterly neglected or contradicted.   People from outside these groups would be disgusted by this phenomenon, leading to the groups becoming more insular, decried as hypocritical and immoral by outsiders, and obviously less able to offer charismatic outreach.  In particular, values such as love, care, and freedom are profoundly contradicted by beliefs which decry life-saving public health protections.  

Anti-vaccine and other "anti-public-health" propaganda is extremely harmful to society; it causes needless suffering, death, and economic hardship.  The propagation of such ideas is shockingly dissonant with the core values of many of the groups associated with it.  Disparate groups have endorsed such beliefs, leading to an unusual medley of fundamentalist religious groups, biker gangs, and racist groups joining in protests or defiance against vaccine and public health mandates. 

It is very difficult to address or improve problems of this sort.  When beliefs have been adopted as an emblem of a tight-knit social community, they are strengthened greatly by group association, and group members will defend these ideas from outsiders, almost like people might defend their home or family from invaders.   These ideas become adopted as almost sacred core values,  as though the beliefs (in this case about vaccination, wearing masks, etc.) were enshrined in a sacred religious text such as the Bible.  

As with psychotic illness, there are degrees of severity.  In mild cases of psychosis, affected people may be able to question their beliefs or request help; in more severe cases they have the insight to know that others would see their beliefs as paranoid, so they are able to refrain from discussing their beliefs, even though they still are fully believing their delusions.  In more severe cases, people will start expressing, or casually "slipping in"  the paranoid ideas in casual conversation (even with a psychiatrist) almost as though to test or evaluate the conversational partner, perhaps to seek a kindred believer or to be warned about a "nonbeliever."  As with some examples of religious practice, "believers" may seek to attempt to "convert" others as though expressing the delusion has a sacred value.   

I think it's pretty important as a psychiatrist to gently inform people that there are delusional beliefs going on.  With entrenched delusions this may need to be done with the greatest care and empathy, but I do think it needs to be discussed at least a little bit, otherwise there is a risk of the person feeling their delusions have been endorsed.  In the case of socially-spread overvalued ideas, it's a more difficult process to address in a therapy setting.  In some cases the discussion risks spreading to a focus on ideas concerning religious or cultural beliefs, which are generally off-base.  But when new "contagious" beliefs are spreading in a social network, straddling the boundary between a "cultural belief" and a "delusion," unbridled and harmful spreading is more likely.  This is similar to the epidemiological dynamics of COVID itself:  COVID is deadly, but its death rate is low enough to insert itself into populations in a seemingly harmless way, until a few weeks or months later when hospitals and ICUs are overflowing with severe cases.  Ironically, if the mortality rate of COVID was much higher, it might be easier to control at a community level, because there would be more unity of action.

As I discussed in my "political polarization" post, there are many social actions that can help this situation.  It is most valuable for rational, persuasive pro-vaccine, pro-public-health members of affected social groups (such as religious leaders, truckers, political leaders on both sides of the political spectrum, police, military personnel, alternative health care providers, and people formerly part of the anti-vax movement but who have changed their position) to speak out as educators and leaders.   Scientists and public health officials, etc. should still do their best to offer effective public communication, with efforts to reach out to these groups, but they are less likely to have a substantial impact in these communities, since they will be dismissed or derided as threatening outsiders.   Some of the communicative efforts from scientific leaders could at least involve building a better rapport with disparate communities, so that scientists would not be seen as elitist or part of an "ivory tower," out of touch with the rest of the population.    Meanwhile, there is evidence that the rest of us should continue to do our best to combat the spread of misinformation, and to do our best to speak the truth, rationally, resisting the urge to give up in frustration.  



Wednesday, September 1, 2021

Political polarization, propaganda, conspiracy theories, misinformation, and vaccine hesitancy: a psychiatric approach to understanding and management

I initially published this post in September 2021, with some additions or editing every month or two since then. 

If you don't have time for a longer read, I encourage skipping ahead to the end, where I discuss ideas about what we can do about the problem of vaccine refusal. 

Introduction

Political polarization, propaganda, and conspiracy theories have caused the world great harm in the past few years. A related problem has come up recently, with a significant minority of people refusing COVID-19 vaccination, leading to the pandemic lasting much longer, claiming many more lives, depleting and exhausting workers in the health care system, and causing much more economic damage.

Another round of this came up in Februrary 2022, with convoys of protesters rallying supporters to challenge vaccine mandates and other public health measures.  Protests of this sort are demoralizing and infuriating to those who are trying to help.  It is like a town threatened by a giant forest fire, with burning embers blowing into the neighbourhood,  but protesters tired of restrictions intimidating the firefighters, surrounding the fire stations with horns blaring (waking tired families and children trying to sleep nearby), and demanding to reclaim their right to have open fires.  

In this post, I will explore the psychological and social factors contributing to these problems, with suggestions of things that individuals, community organizations, companies, church groups, and governments can do to help. 

For a brief video introduction to this topic, I recommend a recent short Netflix documentary,  The Mind: Explained, the episode called "Brainwashing."  

A highly recommended book about how to communicate effectively with people having extremely polarized or conspiratorial beliefs is How Minds Change by David McRaney (2022).  

Polarization

It has become more common for people to hold extreme political views. There are increasingly hateful and intolerant attitudes towards political opponents.  Many of us are familiar with  the 2014 study done by the Pew Research Center, showing this polarization gradually worsening in the U.S. since 1994.

Propaganda

Propaganda is false, exaggerated, or misleading information spread for political or manipulative purposes. Many large news organizations in the U.S. support a particular political party, leading to unprecedented exposure to biased information consumed by nearly half the population. Social media sites such as Twitter and Facebook often lead people to obtain information only from like-minded others. Not only does this lead to extreme bias, it also builds a community of online friends or followers who "egg each other on," ideologically or personally, while denigrating opponents often in a mocking or hostile way. 

Beginning in March 2022, the world has once again seen the most horrific propaganda of all, that which persuades an entire nation to support a war against its neighbour, while preventing its citizens from seeing or understanding the atrocities being committed.     

Conspiracy Theories

Conspiracy theories have become more common and more bizarre, often associated with ideological positions or a particular political party. While most of us have had a sometimes amused tolerance for people holding these beliefs, conspiracy theorists are now more organized, can magnify and spread their ideas using social media, and have managed to influence public policy to some degree. I am aware of people in public positions who seriously believe that COVID-19 vaccines contain microchips used to track people, and that Bill Gates is somehow responsible for this. Others believe the moon landing was faked, or even that the earth is flat. 

Ideas of this sort are now very prevalent, with social media and other news sources contributing to their spread.  Such misinformation can often be presented in a professional manner, as though it is valid documentary reporting.  This attracts many followers who then continue to spread these ideas.  

Anti-Vaxxers

We are all weary of the COVID-19 pandemic. Millions have died or suffered severe disease, and many others have had severe financial losses. Many more are going to die. Most COVID patients will recover fully, but a significant minority of survivors will have long-term health consequences (so-called "long covid"), with symptoms such as chronic fatigue and respiratory problems.

We have vaccines that can cause large reductions in pandemic-related severe disease and death. The rapid development and mass distribution of vaccines since 2020 is one of the most outstanding scientific achievements in history. We also have other knowledge about control of viral respiratory disease, such as about mask usage, ventilation improvement, frequent home testing, etc. which, together with vaccination, could have brought our countries out of the pandemic much more quickly, with much less economic hardship, with much less psychological or social hardship,  and with much less loss of life.

No vaccine is perfect.  There are small risks of side effects, including some rare serious problems; but these risks are much lower than risks caused by COVID itself.  Also, very few vaccines lead to perfect "sterilizing" immunity.  The main impact of COVID vaccines has been to reduce the probability of severe disease, hospitalization, and death from COVID, while having a much smaller effect to reduce milder disease.  This protective effect also gradually fades over 6-12 months, though probably does not disappear entirely, requiring booster vaccinations at this point, while new COVID waves are continuing.  There is accumulating evidence that the vaccines also reduce the probability of developing "long covid."   Because of high vaccination rates in Canada, especially in older cohorts, we narrowly averted the disastrous hospital and critical care overflow that would otherwise have occurred in the waves before 2023.  Vaccine development, like any other human process, is never perfect either; there may be many steps of the COVID vaccine development story, such as political or economic issues, to criticize in some way.   But this should not distract us from appreciating how these vaccines have saved more lives, prevented more years of life lost, than almost any health intervention in our history.   

But a significant minority of people refuse to be vaccinated, refuse to use masks, and even refuse to acknowledge that the pandemic is a serious problem. Those who refuse are more likely to belong to particular political or religious groups, are more likely to watch particular news channels, and are more likely to have less education.  

Unvaccinated people from 2020-2022 were much, much more likely to require hospitalization, including intensive care.  At this point, recipients of recent vaccine boosters continue to have a much lower hospitalization rate.  From 2020 to early 2022, hospitals all over the world were filled to capacity, and beyond, by unvaccinated COVID patients.  If everyone had been vaccinated, there would still have been hospitalized patients (since the vaccines are not perfect), but there would never have been overflow or a major strain on the health care system.   Because of inadequate vaccination rates, everyone else (vaccinated or not)  had much more limited access to medical services, including elective surgery and ICU.  Because of people who refused vaccination, it became much more dangerous for all of us to have a heart attack or a case of appendicitis.  Furthermore, the staff in the hospitals were overworked to the point of exhaustion, the horror of the situation magnified further by the needlessness of it.  Vaccine refusal caused extreme harm to health care workers all over the world, and  led the health care system itself to the brink of total breakdown.  

A new horrific development following all of this, is the increasing prevalence of refusal to get vaccines against previously eradicated diseases such as polio and measles.  If this continues, it is likely that we will once again see children needlessly paralyzed or killed from preventable infectious diseases, for the first time in decades.  

Anti-vax beliefs and other bizarre beliefs about COVID can be shockingly extreme and unchangeable: we have many examples of people remaining convinced that COVID is a hoax, right up to the moment of their death from respiratory failure in an ICU bed. Medical colleagues of mine, working in rural areas with low vaccine uptake, have described many stories like this during the pandemic. There are horrifying examples of hospital workers being threatened or attacked by people convinced that the medical care is somehow harmful.  There are examples of public health officials who are afraid to advocate for vaccines, due to having received intense harassment and even violent threats.  

Alternative Medicine 

In many cases, bizarre beliefs about COVID are an extension of unusual ideas about health care. The alternative medicine industry has a market size of about $100 billion per year. Parts of this industry harmlessly promote healthy lifestyle habits, nutrition, or evidence-based care, but there are a lot of exaggerated or false claims made in the sales of alternative medical services and products.   Alternative health care can involve bizarre or even delusional beliefs about illness. There can be distrust for evidence-based medical science, and loyal allegiance to the alternative practitioners despite harmful practices.  

I realize it is excessive and unfair to simply condemn alternative medicine.  Many people have had bad experiences of conventional health care.  And many people have experienced kindness, generosity, support, and helpful guidance from alternative practitioners.  Many alternative practitioners have wisdom in particular rehabilitative techniques, and some knowledge of conventional medicine.  Some supplements or alternative remedies do have a reasonable evidence base, sometimes on par with standard treatments.   Conversely,  conventional physicians sometimes recommend treatments including prescription drugs or surgery which in some cases have a questionable foundation in evidence.  

Problems in modern medicine, including expense, access problems, or brief, impersonal clinical encounters, can feed some frustrated people's pursuit of alternative health care providers who may have more time for empathic support or apparent understanding.  Unfortunately, this apparent understanding is often based on fictional beliefs couched in pseudoscientific language that can sound impressive or convincing.

There are frequent examples of misguided beliefs in alternative health care practice, particularly when the practitioners are marketing expensive products, pushing bizarre theories of causation using expensive, unnecessary pseudoscientific testing procedures,  fomenting distrust in conventional medicine, or discouraging patients or clients from seeking proven therapies, including medications or vaccines.  

Con Artistry & Fraud

Many people with strong opinions opposing vaccines, supporting quack treatments for COVID, or supporting particular political leaders since 2016, have been conned -- that is, they have been victims of fraud. They have been sold something that seemed very attractive to them at the time, but the goods they've obtained are worthless or harmful to themselves and others. But many people would feel an embarrassing or humiliating injury to their pride to admit that they were conned; so instead, they double down on their support for con artists (including particular politicians) or quack remedies. 

There is a fascinating research literature on this subject. I would start with Maria Konnikova's book, The Confidence Game: why we fall for it...every time. Her book is a series of case studies of various types of spectacular con artistry & fraud, with some discussion of the psychology underlying this. The next scholar to be acquainted with is Brooke Harrington, a Dartmouth College sociologist. I'm in the midst of reading through this work. One of her questions has to do with justice: when should a person who has been conned into doing something harmful be considered an offender requiring management in the criminal justice system, instead of only a victim requiring compassionate care?

I see these issues as closely tied together, fed by the same underlying causes. Together they are driving people and nations apart; they have caused needless suffering, death, and economic hardship during the COVID pandemic, and have led to an unprecedented threat to democracy in some parts of the world.

These are not new problems: they have been with us throughout history. Many of us associate propaganda with World War II or the Soviet Union, not modern-day western democracies. Many of us associate bizarre or erroneous beliefs about health with previous centuries, in which people attributed disease to evil spirits, "excesses of bile," or an excess of blood in the body requiring treatment by bloodletting. Unfortunately, bizarre beliefs about health are alive and well in modern society.

It is crucial to understand and study these problems, to know why they happen and what can be done to improve the situation. A thorough analysis requires input from many fields, including from historians, political scientists, sociologists, public health experts, and psychologists. 

Myside Bias and Motivated Reasoning

The most powerful factor, in my opinion, driving extreme political difference, extreme contrarian views about the pandemic, and "anti-vax" ideas, is what Steven Pinker, in Rationality: What it is, why it seems scarce, why it matters (2021), calls "myside bias," which then gives rise to so-called "motivated reasoning."   This is the tendency to selectively attend to evidence which supports your pre-existing opinion, and to discount or ignore evidence which refutes your opinion.  We are all prone to this.   Your opinions, ideas, and beliefs about almost any issue can become associated with your identity, or your group, or your "side," with differences of opinion or contrary evidence representing some kind of a threat to your identity or values.  Contrary evidence could even bolster your initial opinion further, almost as though your ideas are like a family or village being attacked by outsiders, leading to the community rallying and strengthening itself in response.    Myside bias and motivated reasoning are greatly magnified in groups, and are caused or fueled by many of the factors which I will describe below.  

Haidt: The Righteous Mind

I recommend reading Jonathan Haidt's book, The Righteous Mind: Why Good People are Divided by Politics and Religion--an excellent introduction to the psychological factors which drive ideological differences. Haidt presents himself as a moderate, or even a right-leaning moderate, which I think at the very least should increase the readership and acceptance of this book across a wider swath of the political spectrum.

        Group Loyalty, Tribalism & Ingroup Bias

Haidt concludes that there is a human trait of feeling loyal to groups; those groups with stronger or more frequent loyalty traits among members will have advantages in survival and prosperity. These groups will be more cohesive and better able to defend themselves against outsiders. Some individuals value group loyalty above all other values; this is partially a heritable trait. While loyalty is a virtue, it can also lead to group members continuing or even fanatically increasing their loyal devotion when the group is engaging in destructive or corrupt behaviours, even when such behaviours are causing suffering to the group members themselves. The most extreme examples of fanatical group loyalty are seen in cults, but variations of this phenomenon are seen in daily life--in our families, our communities, our sports teams, our religions, our political groups, and our nations.

We have seen groups with extreme opposition to COVID vaccination harassing exhausted health care workers outside hospitals. Other groups participated in 2021 in an unprecedented mob attack on a world capital. Yet members of these groups previously may have valued ethical principles, such as fairness, hospitality, helpfulness, and the rule of law. Fanatical group allegiance can cause members to stray towards behaviour that is contrary to the group's previous fundamental values.

Groups containing devoutly loyal individuals are likely to have higher hostility to outsiders. Loyalty is a good thing, but in the setting of polarization, propaganda, conspiracy theories, and vaccine hesitancy, such unthinking, rigid loyalty is destructive to others and destructive to the group members themselves.

One of the suggestions Haidt has about improving the problem of polarization is to maintain open dialogue, value the principle of respectful debate, and foster friendships between people and groups with different views. This would involve cultivating friendships between those on the "left" and "right" of the political spectrum, rather than devolving into hostility and becoming "enemies." But this approach is not very helpful for dealing with fanatical or extremist groups; at that point, friendly debate and social warmth will not be possible.

Unfortunately, many people holding anti-vax beliefs and other strongly polarized positions have become too extreme to allow respectful social connection. Yet there are many others whose positions are moderate or ambivalent on these issues, including friends, relatives, and neighbors of extremists. These are the people most amenable to friendly engagement. 

The psychology of Conspiracy Theories

        Lack of feeling in control, need for certainty

According to psychologists studying this area, such as Van Prooijen and Douglas, conspiracy theorists often feel a lack of agency or control, a need to make sense of complex or confusing situations going on in life or in the world, a desire for being respected (but not feeling that such respect is being given), and a need for certainty. Like other delusions or overvalued ideas, conspiratorial thinking can give rise to a feeling of relief, since there is an explanation about why a problem is happening, even though the beliefs are fictional. The explanation, and the excitement of being part of a select group of fellow believers, can give back some feeling of control or certainty, a new sense of purpose. Other people's skepticism could be perceived as a noble challenge to be faced. 

In helping this problem therapeutically, people trapped in conspiracy beliefs need to be shown personal respect and empathy, before any attempt to challenge or refute the false beliefs.  

        Past Psychological Adversity or Trauma 

Prior psychological hardship can sometimes drive people into a fearful, angry, hateful, distrustful, or even paranoid state, with relief of ongoing psychological stress found in narrow or rigid ideologies. Others, including refugees, may have understandable reasons not to trust authorities or the government.  Neuroscientist Nafees Hamid has shown that experiences of social exclusion or discrimination contribute to radicalization.  In some cases, people with a history of trauma or social rejection will find comfort, support, and belonging in groups, such as churches and other community organizations, or extremist fringe groups, even if these organizations are engaging in extreme polarization or conspiracy beliefs. Members of these groups will naturally feel protective and loyal towards the group and the group's beliefs, even if these beliefs are causing harm to others. Therefore, some people develop anti-vax beliefs as a result of their past trauma. 

The possibility of past trauma should always be kept in mind when dealing with someone who is trapped in a conspiracy theory mindset, with compassionate support offered, while always challenging the false beliefs.  

        Personality Disorders

Personality disorders are common, affecting several percent of the population, with milder symptoms affecting many more. They cause lifelong disruption in relationships, behaviour, and emotional stability; people with personality disorders often lack insight that they have a problem. They are caused by a combination of hereditary factors and long-term environmental adversity, such as childhood abuse.

Many conspiracy theorists have narcissistic personality: they believe they are better, more insightful, more informed, and more intelligent than other people, and that other people's skepticism or rational arguments are signs of stupidity or inferiority. They are unable to tolerate critical feedback. A softer type of narcissism is due to unmet psychological needs to feel unique. When extreme narcissism is present in a major world leader (as was the case starting in 2016) the entire group of followers can adopt a narcissistic attitude, even if these traits would normally be abhorrent, or entirely at odds with the group's previous religious or ethical standards.

Another factor is obsessive-compulsive personality. Here, there is a rigid understanding of moral issues, a tendency to be quickly and firmly judgmental, and a tendency to favour a polarized view of issues. Again, such character traits would generally be difficult to tolerate, but when present in a charismatic leader, they become endorsed by the group itself.

Schizotypal and paranoid personality disorders can also lead to conspiracy theory beliefs. With these personality variants, people have low-grade delusional beliefs, magical thinking, superstitions, and mild paranoia.

Finally, there is antisocial personality, which leads to criminal behaviour, a lack of empathy, callous disregard for others' suffering, manipulative behaviour towards others, and compulsive lying, despite showing superficial charm. We have seen this factor in a major political leader since 2016 and in many con artists profiting from the pandemic. 

It should be noted that many conspiracy theorists do not have personality disorders.  They have been swept into false beliefs due to misinformation and group allegiances, but are otherwise mentally well, sometimes well-educated and intelligent.  

In order to help people who have personality disorders, compassionate understanding is required.  There are various therapeutic systems that are useful, including CBT and DBT.  Motivational interviewing techniques are also likely valuable.   Sometimes medications could be of some modest help, at least to reduce specific symptoms such as anxiety, irritability, or low-grade paranoia.  But in order for there to be any possibility of therapeutic help, there would need to be a safe, stable therapeutic frame.  If a person is angry, volatile, or behaving dangerously, therapy is impossible unless there can be strict boundaries guaranteeing safety.   For some people, these boundaries can be difficult or impossible to negotiate.  Furthermore, many people suffering from personality disorders lack insight about their problems, and lack the desire to work on personal change in a therapy setting.  The first step, in these cases, is often to impose limits on negative behaviour.  This is why antisocial personality, for example, usually needs to be dealt with in the justice system.  

        Low Education, Innumeracy, & Lack of knowledge about the world

Many conspiracy theorists have lower levels of education, lower levels of intelligence, and a desire for accuracy or meaning but a lack of the cognitive tools to find this rationally.

Innumeracy, a lack of scientific knowledge, a lack of statistical knowledge, and a general lack of knowledge about the world (for example, about history, culture, geography, or economics) are significant factors contributing to poor personal and political decisions. Even relatively intelligent people who are not broadly educated and informed are more prone to ingroup biases and conspiracy theories. 

Ellen Peters' book Innumeracy in the Wild: Misunderstanding and Misusing Numbers is a detailed account of poor mathematical skill in the population. She shows that only a small minority of people have the skills needed to accurately interpret data, and to correctly guide decision-making. As a result, most people either make erroneous conclusions about data, or are dependent on others to interpret the data for them. This makes people vulnerable to political influence from people who misconstrue data. These influencers may have a deliberately manipulative goal, or may be inadvertently misleading because they are also innumerate.

Without aptitude in science, critical thinking, and reason, verbal ability alone does not protect against being drawn into ingroup biases or conspiratorial thinking. Brittany Shoots-Reinhard, Ellen Peters, and others have done a lot of work over the past decade looking at the relationship between intellectual ability and decision-making. They recently published an article showing that people with higher verbal ability are more likely to have polarized responses to COVID-19, and to consume more polarized media. Numerical skill did not predict higher polarization.  

This suggests that people often use verbal intelligence, not to improve their reasoning or judgment, but to more efficiently gather information that supports their pre-existing views, often ideological and determined by ingroup biases. This is especially problematic at a political level, since verbal intelligence is a more important skill than numerical or scientific intelligence for a politician to be successful, or for a celebrity to be influential. Therefore, we have people who are more likely to have polarized beliefs holding positions of influence in society.

If we see studies looking at the effects of education on various psychological phenomena, beliefs, or ideologies, we need to look at the type of education, with a particular look at numeracy, logic, and reasoning skills, as well as the degree to which the education contains subject matter about global issues, such as history, geography, environmental science, economics, etc. 

It should be noted that some individuals swept into conspiratorial thinking are intelligent and well-educated, and may have good analytical skills.  They are influenced in their misinformation journey by other factors such as group allegiances.   They may use their intellectual skills to spread misinformation more effectively, or even become seen as experts in their communities.  

        The internet and news-bubbles

The internet provides a medium in which people with extreme beliefs can easily form a community, which in conjunction with traits for group loyalty, leads to them forming a strong identity, an "us vs. them" mentality, and a resistance to rational evidence from outside the group.

It is not enough to address this problem on a one-on-one basis. There are political, economic, and educational factors that are likely to help, on an individual and societal level. I'll come back to this later. 

Polarized News Sources & Propaganda

Major news networks in various parts of the world are deliberately propagating conspiratorial thinking and fomenting polarization, catering to entrenched members of particular ingroups. These networks have a profit motive, but the owners of the networks are also driven by ideological beliefs to push this to further extremes. They are popular and tend to have high ratings, especially when they are denigrating ideological opponents in a dramatic way. These news networks lack any form of regulation that prevents or limits harm (particularly in the U.S., after the removal of the FCC fairness doctrine in 1987).

Unfortunately, this has led to a steep decline in the quality of news information that many people are consuming. Fans also form an ingroup loyalty to the news service itself, such that mainstream news may be deemed "fake" or biased. Many fans normally value kindness, civility, education, politeness, the rule of law, balanced debate, and religious beliefs rooted in love and compassion. But due to powerful ingroup loyalty effects, the fans of these news services can embrace leaders or pundits who are unstable, mean-spirited, and bullying.

It is important not to underestimate how powerful and destructive propaganda can be; we have to realize that the freedoms we have enjoyed in modern democracies can be quickly eroded under the influence of powerful and well-financed propaganda efforts.

 Cognitive Biases 

Cognitive biases are "shortcuts" of thinking which allow us to make decisions more quickly. These shortcuts can be useful, since we don't always have the time to analyze every issue in our life in detail. But they can cause massive errors in judgment, especially when we are not even aware of them. For an introduction to this area of psychology, I recommend reading Daniel Kahneman: he is the one psychologist to have won a Nobel Prize.  His book Thinking: Fast and Slow is fun to read and a summary of Kahneman's masterful research.  I'd like to review some of the more common cognitive biases which perpetuate conspiratorial thinking, political polarization,  and ideological extremism:  

    Reactance

Reactance is the urge to do the opposite of what someone wants you to do to resist a perceived constraint upon your freedom. This has been one of the driving factors causing resistance to pandemic-based public health restrictions and vaccinations, and which drives political polarization more generally. A component of the reason many people are refusing vaccination or defying pandemic restrictions is reactance or defiance, because they don't like being told what to do, especially by people who they may see as outside their ingroup.

    Reactive Devaluation

Reactive devaluation is the tendency to devalue an idea or a proposal, only because the idea comes from an opponent. So almost any idea coming from a political opponent is reflexively devalued and opposed, regardless of whether it is rational, correct, or helpful. If the same idea had come from an ingroup member, it would be approved enthusiastically. Reactive devaluation is profoundly self-destructive, not only to individuals, but to entire nations. Unfortunately we see this daily in U.S. politics. Once again, this is a reason many people oppose advice about vaccination or public health measures.

    Projection

Projection is attributing to other people the feelings or problems that you have yourself. For example, you may feel angry with someone, but in a conversation you may have a strong belief that it is the other person who is angry at you. While projection is not typically considered a cognitive bias, it is a common psychological mechanism among those with personality disorders, and among con artists. In the former group, projection is often "unconscious"-- that is, people project without even realizing they're doing it. It would be an issue to be addressed in psychotherapy. In the latter group, it is used deliberately and consciously as a manipulative technique. A well-known political leader after 2016 could be seen to engage in both forms of projection every week--accusing others of bad qualities or behaviours that were obviously his own.

In a conversation with someone holding fanatical anti-vax beliefs, you may encourage the person to be more informed of evidence. But that person will project: they will claim that it is you who are not aware of the evidence! They will deny being conned themselves, but will claim that it is you who have been conned! Many anti-vaxxers are calling people who follow public health guidelines "sheep," while it is the anti-vaxxers who are often passively swept up in mindless herd behaviour. 

    The Availability Cascade

The "availability cascade" and the "illusion of truth effect" refer to the tendency to believe a statement simply because it has been repeated frequently, or because it is easy to understand, even though the statement is false. Many beliefs about the pandemic, including those from conspiracy theorists or those from the "anti-vax" groups, seem more believable simply due to frequent repetition. The staggering daily abundance of frank lies emerging from a major world leader from 2016-2020 were often not perceived to be lies by many people, due to the frequency of exposure and the cognitive ease involved in processing such statements. Or sometimes people did not care that they were lies. Sometimes hateful speech is unfortunately too easy to process cognitively; it may appeal to some deep, primitive component of our brains that is excited by rage and deprecating others. 

    Confirmation Bias

Confirmation bias is the tendency to only look selectively at evidence which supports a previous position. This is driven partly by powerful ingroup loyalty. Even when there is overwhelming evidence to support a contrary position, people suffering from confirmation bias will often remain stubbornly insistent that their own narrow, outdated, or invalid research findings are correct.

    Anchoring

Anchoring is the tendency to stick with an initial position or estimate, or to be swayed by it strongly. If you have started having a particular belief, there is a tendency to maintain it. This is particularly true if there are personality traits valuing consistency, commitment, and loyalty more strongly than traits valuing rationality, compassion, or wisdom. One can become irrationally "loyal" to initially-held ideas (such as about perceived harms of vaccines. or about supporting a political leader who proves to be dangerously unstable) even if these ideas are self-destructively inaccurate and contrary to other values.

    The Dunning-Kruger Effect

The Dunning-Kruger effect and the "overconfidence effect" refer to a tendency for unskilled people to overestimate their ability. We see this with many people making strong claims about specialized areas (such as about epidemiology or virology during the pandemic) despite minimal expertise. Unfortunately, such people can be quite persuasive, not because of their expertise, but because they may be popular and have a loud or persistent voice. On the other hand, many experts may have a rather modest voice, and therefore their accurate messages are under-amplified.

        Present Moment Bias

"Hyperbolic discounting" or present moment bias, is the preference for immediate payoffs relative to later payoffs. On an individual level, this reflects a lack of self-control when faced by temptations. On a community level, it leads to neglect of long-term societal needs, such as health, environmental integrity, and education, in favour of immediate profits, even if such profits cause severe long-term pollution, economic damage, or health damage. We see this in the pandemic management as well--many are unwilling to make a short-term sacrifice (such as maintaining social distancing or mask use) even though such small sacrifices would lead to much larger longer-term gains in health, prosperity, and survival for themselves, their families, and their communities.

    The Sunk Cost Fallacy

The "irrational escalation" fallacy or sunk cost fallacy is the tendency to continue investment in a decision that was made previously, despite new evidence that the decision was wrong. Basically, it can be humiliating or injurious to pride to change one's mind, so it can feel easier to hold onto one's mistaken views or decisions rather than change them. 

    Normalcy Bias

The normalcy bias is the refusal to plan for or react to a disaster which has never happened before. If you live in an earthquake zone, but have never seen or experienced an earthquake, you are less likely to consider how to survive an earthquake or protect your home. It is much less likely that you would undertake expensive large-scale disaster preparations. This phenomenon has happened with COVID. Many experts were well-prepared; there were even organized national preparations for pandemics, but some leaders of major governments dispensed with all of this. The same problem is likely to happen on a much worse scale regarding the ongoing degradation of the earth's environment (disappearance of forests, mass extinctions, degradation of fisheries, loss of wildlife habitats, and climate change). Once a disaster is already underway (such as a house fire or earthquake or flood or pandemic or climate change) it is much, much harder to reverse the situation; it becomes much, much more expensive if not impossible to find a solution. Prevention is much, much more affordable and efficient than expensive disaster management.

     The Ostrich Effect

 The ostrich effect is the tendency to ignore an obvious negative situation. Once again, we saw this in a major country upon the outbreak of COVID, and we see this with the environmental & climate change problems. On a personal level, we see this in the tendency for people not to seek medical help when they notice a serious problem, just hoping that it will go away on its own. It is driven by some combination of fear (in this case fear of the truth and fear of how difficult the treatment might be), and magical thinking (i.e. somehow believing that if you don't look at a problem, then it will go away).

    What to Do About Cognitive Biases

Daniel Kahneman, the world's leading expert on cognitive biases, is doubtful that we can eliminate cognitive bias.  Even people who are very well-educated about this issue are still prone to bias, just like everybody else.  The best we can do is educate ourselves about this, be watchful for bias in ourselves and others, collaborate together to make better judgments, and be open to feedback from others on this issue.  Whole communities should be open to critique from other communities, and not try to shut down debate or discussion.  

  ------

Similarity to Addictions 

Ideological bias and conspiracy thinking have a lot in common with addictions, since they are harmful to individuals and communities, but hard to escape. People often dabble with polarized or conspiracy-based ideas a little bit at first, often influenced by psychological adversities, family or peer culture, and genetic risk factors, then become more and more drawn into problematic behaviour over time; in this way, it is like someone trying cocaine with their friends a few times per year at parties, then escalating gradually towards weekly, then daily use, all the while justifying the behaviour as harmless, enjoyable, or a cultural norm.  

Addictive behaviour can cause deep satisfaction or relief in the moment; moving away from addictions can be challenging and painful; people often cannot do it without external help. Furthermore, many people with moderate to severe addictions deny they have a problem, and do not see any reason to change. They may see their addictive behaviour as simply a lifestyle choice, enjoyed by many friends, with any problems lying with other people who criticize them.

Addictions are strongly entrenched by a peer group of fellow addicts. Moving away from addictions often requires that people let go of their current social network, leading to feelings of loss, loneliness, boredom, and a lack of meaning. This is one of the reasons that we have to offer social and community support to people if we would like to help them move away from entrenched polarization or ideological biases. 

Biases & Educational factors beginning in childhood

Many biases and educational factors causing people to be trapped in a narrow or hostile ingroup begin during childhood, with parents, family, and community members teaching and influencing the children. Many people believe things only because their parents, teachers, and peers believed them. After childhood, people will be more likely to associate with, befriend, marry, or have children with others who are similar; this further entrenches previous beliefs and makes differing belief seem strange or wrong.  

Heredity

There is a hereditary influence on the tendency to be dogmatic or stubbornly adherent to ideologies, and on general intelligence. But hereditary predispositions are never absolute, and are never purely good or bad. Hereditary factors, if channelled through a healthy environment, can lead to good individual lives and a healthy community. 

Refusal to admit mistakes

One last huge psychological factor is refusal to admit mistakes. Many people would rather carry on with a previous decision even if it is leading to disastrous results. They would be embarrassed, ashamed, or would not "save face" if they had to admit they made a terrible mistake, or if they had to reverse their position on an important issue.

This stubbornness can be an extremely powerful factor; it could be a psychological defense, a way of protecting a person against the need to feel intense shame and regret for past decisions which caused terrible harm. This phenomenon is fed by some of the biases listed above, such as the sunk cost fallacy, anchoring, and ingroup biases. Instead of owning up to a bad decision, people will go through a remarkable feat of denial, to persuade themselves that they didn't make any mistake at all. Many people hold onto strong anti-vax beliefs or conspiracy theories for this reason. They might be willing to change their mind, but the cost of admitting a big mistake is too high.  

 Well-funded corporate groups & "think tanks"

Wealthy corporate donors with strong ideological positions are funding marketing campaigns and employing the small cohort of contrarian scientists to push policies opposing vaccination, public health measures, environmental protections, and other public policy ideas they see as relevant to their ideologies or profits. These corporate groups or "think tanks" have members who are part of the political or religious ingroups described above. Their biases are not just individual, but organized, powerful, and very well-funded, often with billions of dollars of financial support. 

Oxford-trained Duke University public health scholar Gavin Yamey has warned us about the influence of such groups, and has compared their tactics to those used in past decades by the tobacco industry: denying or twisting health risk data, to plant seeds of doubt in the population, in order to maintain profits of a multi-billion dollar industry despite the terrible harms it caused.  

External Political Interference

Other nations with antagonistic relationships with our own are attempting to propagate conspiracy theories and extremist groups, mainly using social media, in an attempt to disrupt or weaken our nations. This is a national defense issue. 


What to do about it

There is a lot that can be done about this problem:

1) How Minds Change 

David McRaney, in his 2022 book How Minds Change: The Surprising Science of Belief, Opinion, and Persuasion, describes techniques to communicate with people who have strongly entrenched beliefs.  I strongly recommend this book.  The ideas are based on showing respect, empathy, establishing rapport, sharing personal stories, searching for higher principles in common, gentle inquiry about the person's sources of evidence for their belief, and honest exchange of your own view, but without arguing or showing anger or contempt.  

2) Inoculation against Misinformation

Van der Linden et al. have shown that so-called "inoculation" techniques can protect people from misinformation.(see link).   This involves exposure to videos explaining how misinformation techniques work.  They have even devised an online game called "Bad News"  (see link) which helps people see how powerful misinformation techniques can be, in a gamified form.  This technique is similar to the best ideas from CBT (cognitive-behavioural therapy): the therapy requires exposure, following informed consent, to the heart of the problem, in order to overcome it and be protected from future adversity.  

 3)  Massive campaign to provide information & counter misinformation

According to MIT post-doc Ben Tappin, people differing due to an ideological divide are still persuadable using reasoned arguments. A major reason ideological divides lead to such tenacious resistance to change is the lack of exposure of ingroups to such reasoned argument, and the extreme prevalence of ingroup exposure to false arguments. Therefore, it makes sense to keep up our efforts to provide accurate information, and not give up or become resigned, believing that anti-vaxxers or other ingroup members are not persuadable.

We cannot only have health experts, such as government health officers, speaking to the public. Many anti-vax people will not be persuaded at all by a public leader. We need to have spokespeople in the information campaign representing ingroups associated with the anti-vax movement. We need to have right-wing political leaders, religious leaders, celebrities, sports stars, people with different levels of education, and people from different employment groups, all involved in this marketing and information campaign.

We specifically need to hear from people who were formerly part of the anti-vax movement, who have changed their mind. We need to hear directly from people who are severely ill in hospital, preferably with video.

Consideration should be given to prosecution of those spreading misinformation.

4) Addressing Myside Bias by emphasizing unity

Polarized differences and myside bias can be reduced by emphasizing that we are all on the "same team."  Such a principle of unity, of treating all fellow humans with equal respect, regardless of nationality or culture, is consistent with the laws or constitutions of most countries, and with the principles of most religions.   We all want to prosper, to live a healthy, happy life, to live in peace, and to have freedom while also acknowledging the need for cooperation.   We have a shared humanity despite the existence of national or cultural borders.   Unity between divided groups often improves if the groups need to work together to solve a larger external threat.  In this sense, it is unfortunate that the pandemic (a worldwide threat) has not led to a greater degree of unity between nations.  Environmental degradation is another massive looming threat of this kind.  Disagreement about the nature of such threats, and about the type of action needed, is part of the problem.  

5)  Friendship, Diplomacy, and Trade between opponents

 We should strive to develop friendships and trade relationships between members of opposing groups.  Steven Pinker emphasizes this point in his book on the history of violence in society,  Better Angels of Our Nature: Why Violence has Declined

This principle could be objected to, using extreme examples: most of us would not consider it appropriate or helpful to have cultivated friendships with Nazis during World War II. But most members of opposing groups are not extremists; they are moderates. It is necessary to denounce extremism, but this does not mean denouncing almost half of the entire population on the other side of an ideological divide. If there is to be anyone influencing or learning from each other, there has to be ongoing friendship. 

6)  Experiential Education

 Direct experiential education is extremely important. People need to take tours through overflowing intensive care units, meet the burned out but highly compassionate and expert staff, and be aware of the suffering patients there. There will be many patients who are members of their very own ingroup. I think this will be very persuasive, but this has barely been done at all during the pandemic. Of course, there are technical, ethical, and privacy-related barriers to having such tours, but these barriers could be overcome with good planning. At the very least, there should be embedded journalists in these environments, just as embedded journalists have been allowed access to war zones.

7) Vaccine Education

Specific education about vaccine-preventable diseases (such as polio, measles, or smallpox) is important and helpful. Many people don't understand how severe these diseases were, and how remarkably effective vaccines have been to spare hundreds of millions of people (mostly children) terrible suffering and death.

Specific education about how vaccines work is important. Many people simply do not know these things. 

8) Ingroup leaders as educators and influencers

Members of ingroups (most likely moderates) will be much more influential as sources of education and information, than members of outgroups, who will most likely be dismissed if they are even heard at all.

In the case of the pandemic, encouragement of vaccination from religious leaders and right-wing moderates will be useful to persuade others in this community to be vaccinated. Leaders of these ingroups must denounce extremism and violent behaviour. 

9) Emphasis on underlying values

The importance of emphasizing underlying values is a point made by Haidt. People on the right-wing of the political spectrum tend to value loyalty, family, and purity. Issues such as environmental protection and vaccination are consistent with values of loyalty and purity. It is loyalty to country, loyalty to one's own children (looking after their present and future well-being, enjoyment, and prosperity), and loyalty to God (who would want to care for all people, to encourage peace on earth, to care for the place we live, and to help people help one another). The idea of purity is well-served by plans to protect the environment and to protect the body from a devastating infection. 

10) Stop funding propaganda outlets

Steps should be taken by individuals and corporations to stop financial support for propaganda outlets, and to support independent, unbiased journalism. In general, we would not want our news sources to be influenced by wealthy donors or political parties. 

11) Beware of partisan "think tanks"

Good investigative journalism is needed to show financial and political influences coming from partisan think tanks and corporate lobby groups. I hope that if people can become more aware of these issues, there could be organized efforts to oppose such groups, and legislation to limit their power.

National security efforts are critical, to prevent other nations from contributing to propaganda and extremist groups in our countries. Investigative journalism is essential, as is monitoring of "bots" and fake social media accounts, etc. Government action is likely to be necessary. 

12) Reduce social media polarization

Steps should be taken, on a personal and political level, to reduce the tendency for social media platforms such as Twitter and Facebook to produce "news bubbles" and to foment division or extremism. This could involve persuading social media companies (through individual and government intervention) to adjust the algorithms on their sites, to help reduce exposure to extremist positions or false information, and to help "fact check." On a personal level, one of the options is simply to reduce or stop using social media. 

13) Psychiatric techniques

As a psychiatrist, it is often impossible to challenge entrenched biases with a patient unless there is a very strong therapeutic alliance, rapport, and trust. Even then, the amount of change to expect is very limited and slow, especially in the short term.

It is possible to encourage education, to help patients expand their horizons a little bit.

If there are low-grade psychotic symptoms underlying belief in conspiracy theories, an antipsychotic medication could be useful, but most people with this issue would not be willing to try this.

If past trauma or adversity is driving involvement with conspiracy theories or destructive ingroup behaviour, then compassionate, empathic trauma-informed treatment could be helpful.

Cognitive-behavioural therapy (CBT), in principle, could help people to recognize and change cognitive distortions or biases, but the nature of longstanding ideological bias is less amenable to change, in part due to a lack of insight on the part of those having these problems, and in part due to powerful resistances to change that people have developed over a lifetime, maintained or magnified by like-minded family and peers. 

Motivational interviewing is another set of techniques that would be useful to engage with someone having problems due to polarization, conspiracy theories, ideological propaganda, or anti-vax ideas. This is a style of therapy used to help people with addictions. Its foundation has to do with acknowledging a spectrum of insight and willingness/readiness to change for people with addictive problems, and to match the treatment with the level of readiness. While motivational interviewing is a suitable therapeutic style, it is also to some degree a pretty obvious, common-sensical approach. In any case, I encourage checking out a workbook about motivational interviewing, or some YouTube videos teaching the basics. 

14) Empathy with honesty 

In a conversation or debate with a person espousing a conspiracy theory or following some type of propaganda, empathy is needed for the conversation to continue. In conversing with someone who has a delusional belief, it is important that the person you're talking to knows your honest position on the issue, and knows that you are prepared to back up your position with good evidence, but it is essential that you show understanding of their feelings about the matter, and that the discussion does not deteriorate into a shouting match or into personal attacks.

As stated in number 8) above, it could be useful in a debate or conversation with conspiracy theorists, anti-vaxxers, etc. to find examples of prominent people within their ingroups who have changed their mind and moderated their position, while still endorsing and supporting the ingroup. This could include examples of politicians, religious leaders, and celebrities your debate partner might support or admire, who are now endorsing vaccination or other relevant policy issues.  As of December 2021, I am aware of one notorious member of this ingroup, a major U.S. political leader, who is now supporting vaccinations. 

 15) Possible need to end the conversation or relationship

Open dialogue requires safety and fairness. It is impossible to have a productive discussion with someone shouting at you, threatening you, or monopolizing the conversation. If the person you are talking to cannot behave in a physically safe and respectful manner, then it is necessary to end the discussion.

It may be necessary to end some relationships altogether, because continued contact may prove to be too aggravating and stressful over time, distracting us from more positive and helpful engagements or relationships. But if the conversation or relationship does end, I encourage people to remain polite, gentle, and civil, maybe with the possibility of re-establishing the relationship in the future, if the situation improves.  

16) Social Pressure & celebrity influence

It can be helpful to make use of media to show that public health measures such as vaccination & mask usage--and environmental measures such as recycling, reducing carbon emissions, and ecological protection--are attractive, fashionable, and cool. Conversely, media can help show that being an anti-vaxxer or a polluter is very unattractive. This type of work could involve the help of celebrities, sports stars, "influencers" and models.

17) Justice

In order to deal with con artists or fraud, we usually need to involve the criminal justice system. For a person who willfully neglects safety behaviour, and causes harm to others, there would be legal consequences.  For example, almost everyone, regardless of political orientation, would agree that we should prosecute drunk drivers, especially if they harm someone on the road. Rehabilitative treatment should be offered as well, for example to treat alcoholism.   There is a continuum in our society between debatable contrarian opinion on one side, fraud and frank propaganda on the other.  Obviously we shouldn't suppress contrarian opinion using the power of the legal system, but there should be consideration of prosecution in cases where frank lies are causing substantial harm to individual and public health, especially when the perpetrators are profiting financially.  

For con artists who are successfully prosecuted, it can often be the case that the victims who were conned, sometimes leading to severe financial or physical harm, will still insist that they were not victims at all. They may continue to support the con artist even after prosecution and conviction. Such is the tenacious power of people's need to "save face" -- admitting they were conned by someone they and their family and friends have admired for years as a hero can be embarrassing and humiliating. In order to make this process easier, it is necessary for fellow con victims to come forward and admit the truth. We see a few examples (though not enough) of this happening with previous supporters of a well-known political leader since 2016, which hopefully will lead the way to broader positive change. 

18) Be politically involved!  Vote! 

Some extremist or fanatical groups have been organizing protests, frightening and obstructing health care workers and patients at hospitals in recent days.  Others are threatening the foundations of democracy itself.  

It is necessary to become more politically aware and involved. In an age where democracy itself is under threat, it is essential to use your right to vote, and to help & encourage others to vote as well. If people become so discouraged or cynical about the present state of affairs that they don't even bother to vote, then our nation's and our world's problems will be dealt with by people who are very ill-equipped to solve them. 


Selected Readings & References

Armstrong, Karen. The Battle for God: A history of fundamentalism. (2001)

Brashier, N. M., Pennycook, G., Berinsky, A. J., & Rand, D. G. (2021). Timing matters when correcting fake news. Proceedings of the National Academy of Sciences, 118(5).

Bergstom, C. and West, J. Calling Bullshit: The Art of Skepticism in a Data-Driven World (2020). 

Briant, Emma Louise (2015). Propaganda and Counter-terrorism. Manchester: Manchester University Press. p. 9.

Christakis, Chris. Apollo's Arrow: The profound and enduring impact of coronavirus on the way we live. (2020)

Dawkins, R. The God Delusion (2006).

Dawkins, R. Outgrowing God (2019). 

  (note: Dawkins is a critic of religion, but I think it is good for any religious person to understand the reasons for this; I mention these books here because they address the subject of how people come to form extremely strong and often irrational ideological positions, and how people can move away from this, while gaining some education about basic science)

Douglas, K. M. (2021). COVID-19 conspiracy theories. Group Processes & Intergroup Relations, 24(2), 270-275.

Douglas, Karen et al, Understanding Conspiracy Theories. Political Psychology 40, Suppl. 1, 2019

Epstein, Z., Berinsky, A. J., Cole, R., Gully, A., Pennycook, G., & Rand, D. G. (2021). Developing an accuracy-prompt toolkit to reduce COVID-19 misinformation online. Harvard Kennedy School Misinformation Review.

Haidt, J. The Righteous Mind: Why Good People are Divided by Politics and Religion (2012).

Harrington, B. (2012). The sociology of financial fraud. In The Oxford handbook of the sociology of finance.

https://theconversation.com/why-people-believe-in-conspiracy-theories-and-how-to-change-their-minds-82514

Johnson DK et al. "Combating Vaccine Hesitancy with Vaccine-Preventable Disease Familiarization" Vaccines 2019, 7. 39

Kahneman, D. Thinking: Fast and Slow. (2013).

Kelly, J. (2006). The Great Mortality: an intimate history of the Black Death.

Konnikova, M. (2016). The confidence game: Why we fall for it. Every Time. New York.

Lewandowsky, S., & Van Der Linden, S. (2021). Countering misinformation and fake news through inoculation and prebunking. European Review of Social Psychology, 1-38.

Marchlewska, M., Green, R., Cichocka, A., Molenda, Z., & Douglas, K. M. (2021). From bad to worse: Avoidance coping with stress increases conspiracy beliefs. British Journal of Social Psychology.

McRaney, David.  How Minds Change: The Surprising Science of Belief, Opinion, and Persuasion. 2022. 

Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in cognitive sciences.

Pennycook, G., McPhetres, J., Zhang, Y., Lu, J. G., & Rand, D. G. (2020). Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychological science, 31(7), 770-780.

Pennycook, G., McPhetres, J., Bago, B., & Rand, D. G. (2020). Predictors of attitudes and misperceptions about COVID-19 in Canada, the UK, and the USA. PsyArXiv, 10, 1-25.

Peters, Maertens, R., Roozenbeek, J., Basol, M., & van der Linden, S. (2021). Long-term effectiveness of inoculation against misinformation: Three longitudinal experiments. Journal of Experimental Psychology: Applied, 27(1), 1.

Peters, Ellen. Innumeracy in the Wild: Misunderstanding and Misusing Numbers. Oxford (2020). 

Pinker, S. Better Angels of Our Nature: Why Violence has declined (2012). 

Pinker, S.  Rationality: What it is, why it seems scarce, why it matters (2021)

Pretus, C., Hamid, N., Sheikh, H., Ginges, J., Tobeña, A., Davis, R., ... & Atran, S. (2018). Neural and behavioral correlates of sacred values and vulnerability to violent extremism. Frontiers in psychology, 9, 2462.

Prum, Richard. The Evolution of Beauty: How Darwin's Forgotten Theory of Mate Choice Shapes the Animal World - and Us. Anchor (2018).

Rathje, S., Van Bavel, J. J., & van der Linden, S. (2021). Out-group animosity drives engagement on social media. Proceedings of the National Academy of Sciences, 118(26).

Rutjens et al, "Science skepticism across 24 countries."  Social Psychological and Personality Science 2021. 

Shoots-Reinhard et al. "Ability-related political polarization in the COVID-19 pandemic" Intelligence 88, 2021, 101580

Swire‐Thompson, B., Ecker, U. K., Lewandowsky, S., & Berinsky, A. J. (2020). They might be a liar but they’re my liar: Source evaluation and the prevalence of misinformation. Political Psychology, 41(1), 21-34.

Tappin, B. M. (2021, October 4). Exposure to Arguments and Evidence Changes Partisan Attitudes Even in the Face of Countervailing Leader Cues. https://doi.org/10.31234/osf.io/247bs

Taylor, S. (2019). The psychology of pandemics: Preparing for the next global outbreak of infectious disease. Cambridge Scholars Publishing

technologyreview.com/2020/07/15/1004950/how-to-talk-to-conspiracy-theorists-and-still-be-kind/

Van Bavel, J. J., Baicker, K., Boggio, P. S., Capraro, V., Cichocka, A., Cikara, M., ... & Willer, R. (2020). Using social and behavioural science to support COVID-19 pandemic response. Nature human behaviour, 4(5), 460-471.

Van der Linden, S., Panagopoulos, C., Azevedo, F., & Jost, J. T. (2021). The paranoid style in American politics revisited: An ideological asymmetry in conspiratorial thinking. Political Psychology, 42(1), 23-51.n 

Van Prooijen & Kuijper, "A comparison of extreme religious and political ideologies: Similar worldviews but different grievances", Personality and Individual Differences 159 (2020)

Van Prooijen & Krouwel, "Psychological features of extreme political ideologies."  Current Directions in Psychological Science 2019 28(2) 159-163. 

van Prooijen et al, "connecting the dots: Illusory pattern perception predicts belief in conspiracies and the supernatural."  Aug 21, 2017/ 

van Prooijen and Song, "The cultural dimension of intergroup conspiracy theories."  August 13, 2020. 

Zimbardo, P. (2011). The Lucifer effect: How good people turn evil. Random House.