Monday, May 23, 2022

The Elephant in the Brain & The Folly of Fools

 Two more books to recommend:  


The Folly of Fools (2011) by Robert Trivers and The Elephant in the Brain (2018) by Kevin Simler & Robin Hanson are both about the human tendency to engage in deception: not only the deliberate deception of others but the deception of self.  

Trivers approaches this issue from the point of view of genetics (he was the first to characterize the evolutionary biology of reciprocal altruism).  The capacity to deceive can be beneficial to survival, as we see in many species of animals, and in many human examples.   But such deception can only work up to a certain point, an equilibrium point in terms of frequency, otherwise the strategy fails.  If deception was too frequent, the evolved strategies to counter deception would render the deceptive strategy ineffective.  Similarly, cheating can be an evolved strategy, but if cheating occurs too frequently in a population, it would no longer be effective due to widespread awareness and countermeasures in the population.    

Trivers goes on to argue that self-deception is a type of advanced deceptive strategy.  The capacity to effectively deceive others is enhanced if we can deceive ourselves.  If you REALLY believe you can win a fight (despite poor objective evidence), you are more likely to convince your opponent that you can win, and therefore are more likely to actually win, even if you utterly lack fighting skills. 

Unfortunately, self-deception leads to many serious problems in society.  Trivers goes through many examples, showing that horrible accidents, wars, biased research, and religious phenomena, are often driven by self-deceptive factors which end up causing disastrous results.  

His chapter called "religion and self-deception" is particularly recommended.  

While I consider this book important and highly recommended, I did find it often to be quite informal in reasoning, punctuated by forays into humour, but this could be a bit problematic when he is wandering into areas (for example about politics, wars, and religion) that many people could be quite sensitive or easily offended about.  There are bound to be sections in this book which could cause people some offense.  


The Elephant in the Brain is quite a remarkable review of ideas from social psychology and behavioural economics.  There is influence from Kahneman, Trivers (The Folly of Fools is referenced), Haidt, and many other leaders in the research of this area over the past decades.  I think it's astounding that these two authors, who are not specialists in these areas, produced such a comprehensive and compelling summary of this research.  

The thesis of this book is that humans have a powerful motive to signal membership in groups.  The tendency to form ingroups is a powerful human trait, evolved over millions of years.  Group membership allows us to trust and collaborate with our group members, for safety, defence, maintaining a food supply, dealing with illness, finding a mate, and raising children.  But unfortunately, this tendency to form ingroups can become such a powerful motivation, often without our awareness, that it overwhelms reason, fosters needless and often terrible conflict with outgroup members, and can become very destructive or at least inefficient.    And the phenomenon tends to perpetuate itself, since members of ingroups (be it political or religious or cultural) tend to socialize, mate, and have children with fellow ingroup members.  

They refer to Bryan Caplan's argument about education, showing that a great deal of education leads to only an indirect signal of skill or competence.  Most people do not use subject matter they learned in university very often if at all in the work they do afterwards.  Instead, the degree and grades serve mainly as a competitive signal to employers about capacity to achieve work, conform stably to demands, etc.  I have reviewed Caplan's book elsewhere (I do have some disagreements about this).  

The authors show that political and religious membership have powerful ingroup effects.  The tendency to form strong beliefs about elements of religious doctrine can be understood as a badge of group membership; if one can engage in successful self-deception about these doctrinal elements, it is all the more effective as a group membership badge.    The beliefs become shibboleths which can allow some feeling of trust with co-believers, and a sense of distrust or frank dislike of outsiders.  Such belief systems can develop independently of rational moral reasoning.  While all religious systems contain positive insights about morality (e.g. "love your neighbour as yourself", "blessed are the meek", "blessed are the peacemakers," "judge not lest ye be judged", "do unto others as you would have them do unto you," etc.), the moral prominence of these beautiful insights is often lost in a cloud of doctrine that becomes more about maintaining an emblem of group involvement, an "us" vs. "them" mentality.  This mentality is a manifestation of an evolved trait pushing all humans towards group involvement, formation of local communities in which we can feel trust and belonging, but with the unfortunate consequence of having outgroups which we would not trust, and which we would treat with less positivity, warmth, and generosity.  

The same phenomena occur in political beliefs.  While there could be core rational beliefs about positions on a political spectrum, with regard to preferred economic strategy, international affairs, management of public works, etc., a great deal of political involvement involves doctrinaire beliefs that are badges of group membership, and which have nothing to do with any understanding of policy.  Most people don't even know what the policy positions are, exactly, of the candidates they vote for.   Many others support their ingroup's politicians even though the associated policies would be harmful to themselves economically or socially.    We have tragically seen this happen during the pandemic.  Extreme beliefs about vaccines, masks, etc. became emblems of political group membership; many people made decisions about these issues not because of rational evidence (which strongly supported vaccine and mask use, for the protection of everyone's health, including the anti-vaxxers' own health and well-being), but because of the beliefs of fellow ingroup members in political or religious factions.  Masks and vaccines have almost nothing whatsoever to do with religion or politics -- they are simply common-sensical public health measures -- but once these issues became badges of group involvement, the issue spiralled into disaster, to the detriment of everyone.  This is an extreme example of the phenomenon shown in the famous children's study, where kids randomly given shirts of a different colour end up forming hostile ingroups, opposed to each other.  In the case of the pandemic, a great deal of anti-vax belief was simply driven by factors akin to having a different shirt colour, just to show difference from an opposing outgroup.  

In both books, reference is made to psychiatric theory as an example of self-deception.  Psychoanalytic theory is basically a set of ideas akin to religious doctrine, with a strong ingroup community of "believers" who couch discussion of psychiatric issues through the lens of a theoretical system which is mostly fictional.  As with religions, there are core beliefs in psychoanalysis which reflect deep insight and wisdom.   For example, the idea of psychological defenses came from psychoanalysis, and is ironically an insight into the tendency for humans to engage in self-deception, with the implication that we should try to become aware of our defences, and to be able to set them aside.   Similar insights warning about self-deception can be found in religious texts.  But most of psychoanalytic theory is arbitrary, based on bizarre inferences made from case reports, coloured by the already biased opinions of the therapists.   But as with religious practices, much of the therapeutic value in psychoanalysis has nothing to do with the literal belief system, it has to do with the practice itself.   Visiting a trusted minister or priest, who would most likely be kind, gentle, understanding, supportive, and wise, could be a wonderfully healthy practice, as could a meditative practice of daily prayer, or visiting a congregation of loving friends.   These healthy and possibly healing effects would occur regardless of the belief system held by the group.  Similarly, the practice of psychoanalysis (or psychodynamic therapy more generally) requires frequent visits with a wise, compassionate, gentle, kind therapist who probably has some useful feedback about life problems, and there would be a healing effect of simply having a stable therapeutic relationship over a long period of time, irrespective of the fictional theoretical belief system held, such as strict Freudianism.  

While we can empathize and even endorse the benefits of ingroup membership phenomena, I believe it behooves us to strive for improved rationality, to guide our knowledge and decisions so as to benefit ourselves, our neighbours, and the world in the most effective way.  Societies across the world have improved in this way over the centuries, as Steven Pinker has shown us (see Enlightenment Now), but we have a lot of work to do to continue progress in building a just, peaceful, prosperous society.  

In both books, we are wisely cautioned to look to ourselves for our own self-deceptions.  It is another human tendency to see self-deception or folly in others, while not noticing our own.  In my case, I recognize this will be a work in progress.   I surely have beliefs or practices that are products of my ingroup or other biases; I hope that I will be able to keep working on better awareness of these issues over time, in service to my patients and to myself.   



No comments:

Post a Comment