Wednesday, February 25, 2026

The Psychology of Religion, Chapter 12: Benefits of Unfounded Belief

Another troubling angle on all of this is that people can sometimes latch onto a belief system whose claims are unfounded or causally misconceived, and yet experience real improvements they had not found otherwise. Those improvements can entrench the belief further, because the person now has what feels like lived proof: It worked for me. The mechanism is similar to the self-deception dynamic Trivers describes, and also similar to the way psychoanalysis could help some people even when much of its causal theory was exaggerated, overstated, or wrong.

For example, some people latch onto an extremely rigorous diet with a spurious rationale, and yet end up stabilizing a prior eating problem, bingeing pattern, or weight problem. Often what makes the diet “work” is not the theory, but the frame: the diet becomes a totalizing structure, a rule-set, a ritual, a commitment device, sometimes even a moral identity. Strong belief in the diet’s narrative can increase adherence—sometimes dramatically—especially when the belief is reinforced by “spiritual” practices, authoritative texts, a charismatic leader, and enthusiastic support from fellow adherents. The resulting improvement may have little to do with the supposed mechanism (“toxins,” “energy,” “impurity,” or whatever the myth is) and much to do with ordinary behavioral ingredients: reduced ultra-processed food, fewer calories, more routine, more attention to quantities and timing, and stronger social accountability. In other words, the distinguishing features of the theory may be fictional, while the behavior change is real.

This is an extension of the nonspecific-factors argument, but with an important twist: people may become more loyal to the false theory because the surrounding practices genuinely helped them. And expectancy itself matters. In psychotherapy, patients’ early outcome expectations are associated with better outcomes, and placebo research shows that ritual and a warm practitioner relationship can produce real symptom change. In some settings, even open-label placebos—interventions people are told are inert—can still help. That matters here because it shows that the human benefit does not necessarily require the theory to be true. In some cases it does not even require deception.

It is tempting to treat these forays into unfounded belief as harmless whenever they produce visible gains. But there is a dark side. Some dietary regimens are medically dangerous; some aggravate eating disorders; and some cultivate a loyalty to the framework that discourages critical thinking. Social media can intensify the problem, especially when health-focused communities reward purity, rigidity, and bodily control.

When a person’s identity becomes fused with a belief system, they may reject better treatments when those treatments are indicated—especially if a setback is interpreted as evidence of insufficient “faith,” insufficient purity, or insufficient devotion. The harms here are not merely theoretical. In patients with curable cancers, complementary medicine use has been associated with greater refusal of conventional treatment and worse survival, with the excess mortality appearing to be mediated by delay or refusal of effective care. 

These frameworks also often come packaged with community. People who join one cluster of unusual health beliefs can sometimes be pulled, by social gravity, into neighboring clusters: new spiritual doctrines, anti-vaccine attitudes, conspiratorial styles of explanation, and monetized ecosystems of coaching, supplements, retreats, and memberships. The pattern is not inevitable, but it is real enough to take seriously. 

And there is another distortion worth naming: we mostly hear from the success stories. The people for whom the diet failed, harmed them, or simply became an expensive obsession rarely become public evangelists. The community’s narrative therefore becomes skewed toward “miracles,” while the quiet attrition and collateral damage remain largely invisible.

Finally, just as in religions, the next step is often proselytizing. People who believe they have found salvation—whether dietary, medical, or spiritual—tend to recruit. They may pressure friends and family to “convert,” and disparage outsiders as ignorant, impure, or closed-minded. In the context of fad diets and alternative medicine, that can do real harm to public health.

So the point is not that unfounded belief never “helps.” The point is that when it helps, it often does so through common human mechanisms—structure, community, meaning, identity, expectancy, and accountability—while smuggling in risks that are easy to deny and hard to reverse once the belief becomes an emblem of belonging. The deeper problem is not merely that the belief is false. It is that the felt benefit is then misread as proof that the belief was true all along.

The Psychology of Religion, Chapter 11: Evolution

Evolution deserves space here because it explains the origins and diversification of life in a way that directly contradicts literal religious or mythological accounts of creation. It is certainly possible to remain religious while accepting evolution—many people do—but for some believers, evolutionary biology feels like an unacceptable affront to faith, because it replaces a story of intentional design with a story of natural processes unfolding over vast time.

There are experts in evolutionary science who can explain this far better than I can. Still, I want to set aside space for it in my own voice, because the basic logic is not hard to understand, the evidence is overwhelming, and the emotional resistance to it often has very little to do with evidence and a great deal to do with identity, belonging, and sacred narrative—topics I have already been discussing.

What follows is a tour through the core mechanism of natural selection, a few common misunderstandings, the idea of speciation, and then several side corridors that matter for this essay: cultural evolution, sexual selection, and the uncomfortable fact that even religiosity itself is shaped not only by culture, but also by temperament, inheritance, and biology.

Selection

Natural selection is the central guiding principle of evolutionary theory. The logic is profoundly simple. It requires only that we accept three basic facts:

1) Organisms vary (physically, physiologically, behaviorally).

2) Some variation is heritable (traits are influenced by DNA, even though environment matters enormously too).

3) Some traits affect reproductive success—not in a morally loaded sense of “deserving,” but in the literal sense that some variants leave more surviving offspring than others.

If a heritable trait increases the probability of leaving more surviving offspring in a particular environment, then over generations the population will contain more of that trait. If a trait reduces reproductive success, it tends to diminish. That's it--natural selection is differential survival and reproduction acting on inherited variation, repeated over time. 

You can see the basic logic everywhere, from selective breeding in crops and animals ("artificial selection"), to antibiotic resistance in microbes, to the obvious family resemblance in both physical and psychological traits among human relatives. None of this requires the belief that genes are the only cause of traits. It requires only the admission that heredity is a major contributor. A point worth emphasizing, because it is often misunderstood, is that natural selection is not about “improvement” in any moral or progressive sense. It is simply a filter that favors whatever works well enough in a local environment at a given time.

Mutation

DNA replication is highly accurate, but not perfect. Across generations there are small changes—mutations—introduced into genetic material. “Mutation” here does not mean “bad.” It simply means “change.” Most mutations are neutral, many are harmful, and a few are beneficial in a given environment. Sexual reproduction adds still more variation by shuffling existing variants into new combinations, so evolution does not work only with brand-new mutations but also with new mixes of old genetic material. 

Mutations do not happen because the organism needs them. A bacterium under stress does not somehow produce the exact helpful mutation it would most like to have. At the deepest level, whether a particular copying error happens in a particular cell at a particular moment is still a fundamentally accidental event. But that does not mean every part of the genome is equally likely to change. Some kinds of DNA changes happen more often than others, and some parts of the genome have a higher probability of experiencing mutations than others. Mutation is fundamentally random in origin, but statistically biased in pattern, and those biases affect what raw material natural selection gets to work with. 

When small genetic changes happen to produce a trait that improves survival or reproduction in a particular environment—say, a slightly different beak shape that lets a bird get food more efficiently—those variants will, on average, become more common. They become more common for a very simple reason: the creatures carrying the useful variant survive better and leave more offspring. Over many generations, the accumulation of such changes can produce substantial transformations, including changes in complex organs and behaviors. Darwin’s finches in the Galápagos remain a famous entry point into this idea because long-term field work and later genetic work showed how ecological pressures can shape beak traits across populations. 

The Time Scale of Evolution

One reason evolution feels counterintuitive is that large organisms reproduce slowly relative to a human life. Big evolutionary changes can take thousands or millions of years, just as major geological or astronomical processes do. We do not watch a canyon form in a single afternoon, and we do not watch a star age in real time, but the evidence for those processes is still decisive. Evolution is similar: long processes are inferred from converging lines of evidence. Fossils are one line of evidence—imperfect, but immensely powerful. Fossilization is rare and biased toward certain environments and tissues, so the record will always be incomplete. Still, the overall pattern—order in time, branching diversification, and transitional forms—fits evolutionary predictions very well. Comparative anatomy, fossils, biogeography, embryology, and genetics all point to the same branching story. 

Common Ancestry and Cousins

A cliché misunderstanding goes like this: “Evolution says humans descended from chimpanzees.” That is not what evolutionary biology claims. Humans and chimpanzees are not ancestor and descendant; they are cousins. And “cousin” is not just a metaphor here. It is literally the right genealogical idea: two living lineages sharing an older common ancestor. In ordinary family language, we talk about first cousins or second cousins. Chimpanzees are obviously not that kind of close cousin; if one insisted on stretching the family language into absurd distances, they would be something like our three-hundred-thousandth cousins. Current genetic evidence places the human–chimpanzee split on the order of about 6 million years ago. 

The same logic generalizes outward. Every living thing on Earth is, in the broad genealogical sense, our cousin. For many people, that is not depressing at all. It is a source of awe—a kind of cosmic kinship. If a reader wants a visual sense of this, it is worth looking up one of the modern phylogenetic tree charts online, such as OneZoom. The exact dates of particular branching points are still being refined, but the branching structure itself is already known very well. And the scale is staggering. Humans and mice last shared a common ancestor around 90 million years ago. Mammals and birds last shared a common ancestor on the order of 310 million years ago. Humans and fish share a common ancestor on the order of 450 million years ago. The exact numbers continue to be refined, but we are talking about tens to hundreds of millions of years, not a few thousand. (OneZoom)  (Tree of Life Explorer)

Speciation

Over the short term, evolution often looks like shifting trait frequencies within a species. Over the long term, divergence can accumulate until populations become reproductively isolated—meaning they can no longer interbreed successfully under natural conditions. That is speciation.

Speciation is not always a sharp on/off switch. In nature it often behaves more like a continuum. The Ensatina salamander example is a real case, and it is useful because it makes that idea vivid. Around California’s Central Valley, populations of these salamanders spread in a rough loop. Neighboring populations along the loop can still interbreed with the populations beside them. But by the time the two far ends of the loop meet again in Southern California, they have changed enough that they no longer interbreed successfully with each other, or do so only rarely. A simple analogy would be a circle of people, each speaking a dialect just a little different from the dialect of the person next to them. Each person can understand the neighbors on either side, but the person across the circle sounds incomprehensible. Ring species work something like that, except the differences are genetic and reproductive rather than linguistic. That is the point: species boundaries can emerge gradually, not by magic and not all at once.

One thing I want to state explicitly, because people often get confused here: even if a behavior or trait has evolved, that does not mean it is morally right, or that we should accept it. Evolution describes how traits spread. It does not tell us what we ought to value.

Compromise, Not Perfect Design

Evolution also does not produce perfect design. It modifies what already exists. That is why living bodies so often look less like clean engineering and more like a history of workable compromise. Humans are full of such compromises. In adult humans, the passage for food and the passage for air share anatomy in the throat, which is one reason choking is even possible. Our spines and backs also show the costs of walking upright. The human spine has an S-shape that helps us balance over our hips and walk efficiently on two legs, but it also turns a structure inherited from four-legged ancestors into a vertically loaded column. The result is chronic stress on the lower back, high rates of disc degeneration, herniated discs, sciatica, and persistent back pain. Human childbirth is unusually difficult because pelvic form, fetal growth, and the evolution of large brains create a tight compromise. It is not as though the body is simply poorly designed. The point is that evolution works with inherited materials and trade-offs, not with fresh blueprints. 

The vertebrate eye is another striking example. The retina is effectively wired backward: light has to pass through layers of neural tissue before it reaches the photoreceptors. Where the optic nerve exits the eye there is a literal blind spot, because that patch contains no photoreceptors. And because light must travel through the retinal layers first, the vertebrate eye needs compensatory tricks just to reduce scattering and preserve image quality. Müller cells help act like optical fibers to guide light through this awkward arrangement. The design also forces a trade-off between a clear optical path and the blood supply the retina needs. The system works, but it is hardly what a tidy engineer would draw from scratch. It also comes with structural vulnerabilities: the retina can detach, and when it does, vision can be permanently damaged. The octopus, whose camera eye evolved independently, ended up with a more direct setup. Its retina is everted rather than inverted, so the nerve fibers are routed behind the photoreceptors and there is no comparable blind spot. 

Our jaws and teeth tell a similar story. Over time, human faces and jaws have become smaller without tooth size shrinking in perfect proportion. The result is a mismatch between tooth size and available arch space. Softer and more processed diets appear to worsen the problem by reducing the chewing demands that help stimulate jaw growth. So many modern humans grow jaws that are simply too small for the full dental package they still carry, leading to orthodontic problems, crooked teeth, and impacted wisdom teeth. 

The broader point for me is simple: evolved traits are not always morally admirable or functionally ideal. Some have to be restrained, some have to be accommodated, and some have to be counterbalanced by culture. If we want to become more humane, we need rules, norms, education, medicine, and institutions that limit certain inherited tendencies, compensate for the problems they create, and cultivate better ones.

Cultural Evolution

A parallel kind of evolutionary logic shows up in culture. Dawkins coined the term “meme” for cultural units that replicate, but the broader point matters more than the label. Ideas, phrases, rituals, fashions, and institutions can spread, vary, compete, split, and disappear.

Language evolution is a very good example. Languages branch and drift. Over time, groups can become mutually unintelligible--they become different language "species." Linguists can reconstruct family trees and infer common ancestors in ways that are strikingly analogous to biology. Proto-Indo-European—the distant ancestor of languages as varied as English, German, Greek, Russian, Persian, and Hindi—existed about six to eight thousand years ago. English, German, and Dutch are much closer cousins inside the Germanic family, with a common ancestor a little over two thousand years ago. French, Spanish, Italian, Portuguese, and Romanian are cousin languages that began diverging from Latin roughly fifteen hundred to two thousand years ago. And the same logic extends beyond Europe. Mandarin and Cantonese are also cousins. Their shared recognizable ancestor lies in Middle Chinese, roughly a thousand years in the past. What begins as a dialect can, given enough time and separation, harden into a clearly distinct language. 

And language divergence does not require millennia only. Some changes become obvious in just the last few centuries. Afrikaans, for example, diverged from colonial Dutch over roughly the last three to four hundred years after Dutch settlement at the Cape in 1652. Even before a speech form is officially labeled a separate language, strong regional varieties can become difficult for outsiders to follow. There are strong accents within our own country that can be difficult to understand. Mutual understanding can erode gradually long before people agree on where to draw the boundary. 

Religions also behave this way. Doctrines split. Schisms occur. New denominations form. We see this clearly in Christianity: the East–West Schism of 1054 formalized the great split between Roman Catholic and Eastern Orthodox Christianity; the Reformation is conventionally dated to 1517; and those Protestant branches splintered further into Lutherans, Calvinists, Anabaptists, Anglicans, Methodists, Baptists, Pentecostals, and countless smaller groups. This fragmentation is not just ancient history. In the United Methodist Church, more than 7,600 U.S. congregations left between 2019 and the end of 2023, roughly a quarter of the denomination’s earlier U.S. total. Buddhism diversified too. The older matrix was early Buddhism. Theravada traces itself to one early line of that tradition, while Mahayana arose later as another major descendant branch around the beginning of the Common Era.  Islam experienced its defining Sunni–Shia split in the struggles over succession after Muhammad’s death. The family-tree metaphor is not perfect, but it is illuminating: religions do not descend from heaven as finished products. They branch, drift, quarrel, and split inside history.

Psychologically, that has an important implication: people often treat their own local, historically contingent version of a faith as if it were timeless and universal, when in reality it carries the fingerprints of geography, conflict, institutions, inheritance, and politics.

Sexual Selection

Another important evolutionary idea is sexual selection: traits can spread not because they help survival directly, but because they affect mating success. The peacock’s tail is the classic case—beautiful, costly, cumbersome, and yet selected because it becomes desirable within the mating preferences of the species. Darwin understood that biology is not only about staying alive long enough to reproduce; it is also about courtship, display, preference, and attraction. Richard Prum has argued, persuasively in my view, that sexual selection can include a genuinely aesthetic component: preferences themselves can become evolutionary forces, and traits can spread because they are found attractive, not merely because they advertise some practical advantage. 

This is relevant to humans not because we are peacocks, but because it reminds us that biology is not only grim survival calculus. Human mate choice also attends to looks, voice, movement, confidence, style, humor, conversation, and perhaps aspects of intelligence itself. Some theorists—notably Geoffrey Miller—have argued that traits such as humor, creativity, artistry, music, and parts of intelligence may have been shaped at least partly by sexual selection because they function as displays: they can signal mental agility, creativity, or the ability to hold another person’s attention. The evidence here is mixed and the details are debated, but the underlying idea is serious and worth considering.

Temperament, Inheritance, and Religion

Religiosity itself is not only cultural. Which particular religious group a person belongs to is largely a cultural and family-transmitted matter. But the broader tendency to be religious at all—to find religion important, compelling, consoling, or identity-defining—shows a meaningful inherited component. Twin research repeatedly suggests that adult religiosity has a moderate hereditary component, but the size depends a great deal on what exactly is being measured. Broadly speaking, estimates often land from the high 20s into the low 60s. More outward or socially enforced measures tend to sit lower, while more inward, identity-heavy, or conversion-like dimensions can sit higher. And when researchers say that some dimension of religiosity has a heritability of 60%,  they do not mean that 60% of one person’s religion is “caused by genes.” They mean that, in the population being studied, about 60% of the variability between people on that trait is statistically associated with genetic variability. 

It is also worth noting that intelligence and religiosity show a modest negative association on average in meta-analytic work. This is not a claim about any individual believer—many brilliant people are religious. The more useful point is simply that cognitive style varies. Some people are more comfortable with analytical doubt and ambiguity; others are more drawn to certainty, authority, and communal reinforcement. 

A trait dimension that is relevant here is schizotypy: a spectrum involving unusual perceptions, magical ideas, and strong pattern-finding. Most religious people are not schizotypal. But a person who is more prone to unusual experiences and to seeing hidden connections may be more likely to interpret inner events as messages, revelations, or signs coming from outside the self. Schizotypy itself also appears to be substantially heritable, often in roughly the 30% to 50% range. And there is repeated evidence suggesting that higher levels of schizotypic traits—especially the unusual-experiences side—are associated with stronger paranormal beliefs.

Moral psychology belongs here too. Some people are more temperamentally drawn to moral themes like loyalty, authority, and purity; others prioritize harm reduction and fairness more strongly. These inclinations are also significantly heritable.  Religions, especially organized and more traditional ones, tend to be associated with stronger emphasis on loyalty, authority, and purity, with less emphasis on harm reduction and fairness.  That is one reason some people feel deeply at home in religious cultures while others experience them as alienating. 

Conclusion

Once a person really absorbs the logic and evidence for evolution, it becomes difficult to look at literal creation myths in the same way again. For many people, that shift does not drain the world of meaning. It opens the door to a deeper, steadier awe: reverence for reality as it actually is, not as we once wished it to be.

The Psychology of Religion, Chapter 10: Reading List & Discussion

Here is a list of recommended books, with discussion:


1. The God Delusion, by Richard Dawkins.

An absolutely devastating critique of religious belief from a scientific point of view, with a particular focus on the harmful aspects of religion. Dawkins is at his best when he is clear, fierce, and empirically grounded. But he sometimes falls short in affirming the psychological and sociocultural causes and benefits of religion—benefits that can exist even when the supernatural claims are false.


2. Other books by Richard Dawkins, including The Ancestor’s Tale and The Selfish Gene.

These are excellent introductions to genetics in general and evolutionary biology in particular, with wonderful case studies highlighting specific organisms and biological systems. For example, it is incredibly interesting to understand the genetics of bee reproduction and how this influences the behavior and social organization of bees. The classic “birds and bees” talk really should be updated to include this.


These books also showcase Dawkins as one of the great science communicators, and they demonstrate that evolutionary biology can be wonderfully interesting. It has always bothered me that university students can obtain science degrees—sometimes even in biology—without ever reading books like these. Another very good book about the history and texture of genetics is She Has Her Mother’s Laugh, by Carl Zimmer.


3. Religion Explained, by Pascal Boyer.

This is one of the most important books for the whole thesis of the present book. Boyer is not mainly asking whether religion is true; he is asking why certain kinds of religious ideas are so easy for human minds to generate, remember, and transmit. Gods, spirits, ancestors, invisible agents, taboos, and rituals do not have to be imported into human culture from some supernatural realm. They arise very naturally from ordinary cognitive tendencies—our readiness to detect agency, infer intentions, remember striking stories, monitor threats, and think about social obligations. Religion, in this view, is not evidence of revelation. It is evidence of the structure of the human mind.  


4. In Gods We Trust, by Scott Atran.

A very important book in the cognitive and evolutionary study of religion. Atran helps show that religion is not simply a “belief system” in the abstract, but something that grows out of ordinary human cognition and then becomes woven into kinship, coalition, morality, identity, and emotional life. This is helpful because many religious people imagine skepticism as though it were merely a refusal to accept a few doctrines. But religion is usually much more socially and psychologically embedded than that.


Atran is denser and more scholarly than some of the other authors on this list, but he is worth the effort. He helps explain why religion is persistent, why it is so often resistant to contradiction, and why sacred ideas can motivate extraordinary loyalty and sacrifice.  


5. Big Gods, by Ara Norenzayan.

A very useful book for thinking about the social success of religion. Norenzayan explores the idea that belief in morally interested, watchful supernatural beings may have helped large societies sustain cooperation among strangers. Even if one grants this in part, it does not rescue the truth of the supernatural claims. It simply shows that false beliefs can sometimes have functional social effects. That distinction is central to the argument of this book: a belief can be adaptive, comforting, socially useful, and still not be true.


One can debate the exact historical sequencing of all this, but the core psychological and sociological point is extremely useful. If people feel watched, judged, and morally monitored—even by an invisible being—they may behave differently. Religion may therefore function, in some settings, as a kind of supernatural social surveillance system.  


6. Modes of Religiosity, by Harvey Whitehouse.

Especially useful for thinking about ritual. Whitehouse argues that religions often cluster around two broad patterns. In one, there are rare, emotionally intense, highly memorable rituals that bind small groups very tightly. In the other, there are frequent, repetitive, lower-arousal practices that support larger, more orderly, more bureaucratic communities. Once one sees this, many religious differences stop looking mysterious. They look more like different cultural technologies for shaping memory, commitment, group identity, and social structure.


This book is particularly relevant to chapters in this book about sacrifice, prayer, behavioural restrictions, shepherding, and common knowledge. It gives a very helpful vocabulary for thinking about why certain rituals are dramatic and painful while others are routine and repetitive. Both can serve group cohesion, but in very different ways.  


7. When God Talks Back, by Tanya Luhrmann.

One of the best books for understanding how religious experience becomes subjectively real. Luhrmann is not trying to prove that God exists; she is trying to understand how people come to experience God as vivid, intimate, personal, and responsive. This is a very important distinction. If one wants to understand religion properly, it is not enough to say that believers are mistaken. One must also understand how prayer, inner speech, imagination, attention, social reinforcement, and expectation can combine so that an invisible relationship feels emotionally real.


This book is especially valuable because it helps one understand believers without condescension. It is possible to reject the supernatural claims while still taking the psychology seriously. In fact, if one is going to criticize religion, one should understand it well enough to see how normal and human many of its experiences actually are.  


8. The Better Angels of Our Nature, by Steven Pinker.

A brilliant, ambitious book about violence through history and why it has declined over long time scales. Pinker explores multiple drivers—state formation, commerce, literacy, cosmopolitanism, and what he calls an “escalator of reason,” strongly associated with Enlightenment-style thinking.


This is relevant to religion not only because religion is one factor among many in the history of violence, but because Pinker directly challenges the idea that religion has been a uniquely reliable force for peace or moral progress. He gives religion partial credit in some cases, but he also points to crusades, inquisitions, witch hunts, wars of religion, sanctified cruelty, slavery defended from scripture, and the recurrent use of sacred certainty to intensify conflict. The broader point is not that religion causes all violence. It is that religion is not a dependable moral technology that reliably makes societies kinder or safer. When religion does align itself with humane progress, that alignment is not always intrinsic to the supernatural doctrine itself; often it is belated, partial, and entangled with wider secular developments. Some historians dispute parts of Pinker’s data and causal interpretation, but the book is still extraordinarily important and stimulating.  


9. The Folly of Fools, by Robert Trivers—especially the chapter on religion.

Trivers, one of the great evolutionary biologists of the past generation, argues that deception can confer survival advantages in the natural world. He goes further: the most effective deception often requires some degree of self-deception, because sincere belief makes the performance more convincing.

Applied to religion, the idea is that collective self-deception can bring real psychological comfort and social cohesion, while also creating vulnerability to manipulation. Leaders can channel the group toward intergroup conflict, persecution of outsiders, or financial exploitation—bolstered by moral certainty and a belief that “God is on our side.”

A related point about religion and self-deception concerns the afterlife. A soldier who no longer fears death because of fervent belief in Heaven may be more willing to fight; and, at worst, may feel fewer qualms about killing or about the suffering of civilians, if the entire moral narrative has been reframed as divinely endorsed. The same logic can be extended far beyond the battlefield. Once earthly suffering is downgraded as temporary, and eternal reward is made the real currency of value, almost any cruelty can be redescribed as necessary, justified, or trivial in the larger cosmic accounting.

10. Books about cognitive biases, such as Thinking, Fast and Slow by Daniel Kahneman.

This is relevant to religion because many religious beliefs are stabilized by well-described biases. Ingroup loyalty can produce “belief bubbles,” in which people preferentially consume ideas that support their faith and avoid exposure to ideas that challenge it. Beliefs become intertwined with identity and group safety, so a logical challenge can feel like an attack rather than a discussion.  Some other cognitive biases include the following: 

Ad hominem attacks against people who challenge religion. A famous target of this was Charles Darwin—a gentle, humble, family-oriented man, with a patient style of meticulously gathering and weighing evidence—who was subject to various criticisms of his character, integrity, and motives as part of a technique to discredit his ideas. Another variant of this is to label religious critics with a term meant to be derogatory, such as “liberal.” For some people, the term “liberal” is associated with an almost venomous loathing, as though this is one of the worst things a person could be.

Reactive devaluation. Following ad hominem attacks, there is the further step of dismissing evidence not because of its quality, but because one does not like the person expressing it. Conversely, evidence is embraced irrespective of its quality because one likes the messenger, knows the messenger well, or feels loyalty to that person because of past positive experiences.

The availability cascade. Various religious ideas can seem more believable or persuasive simply because of repeated exposure, sometimes over a lifetime. These ideas are easier to call to mind, more salient in memory, and this can fool people into thinking they are more accurate or more obviously true, irrespective of evidence.

Confirmation bias. This is when one looks for evidence in a biased way, collecting individual items that support a previous position, while not attending to—or even being aware of—evidence against that position.

Anchoring. This is the tendency to stick to an original position. One becomes “rooted” to a baseline stance—in this case, religious belief—and later judgments remain biased toward that starting point even in the face of dissonant evidence.

The sunk cost fallacy. This is sticking to an original position even when there is strong evidence against it, with the reasoning that one has invested so much time, effort, identity, and devotion into the belief that it feels like too much of a loss to let it go.


11. Books about tribal psychology, such as The Power of Us, by Jay Van Bavel and Dominic Packer.

One of the core causes of religion is its tribal nature. Tribalism is an innate human tendency to form groups we value and protect—almost always at the expense of outsiders. The origin stories of many religions contain an implicit tribalism: one chosen group receives the “true” revelation, while the rest of humanity is left out unless it is successfully converted.


Even the best missionary efforts cannot reach everyone—and historically there have been delays of centuries or millennia—so the theological structure often implies that billions of people are excluded from salvation or relegated to punishment for reasons having nothing to do with character. This contradicts the spirit of justice and universal benevolence many religions claim to endorse. Many texts also describe divinely endorsed violence against outgroups—neighbouring tribes, rival cities, entire peoples. Oddly, the divine help rarely involves settling disputes peacefully.


12. Joseph Campbell, especially The Power of Myth and Myths to Live By.

These were favorites of mine in young adulthood, though they can feel a little dated now. Campbell was a great storyteller with a strong interest in comparative mythology. He saw myths as sources of poetic insight about history, humanity, and morality—insights that evaporate if you insist on literalism.


After reading Campbell, I came to see that “it’s just a myth” doesn’t have to be insulting. A myth is not a historical account, but it is a portrait of a culture and its evolving moral imagination. Of course, myths are also edited over generations and often carry ideological agendas—sometimes to justify the power structures of the day. Taking a myth literally is like watching a great movie and then treating it as a documentary and instruction manual, policing behaviour by quoting isolated lines of dialogue, while denouncing all other films as blasphemy. We easily do this with Greek mythology, but many people refuse to do it with modern mythologies.


13. Determined, by Robert Sapolsky.

Sapolsky marshals a mountain of evidence that behaviour has many deterministic causes: genetic influences over long time scales, brain changes due to childhood experience, hormonal fluctuations, and immediate environmental conditions. “Free will” is at minimum far less free than most people assume, and for some individuals—given their biology and life history—following certain moral rules will be far harder than for others.


This connects to a classic problem in religious dogma: the tension between an all-knowing, all-powerful deity and meaningful human freedom. If a deity knows the entire future with certainty, it begins to resemble a fixed script—God watching a movie whose ending was known all along, including who ends up rewarded or punished. A religious apologist may hand-wave and say that human logic does not apply to divine matters, but once you do that consistently, you also weaken the very logic by which the religion argues for itself.


14. The Battle for God, by Karen Armstrong.

A history of fundamentalism that is very engaging and full of illuminating case examples. Armstrong’s key point is not simply that fundamentalists are ignorant or backward. Her point is that fundamentalism is, in important ways, a modern reaction to modernity—a kind of militant piety or panic in response to rapid, disorienting social change. She emphasizes the clash between older religious ways of making meaning and a new world dominated by scientific rationality, secular institutions, modern states, and aggressive cultural change. She is also very good on the way fundamentalist movements often turn religion into a quasi-rational system of certainties, as though sacred myth had to be defended in the language of science, politics, and ideology.  


This is particularly relevant to Darwin, modern science, and women’s rights. In the Protestant world especially, fundamentalism became deeply bound up with resistance to biblical criticism, Darwinian evolution, liberal theology, and secular education. Later, these same movements often fused with backlash against women’s rights, sexual liberalization, abortion rights, and other social changes seen as attacks on a divinely ordained hierarchy. When a belief system is predicated on absolute certainty and a rigid, divinely ordained order, a ballot box for a woman, a secular classroom, or an evolutionary biology textbook can feel not like progress but like an existential threat to the cosmic order. Armstrong is especially useful because she shows that fundamentalism is not a timeless, “pure” form of faith. It is a highly modern response to the painful transformation of modern life.  


15. The philosophical works of Bertrand Russell, particularly those in which he discusses religion.

Russell remains one of the clearest philosophical critics of religion. He is especially good at exposing bad arguments that have survived mainly because they are old, emotionally resonant, or repeated with confidence. There is also something refreshing about reading someone who is sharp, logical, unsentimental, and completely unintimidated by sacred tradition.


16. Works reviewing controlled studies on paranormal claims—psychic ability, ghosts, miracles, and related subjects.

This includes James Randi’s work over many years, and skeptically oriented psychologists such as Richard Wiseman, who has written and spoken extensively about how “paranormal” experiences can arise from priming, suggestion, environmental factors, memory distortion, coincidence, and cultural expectation.


The fairest claim here is not that every study has always been negative, but that after decades of investigation, paranormal claims have not produced a robust, reliably replicable body of evidence. Many “hits” are better explained by coincidence, selection effects, remembering the hits and forgetting the misses, motivated interpretation, and the cognitive biases that flourish in emotionally charged settings.


17. Books about evolutionary psychology, such as Spent, by Geoffrey Miller.

Useful for thinking about status, mating displays, consumption, and the ways our evolved social psychology can drive behaviour that we later dress up in rational language. Human beings are very good at post hoc moralizing and self-justification. Religion often provides an especially elegant vocabulary for motives that are much more ancient and much less noble than they appear on the surface.


18. Scholars in archaeology, Ancient Near East history, the history of religious texts, and philosophy.

Look for scholars with strong credentials and serious methods, as you would in any discipline. Bart Ehrman is an excellent place to start. One should be acquainted with scholarship on the origins and editorial histories of religious texts: multiple versions, translation issues, theological agendas, and the ways texts absorb and transform motifs across cultures. Richard Elliott Friedman, Israel Finkelstein, Neil Silberman, and Mark S. Smith are all very useful in this regard.


A caution: as with many polarized topics, there are also plenty of apologists—sometimes persuasive, sometimes contemptuous of contrary evidence—who can reproduce the very dynamics they claim to be rising above.


19. Astronomy Today and other good introductory astronomy texts.

An introductory astronomy textbook is genuinely thrilling—even just aesthetically. I think everyone should understand how planets, stars, and galaxies form, what they are made of, the time scales involved, and the astonishing reasoning that has helped us understand the universe.


A modern cosmic perspective does more than show that many old cosmologies were factually wrong. It shows how provincial they were. Many religious worldviews were formed in a pre-Copernican mental universe. Once one has learned even a little astronomy, it becomes much harder to take ancient sky-centered metaphors as literal descriptions of reality.


This list is obviously selective. But these are among the books and research traditions I would most want a reader to have in mind while reading the rest of this book.


Tuesday, February 24, 2026

The Psychology of Religion, Chapter 9: Video Recommendations

To explore the evidence behind my main thesis, I have to defer to people who are masters of their respective fields. I’m not a specialist in genetics, geology, astronomy, physics, or history—and I don’t want to pose as one. Still, with many complex topics it helps to have some working understanding across multiple domains, because reality doesn’t come neatly divided into academic departments.

So here is a starter video list—meant less as a syllabus than as an invitation to curiosity:

1. A very approachable place to start is simply to watch nature documentaries. David Attenborough is, in my view, among the greatest nature documentarians in history. One can see that Attenborough is also a great human being—gentle, wise, kind, caring—and most everyone, regardless of religious or political leanings, would surely appreciate him. The BBC Planet Earth series is a good gateway:

Planet Earth
Planet Earth II
Planet Earth III


And separately (not BBC): David Attenborough: A Life on Our Planet (2020).

These films can begin with simple appreciation of the wonder of the natural world—then, if you’re willing, they also confront something darker: predation, starvation, disease, and the high baseline suffering in wild ecosystems. And these documentaries are a good background to understand evolution as the phenomenon which has guided the history of life, as opposed to some kind of divine hand. Increasingly, they also point to the scale of human-caused ecological damage—habitat loss, pollution, and the accelerating loss of biodiversity.

For people raised in literalist traditions, geology is often the first immovable wall of evidence: the Earth is old—billions of years old. A clear and enjoyable entry point is the work of geologist Iain Stewart, who has presented excellent television introductions to Earth’s history and processes, for example his geology series Earth: The Biography (released in some markets under that title in 2008). This matters here because many forms of dogmatic faith make specific claims about origins and timescales, such as that the Earth is only a few thousand years old, which simply do not survive contact with the evidence.


2.  Cosmos—the original series with Carl Sagan (1980), and the modern reboot with Neil deGrasse Tyson (Cosmos: A Spacetime Odyssey)—is a beautiful introduction to astronomy and to the history of scientific discovery. The central lesson isn’t that “science has all the answers.” It’s that science has developed methods for correcting itself, revising its claims when new evidence arrives, and building an increasingly coherent picture of nature—methods that look very different from dogma.


3. Alice Roberts’ 2009 documentary The Incredible Human Journey is a vivid, evidence-focused account of human origins and migration. It tells the story of humans emerging in Africa hundreds of thousands of years ago, and then spreading across the world over long stretches of time. It’s “hands on” in the best way: bones, artifacts, genetics, geography—real evidence you can actually reason about. A similar more recent documentary, produced in 2025, is Human, a five-part series hosted by paleoanthropologist Ella Al-Shamahi; it argues that a defining feature of humanity is the capacity for communicating abstract thought, and it invites reflection on how symbolic practices—including religion and sacrifice—became important elements in the development of human culture.


4. The Cambridge historian Christopher Clark has made accessible historical work available in lecture/documentary form, especially in his series The Story of Europe, beginning with Origins and Identity. Some of this can be found on YouTube. I recommend watching serious history partly because it inoculates against simplistic religious apologetics. Every major religion has sometimes been entangled with education, social organization, and cultural development. But history also forces us to look directly at atrocities, wars, persecutions, and massacres carried out under religious banners—including conflicts between rival branches of the same religion.


5. PBS’s Evolution (2001), narrated by Liam Neeson, is a solid place to begin learning about evolutionary science. This documentary is dated now in production style, and much of evolutionary biology has advanced dramatically since 2001—especially with the explosion of genetic evidence. But it still introduces the central logic clearly, and it’s hard to overstate how overwhelmingly strong the evidence is. Understanding evolution does not have to “dampen morale” any more than understanding that the earth revolves around the sun. It’s simply a lucid way of seeing how biological systems actually work.

A very good follow-up here is PBS’s Your Inner Fish (2014), based on Neil Shubin’s work. It is especially good because it links fossils, embryology, genetics, and the odd quirks of human anatomy in a vivid, understandable way. The basic theme is that many parts of the human body—our limbs, necks, lungs, and even aspects of our hands—make much more sense when you see them as products of deep evolutionary history rather than as pristine design.

A small rhetorical critique, though: documentaries sometimes lapse into personification—phrases like “nature wants” or “evolution tinkers.” This is just figurative language, but it can confuse a literal-minded viewer into imagining a conscious agent. Evolution is not a being; it is a process. Nature doesn’t “decide.” Things happen because systems have certain constraints, causal mechanisms, and regularities—and those regularities can be studied.


6. Documentaries such as Into the Universe with Stephen Hawking (2010) are an accessible entry point into questions about the origins and fate of the universe. If you’re drawn in, it becomes worth learning at least the basic conceptual outlines of cosmology, relativity and quantum mechanics—not to become a physicist, but to appreciate what modern science has learned about time, matter, and causation.

This list is not meant to “replace” religion with documentaries. It’s meant to give readers a way to encounter the natural world and human history directly—through disciplines that are constrained by evidence, and that openly correct themselves when they’re wrong. If my broader argument is that dogma fails under scrutiny, then the honest next step is to offer people good places to do that scrutiny.

Next Chapter

The Psychology of Religion, Chapter 8: Charlatanism

The word charlatanism sounds harsh, but I think it is sometimes the right word. By charlatanism, I mean the presentation of exaggerated or false claims of special spiritual, prophetic, healing, or paranormal authority—especially when these claims are used to gain trust, money, status, or obedience. I do not mean that everyone in these roles is a deliberate fraud. In many cases the line between fraud and self-deception is blurry: some people are consciously manipulative, while others sincerely believe in powers or insights that are not actually there. The effect on followers can be similar either way.

There are many examples of charlatanism in religious history and in the wider history of spiritual practice. Over the years, some highly visible church movements and leaders have been exposed for deceiving followers—manufacturing moral authority, staging spiritual “results,” and in some cases enriching themselves dramatically through offerings and donations. Outside formal religion, there are also psychics and fortune-tellers who make strong claims about paranormal abilities that they cannot substantiate. Yet even here, the picture is not uniform. Some may sincerely believe in what they are doing, and some—whatever their beliefs—can still offer genuinely helpful human wisdom, sometimes resembling a perceptive psychotherapist. Once again, this is often a frame issue: if there is a setting in which a perceptive person pays close attention to a needy and trusting client, many helpful interactions can occur, including occasional insights that feel “predictive,” even when no paranormal or spiritual mechanism is involved.

With regard to psychics and fortune-tellers, much of what feels uncanny in these settings is better explained by ordinary psychology. Some “predictions” rely on cold reading—careful observation of verbal and non-verbal cues, strategic ambiguity, and gentle probing—combined with the Forer (Barnum) effect, in which feedback is so broad that it could apply to almost anyone, yet is delivered in a way that feels intimate and precisely tailored. In a sense, the client supplies the specificity while the psychic supplies the theatre.

Ironically, a kind of “faith” in the mechanism can make the experience more powerful. If you believe in psychic powers, you will likely be more open, more trusting, more suggestible, and more motivated to find meaning in what is said. This can make the encounter feel transformative—while still having nothing to do with paranormal abilities.

On the evidence: it is tempting to say that careful research on parapsychological phenomena has always been negative. A more precise—and still unsparing—statement is that after decades of investigation, these claims have not produced a robust, independently replicated body of evidence that would justify belief in paranormal powers. There are occasional studies that report statistically significant results, but these effects tend to be small, fragile, and disputed, and they do not survive replication under tighter controls (better blinding, preregistration, fixed stopping rules, and independent labs). Most apparent “successes,” in practice, are better explained by coincidence, selection effects (remembering “hits,” forgetting “misses”), motivated interpretation, and the cognitive biases that flourish in emotionally charged settings.

I am aware, too, of some influential figures in the “new age” / self-help spiritual milieu who, as people, have had a genuinely delightful, warm, and thoughtful style. Louise Hay is an example. Many of her self-help affirmations are beautiful—arguably a more poetic and intimate cousin of cognitive therapy. One shortcoming of how CBT is often presented is its cool mechanistic tone, and the affirmations approach can feel refreshingly humane. So I do sometimes encourage patients to use affirmations.

But alongside the affirmations, this same genre often carries dogmas about disease causation—claims that illnesses are produced by emotional states like resentment, criticism, or guilt, and that changing one’s attitude can dissolve even severe disease. Versions of this claim have been widely quoted from the preface of a best-selling affirmations text, and they are not just scientifically implausible—they are ethically hazardous, because they imply that people with tragic illnesses are partly to blame for having the “wrong” emotional life. Even when there is a kernel of truth (stress matters; psychology affects coping and health behaviour), this is a massive distortion of complex causation.

Most importantly, these beliefs become dangerous when they delay or obstruct timely evidence-based care. A “spiritual” frame that provides comfort and meaning is one thing; a causal dogma that misleads people away from effective medical treatment is another.

A related issue is accountability. In medicine and licensed psychotherapy, there are training standards, ethical codes, professional regulation, and at least some recourse when someone harms you. Spiritual markets are much murkier: the more grandiose the claims (“I can see your future,” “I can heal your cancer,” “the universe told me”), the less often there is oversight commensurate with the potential harm. The result is a predictable asymmetry: vulnerable people—often frightened, grieving, or desperate—are asked for trust, money, and obedience, in exchange for claims that are difficult to test and easy to excuse away when they fail.

And we should not flatter ourselves that education inoculates against this. Even very intelligent people can be drawn into false frameworks when the framework meets a psychological need: relief from uncertainty, the soothing of grief, a sense of control, a narrative that restores meaning, or simply the comfort of being seen. In fact, verbal intelligence can sometimes make the problem worse: it supplies better rhetoric to defend the belief, better stories to rationalize disappointment, and sharper arguments to dismiss critics as “closed-minded.” The vulnerability here is not stupidity—it is humanity, under stress, doing what human minds do best: turning ambiguous experience into a story that feels coherent and safe.