Wednesday, February 25, 2026

The Psychology of Religion, Chapter 16: Sacrifice

Most religions have some form of sacrifice alluded to in their theology. Sometimes this involves literal offerings—killing and burning animals, or destroying valuable objects. Other times it is “bloodless”: giving money, time, obedience, or the renunciation of pleasures through fasting, abstinence, or celibacy. In all these cases, the underlying idea is similar: something costly is offered up, with the hope of securing meaning, favor, purity, forgiveness, protection, or communal belonging.


There are also sacrificial motifs that move disturbingly close to human sacrifice. In the Abrahamic traditions, for example, the willingness of Abraham/Ibrahim to sacrifice his son is presented as a peak test of obedience—and in Islam it is commemorated annually in Eid al-Adha, the “Festival of Sacrifice,” in which animal sacrifice functions as a memorial of that story. And in Christianity, the theme of sacrifice is carried into the central story of Jesus: a dramatic moral and symbolic reframing of sacrifice into self-sacrifice, offered “for others.”


Nor is sacrifice some oddity of the Abrahamic traditions. Across much of the ancient world, sacrificial traditions were common, and they were often brutal. Ancient Greek religion had animal sacrifice. Vedic religion in India revolved around yajña, sacrificial ritual. Ancient China too had elaborate sacrificial practices directed toward ancestors and higher powers, sometimes involving animals and at times human beings. The Aztecs are especially notorious for human sacrifice. And the roots of all this may go back shockingly far. A 2019 archaeological paper on symbolic destruction says that “the earliest evidence, dated to about 26,000 BP,” comes from Dolní Věstonice, in the form of making and then “exploding” clay figurines. If that interpretation is right, proto-sacrificial or ritually destructive behaviour belongs among the earliest traces of symbolic culture that we have. 


Why would an all-powerful deity, especially one associated with the highest standards of morality,  want a dead animal or a burnt work of art as a gift? One might think that a god worth revering would consider it a gift if you were to help other people, or care for the natural world, rather than to destroy objects or kill things.  But sacrificial systems do not usually work that way.

Reciprocity, Magical Thinking, and Social Technology


Sacrifice is, in my view, an extension of ordinary human ideas about reciprocity and gratitude—infused with magical thinking. In a community we do favors, give gifts, and care for one another. These behaviors can be altruistic, but they are also supported by norms of reciprocity. If one believes that a mystical power controls destiny, fertility, weather, health, wealth, or military success, it becomes psychologically “reasonable,” within that worldview, to give that power a gift—hoping for a return.


And once a person enters this mindset, the logic can become self-sealing. If you make sacrifices and misfortune still comes, you can conclude the offering wasn’t sufficient, wasn’t sincere enough, or wasn’t given with the right purity of heart—so you must increase it next time. If something good happens afterward, it feels like proof that the sacrifice worked, and should be repeated. In this way, practicing sacrifice can become an escalating, brutal, and destructive behaviour. The sacrificed animals—often the most vulnerable and least able to “consent” to the human story being told about them—do not get much say in the matter.


Another motivation for sacrificial rituals likely came from the brutal necessities of ancient life: hunting animals, or killing domestic animals for food. Most humans bond to animals easily, and it would be psychologically troubling to watch an animal struggle and suffer. Ritual can function as moral anesthetic: a way to consecrate violence, to assuage guilt, and to turn a grim necessity into a story of gratitude, order, and meaning.


Sacrifice can also be political performance. Public ritual can consolidate hierarchy, especially priestly hierarchy, display power, intensify fear, and signal unity. It is not hard to see how sacrifice functions as a kind of social technology: it makes shared belief visible and costly. It puts loyalty on display. It shows who is serious, who is obedient, who can be trusted, and who has the authority to declare what counts as holy.


This is also where sacrifice connects to group psychology. Some scholars have argued that costly rituals—things you would not do unless you were committed—operate as signals that strengthen trust and cooperation within a group, partly by filtering out free riders. A community bound together by shared sacrifice can feel safer, warmer, and more morally serious to its members. But that same mechanism can harden boundaries and intensify suspicion of outsiders.


And costly sacrifice does not merely send a signal to other people; it also works on the person making the sacrifice. People are generally reluctant to admit that they have suffered for nothing. So the greater the sacrifice, the stronger the pressure to reinterpret the suffering as meaningful, noble, or necessary. That helps make sacrificial systems self-protective and self-reinforcing. The cost itself becomes part of the “evidence” that the belief must matter.

Kin Altruism


Speaking of reciprocity: it is a strongly selected trait to favor and help genetic relatives, sometimes even in self-sacrificial ways. If there is a person who has a trait that causes them to selectively help close relatives, then that trait will tend to persist in the family line, because close relatives are more likely to carry the same genes that helped produce that tendency in the first place. This is a simple evolutionary logic: kin altruism increases the survival and reproductive success of the shared family “pool,” even when it costs the individual something in the short run.


But humans do not walk around calculating degrees of genetic relatedness. Instead, we rely on crude, fast estimates—cues that, over most of human history, were often correlated with kinship and shared ancestry. People who live near each other, marry each other, and raise children together will, over generations, tend to share not only genes but also language, accent, customs, dress, habits, and social norms. They may also tend, on average, to resemble one another physically more than they resemble people from a distant village, tribe, or lineage. Conversely, people who look different, speak differently, or practice very different customs are often from a different village, tribe, or family network—and therefore are somewhat less likely to be as closely genetically related as the people who share your immediate cultural and familial world. 


Similarity of appearance, familiarity of accent, shared habits, shared rituals, shared dress, and shared taboos can all become proxies—very imperfect proxies—for “one of us.”  Religion gives people common dress, common restrictions, common foods, common sacrifices, common songs, common stories, and common enemies. In other words, it manufactures the feeling of kinship, even among people who are not literally kin.


The mind has evolved to be slightly more generous, trusting, and self-sacrificing toward those who are more likely to be “one of us,” so it follows that it may also be less generous, more suspicious, or more emotionally distant toward those who feel like “not us.” These tendencies are not destiny, and they are not moral justification—but they are part of the psychological and evolutionary foundation of prejudice. These are precisely the sorts of inherited inclinations we must learn to recognize, challenge, and actively override.


Belonging and Group Boundaries


Religion can sometimes widen the circle of felt family. But it can also strengthen the distinction between those inside the group and those outside it. Once sacrifice, loyalty, and group identity are fused together, shared customs can take on unusual emotional and moral weight, and group boundaries can begin to feel especially important. The stronger those boundaries become, the easier it is for outsiders to be viewed with suspicion, distance, or moral distrust. This does not mean religion always produces hostility, or that it does so uniquely. These are broader features of human social psychology. But religion can give them a sacred language, a ritual structure, and a greater sense of seriousness. In that way, stronger religious boundaries can contribute to increased exclusion and, in some cases, increased hostility between groups. Religion does not invent this psychology, but it can reinforce it.




The Psychology of Religion, Chapter 15: Spirituality & Superstition

Humans have cognitive tendencies that make superstitious beliefs easy to generate—and hard to extinguish. By “spirituality” here I do not mean awe, contemplation, or reverence in a broad sense; I mean the more specific belief that hidden forces—fate, synchronicity, spirits, or nonphysical “energy”—are actively guiding events. Beliefs in spirits, ghosts, magic, luck, or fate guided by mysterious forces are widespread across cultures. The specifics vary wildly from place to place—local spirits, protective rituals, sacred objects, invisible dangers—but the underlying psychological grammar is familiar.

Meaning, Pattern, and Agency

A core ingredient is pattern-seeking. The mind craves meaning, and when the world is uncertain or painful it will often manufacture meaning rather than tolerate ambiguity. This is not a sign of low intelligence; it is ordinary cognition under stress. Pattern-seeking is only part of the story: humans also readily detect agency, intuit purpose, and imagine hidden minds or forces operating behind ambiguous events. When people feel a loss of control, they become more likely to perceive patterns—even illusory ones—in the environment, and to treat coincidence as signal. Superstition can be emotionally satisfying precisely because it converts randomness into a story.

Stories, dreams, unusual experiences, and compelling anecdotes can then become socially transmissible. Once a few people begin to interpret events through a “hidden forces” framework, the framework spreads: it gives language to fear and hope, it creates a sense of specialness, and it offers the pleasure of explanatory closure. Coincidences become “signs.” Ambiguous perceptions become “messages.” A confusing life becomes a legible plot.

Beliefs about fate, synchronicity, or “good and bad energy” fit neatly into this same psychology. A person has a strong feeling—dread, relief, attraction, foreboding—and the mind is tempted to treat that feeling as information about the outer world. A difficult decision can then feel as though it has been answered by “the universe.” A coincidence becomes destiny. A run of bad luck starts to feel orchestrated. The step from “this feels meaningful” to “this is objectively meaningful” is, for many people, quite small. In cultural settings where unusual feelings are already given a supernatural vocabulary, it becomes even easier for an ordinary human experience to be interpreted as fate, guidance, or invisible force.

Why It Can Feel Helpful

Sometimes these beliefs can even confer a short-term psychological benefit. A ritual, talisman, or conviction that one has “good energy” behind them can reduce anxiety, increase confidence, and make a person feel more ready to act. In that sense, superstition can work a little like prayer, placebo, or a pre-performance routine: it changes the person’s emotional state, and that changed emotional state can sometimes improve performance or endurance. But this does not validate the supernatural explanation. It shows that belief can alter mood, attention, and confidence—not that a mystical force is operating in the background.

When Meaning Hardens into Causation

The trouble begins when a poetic or emotionally satisfying interpretation hardens into a literal theory about reality. At that point there is no longer only a harmless sense of wonder; there is a false model of causation. There is still no robust, independently replicated body of evidence that psychic forces, spirits, or nonphysical “energies” of this sort are objectively guiding events in the way believers often suppose. And there is no good reason to treat a strong feeling of destiny as evidence that destiny is real. Once such beliefs are treated as evidence, judgment begins to drift away from probability, base rate, character, and practical consequence. A person may stay in a bad relationship because it feels “meant to be.” They may avoid a sound medical treatment because the illness is thought to be spiritual. They may take reckless risks because fate is presumed to be protective. Life planning becomes poorer when omens and vibes displace sober thinking about what is actually happening.

When Superstition Turns Social

There is a darker social risk as well. Once people begin to believe that invisible moral or spiritual contamination clings to persons, places, or groups, superstition can become a license for prejudice. History offers grim examples of what happens when communities weaponize these causal illusions. The early modern European witch crazes, which claimed tens of thousands of lives, were driven in large part by the urge to assign occult causality to illness, infant mortality, crop failure, or social misfortune. Medieval blood-libel accusations and later pogroms during epidemics drew on related fantasies of hidden contamination and malevolent agency. A more contemporary example can be seen in the persecution of people with albinism in parts of Sub-Saharan Africa, where witchcraft beliefs and ritual myths still endanger lives. In modern, everyday life, the seeds of this same pathology are more banal but equally insidious: a neighbour is said to have “dark energy.” A house is called cursed. A child is treated as spiritually tainted. A stranger is felt to be threatening in some occult way rather than simply unfamiliar. Once a group shares such assumptions openly, they no longer remain private quirks of interpretation; they coalesce into a moral atmosphere in which exclusion, suspicion, and even physical violence feel justified. This is how irrational belief can slide from the sanctuary of private comfort into the arena of public harm.

A Humane Response

At the same time, this topic calls for sensitivity. For the person immersed in such beliefs, the experience does not feel frivolous. It may feel visceral, self-evident, and woven into memory from early life. It may have been reinforced for years by trusted friends, family, charismatic figures, selected anecdotes, online communities, and a steady diet of “paranormal” documentaries or videos that showcase apparent hits while ignoring the endless misses. When a belief has been stabilized by familiarity, repetition, and community endorsement, challenging it can feel less like an intellectual correction than like an invalidation of lived experience. The humane response is not to mock the feeling. The feeling is real. What deserves challenge is the conclusion drawn from it.

A Psychiatric View

From a psychiatric point of view, there is also genuine individual variation in proneness to unusual, mystical, or numinous experience. None of this means the experience is itself pathology. Some people reliably feel awe, presence, synchronicity, and “spiritual certainty,” while others rarely do. Some people seem to have a more absorptive mind: more prone to inner vividness and felt significance. This is shaped by personality and temperament, by culture and reinforcement, and by biology. One useful but imperfect metaphor is that some minds run with higher “gain”: experience arrives vivid and compelling, but with a greater risk that noise is interpreted as signal. Salience systems in the brain are part of this story, though the biology is not reducible to dopamine alone. A related literature suggests that paranormal belief is associated, on average, with more intuitive thinking styles and some weaknesses in probabilistic thinking and analytic reasoning, though of course none of this maps neatly onto any one individual person.

Spirituality and Religion

Many members of organized religions disparage “superstition” or free-floating “spirituality.” Yet in psychological terms—at the level of cognitive ingredients—the differences are often of degree rather than kind. Organized religions tend to formalize these human tendencies into institutions: they standardize the stories, professionalize the interpreters, and link belief to group identity and obligation. “Spirituality,” in contrast, often keeps the intuitions while loosening the institutional grip. But both draw on the same human appetite for meaning, comfort, narrative, and relief from uncertainty.

Next Chapter

The Psychology of Religion, Chapter 14: Religion as a Business

Many religions and other spiritual practices operate partly like a business. There is marketing (proselytization, outreach), branding (symbols people wear on clothing or on necklaces), encouragement to be loyal to your brand, and criticism of other brands. But then there is also a financial commitment, leading to an organized financial structure. There is work to be done by members of this structure, with an ultimate goal—explicit or implicit—of retaining and expanding membership, eliciting volunteering efforts and financial contributions, and maintaining morale.

With some intensely tribal, high-commitment groups (fraternities are the obvious benign example, gangs the darker one), there can be an onerous initiation ritual. Social psychologists have shown that when people have to work hard, endure discomfort, or pay a steep price to join, they often become more loyal afterward—partly because the mind naturally tries to justify what it has sacrificed. Religions also commonly have initiation processes: potential members may be vetted, attend educational sessions, and then take part in some public ritual in which solemn commitments are made.

Sometimes, as with luxury business models, broad proselytization does not occur; instead, the “product” is restricted. Only a select few gain entry. In some traditions you need advanced membership—often taking years—before you are allowed to enter certain beautiful buildings such as temples, or partake in certain deep rituals. Sometimes only men are allowed into certain leadership roles or ritual spaces. These obstacles increase the allure and tend to attract people willing to contribute more commitment, time, and money. If everybody had a Rolex watch or a Gucci bag, it would cease to be as special; exclusivity is part of what makes the object feel “high-end.”

One particular feature of religion that resembles a corporate tactic is the elevation of belief alone—faith—as a key virtue. Belief without evidence is not merely tolerated; it is often praised. If a corporation could successfully propagate that idea, it would be extremely useful for marketing, since people would form loyalty to the brand without looking too closely at “reviews.” Doubt could be reframed as weakness, betrayal, or impurity. Meanwhile, “true believers” are rewarded: their status, trust, and esteem in the community rises in proportion to their loyalty.

In many cases religious institutions amass vast wealth: in property, buildings, and investments. In at least some prominent modern examples, credible reporting and public filings have described religious investment holdings on the order of tens of billions of dollars, with wider claims in some cases exceeding $100 billion—figures that are difficult to reconcile with the ordinary believer’s image of humble spiritual stewardship. And these structures often operate with significant tax advantages. In the United States, churches are generally treated as tax-exempt. In Canada, registered charities (including many religious organizations) are exempt from paying income tax while registered. 

And yet, some of the most insightful cautions about wealth come from within religion itself. One of the sharpest is the line attributed to Jesus (present in all three Synoptic Gospels): “it is easier for a camel to go through the eye of a needle than for a rich man to enter the kingdom of God.”

The Psychology of Religion, Chapter 13: to be a scholar, you had to study theology

For long stretches of European history, many able people who wanted to study the biggest questions ended up studying religion. That was not always because religion had the best answers. Often it was because religious institutions controlled much of the schooling, the books, and the road to public influence. If you wanted literacy, training, status, or a respected voice in your community, religion was often the main gate you had to pass through.

Before the printing press, and before secular universities became common, the Church often controlled many of the material conditions of scholarship as well: the copying of books, access to manuscripts, and much of the economic support that allowed a person to read, write, and think rather than spend life in manual labor. In practice, that often meant that to be a scholar was to be funded, housed, trained, or at least tolerated by a religious institution. That inevitably shaped what could be said, what could be explored, and how far a person could go.

This created a built-in bias. It was not just that highly capable people happened to like theology. It was that theology sat near the center of educated life. If you wanted mentors, libraries, credentials, or a place to teach and write, you often had to work inside a religious setting, or at least learn to speak its language. In that kind of system, religion could borrow prestige from the educated people who passed through it.

That distinction mattered. When people saw a brilliant, educated, generous person who was also a priest, minister, or theologian, they could easily draw the wrong lesson: if someone this thoughtful believes it, maybe the religion itself must be true. But that does not follow. A person can be wise, morally serious, and deeply useful to society while still believing things that are false. Learning and talent do not make a doctrine true.

There was another pressure as well. In many periods, if you were a serious thinker and openly challenged the religious system, you were not just risking an argument. You could lose your position, your audience, your safety, or even your life. So the historical record is not a fair contest in which every idea had the same chance to survive. People who stayed within accepted limits were more likely to keep teaching, keep publishing, and keep being remembered.

Galileo is a clear example. He used a telescope to study the sky and argued publicly that Earth moved around the sun. Today that sounds ordinary. In his time it crossed powerful religious limits. He was put on trial in 1633 and spent the rest of his life under house arrest. Whatever one thinks about all the details of the case, one lesson is plain: a scholar’s standing and safety could depend on not going too far beyond approved belief.

Giordano Bruno shows a somewhat different version of the same pattern. He put forward bold ideas about the universe and about religion itself, and authorities judged some of those ideas unacceptable. He was executed in Rome in 1600. The point is not to turn him into a simple hero of science. The point is that religious power helped set the boundaries of acceptable thought, and crossing those boundaries could be deadly.

And this pressure was not limited to astronomy. Reformers, translators, and other dissidents were also punished severely in different times and places. Even when the issue was not a new scientific discovery, the deeper conflict was often the same: who gets to define truth, and what happens to you if you say otherwise in public?

Europe was not unique in this broad pattern. In the Middle East and the wider Islamic world, serious learning often grew around mosques, religious schools, and scholars of law and scripture. In China, higher learning and public advancement often depended on mastery of the Confucian classics and success in the imperial examination system. In India too, advanced learning often grew around religious traditions, temple settings, monasteries, and learned priestly circles. The details differed from place to place, but the larger point remains: when one tradition controls the road to education and status, its ideas can start to look far more intellectually established than they really are.

To be fair, religious institutions also preserved and passed on learning in many eras. They copied books, trained students, and helped keep scholarship alive. Several well-known universities began in religious settings or still carry that identity. So the story is not simple. But that is exactly why the bias is easy to miss. When the same institution both protects learning and sets the limits of belief, educated people will naturally be overrepresented inside that system.

You can still see a milder version of this today. Many excellent private schools, colleges, and universities are sponsored by religious organizations and maintain very high academic standards. Some of them also require students to take courses in religion, attend chapel, or absorb a broader moral and intellectual outlook shaped by a faith tradition. That does not mean these schools are weak academically; some are excellent. But it does mean that strong education and religious commitment can be packaged together in a way that makes the religion seem more intellectually confirmed than it really is.

There is a second modern echo too. In some religious schools, seminaries, tightly bound communities, and other closed systems, career paths and social acceptance can still depend on affirming the right beliefs. The penalties are usually softer now—not execution, but loss of job, loss of status, loss of community, loss of belonging. The basic pattern is similar: a belief system becomes part of the ticket into a valued world. And once again, this can make it look as though the best minds support the belief, when some critics have left, have been forced to stay quiet, or have been pushed out.

So the old link between theology and scholarship was often not straightforward proof that theology was true. Much of it grew out of history, institutional control, and control over the means of education. For long stretches of time, if you wanted a serious education, you often had to study religion; and if you wanted to keep your place in public intellectual life, you often had to stay inside religion’s limits. That alone can make religion look intellectually stronger than the evidence for its literal claims really is.


Next Chapter

The Psychology of Religion, Chapter 12: Benefits of Unfounded Belief

Another troubling angle on all of this is that people can sometimes latch onto a belief system whose claims are unfounded or causally misconceived, and yet experience real improvements they had not found otherwise. Those improvements can entrench the belief further, because the person now has what feels like lived proof: It worked for me. The mechanism is similar to the self-deception dynamic Trivers describes, and also similar to the way psychoanalysis could help some people even when much of its causal theory was exaggerated, overstated, or wrong.

For example, some people latch onto an extremely rigorous diet with a spurious rationale, and yet end up stabilizing a prior eating problem, bingeing pattern, or weight problem. Often what makes the diet “work” is not the theory, but the frame: the diet becomes a totalizing structure, a rule-set, a ritual, a commitment device, sometimes even a moral identity. Strong belief in the diet’s narrative can increase adherence—sometimes dramatically—especially when the belief is reinforced by “spiritual” practices, authoritative texts, a charismatic leader, and enthusiastic support from fellow adherents. The resulting improvement may have little to do with the supposed mechanism (“toxins,” “energy,” “impurity,” or whatever the myth is) and much to do with ordinary behavioral ingredients: reduced ultra-processed food, fewer calories, more routine, more attention to quantities and timing, and stronger social accountability. In other words, the distinguishing features of the theory may be fictional, while the behavior change is real.

This is an extension of the nonspecific-factors argument, but with an important twist: people may become more loyal to the false theory because the surrounding practices genuinely helped them. And expectancy itself matters. In psychotherapy, patients’ early outcome expectations are associated with better outcomes, and placebo research shows that ritual and a warm practitioner relationship can produce real symptom change. In some settings, even open-label placebos—interventions people are told are inert—can still help. That matters here because it shows that the human benefit does not necessarily require the theory to be true. In some cases it does not even require deception.

It is tempting to treat these forays into unfounded belief as harmless whenever they produce visible gains. But there is a dark side. Some dietary regimens are medically dangerous; some aggravate eating disorders; and some cultivate a loyalty to the framework that discourages critical thinking. Social media can intensify the problem, especially when health-focused communities reward purity, rigidity, and bodily control.

When a person’s identity becomes fused with a belief system, they may reject better treatments when those treatments are indicated—especially if a setback is interpreted as evidence of insufficient “faith,” insufficient purity, or insufficient devotion. The harms here are not merely theoretical. In patients with curable cancers, complementary medicine use has been associated with greater refusal of conventional treatment and worse survival, with the excess mortality appearing to be mediated by delay or refusal of effective care. 

These frameworks also often come packaged with community. People who join one cluster of unusual health beliefs can sometimes be pulled, by social gravity, into neighboring clusters: new spiritual doctrines, anti-vaccine attitudes, conspiratorial styles of explanation, and monetized ecosystems of coaching, supplements, retreats, and memberships. The pattern is not inevitable, but it is real enough to take seriously. 

And there is another distortion worth naming: we mostly hear from the success stories. The people for whom the diet failed, harmed them, or simply became an expensive obsession rarely become public evangelists. The community’s narrative therefore becomes skewed toward “miracles,” while the quiet attrition and collateral damage remain largely invisible.

Finally, just as in religions, the next step is often proselytizing. People who believe they have found salvation—whether dietary, medical, or spiritual—tend to recruit. They may pressure friends and family to “convert,” and disparage outsiders as ignorant, impure, or closed-minded. In the context of fad diets and alternative medicine, that can do real harm to public health.

So the point is not that unfounded belief never “helps.” The point is that when it helps, it often does so through common human mechanisms—structure, community, meaning, identity, expectancy, and accountability—while smuggling in risks that are easy to deny and hard to reverse once the belief becomes an emblem of belonging. The deeper problem is not merely that the belief is false. It is that the felt benefit is then misread as proof that the belief was true all along.