Saturday, May 16, 2026

Chapter 29: The Psychology of Cults

Few words in religion are as morally loaded as the word “cult.” This term usually refers to a strange, intense, controlling group with a charismatic leader and loyal followers who appear to have surrendered their independent judgment. But in scholarship, the word is more complicated. Sociologists of religion have often preferred terms such as “new religious movement,” partly because “cult” has become so pejorative that it can stop careful thinking before it begins.

Still, I do not think the word should be abandoned entirely. There really are destructive cults. There really are groups in which people are manipulated, isolated, humiliated, exploited, terrified, financially drained, sexually abused, or induced to surrender ordinary moral judgment to a leader or ideology. The term is useful if it points toward a pattern of coercive group control. It is much less useful if it simply means “a religion I dislike.”

A destructive cult is a group in which a leader, ideology, or inner circle becomes so authoritative that members are pressured to subordinate conscience, evidence, relationships, money, sexuality, bodily autonomy, and ordinary freedom to the group’s demands. The content may be religious, political, therapeutic, commercial, conspiratorial, or self-help oriented. The key issue is not whether the group believes in God, aliens, enlightenment, revolution, purity, healing, or business success. The key issue is whether the group uses those beliefs to control people.

Some groups have mild cult-like features: excessive admiration of a leader, hostility to critics, ritualized confession, pressure to donate, suspicion of outsiders. Others become severely cultic: members are isolated from family, discouraged from outside education, punished for doubt, pressured into sexual access to leaders, or threatened with damnation, exile, violence, or cosmic catastrophe if they leave.

Almost every intense human group has some version of the ingredients. A military unit, fraternity, elite sports team, political movement, therapy community, monastic order, or religious congregation may require sacrifice, discipline, shared language, and strong group identity. That alone does not make it a cult. A demanding group can be healthy if participation is voluntary, leaders are accountable, criticism is allowed, members can leave without catastrophic punishment, finances are transparent, and outsiders are not automatically treated as enemies. The danger begins when loyalty becomes more important than truth, obedience more important than conscience, and belonging more important than freedom.

Examples of Cults, and the Problem of the Label


The most infamous cult examples are now part of public memory. Jonestown, led by Jim Jones and the Peoples Temple, ended in 1978 with more than 900 deaths in Guyana, including many children. Heaven’s Gate ended in 1997 when 39 members died by suicide, linked to beliefs about a UFO associated with the Hale-Bopp comet. Aum Shinrikyo, a Japanese apocalyptic movement, carried out the 1995 Tokyo subway sarin attack, killing 13 people and injuring thousands. The Branch Davidians, an offshoot of the Davidian Seventh-Day Adventist tradition, became internationally known after the 1993 Waco siege, which ended with nearly 80 deaths.

These examples shape public imagination, but they can also mislead us. Most high-control groups do not end in mass death. Most do not become global news. Many simply create years of quiet suffering: estrangement from family, lost education, sexual shame, financial exploitation, fear of leaving, and a long aftermath of confusion.

There are also groups that were once called cults—or sects, heresies, fanatical movements, or dangerous outsiders—that later became accepted, or at least normalized. Early Christianity itself was viewed by many Roman observers as a strange and deviant sect. Protestants were once heretics to Catholics; Catholics were viewed with paranoia in many Protestant countries, and many Protestants to this day consider Catholicism to be a cult. Quakers, Methodists, Shakers, Mormons, Seventh-Day Adventists, Jehovah’s Witnesses, Christian Scientists, Pentecostals, and many others have at various times been treated as dangerous outsiders by dominant religious cultures. Some of these groups remain controversial; some retain high-control features; some have liberalized; some have become legally protected, socially familiar, and in many communities quite ordinary.

This raises an uncomfortable point: “cult” is partly a social status label. A new religion that is small, poor, foreign, sexually unusual, secretive, or socially embarrassing is much more likely to be called a cult than an older religion with cathedrals, lawyers, tax status, universities, famous donors, and tasteful music. If an elderly institution teaches supernatural claims, demands money, restricts behaviour, discourages apostasy, and venerates ancient leaders, we call it a religion. If a new group does similar things with less history and less social capital, we may call it a cult. Age does not make a belief true. Respectability does not make a social structure harmless.

But many religious groups are not cult-like: they are open, kind, pluralistic, self-critical, and psychologically healthy. Many people participate in religion without being controlled. The better point is that religion and cults share underlying mechanisms, and the difference is often one of degree, accountability, and freedom.

How Common Are Cults?


The prevalence of cults is hard to measure, because everything depends on definition. If by cult we mean a small new religious group, there are many. If we mean a destructive high-control group, there are fewer, but still more than most people realize. If we include nonreligious groups—political extremists, conspiratorial movements, therapy cults, human-potential movements, multilevel marketing groups with quasi-spiritual rhetoric, militant ideological communities, and online radicalization clusters—the number becomes much larger.

The public tends to notice catastrophic endings, but cult-like behaviour exists in quieter forms all around us. A small church in which members are shunned for dissent can be cult-like. A self-help group that pressures vulnerable people to pay for endless levels of training can be cult-like. A political movement in which the leader cannot be criticized, facts are treated as enemy propaganda, and members are rewarded for visible loyalty can be cult-like. A family system can even become cult-like when one controlling parent defines reality for everyone else, punishes dissent, recruits allies, and frames escape as betrayal.

Historically, cult-like movements often flourish during periods of dislocation: war, migration, economic distress, rapid modernization, humiliation, family breakdown, cultural change, or institutional collapse. People seek certainty when the world becomes confusing. They seek belonging when ordinary community has failed. They seek moral clarity when ambiguity becomes exhausting. This is not a sign that people are stupid. It is a sign that human beings are social creatures whose minds are built for attachment, identity, and meaning.

Why Do People Join Cults?


A common mistake is to imagine that people join cults because they are unintelligent, or gullible. That is comforting to believe, because it allows everyone else to say, “That could never happen to me.” But this is not what the evidence suggests. A review of cultic environments noted that people entering such groups do not necessarily show mental health problems, and that current members may appear well-adjusted in many respects.

The initial experience of joining a cult often feels good. It may feel wonderful. The person is lonely, grieving, uncertain, ashamed, bored, rejected, idealistic, or in a period of transition. Then a group appears offering warmth, certainty, purpose, moral seriousness, friendship, structure, and perhaps even practical support. This is almost exactly the same mechanism by which ordinary religion can function as refuge. In an earlier chapter I compared religious benefit to nonspecific therapeutic factors: the frame, the relationship, the structured attention, the ritual, the group support. Cults often begin by offering those same nonspecific factors in concentrated form. They do not usually begin with the most bizarre doctrine. They begin with love, interest, inclusion, and a sense that your life is finally becoming meaningful.

Many people join because they happen to meet the group at the wrong moment, or the right moment, depending on one’s point of view. A new friend invites them. A romantic partner is involved. A campus recruiter is kind. A meditation group seems peaceful. A Bible study group seems sincere. A charismatic teacher gives a talk after a period of depression or grief. The group may provide housing, food, music, emotional intensity, or a sense of being chosen. The person does not feel captured. They feel rescued.

Mental illness can sometimes be a vulnerability, but it is not the main explanation. Depression, trauma, substance use, psychosis, personality vulnerabilities, or severe anxiety may increase susceptibility in some cases, especially if the group offers certainty and containment. But many recruits are healthy, intelligent, educated, and morally serious. In fact, idealism can be a risk factor. People who want to change the world, purify themselves, help the poor, find truth, heal from trauma, or live a life of moral depth may be especially vulnerable to a group that claims to offer exactly that.

Poverty and lack of social support can also matter. If a group provides housing, food, friendship, or protection, it becomes much harder to evaluate it objectively. Leaving may mean not just changing beliefs but losing shelter, work, childcare, social standing, and one’s entire identity. A cult is most powerful when it becomes not only a belief system but a life-support system.

The Psychological Machinery of Cults


The machinery of cults is not mysterious. It is a set of ordinary human processes intensified and organized.

First, there is love-bombing or intense early inclusion. The new person is treated as special. They are welcomed, listened to, praised, and given a role. In ordinary life, many people feel unseen. A group that suddenly sees them can feel intoxicating.

Second, there is a charismatic leader or sacred authority. Sometimes this is a literal leader: a prophet, guru, pastor, therapist, revolutionary, coach, or founder. Sometimes the leader is dead but still controlling through texts, recordings, rules, or institutional descendants. Sometimes the authority is an ideology rather than a person. In all cases, the same psychological structure appears: ordinary doubts are subordinated to a higher truth.

Third, there is isolation from competing sources of reality. This may be physical isolation, as in a commune, compound, retreat centre, or remote residence. But isolation can also be informational and relational. Members are encouraged to distrust outsiders, avoid critical media, reduce contact with family, or reinterpret criticism as persecution. The group gradually becomes the main source of facts, values, relationships, and meaning. A modern cult-like phenomenon all around us today is the “media echo chamber” by which people are often exposed only to one news source, which espouses a particular political or cultural position.

Fourth, there is escalation of commitment. The person gives time, money, labour, secrets, sexual access, public testimony, or obedience. Each sacrifice makes departure more psychologically expensive. People do not like to admit they have suffered for nothing. So the more they give, the more they need the group to be worthy of the gift. This is one reason costly rituals and high-demand commitments can strengthen groups.

Fifth, there is public signaling. The member does not simply believe privately; the member is seen believing. They attend meetings, wear symbols, donate publicly, confess, sing, testify, recruit, or participate in rituals. This connects directly with the idea of common knowledge. Once everyone has seen you declare loyalty, your identity becomes socially fixed. Doubt is no longer a private intellectual event. It becomes a public betrayal.

Sixth, there is language control. Cults often develop a specialized vocabulary. Ordinary words are replaced by group terms or jargon. Critics are “suppressive,” “worldly,” “demonic,” “asleep,” “impure,” “unenlightened,” “bourgeois,” “apostate,” or “negative.” The words differ, but the function is similar: to make dissent feel morally contaminated before it is even heard. Language becomes a fence around thought.

Seventh, there is confession and surveillance. Members may be encouraged to reveal doubts, sins, sexual thoughts, family conflicts, fears, or traumas. In a healthy therapeutic context, disclosure can be healing. In a cultic context, disclosure becomes leverage. The group learns where the person is vulnerable. The person becomes more dependent on the group’s approval. Private life becomes group property.

Eighth, there is fear. The fear may be supernatural: hell, demons, curses, divine punishment, loss of salvation. It may be social: shunning, humiliation, loss of family, loss of marriage, loss of children, loss of reputation. It may be practical: poverty, deportation, loss of housing, loss of job. It may be apocalyptic: the world is ending and only the group will survive. Fear keeps people attached even after love has faded.

This combination produces what Janja Lalich has called “bounded choice”: the member may appear to be choosing freely, but the social and ideological world has been narrowed so dramatically that only one choice feels morally possible.

The Long-Term Outcomes of Cults


Cults do not all end the same way. Some collapse quickly when the leader is exposed, imprisoned, dies, loses charisma, or fails to deliver a prophecy. Some fragment into rival factions. Some remain small but stable for decades. Some become less extreme over time as children are born, finances become formalized, legal scrutiny increases, and the group learns to survive by appearing more normal. Some become mainstream religions or denominations. Some self-destruct catastrophically.

Catastrophic self-destruction becomes more likely when several conditions converge: apocalyptic belief, isolation, paranoia about outside persecution, leader instability, weapons or poison, sexual or financial scandal, legal pressure, and a doctrine that reframes death as victory. Jonestown and Heaven’s Gate are the public nightmares, but the psychological ingredients are not exotic. They are ordinary group processes pushed to a terrifying extreme.

Many groups do not implode. They simply become chronic containers of control. Children grow up inside them. Marriages are arranged or heavily influenced by them. Education is restricted. Doubters learn to remain silent. The group may never make the news, yet it quietly consumes the lives of its members.

Cults and Ordinary Religion: Difference in Kind, or Difference in Degree?


We often talk about cults as though they are qualitatively different from religion. That is understandable, because it protects ordinary believers from feeling accused. But psychologically, many of the same mechanisms are present across the spectrum. The difference is usually intensity, accountability, and freedom.

Ordinary religions often include sacred authority, ritual, public profession, moral language, behavioural restrictions, costly sacrifice, insider-outsider boundaries, suspicion of apostasy, and emotionally charged group experiences.

A church that teaches children they are loved, encourages charity, welcomes doubt, cooperates with science, protects dissenters, and allows people to leave peacefully is not a cult in any meaningful sense. A church that teaches children that outsiders are evil, threatens hell for unbelief, shuns dissenters, hides abuse, demands unquestioning obedience, and gives leaders unchecked power is moving along the cultic continuum, even if it has beautiful buildings, tax-exempt status, and a long history.

This is the uncomfortable thesis: cults are not alien intrusions into religious life. They are one possible intensification of religious life. They are what happens when normal human tendencies—belonging, reverence, loyalty, trust, moral aspiration, fear of death, love of authority, need for certainty—are captured by a closed system.

The same point applies outside religion. Political cults, therapy cults, business cults, and conspiracy movements show that the problem is not supernatural belief alone. The problem is totalizing allegiance. Any group can become cult-like when it treats loyalty as the highest virtue and dissent as contamination.

Helping Someone Who Is in a Cult


The old movie image of an “intervention” is dramatic: family members kidnap the person, lock them in a room, and bring in a deprogrammer to break the spell. This approach is ethically and legally fraught, and it can easily reproduce the coercion it claims to oppose. Historically, “deprogramming” declined partly because of legal risks, public criticism, and the rise of less coercive exit-counseling approaches.

The more humane approach is relational. Do not begin by humiliating the person. Do not mock the group’s beliefs. Do not treat the person as stupid. Do not force them to choose between the cult and the family in a single dramatic confrontation, because the cult has usually already prepared them to interpret criticism as persecution. If outsiders behave with contempt, the group’s warnings are confirmed.

The first task is to keep the relationship alive. Be kind, steady, patient, and curious. Ask questions rather than delivering lectures. The second task is to provide a non-cultic attachment. Many people stay because leaving would mean loneliness. If the outside world contains only critics, shame, and confusion, the cult remains emotionally safer. Outsiders need to become a credible alternative: not a debate team, but a humane place to land. Megan Phelps-Roper’s experience is a useful example. Her account emphasizes that compassionate relationships with outsiders, including people who disagreed with her profoundly but treated her with dignity, helped create the conditions for doubt.

The third task is safety. If there is imminent risk—suicidal intent, violence, child abuse, sexual exploitation, medical neglect, forced confinement, weapons, or credible threats—then ordinary clinical, legal, and protective steps are necessary. But even then, intervention should be careful. Heavy-handed state action can sometimes intensify the group’s persecution narrative and increase danger, especially if members believe the outside world is evil, corrupt, or violent.

How likely is it that someone will leave a cult? There is no single answer. Many people do leave high-demand groups, but the timing is unpredictable. Some leave after a leader’s hypocrisy becomes undeniable. Some leave after a failed prophecy. Some leave because they fall in love with someone outside the group. Some leave when they become parents and cannot bear to subject their children to the same control. Some leave after education, travel, illness, grief, burnout, or private moral disgust. Some do not leave for decades. Some never leave.

Leaving is not a single event. It is often a long psychological migration. A person may physically leave before mentally leaving. Or mentally leave years before saying so publicly. This is especially true when family remains inside. To leave a cult may mean losing one’s parents, siblings, spouse, children, friends, language, cosmology, moral identity, and sense of destiny all at once. Outsiders sometimes underestimate this. They say, “Why don’t you just leave?” But leaving can feel like stepping off the edge of the world. In many ways leaving a cult is analogous to a person leaving an abusive family.

After Leaving


Former members often need practical and psychological support. They may need housing, money, employment help, education, legal assistance, medical care, and protection from harassment. They may also need gentle, empathic therapy.

Research on former members describes an “in-between” period: loss of worldview, confusion, grief, identity disruption, and distress as the person tries to adapt to life outside the group. Leaving does not automatically produce joy. It may produce terror, emptiness, guilt, and longing. The cult may have been abusive, but it was also home.

Former members may need to relearn ordinary decision-making. What do I wear? What do I eat? Who am I allowed to date? What music do I like? What do I believe about death? How do I spend money? How do I disagree with someone without panic? How do I trust my own perception? These questions can be profound for someone whose preferences were controlled for years.

There is also moral injury. Former members may have recruited others, shunned family, repeated hateful doctrines, donated money, punished children, or participated in humiliating rituals. Some were victims and perpetrators at the same time. A good therapeutic approach has to make space for both accountability and compassion. Shame alone can drive people back into closed systems. But denial is not healing either.

A healthy exit requires new community. This is a recurring theme in the psychology of religion. A person rarely leaves a total social world merely because an argument is correct. They need somewhere else to belong. The secular world often fails here. It can be rational but socially thin. Cults and religions know how to gather people, sing together, eat together, help each other move apartments, visit the sick, celebrate births, mourn deaths, and give people roles. A purely intellectual critique of cults is not enough if we do not offer better human structures outside them.

Preventing Cults at a Community Level


At a community level, preventing cults requires more than warning people. It requires reducing the vulnerabilities that cults exploit: loneliness, poverty, untreated mental illness, poor education, family estrangement, lack of belonging, and lack of meaning.

People need to learn cognitive biases, social influence, probability, media literacy, financial literacy, and the warning signs of coercive control. They need to understand that intelligence does not immunize anyone against manipulation. In fact, intelligent people can become very skilled at defending a false belief if the belief protects identity and belonging.

Religious communities themselves can help prevent cultic drift by building safeguards: transparent finances, rotating leadership, independent abuse reporting, no secret doctrines for inner circles, no punishment for leaving, no shunning, no leader above criticism, no sexual access disguised as spiritual privilege, no pressure to cut off family, no discouragement of education, and no claim that doubt is evil.

When authorities do intervene, they should of course minimize aggression. Some cults depend on a persecution narrative. If the state behaves stupidly or brutally, the leader’s warnings are confirmed. This does not mean tolerating abuse. It means that intervention should be intelligent, patient where possible, legally precise, and informed by people who understand high-control groups.

Participating in Religion in a Less Cult-Like Way


The final question is not only how to identify cults, but how to participate in religion—or any intense community—in a less cult-like way.

A less cult-like religion allows members to leave without punishment. It allows doubt to be spoken aloud. It teaches children accurate science and history, not only sacred stories. It does not require hatred or fear of outsiders. It does not make sexuality the obsessive focus of moral worth. It does not protect leaders from consequences. It does not treat financial giving as proof of holiness. It does not confuse humility with obedience. It does not turn family love into a hostage situation.

A less cult-like religion treats doctrine as a framework for moral reflection, not as a weapon. It values truth enough to tolerate questions. It values love enough to refuse shunning. It values conscience enough to let people disagree. It values children enough not to terrify them with hell. It values community enough not to trap people inside it.

Of course, one could ask why supernatural belief is needed at all. If the best parts of religion are community, music, moral reflection, care for the vulnerable, gratitude, beauty, mourning, celebration, and humility, then perhaps those goods can be cultivated without insisting on literal fictions. But while religion persists, it is worth asking how it can become less coercive, less dogmatic, less tribal, and less cult-like.

A cult is not just a strange group somewhere else. It is a warning about human social nature.

Annotated Bibliography


Aronoff, J., Lynn, S. J., & Malinoski, P. (2000). Are cultic environments psychologically harmful? Clinical Psychology Review, 20(1), 91–111. doi:10.1016/S0272-7358(98)00093-2

This review is useful because it complicates the simplistic idea that people who join cults are obviously mentally ill or psychologically unwell. 


Barker, E. (1984). The making of a Moonie: Choice or brainwashing? Blackwell.

Barker’s study of the Unification Church remains one of the most important empirical works on conversion to a controversial new religious movement. Its central value is that it resists both extremes: the idea that converts are simply brainwashed robots, and the equally naive idea that conversion is always a free, rational, individual choice. Barker’s work is especially useful for describing how social context, personal searching, recruitment settings, and group experience interact over time.


Chambers, W. V., Langone, M. D., Dole, A. A., & Grice, J. W. (1994). The Group Psychological Abuse Scale: A measure of the varieties of cultic abuse. Cultic Studies Journal, 11(1), 88–112.

This article introduced the Group Psychological Abuse Scale, an attempt to operationalize psychological abuse in cultic or high-control groups. While the journal and field are more specialized than mainstream psychiatry or psychology, the article is important because it tries to move the discussion away from vague accusation and toward measurable dimensions: isolation, dependency, fear, manipulation, and exploitation.


Coates, D. D. (2010). Post-involvement difficulties experienced by former members of charismatic groups. Journal of Religion and Health, 49(3), 296–310. doi:10.1007/s10943-009-9251-0

Coates examines the difficulties former members can experience after leaving charismatic or high-demand groups. This is important because many outside observers assume that leaving is the end of the problem, when in reality it may be the beginning of a difficult identity transition. Former members may struggle with guilt, fear, loneliness, distrust, and loss of worldview.


Festinger, L., Riecken, H. W., & Schachter, S. (1956). When prophecy fails: A social and psychological study of a modern group that predicted the destruction of the world. University of Minnesota Press. doi:10.1037/10030-000

This classic study introduced one of the most memorable accounts of cognitive dissonance in a religious-apocalyptic setting. Festinger and colleagues studied a group whose prophecy failed, showing how disconfirmation can sometimes intensify rather than weaken belief, especially when members have already sacrificed publicly for the worldview. This remains highly relevant to cult psychology because failed predictions, scandals, and contradictions do not automatically dissolve closed groups. Sometimes they generate more fervent proselytizing and more elaborate rationalization. This is also a caution for some of the world’s cult-like political phenomena—even as more and more disasters occur as a result of poor leadership, the “base” following may become even more fervent or extreme.


Galanter, M., Rabkin, R., Rabkin, J. G., & Deutsch, A. (1979). The “Moonies”: A psychological study of conversion and membership in a contemporary religious sect. American Journal of Psychiatry, 136(2), 165–170. doi:10.1176/ajp.136.2.165

This psychiatric study of Unification Church members is important because it brought the topic of charismatic religious sects into mainstream psychiatric discussion. It explored conversion and membership without reducing the issue simply to psychosis or individual pathology.


Galanter, M. (1990). Cults and zealous self-help movements: A psychiatric perspective. American Journal of Psychiatry, 147(5), 543–551. doi:10.1176/ajp.147.5.543

Galanter’s article is valuable because it broadens the discussion beyond obviously religious cults to include zealous self-help movements. This is a crucial point: cultic dynamics are not limited to supernatural belief. They can appear in therapeutic, commercial, political, and “human potential” settings as well.


Gómez, Á., Martínez, M., Martel, F. A., López-Rodríguez, L., Vázquez, A., Chinchilla, J., Paredes, B., Hettiarachchi, M., Hamid, N., & Swann, W. B. (2021). Why people enter and embrace violent groups. Frontiers in Psychology, 11, Article 614657. doi:10.3389/fpsyg.2020.614657

This article is not about cults narrowly, but it reviews why people enter and internalize commitment to violent groups. Concepts such as identity fusion, sacred values, social influence, and radicalization are directly applicable to cultic dynamics. The importance here is that it places destructive religious cults within a broader family of intense group allegiances, where belonging and identity can override ordinary self-interest and moral caution. This reference is already aligned with your existing reference base.


Hadding, C., Semb, O., Lehti, A., Fahlström, M., Sandlund, M., & DeMarinis, V. (2023). Being in-between; exploring former cult members’ experiences of an acculturation process using the cultural formulation interview (DSM-5). Frontiers in Psychiatry, 14, Article 1142189. doi:10.3389/fpsyt.2023.1142189

This recent qualitative study explores former cult members’ experiences after leaving, describing the “in-between” quality of post-cult adjustment. It is significant because it treats exit as an acculturation process: the person is not merely changing opinions, but moving from one cultural world to another.


Kent, S. A. (2002). Deprogramming, exit counseling, and the decline of deprogramming. Cultic Studies Review, 1(3).

Kent’s article is important for understanding the history of attempts to remove people from cults. Earlier “deprogramming” sometimes involved coercive tactics, including abduction or confinement, which raised serious ethical and legal concerns. The shift toward exit counseling reflects a more humane and defensible approach: rapport, information, family support, and voluntary engagement.


Lalich, J. (2004). Bounded choice: True believers and charismatic cults. University of California Press.

Lalich’s concept of “bounded choice” is one of the most useful ways to describe the subjective experience of cult members. From outside, members appear to be making irrational choices. From inside, the group has narrowed the moral and informational universe so dramatically that obedience feels like the only possible path. This concept is significant because it avoids crude “brainwashing” language while still taking coercive control seriously.


Lifton, R. J. (1961). Thought reform and the psychology of totalism: A study of “brainwashing” in China. W. W. Norton.

Lifton’s work is foundational for understanding totalistic environments: milieu control, confession, sacred science, loaded language, doctrine over person, and the demand for purity. Although the original context was political thought reform rather than religion, the framework has been influential in later discussions of cults and high-control groups. Its significance lies in describing how environments can reshape identity and reality-testing without requiring that every member be mentally ill or unintelligent.


Lofland, J., & Stark, R. (1965). Becoming a world-saver: A theory of conversion to a deviant perspective. American Sociological Review, 30(6), 862–875. doi:10.2307/2090965

This classic article is one of the foundational sociological accounts of conversion to a deviant religious perspective. Lofland and Stark emphasized strain, religious seeking, affective bonds, and intensive interaction with believers. The article is significant because it helps explain why conversion is rarely just an abstract intellectual shift. It is a social and emotional process.


Pretus, C., Hamid, N., Sheikh, H., Ginges, J., Tobeña, A., Davis, R., Vilarroya, O., & Atran, S. (2018). Neural and behavioral correlates of sacred values and vulnerability to violent extremism. Frontiers in Psychology, 9, Article 2462. doi:10.3389/fpsyg.2018.02462

Pretus and colleagues examine sacred values and vulnerability to violent extremism using behavioural and neural measures. This is important for cult psychology because sacred values are not experienced as ordinary preferences. Once a belief becomes sacred, compromise can feel like betrayal. The article helps explain why cult members may endure suffering, reject evidence, or even accept violence when the group’s core values have become fused with identity.


Richardson, J. T. (1993). Definitions of cult: From sociological-technical to popular-negative. Review of Religious Research, 34(4), 348–356. doi:10.2307/3511972

Richardson’s article is essential for any chapter that uses the word “cult.” It shows how the term has changed over time and how its popular use has become strongly negative. The significance is that it forces conceptual discipline: we should not call a group a cult merely because it is new, strange, small, foreign, or theologically unappealing. The better question is whether the group uses coercive, exploitative, or abusive control.


Saldaña, O., Rodríguez-Carballeira, Á., Almendros, C., & Escartín, J. (2017). Development and validation of the Psychological Abuse Experienced in Groups Scale. The European Journal of Psychology Applied to Legal Context, 9(2), 57–64. doi:10.1016/j.ejpal.2017.01.002

This article is significant because it attempts to measure psychological abuse in group settings empirically. The authors developed and validated a scale focused on abuse experienced in groups, including former members of manipulative or coercive organizations. This helps support the chapter’s process-based definition of cults: the concern is not theological oddness but patterns of control, manipulation, dependency, isolation, and harm.


Sosis, R., & Bressler, E. R. (2003). Cooperation and commune longevity: A test of the costly signaling theory of religion. Cross-Cultural Research, 37(2), 211–239. doi:10.1177/1069397103037002003

Sosis and Bressler’s study is important because it links costly ritual and group survival. Their analysis of communes supports the idea that demanding rituals and sacrifices can strengthen cooperation by signalling commitment and filtering out free riders. This is directly relevant to cults because high-control groups often require costly behaviours—donations, dress codes, confession, celibacy, public testimony, isolation—not only for theological reasons but because cost itself creates loyalty and makes commitment visible.


Stark, R., & Bainbridge, W. S. (1980). Networks of faith: Interpersonal bonds and recruitment to cults and sects. American Journal of Sociology, 85(6), 1376–1395. doi:10.1086/227169

This article is central to understanding recruitment. Stark and Bainbridge argue that interpersonal bonds play a major role in conversion to cults and sects. This is crucial because it challenges the image of recruitment as simply ideological persuasion. People are often drawn in through friends, partners, mentors, and emotionally meaningful relationships.


Ungerleider, J. T., & Wellisch, D. K. (1979). Coercive persuasion (brainwashing), religious cults, and deprogramming. American Journal of Psychiatry, 136(3), 279–282. doi:10.1176/ajp.136.3.279

This psychiatric article is historically important because it reflects the late-1970s clinical concern about brainwashing, cults, and deprogramming. Its significance today is partly historical: it shows how psychiatry was grappling with cultic influence during a period of intense public concern. It also helps frame why later writers became more careful about the language of “brainwashing,” preferring more nuanced models of social influence, coercive persuasion, bounded choice, and high-control environments.


Whitehouse, H. (2004). Modes of religiosity: A cognitive theory of religious transmission. AltaMira Press.

Whitehouse’s work is highly relevant because it explains how different ritual styles transmit religious commitment. His distinction between rare, intense, emotionally memorable rituals and frequent, repetitive, routinized practices is useful for understanding how groups create cohesion. Cults often combine both: dramatic initiations, confessions, retreats, or ecstatic experiences, alongside repetitive meetings, slogans, and behavioural rules.

No comments: