Setting the altar
The polite academic attempt to introduce the term “New Religious Movement” has largely lost ground in the public sphere. Canadaland Commons, for example, releasing their most recent season on — whatever we’re talking about — chose the punchier and more communicative term: CULTS. When people witness bad behaviour in a spiritual group they want to employ a hard, derogatory word to make their opinions on the behaviour known. This human urge is hard to control. Our desire to communicate our value assessments surely accounts for much of the needless, constant and wonderful expansion in our vocabulary.
I guess folk semantics would define “cult” as a term that applies to the intersection of two seperate sets of social types. The first set are “New Religious Movements” proper: this term would presumably suffice to designate a social group with strange spiritual beliefs, not sufficiently similar enough to an organized religion to fall under their aegis, so long as the group didn’t provoke our disdain. The second set are high-control groups: social groups that require members to perform especially difficult or unusual demonstrations of obedience, whether or not those demonstrations are spiritual in nature.
Because of the rhetorical effect of the term “cult”, other groups that don’t quite fit into this intersection also get called “cults”. I want to precise them out here.
- Some relatively established and organized religious groups get referred to as cults if the person speaking is unsympathetic to them or suffered abuse within that organization. Mormonism or Old Order Anabaptism might serve as examples.
- Sometimes, small spiritual groups that are unusual but not high-control get referred to as “cults,” such as Wiccan covens.
- Sometimes, non-spiritual high-control groups are referred to as “cults”, such as the military or the New York City Ballet.
- Finally, the term “cult” sometimes gets used to describe social groups that are neither spiritual nor, relatively, very high-control, but that the speaker happens to dislike. Maybe they have some particular in-group vocabulary and a high level of enthusiasm. This is how, for example, CrossFit and Goop get described as cults. Are there people living in an off-grid CrossFit compound where they aren’t allowed to wear the colour red? If so, I retract my criticism.
Amanda Montell of the popular podcast Sounds Like a Cult (genre: comedy) talks about the “cultish spectrum” in which groups like the latter could be described as “more cultish” than other groups. I am personally ennervated by this easy use of “cult” or even “cultish”, which tends, in my opinion, to the trivializing. However, since my background is in linguistics, I know better than to rail against a dysphemism treadmill. Language will do what it will do. If people hate CrossFit so much that they feel the need to call it a “cult” or even claim that it exists adjacent to a cult on a cultish spectrum to get their point across, God knows, I can’t stop them.
Dozens of popular documentary works have emerged in the past few years on cults and extremist religious movements. I’ll name a few of these programs literally off the top of my head: “Shiny Happy People”, “Love Has Won: The Cult of Mother God”, “How to Become a Cult Leader,” “Wild Wild Country”, which was a trailblazer back in 2018, “Keep Sweet: Pray and Obey”, “Escaping Twin Flames” and also “Desperately Seeking Soulmate: Escaping Twin Flames Universe”, and so on.
It’s easy enough to deride the tone of much popular material on cults. This LA Times listicle is a typical example. The descriptions salivate: “Love Has Won” opens on the image of a woman’s dead body and, quote, “only gets more disturbing from there”. The programs and their subjects are “riveting”, “unsettling”, “unthinkable,” “chilling”, “extreme”, “bizarre”, “baffling, heartbreaking”, “bizarre” again, “absurdly entertaining”. I’m only halfway down the page.
The momentum of this type of prose is usually modestly restrained by apologia: a series on Heaven’s Gate, for example, “reveals a group of people who were all too human, despite desperately wanting to be more.” Our voyeurism is recast as sympathetic, as if we turned the program on in the first place only as an opportunity to better understand people who believe things that seem strange to us. But surely if our primary desire was really to enter into a pleasant state of human-to-human understanding we would be reading Dostoevsky, or something.
How sinister that we could describe these moral annotations as a “humanizing urge”. As if humanity is a status that can be removed if your feelings about the nature of the world are far enough from mainstream; a status which can then be conferred again by a sufficiently large crowd of onlookers who share a mental model of reality that is normal and righteous.
I, of course, have watched most of these programs, so I have no moral ground to stand on.
Ask not what your cult can do for you
Our fascination with the cult must have multiple social uses, which strike me as both paranoid and soothing. (Both, because a paranoid might be soothed by a complete, even excessive, accounting of the environment.)
There are the apotropaic benefits, for one. Cult-watching shares this trait with its cousin, true crime. It seems nearly obvious at this point to suggest that female fixation on true crime results from the fact that it’s primarily women who get true-crimed. As Kate Tuttle writes in the linked New York Times op-ed, the much-derided motto of true crime’s vanguard party, My Favourite Murder (genre: comedy) takes on a pleading quality when seen in this light. ”Stay Sexy and Don’t Get Murdered.”
There happens to be some evidence to suggest that women are more likely than men to join a cult-in-the-classic-sense. The explanations for this, in the many thinkpieces on the topic, mainly feel insufficient or cloying. I suspect that this fact has most to do with what type of high-control groups men and women are likely to join: more men than women join militias. However, trying to get down to the root of all sex difference will have to wait for another essay.
We could say that true crime represents a physical or existential risk, and the cult represents a spiritual risk, an epistemic risk. Epistemic risk is, simply, the chance that you believe something that is wrong. The result of epistemic risk is epistemic anxiety, the fear that your beliefs are incorrect. A lot of academic work on this topic seems to have arisen in the last five years, in my brief survey (remember, I’m only “Beginning to Think About Cults”). I am wary of all tea-leaf-reading that purports to explain why some social trend has arisen at this moment in history. The conclusions often strike me as apophenic, even when poignant. That said, it does seem poignant: the Internet has expanded our access to contradicting and convincing information beyond all prior experience, and subsequently, both academic philosophers and true crime aficionados are beginning to think about epistemic risk.
There’s a real sense in which spiritual abuse in the cult could be equally described as epistemic abuse. The particular danger of the cult is that it perverts what you know, what you believe, until you find yourself acting in a way you would never have justified in your prior life. The entire time you believe that action to be correct, even necessary. I believe our fascination with the cult is, at some levels, an epistemic fascination. And the other major function of the archetype of the Cult is an epistemic function.
You don’t know what you don’t know what you don’t know what you don’t know
Our fixation on the Cult doesn’t come only from possible harms to us. Many of these groups never harmed anyone but their members; some of these groups don’t even seem to be very interested in recruiting. If you’re not already Mormon, and you’re an independent adult, you are probably safe from the FLDS.
I’m curious whether we are also fixated on the Cult because it helps the rest of us, outsiders, define a window of acceptable belief. The Cult is threatening because, in lying outside those boundaries, it offends our sense of consensus reality.
We each take on beliefs based on the sources we trust, the evidence we believe, and the other beliefs with which the new belief coheres. I have heard this called doxastic deference. Of course, we have other mental mechanisms which help us decide what to believe. This one is merely particularly interesting for this case. For some of us, for example, our trusted sources of knowledge are government officials. For some, activists and grassroots leaders. For some, academic commentators. For some, peers and family. Isn’t a spiritual teacher also a perfectly good source?
(Doxastic deference is an idea I’m really interested in exploring further, which I encountered, reading for this essay, in Richard Pettigrew’s 2022 work Epistemic Risk and the Demands of Rationality. The relevant section starts around p. 144. Almost every time I write this title, I accidentally write, “the Limits of Rationality”.)
One frightening aspect of the Cult, which causes us to turn its image over and over in our minds, is that the process of coming to believe in Amy Carlson Mother God must surely feel subjectively similar to any other change in belief. It might feel quite like coming to believe in small government, climate change, Universal Basic Income, or the war in Afghanistan. When we believe those things, our social networks reassure us that our beliefs are legitimate—that a reasonable, imaginable person could hold our same belief. Correspondingly, our social networks also tell us, overtly or by omission, which beliefs are not legitimate, which beliefs are not imaginable. The special threat of a high-control group is that it contracts your social world until nobody you know and trust personally is willing to profess a belief contrary to dogma.
“Love Has Won” is a particularly interesting work for this reason because, as opposed to most other similar documentaries, the interviewed members are mainly still believers. Their calm conviction is unsettling. They aren’t appalled or apologetic at their disclosures, as exited cult members often are in interviews. They aren’t signalling that they know what they’re saying lies outside the window of permissible belief. As a result, they just don’t seem that crazy. They could be talking about anything at all.
I wonder if the Cult helps us define the boundaries of belief, as a social collective. We want to assert and absorb opinions on which beliefs, particularly those we call “spiritual” beliefs, are acceptable and which are unacceptable. (The Cult also asks us to maintain a distrinction that seems increasingly arbitrary to me between spiritual beliefs and other types of beliefs.) Proceeding too far outside the window of legitimate belief is a kind of social death. The Cult is a group with such abnormal beliefs that they have become inhuman, frightening, monstrous. And it is a group: most cult-doc aficionados would profess to sympathize with those individuals who fall under the sway of a dominating spiritual leader, but the term “cult” retains an undiscerning rhetorical effect. Words like “guru” lack the piquancy of “cult”. Even “cult leader” feels diluted. The whole group, as an undifferentiated mass, is the derided and feared Cult. Belief-setters and belief-receivers are treated alike: merely holding the belief is enough to mark you.
Building epistemic social consensus is important. Any popular enough belief has a certain amount of safety built in. I can safely make decisions under the presumption that birds exist. Otherwise, I can reason, somebody among the other billions of people who also believe birds exist would have come to some harm, and I would have heard about it. So I feel secure and happy, and I stop thinking about whether birds exist, which is a load off my mind. If we want to build this type of social consensus, we need to define its boundaries. Otherwise, “Birds don’t exist” competes as a valid belief with my “Birds exist,” and I am thrown again into uncertainty. This is not only frightening, but time-consuming, since I must try to verify that birds exist. Once we define a boundary, certain objects must lie outside it: beliefs that are so pathological we don’t even have to review the evidence.
The trouble is that few beliefs lay themselves out as neatly as “Birds exist”. (I certainly hope this statement won’t age too poorly.) Beliefs are complex. We know our social consensus can fail. We know that false beliefs can be very popular, and can motivate extraordinary harm (witch-burning, Crusades, race “science”). So it does us good at times to review the supposedly pathological beliefs that we have pre-dismissed.
This to say, we shouldn’t rely too heavily on the Cult to reassure us of the normalcy of our beliefs without further reflecting on the character of those beliefs. Epistemic complacency threatens to leach us of a certain humility that does us good to retain. I fear that the sensationalism we have drummed up around this very specific type of interpersonal powermongering turns spiritual high-control groups into a kind of epistemic fidget toy, and blinds us to other lessons spiritual high-control groups might teach us.
John 20:27
I suppose the main thing that interests me is attempting to integrate the cult into our model of how social relations and social knowledge-making function more generally. The cult is not a mysterious pit trap that people sometimes inexplicably fall into. In some way, it must be a natural, foreseeable output of tendencies in our nature. An edge case, to be sure, but an edge case with explanatory potential. And here, I have tended to hit an asymptote of sympathy. Indeed, this asymptote feels to me somewhat like a sign of the abject as I poorly understand it. It is extremely difficult to even imagine taking the cult’s beliefs seriously—seriously enough to perform some harmful action on the merit of those beliefs. I mean, Robin Williams?
Even ex-members of unusual spiritual groups often seem somewhat baffled at their past actions. A plethora of unexamined assumptions and unconscious emotional impulses must be enumerated to make something that seems irrational come clear. The gap seems too wide. And yet in another way it also seems so narrow, terribly narrow. You grow up Buddhist, you become interested in expanding your spiritual horizons, you meet an articulate and charismatic teacher, and the next thing you know, you’re releasing sarin gas on the Tokyo subway.
In the case of Aum Shinrikyo, maybe more broadly in religious high-control groups, the missing middle input which provides a familiar motivation might be fear: for example, deep existential fear of the end of the world. That same fear which motivates climate activists to climb bridges and glue themselves to walls in public. Can’t we understand that when time feels short, when the consequences are near, previously unthinkable things start to draw within reach? But if most of us really believed day-to-day in the possibility that the world as we know it will fall apart within our lifetime, we would all be teetering on some kind of brink.
Perhaps we purposefully avoid giving ourselves over completely to sympathy with the Cult. The best way to understand at a human-to-human level how a person could reach the point of such action is to consider your own beliefs in relation to theirs. Isn’t there some idea you have which you don’t confess to the people around you? Isn’t there some idea we all seem to hold, but which nobody is taking seriously enough to act on? Isn’t there some fact you avoid because its consequences make you feel too desperate and bizarre? If we forced ourselves to consider what extremity of feeling could drive us to the edge of the epistemic world, we might find that we already half-believe certain things that could take us there if we let them. We are held back mainly by trust in our fellows and by convention. I think many of us are frightened that if we embarked on that journey, we couldn’t come back.
