Beliefs – really
Wherever beliefs are foundational and crucial, you’re looking at a cult
“Belief” is a word everyone uses as if we all mean the same thing by it, but we don’t. It means very different things depending on where you’re standing when you use it.
Inner sanctum
From the inside—our native perspective—a belief is a position taken, a commitment made, something staked. We “believe” it because we’re sure it’s true—but not sure enough to just state it as a fact. When we have no doubt that something is true, we don’t package it as a belief. Believing doesn’t even enter into it. “Hey, it’s raining!” isn’t the same as, “Hey! I believe it’s raining!”
“It’s raining” plainly states what we’ve taken as a fact. “I think it’s raining” reports what we’ve taken as a likely fact along with our mental state, our degree of certainty regarding the fact: less-than-certain, open to revision. “I’m sure it’s raining” conveys a bit more certainty. All of these expressions remain focused on the question of whether it’s raining. The self-reporting is tangential. Our state of certainty about it raining serves our operative concern: Will I get wet if I go outside, or am I going to need an umbrella?
“I believe it’s raining” does something unique and magical. It announces our commitment to a claim despite our uncertainty. Suddenly, not only are we conveying an opinion about it raining along with how sure of it we are, but we’re announcing an entirely new element that doesn’t exist in “it is”, “think it is,” or “sure it is”—an element that has dick-all to do with whether it’s raining or not: our committed position on the raining question.
This new, egoistic element often ends up eclipsing the question it was supposed to bolster. Just notice your own reaction when a belief important to you gets criticized or contradicted, and you’ll see it for yourself: the matter you believe becomes secondary to the fact that you took a stand on it that you’ll defend.
If we say, “It’s raining!” and someone says, “No, it’s not!” what’s our first impulse—to argue? No. We’re natural born scientists. We go look. We inform ourselves. We obtain evidence. Maybe the neighbor’s sprinkler is set too high and it’s watering our lawn, too.
Is that our first impulse when someone criticizes or contradicts one of our beliefs?
Stating a belief is a different kind of statement than stating something we’ve taken as a fact. A fact is not a mental state, and it’s not a commitment. Each form of expression above, in the order I presented them, signals less certainty than the kind before. “I believe…” is the least certain of them.
If we’re certain, “It’s raining” covers it. If we’re not sure, “I think it’s raining” covers it. “I’m sure it’s raining” implies that we had enough doubt about it that we double-checked and, sure enough, we were right the first time.
In stark contrast, saying, “I believe it’s raining,” just triggers the question: Since you’re uncertain about it, what the hell possesses you to “believe” it? Which only sparks another: But if you’re certain, then what the hell needs to be “believed”?
This is the denial aspect of belief. In an effort to project certainty, believers deny the truth: they’re absolutely not certain. They want to be certain, and they especially want to look like they’re certain, but to do that they must deny the facts: they do not feel certain because they do not have sufficient reason to be certain. They lack certainty and the confidence that comes with it, so they just make that shit up. “Belief” is their rickety plank across the sinkhole.
“I believe” is the oddball: despite being the least certain position, believers want others to see that they’re confident. Often, they’re trying to convince themselves—the old, so-called “fight of faith”. Play-acting conviction and commitment is the device to pull off the ruse. Gaslighting is the technique.
But why do they want to look convinced and committed beyond what they actually are? They might not know themselves.
How, then, is believing not incoherent? And, given the denial required, how is it not dishonest?
I’ll let you ponder that one. My answer is below, under “What walls are made of”.
In or out?
So that’s belief from the inside: claiming something is true while lacking the reason and certainty to state it as plain fact.
From the outside—an observer watching us—”belief” is a label slapped on our behavior, an inference about what must be going on in our heads.
These aren’t just two angles on the same thing. They’re two different things disguised by the same word.
Inside-out (IO) belief: This is what’s going on when you believe something—your experience of holding a position, being committed to it, defending it.
Outside-in (OI) belief: This is what an observer infers you hold true, based on what you say or do. They’re outside your perspective, modeling what’s going on in your mind.
Most of what academics say about belief is OI. Bayesian epistemology, belief-desire psychology, all those propositional attitude frameworks—these are models observers build to describe what they think is happening inside their theoretical (philosophy) or experimental (psychology) study subjects. The idealized rational agent who operates by probability distributions and revises their thinking as they get new information? That’s a figment, not a person.
IO belief is relevant to the real lives of us all. OI belief is relevant only for people who care about the work of academicians.
This article is about us as believers—what’s actually happening when we hold a position, whether we need to hold positions in the first place, and why we put “I believe” in front of things we want others to take seriously. What follows now tightly focuses on IO belief. It doesn’t apply to OI belief.
Maybe “believing” is just what we do when we lack what we need to state a matter as a simple fact, but we’re not OK being openly, frankly uncertain about it—or, more to the point, when we’re not OK with the vulnerability of that kind of honesty. Let’s find out.
Abstraction: Functional baseline
Separating an idea, symbol, name, label, truth statement, image, etc., from the actuality it refers to or represents is the whole point of abstracting at all. (I’m ignoring imaginary abstractions which don’t refer to anything real but, instead, serve as conceptual starting points—like math axioms and operators, or game rules.) It’s much easier to say “elephant” than to drag your friend to the zoo, march them over to the elephant habitat, then point and say, “That, there!”
This separability is the defining feature of non-imaginary abstractions.
An abstraction is a type of “thing” that has a kind of quasi-existence—enough so that our minds accept it as a mental “object”, a proxy for whatever it represents or refers to.
A photo of a loved one is not the loved one, but it’s close enough to evoke our feelings for them. We might love the photo, too—but we don’t confuse our love for the photo with our love for the loved one in the photo. The photo exists, so we could say the image of our loved one exists, but we don’t confuse the existence of an image with the existence of its subject.
Someone might spite us by drawing an absurd goatee on the photo of our loved one’s face, but that would have no effect whatsoever on their actual face. The photo could be torn up or mutilated or burned without our loved one feeling the slightest discomfort.
All the discomfort over these actions would be ours.
Still, we wouldn’t confuse that discomfort with the discomfort we’d feel if someone actually harmed our loved one.
None of this is especially complicated—we just don’t normally think about things at this level of detail.
You might say, “But a photo isn’t immaterial!”
True enough, but then neither do the paper/plastic, pigment, pixels, constitute an abstraction. The abstraction is the pattern that we, in our minds, relate as a representation of something that actually is material—a loved one’s face. (We’ll ignore for the sake of simplicity that an abstraction might refer to another abstraction, which gets real sticky real fast, but doesn’t change the basic structure of these relationships one whit.)
This is how we normally handle abstractions—we use them, we don’t confuse them with what they represent, and we don’t stake ourselves to them. Think of them as flags we plant in the ground of some territory. As long as they’re useful to mark something, to use for navigation, or even just as decoration, they stay put unless we find better spots to put them. When they lose their usefulness, they become litter, and we pull them up and discard them. Abstractions are low investment—fences, not walls. Lean-tos, not forts. This is the baseline. We revise or discard abstractions at will—precisely because we don’t invest much in them or attach ourselves to them.
Belief: Bye-bye baseline
Beliefs primarily concern a specific type of abstraction: truth statements.
The difference between a belief and a simple, abstract truth statement lies in its durability. The very word “abstraction” connotes something intangible, iffy, provisional, and derived. When we wrap a truth statement up as a “belief”, we imply none of those things. Strictly speaking, beliefs are intangible, of course—just like any other abstraction. But you’d never know it to listen to most believers. Their beliefs often matter more to them than the supposed actualities they say they believe.
So where does this durability come from? Beliefs have three characteristics that simple truth statements don’t: attachment, resulting privileged status, and defensive position.
A believer has stopped planting flags and started building fortifications.
You take three steps in forming a belief, unless someone else took them for you, the belief having been indoctrinated into you: whether by the explicit “instruction” of “education” or “religious training” or by social osmosis/impregnation—”Everybody knows that…”—or simply because you were left ignorant of other options, which people often discover when they experience cultures other than the one they grew up in.
First, you attach. Something about this abstraction matters to you—it’s not just useful, you’ve made it yours. You’ve got skin in it.
Second, attachment confers privileged status. The abstraction takes on weight. It’s no longer a flag you’d pull up when a better spot appears. It occupies specific territory in your head. It’s got psychological mass and inertia. It resists being moved.
Third, you take a defensive position. “I believe X” is only secondarily about X. What you’re really announcing is your personal attachment to X and the fact that you’ve dug in. The flag doesn’t just mark a position—it stakes you to what, for you, is now quasi-holy ground where you’ve taken a stand that you’ll defend, building walls around it. You’re not going to let anyone break in and capture that flag. If someone goes after your belief, they’re attacking your treasured position—not merely the abstraction, i.e., the idea itself.
Put those together—attachment, privileged status, defensive position—and you get durability. The belief isn’t just a useful idea anymore—it marks hallowed, fortified ground you’ve invested in.
Believing attaches ego to abstraction.
What these walls don’t stand on
Here’s where it gets interesting.
No one comes indoors from a recon run, sopping wet, and says, “I believe it’s raining!”
If you do the epistemic work—scout the territory, check the weather, test the ground, gather the evidence needed to show a position is worth occupying—you don’t need to “believe” it’s raining. You know. Your epistemic work—being out there in the rain getting soaked—relieves your need for belief entirely. If not, you’ve got more work to do—but don’t come back pretending you did it and act like pretending substitutes for doing.
More importantly, if you did the necessary work, you’ll feel no need to defend the position. If people object or disagree with your findings, they’re free to go out, scout the territory, check the weather, test the ground, gather the evidence themself—just like you did. In science, this is called reproducibility. If you’re right, they’ll find out. If you’re wrong, they’ll find out—and you can be sure they’ll let you know about it! 😁
This is the best way we currently know of to discover and establish what’s really going on, aka “the truth”. Either way, right or wrong, it’s a win for all. The fact that belief comes into the picture at all signals a failure of intelligent investigation, reasoning, and evidential support. The widespread notion that it makes sense to preface statements with “I believe…” without reason, evidence, or right to state them as facts isn’t just wrong—it’s dumb. Substituting two words for doing the necessary work is a way to pretend you have a claim that’s grounded—has a basis in reality—when you did little to nothing to make sure of it.
Adopting that posture is the dishonesty of belief.
Belief is what happens when the walls go up without any reconnaissance, scouting, testing, evidence. You’re here because you’re here. You’re attached to the spot, but you’re not honest enough to leave it at that. You need something to convey that its value isn’t merely subjective. A fortification does just that. “They must have had a good reason to go to all this work!” But you don’t have any good reason. You’re attached—and that is all. Attachment posing as justification. No sane, rational grounds—just commitment wrapped around a position that you sanctified because it’s important to you, that you defend as if it were gold when you have no reason to think it’s anything but dust. The walls don’t just defend the ground—they hide the fact that there’s nothing special there beyond your attachment to it.
At the beginning, I pointed out that we use “I believe” when we’re not certain enough to just state a fact or that we think it’s true. I said that it’s the “oddball” and asked how it is not incoherent, because I’ve been unable to find any coherence in it.
Far from coherent, beliefs are incoherent on two levels.
First, the stance itself is plain: commitment without grounds. That’s incoherent, because it’s arbitrary. You’re admitting you have no better grounds for the belief than you would for a belief in flying pink unicorns—none that you’re capable of showing, because if you had them, you’d show them. Instead, “believe” covers the hole. No grounds is none—which is all there is in either case.
Second, it’s a sleight of hand. “I believe God exists” answers a question no one asked. The real question is whether God exists. Speaking bluntly: no one cares whether you believe it or not. That’s important to you alone. Your belief is wholly beside the point, but you’re treating it as the most important point. That’s incoherent, because you and your commitments are not the issue. This is the narcissism inherent in “belief”.
People do a funny thing when confronted by the fact that they have no grounds for a belief. They point to related aspects for which they do have reason and evidence, then act as if that credibility transfers to the aspects for which they have none. It doesn’t. If you follow the first three steps of a recipe correctly but pull step four out of your ass, no one’s going to eat the cake.
If you have grounds, it’s not belief—it’s knowledge. If you have grounds for some aspects of a belief, they stand on their own. They don’t need to be believed, and they can’t rescue the parts that do. “Belief” doesn’t apply to them at all. The only aspects that make it a “belief” are precisely the ones without grounds. Believing covers that hole. It’s misleading and actually dishonest to pretend that the ground around the hole is part of the hole.
Questions of truth become irrational once you’ve inverted the fundamental relationship by which truth is established. Meaning what? Beliefs are like photos of the Pope on your wall, but then when he condescends to pay your family a visit, you start talking to the photos, asking if this guy’s for real, because he looks completely different in the flesh.
These incoherencies of belief are instrumental and crucial in cult formation.
First, believers build walls out of thin air, disguising personal attachment as objectively real value. Second, they defend their fort. Third—the trick—they treat the very act of defending it as evidence that it’s worth defending—so, of course, it must bear defense-worthy value, i.e., truth. The defense itself becomes proof of truth.
If these constructs were rational, evidence would come first, then walls. Instead, walls come first, and defending them becomes the “evidence”. The believer pretends their position is self-justifying: “I wouldn’t have taken this position if it weren’t true, so the fact that I took it proves it’s justified.”
So what’s the difference between someone who did the necessary epistemic work and someone who just built sky-castles? Supposedly, they both ended up with beliefs. Academicians don’t differentiate them. Yet, from either vantage—inside or outside—they look nothing alike.
An honest person finds a spring and shares it openly. Taste—see for yourself whether the water is sweet.
The believer claims to have a spring too—hidden behind walls. To taste the water, you must go inside—but they control the gates. By the time you’ve given the password and paid the toll, you’re already invested in the water, so you’re already sympathetic to the believer’s estimation of the sweetness of their wonderful truth-spring.
I’m sorry, but that is not an open, honest, unbiased, intelligent way of handling information claimed to be true.
If you could see this one cognitive turd for what it is, you’d never get conned again, believe it or not! Abusive situations would disintegrate in a short time. It would be impossible to form cults. Not kidding even a little bit.
Abs got privileged
A belief surrounds a truth statement. No matter which angle you come at the statement from, you have to get through “believe” first. Both, being abstract, are separable from reality. But beliefs do something simple truth statements don’t: they can be turned against the very realities they were derived from—subverting, dominating, and overruling them.
We observe things falling. It’s a consistent pattern. So, we name it “gravity”. It’s so consistent, it’s just as if there were a “law” governing it. We name it, too: the “law of gravity”. Before you know it, we’ve taken for granted that this “law” actually exists out there, “governing” how things fall. We have no evidence for any such “law” that “governs” the very phenomena we derived it from, but the idea is appealing, and it provides psychic consonance. It feels better than “things fall and we don’t know why.” So we believe: there’s a law. Then something falls in an unexpected way. First, we resist—it broke the law! But since it persists, we eventually revise the “law” to fit. The actuality determined the law all along—and always will. And yet, we go right back to believing the law “governs” the actuality.
This circularity is unique to beliefs, and it’s not accidental. A belief is an abstraction constructed in such a way that it not only can contradict what’s really going on, delegitimize it, and come out on top in the minds of believers—but, by its nature, is highly prone to do so. Think Christian Scientist or Pentecostal refusing medical attention because they believe God has healed them.
This is why beliefs are critical to cult formation. You can’t make a cult without them.
Do beliefs necessarily create cults? That depends on whether you think cults are just weird fringe groups. If you focus on essentials—what makes a cult cultic—it all comes down to beliefs: how they operate and the incoherencies they infiltrate and spread throughout groups.
If the Roman Catholic Church operates by the same mechanisms and in similar ways as the Church of Scientology—same tricks, same motives, same goals—what makes one a reputable institution and the other a cult? Size? Age? Social acceptance? Judged by essentials, the question isn’t which one is the cult, but which is the bigger one.
When you look at how beliefs actually function, they always drive toward cult formation. The cultism consists precisely of the incoherencies of belief, which are baked in. The weirdness and harmful cult practices are just symptoms.
Some will say I’ve merely redefined “belief”, reducing it to pathological edge cases. I’d ask them to show how non-pathological IO belief differs from simple abstraction.
They won’t—they can’t—because “non-pathological belief” is a ruse, a trick, a con job. They’re welcome to try, and I’ll listen if they come up with anything.
What I’ve described is what belief does, always: attachment confers privileged status; privileged status implies value; value demands defense; and defense fuses the believer to their position. As they see it, they cannot leave their position without grave risk. And all this happens without making any effort to ensure their believed “truth” has the slightest connection to reality. That work was outsourced, if it ever happened at all. But once a truth statement gets wrapped in belief, not only is that grounding epistemic work irrelevant—because believing makes it unnecessary and beside the point—but making it irrelevant is precisely the point.
This egocentrism and ego attachment accounts for why beliefs are contentious in a way all their own. Controversies over beliefs typically wrangle about whether the belief is “true”—the equivalent of arguing whether a caricature is “true”. But who defines “true”? Who judges whether the belief adequately resembles the reality it claims to represent? Who decides which reality it represents—or even if it refers to anything real at all?
You might say: “But it’s no different with any other kind of abstraction!”
No, not true at all. Granted, those same questions are relevant to plain old vanilla abstractions, too—but that’s where the similarity ends. Simple abstractions aren’t remotely as contentious and conflict-prone as beliefs.
If beliefs were no different, then wouldn’t mere abstractions generate the same kind of disruption, conflict, even animosity that beliefs can and often do? When was the last time you got in an argument over an abstraction? I bet you can tell me all about the last time you got in an argument over a belief, though—that is, if you’re not like most people, who go out of their way and bend over backward to avoid bringing up beliefs precisely to prevent the kind of fireworks they tend to set off.
Abstractions just aren’t fireworks material. Beliefs are, because the believer simply can’t afford to let them not be. Challenge a flag usefully marking a spot, and you’re quibbling over placement—easy enough to move. Challenge an occupied, fortified position, and you’re attacking defended, hallowed ground. Of course there’s gonna be a fight.
Proxy Wars
“Debates” over beliefs are never in good faith. How could they be? The openness, honesty, and suspension of judgment required for good faith discussion (debates are just pitiful knock-offs) were excluded the instant the believer erected their first wall. Like puppet shows where the real contention—the struggle for narrative domination of the audience—never gets exposed, obscured behind scenery and costumes and antics, hidden in the subtext, while the competing puppet masters pretend scripted dialog is “discussion”, even though none of them are listening to each other. Theism vs. atheism, evolution vs. creation, pro-life vs. pro-choice, and every debate between rival forms of politics or economics or religion—all are examples.
These dialogues of the deaf, chronic and interminable, burn on for one reason only: no one involved is being honest. They pretend they care about truth, having already rejected its requirements.
We live on the same planet, where life takes place in largely the same ways, where physics behaves the same whether you’re in North America or South America or Africa or Europe or Asia or Australia or India or Antarctica. If we all look at the same thing happening without harboring preexisting, ulterior agendas, we’ll all see pretty much the same thing. And if we’re honest, we’ll all report pretty much the same thing. Any variations in perspective or timing, any distortions of perception or inadvertent misrepresentations will be no more difficult to locate and rectify than a coding bug in an app. We all know how to locate glitches like that, and we already do it routinely in our communication and relationships with each other... when we’re being honest. But when we’re being dishonest, we leverage those same glitches to gain illegitimate advantage.
The problem isn’t just that this ideological gaming happens between people who aren’t honest and confident enough to put all their cards on the table. Beliefs are devised precisely in order to get games started and enable to game the players. Beliefs create wiggle room that inevitably expands into vast stretches of bullshit-smothered territory. Without abstract constructs that can be manipulated to obfuscate, confuse, and misrepresent what’s really going on, resolving our differences is a skill we’d master—and in fact do master—by the time we’re 12.
Where This Leads
A belief is an abstraction taken as true that the believer converted into a fortified position. A believer doesn’t defend either the abstraction itself or the reality it represents. They defend their position and its fortifications—because those are where they invested their work. This means the believer isn’t primarily interested in the reality involved nor the truth about it. Neither is their treasure. Their treasure is their position and the work they did to fortify it.
Believers defend their occupation of the positions they chose—whether or not their choice itself makes any sense. This is why all IO beliefs are egoistic artifacts. Think back to “It’s raining” vs. “I believe it’s raining”. Which one explicitly represents ego and which completely ignores it?
Understanding this is all you really need to make sense of “identity politics”—or any kind of politics, for that matter. Believers don’t defend anything real, nor the truth about it. They defend their occupation, right or wrong. They defend the walls that protect their occupation of a chosen position. They defend their investment. All of that revolves around ego, not truth.
This explains what we all see but don’t usually name: believers will defend their beliefs against reality itself. Their occupation of their chosen position matters more to them than whether the position is valuable, accurate, legitimate, or functional. They claimed it, they staked it, they occupied it, so it’s “theirs”. They now have a position, and woe to anyone that threatens it.
Haven’t you ever seen people instantly start speculating when they hear something they don’t know what to make of—often gossip about someone not present? They can’t tolerate not knowing what’s really going on, but they don’t know any way they’re willing to follow to get the information they’d need to figure it out. So, they come up with one hypothesis or conjecture, then another and another, groping towards some kind of opinion, itching to draw a conclusion—any conclusion.
The instant they land on a plausible idea they like—or hate—their tension over what’s going on dissolves, maybe replaced by tensions induced by what they “realized”. This isn’t because they figured anything out, but because they now have a position. The relief comes from concluding, not from understanding, let alone being right about it. The facts of what’s really going on have nothing to do with the fact that now they have a place to take a stand, one which they feel confident they can defend.
Truth? What’s that? 🤣
It’s effectively impossible for believers to sit with the discomfort of not “knowing”. They simply must manufacture a situation where they can plausibly extract the comfort of closure—even if it means bullshitting with fake “knowledge” to do it. Bottom line, that’s what “beliefs” do for them. They simply must have a position to occupy. Until they establish one, it’s all that matters. Whether or not their choice of position is sane and smart or psychotic and destructive, it’s theirs, by God, and forever will be until a “stronger man” invades, ties them up, and plunders them.
Go back to the photo. We don’t confuse tearing up the photo with tearing up the person it depicts. But believers are so invested, they confuse attack on the position they staked as attack on their judgment that led them to stake it—which means attack on them. They’ve left the baseline of functional abstraction entirely behind them.
A believer doesn’t just occupy their fortified position—they’re one with it. Breach the walls and you breach them.
And if those walls are made of nothing but attachment defense—if the fortification was never built around what’s true and valuable but boils down to a monument to the builder’s commitment to the position, true or valuable or not—then what’s being defended is patently irrational. This is not a position worth holding, but one whose only value results from the mere fact that it’s held.
Assembled together, the various fortresses a believer builds forms an entity, an encampment of sorts. Some call this a “worldview” or a “belief system”, but we’ve already seen how biased, egocentric, and ego-attached these are from the ground up. It actually comprises the mentality of a pseudo-self—the notorious ego—in a delusory world where simple say-so, makes so. No matter how well, superficially, it all seems to hang together, it still just floats in air like a castle in the sky.
This is the mental world of a cultist.
Infantilism
Tommy and Billy are in the midst of bonking each other over the head because neither will share his Legos with the other. Mom suddenly enters the room. What happens next? Or, maybe the better question is: what stops happening?
Lou and Steph just got into their car after Sunday morning Mass and quickly get into a heated argument. On the drive to church, they’d agreed to have Thai food for lunch—but by the time the service ended, Steph was craving Mexican. Lou had his heart set on green papaya salad. Plus, now, he’s miffed with Steph, since she thought she could break an agreement so casually. The priest notices them on his way over to the rectory and stops to say hi. What happens next?
Clyde and Madge and their three kids are one hour into a 5-hour-drive to a vacation cabin when the kids kick up a ruckus in the back seat. Madge tries to settle them down, but Clyde is getting angrier by the second, both at the kids and at Madge for failing to shut them up. Suddenly he explodes, veers the car abruptly onto the shoulder, and grinds to a stop in a cloud of dust. Everyone is screaming at everyone else. A trooper who wasn’t too far back sees the erratic driving and stops behind them. He gets out of his car and approaches theirs. He’s all the way to Clyde’s window before anyone notices him. What’s their reaction?
In each case, 100 to 1, the brothers, the couple, and the irate family (except little Lester, 4-y-o and a firecracker who hasn’t “learned” to respect much of anyone yet, let alone “authority”) assume the position and the attitude, more or less, of an infant confronted by a god.
Infantilism.
The Three-Legged Stool
There are only 3 requirements for a cult. When these are met, a cult has formed.
1. Infantilism
2. Exploiters/extorters that leverage and exploit the infantilism
3. Occultism: the darkening or obscuring of information that, in honest relationships, would be plain and openly available. For example: taboos on questions and criticism; communication control/censoring that creates self-justifying loops; banning or demonizing information from outside sources; making secrets of relevant truths/facts; passing metaphorizing and euphemizing off as virtues; a culture of hiding/lying as love.
If you think that’s far too broad for “cult”, it’s not because some of the cases it covers aren’t actually cults—abusive domestic situations, for example. It’s because you’ve been conditioned to think of cults behaviorally instead of in terms of the essential characteristics common to them all. You’ve been conditioned to ignore the biggest and most powerful cults, thinking that they’re legit.
Those three “legs of the stool” create a cult, every time. Eliminate just one of them, and the stool will topple and disintegrate in short order. All three legs need continual maintenance and reinforcement to keep the stool standing strong.
Cults are extremely labor-intensive and, like their counterpart operations in general society, profit the few at the top on the backs of the many.
You’ll discover how essential these “legs” are to a cult if you even marginally appear to pose the slightest threat to any of them. The immediate and vehement reactions from everyone involved—leader and minion and underling alike—will expose that it’s actually a cult, whether people “define” it as one or not.
The stool must stand.
The floor is lava.
Testability
These things are testable.
Find IO beliefs that don’t play a role in ego edification. Find IO beliefs that don’t entail ego issues. Find IO beliefs that have not formed cliques and parties and cults and entire “schools” of thought including religions, as well as the institutions which service them—that is: beliefs that don’t sport the pathological characteristics I’ve outlined.
Then, show how these non-pathological IO beliefs (not OI “beliefs”) differ from simple abstractions. Only then will you have identified a disconfirming counterfactual.
If you succeed, then I’ll gladly revise my thinking. But then, you’ll be faced with a quandary. You’d have to answer the question why you call them “beliefs” and not, simply, “abstractions” or “ideas”. If you’ve got no reason for it, and if there’s no value added to offset violating Occam’s razor with unnecessary entities, introducing ambiguity, and making room for confusion, it just leaves me wondering what you’re actually trying to do.




"You are right" OR "You believe you are right"... tough call.