A few years ago, I came across a fascinating and provocative academic article by Dan Sperber called The Guru Effect. It rewired how I saw the world. One particular passage really stuck with me.
Historically, the transition to modernity can be described as the replacement of authority by argument as the main basis of justified beliefs.
But let’s go in order. In the article, Sperber dives into the role of authority in belief formation, emphasizing how a runaway dynamic of overappreciation can promote the power of authority to shocking heights. He sets his sights on philosophy, especially the continental kind, and argues that some philosophers might milk this effect a little too much—building an aura of profundity with dense, cryptic writing that elevates them to the status of some sort of intellectual gurus. However Sperber’s arguments transcend the confines of philosophy, offering broader insights that can be applied to all kinds of social domains.
1. A Short Summary
For something published in an academic journal, the article doesn’t pull punches. Even the abstract comes out swinging, laying down a pretty sharp and provocative tone right from the start.
Obscurity of expression is considered a flaw. Not so, however, in the speech or writing of intellectual gurus. It is not just that insufficiently competent readers refrain, as they should, from passing judgment on what they don’t understand. All too often, what readers do is judge profound what they have failed to grasp. Obscurity inspires awe, a fact I have been only too aware of, living as I have been in the Paris of Sartre, Lacan, Derrida and other famously hard to interpret maîtres à penser. Here I try to explain this “guru effect.”
Leaving aside disquisitions on tone, the article makes several compelling points that led me to reconceptualize the power of the authority bias.
One of Sperber’s key observations is how authority automatically benefits from confirmation bias. Once someone is granted authority, it becomes surprisingly easy to interpret disparate outcomes as evidence of their credibility, while dismissing counterexamples as mere outliers. The more ambiguous the evidence, the more space there is for confirmation bias to take root, reinforcing the belief that the authority figure is credible—regardless of what the evidence actually suggests.
Another insight concerns ambiguous statements and their potential to create a phenomenon of runaway authority. When someone in a position of power makes vague or obscure remarks, their perceived importance often stems less from the content itself and more from the speaker's status. This can create a feedback loop where the authority of the speaker enhances the impact of their statements, which in turn further inflates their perceived authority—a self-reinforcing vicious cycle exacerbated by the natural human tendency toward interpretative charitability.
As if that weren’t enough, there’s a strong social incentive to align with the powerful. Those in power rarely welcome dissent; they’re more likely to reward compliance while actively suppressing those who challenge them. So, authority tends to naturally self-strengthen; once people begin assigning legitimacy to someone or something, the number of followers increases—likely in an exponential manner—compounding the possibility of runaway authority.
However, all is not lost. Sperber suggests that we may possess a natural safeguard against the power of authority: argumentation. He proposes that our ability to craft and evaluate arguments may have evolved specifically as a defense mechanism against deception and manipulation—tools often wielded by those protecting an unjust power structure. In a subsequent article, Sperber explores this idea further, delving into the possible adaptive functions of argumentation and its role in countering undue influence.
To conclude, Sperber provides a sociological analysis of the dynamics within certain intellectual traditions:
Here emerges a collective dynamics typical of intellectual schools and sects, where the obscurity of respected masters is not just a sign of the depth of their thinking, but a proof of their genius. Left on their own, admiring readers interprets one recondite passage after another in a way that may slowly reinforces their admiration (or else render them wary). Now sharing their interpretations and impressions with other admirers, readers find in the admiration, in the trust that other have for the master, reasons to consider their own interpretations as failing to do justice to the genius of the interpreted text. In turn these readers become disciples and proselytes. Where we had the slow back-and-forth of solitary reading between favourable interpretation and increased confidence in authority, now we have a competition among disciples for an interpretation that best displays the genius of the master, an interpretation that, for this purpose, may be just as obscure as the thought it is meant to interpret. Thus a thinker is made into a guru and her best disciples in gurus-apprentices.
Told you the article doesn’t pull punches.
2. Implications for Academia
When I first entered university, I thought of it as a sort of intellectual heaven. A self-critical bastion of truth, free from the trappings of authority that bog down the outside world. But, as had happened before, I had underestimated my own authority bias and the role it played in shaping my perception. One of my early mistakes was assigning almost untouchable levels of authority to some of the thinkers mentioned in Sperber’s article. The truth is that the issue of runaway authority is everywhere in human society, at every level—and unfortunately, academia is not immune. I soon discovered that even in paradise, there are thorns.
Much of academic research operates on implicit trust and influence. The peer review system, in theory, is designed to mitigate these problems, but it also creates strong incentives to carve out insular research niches. A particularly eyebrow-raising practice, even in some top journals, is the request for authors to provide a list of possible peer reviewers. I don’t think I need to explain how this could lead to obvious conflicts of interest. When a niche is created, scholars often band together, citing one another and forming orthodoxies of thought that can be difficult to challenge. I have my suspicions about certain areas of methodological research, in my own field, that may have fallen prey to this dynamic.
Now, peer review has been criticized to death, and I don’t intend to pile on excessively. Yes, something should be done to address its flaws—maybe making all peer reviews double-blind, supporting negative and null results more actively, or offering better incentives for being a high-quality reviewer. But the real challenge is that the issues we are facing stem from biases deeply rooted in human nature itself. Our natural tendency to trust authority and follow established norms creates fertile ground for runaway authority. It’s likely a very hard problem to fully solve, no matter how well the peer review system is designed.
One thing I sometimes wonder about is how many enclaves of unjust authority could be out there, hidden in plain sight.
Psychoanalysis is one of the usual suspects. Critics have long accused it of being more of a dogmatic belief system than a rigorous science, with its tight-knit community of adherents reinforcing its authority despite a lack of empirical rigor.
String theory might be another contender. The murmurs around it seem to be growing louder. I know mathematicians in my own department who quietly suggest that string theory has been unproductive for decades, hinting that it’s gone off the rails of runaway authority. For all its elegance and theoretical beauty, they argue, it hasn’t delivered much in the way of concrete results for over 40 years.
Sperber, in his article, suggests that Roger Penrose might have benefited from runaway authority too.
Honest gurus are not trying to deceive their audience. Nevertheless, they may produce arguments that will persuade most of their readers not by their logical force, but by their very difficulty. A recent illustration is provided by The Emperor’s New Mind by the eminent physicist Roger Penrose
Without the relevant expertise, it’s difficult to judge which of the cases mentioned are truly suffering from runaway authority and which are simply misunderstood.
And if there are examples that stand out, how many smaller, less visible enclaves of runaway authority might exist in other corners of academia? Every niche field, every obscure research domain, could potentially harbor its own orthodoxy—where a handful of influential figures define the conversation, and everyone else simply aligns to maintain a semblance of credibility within the group.
3. Implications for Society
If we step outside the academic domain, the problem doesn’t just go away—in fact, it might even get significantly worse.
Take the workplace: How many bosses are coasting on runaway authority, throwing their weight around despite being less competent than some of the interns? Consultants roll in with PowerPoints full of models that sound brilliant until you realize that nobody has checked their assumptions since the Obama administration. And that CEO that speaks only in buzzwords— Is he a genius or did he just get lucky in his previous merger?
But it doesn’t just stop at the office. Many would say organized religions are the OGs of runaway authority. Sacred texts, divine mandates—authority dialed up to eleven. Forget corporate mission statements or consultant slideshows; what’s a PowerPoint compared to "because God said so"?
Then there are politicians, who don’t just benefit from authority bias—they dream of engineering it. Every campaign slogan, every carefully staged photo-op, every soundbite is a calculated move to project infallibility. Politicians understand our human tendency to conflate authority with competence and morality, and they exploit it masterfully.
Historically, this trust in authority has led humanity to believe bizarre things and commit shocking atrocities. Dictatorships silencing dissent with bloodshed. Witch hunts burned the innocent. Eugenics cloaked bigotry in a veneer of science. Time and again, authority has been wielded to justify the unjustifiable.
Sometimes I catch myself wondering: by what authority bias am I still captured by?
Is philosopher David Benatar right that bringing children into the world is immoral, a selfish gamble that dooms new lives to suffering? I’m skeptical, but I’ve never seriously grappled with his arguments.
And what about Peter Singer’s haunting critique of humanity’s indifference to animal suffering? Is humanity blinded by a consensus effect, brushing off the moral weight of eating meat simply because everyone else seems to do it? Could it be that factory farming—our silent approval of animals raised and slaughtered in misery—is not just wrong but, as Substacker Bentham’s Bulldog warns us, the greatest crime in human history?
4. The Extreme Value of the Good Dissenter
The quintessential way for dismantling runaway authority and bring us closer to the truth lies in the deliberate and courageous actions of a dissenter.
A dissenter is someone who publicly challenges the established authority or dominant narrative, often pointing out flaws, inconsistencies, or blind spots that others overlook. They’re the innovators, the relentless troublemakers, the people willing to say, “Wait a minute—are we sure about this?”
Unfortunately, there’s a catch: many dissenters tend to be...a little crazy. Some of them are genuinely insightful, but others latch onto fringe theories or embrace contrarianism for its own sake, muddying the waters. The problem is that authority dynamics are so entrenched that dissent often requires a certain degree of stubbornness—or outright obsession—to persist in the face of ridicule and resistance. And while that determination is admirable, it can also attract people whose arguments lack coherence or grounding in reality.
Some, like Terrence Howard, are quite far off their rocker to the point they might need a search party to find it again. Others, like Bret Weinstein, can perhaps still see their rocker—but it’s hard to say they’re sitting on it when they propose hypotheses like the idea that countries in a position to expand have an evolutionary mechanism for creating armies by producing excess males.
Which begs an interesting question: what makes a good dissenter?
A) Good dissenters generally needs a deep understanding of the field they are challenging, including its foundational principles and methods. Without this, their critique risks being shallow or misguided.
B) They must also be open to dialogue, willing to engage with opposing views and defend their reasoning with clarity and rigor.
C) Their dissent should be driven by a commitment to truth, fairness, or justice—not by personal gain, malice, or a desire to simply stir the pot.
D) Good dissenters must be courageous and patient, with strong epistemic practices that prioritize evidence and reason over impulse or ideology.
E) They should also be willing to engage with constructive criticism, incorporating it into their arguments. The greatest dissenters not only challenge others but remain open to the possibility that they themselves could be wrong.
When you consider this list of qualities, it’s easy to see why good dissenters are so rare. Being a good dissenter is incredibly challenging and psychologically taxing—it’s all too easy to lose your way, become virulently bitter, or veer off the rails.
Substack is a kind of safe haven for dissenters, offering an outlet for voices that challenge the mainstream. And while it’s flooded with every kind of harsh contrarian, I think we’re fortunate to have some genuinely good ones. I’ve already mentioned Peter Singer and Bentham’s Bulldog arguing against factory farming. Another example is Michael Huemer, known for his thought-provoking dissent on various topics, including a radical skepticism of all forms of political authority.
Personally, I also follow a smaller community of dissenters who are sharply critical of the analytic philosophy tradition—voices like Lance Bush, Digital Gnosis, and Stan Patton. I’m quite confident they’d argue that parts of analytic philosophy have fallen victim to runaway authority. I’m not sure I’m fully equipped to evaluate their claims, but I find their dissent consistently clear, well-argued, and open to engaging with critics.
The problem of runaway authority—let’s remind ourselves—is pervasive throughout society. A dissenter often takes on significant personal risk, facing ridicule and exclusion to challenge the status quo. Some, like Ignaz Semmelweis—who advocated for handwashing in hospitals to prevent infections—endure lifetimes of derision, their ideas vindicated only after their deaths, leaving behind a story too tragic to even succeed at the box office.
The good dissenter is extremely valuable. Those rare few who manage to be good dissenters deserve nothing less than our utmost respect and appreciation. Yet, all too often, they don’t receive it. If we found ways to properly reward good dissenters, their role might feel less frustrating, fewer would lose their way, and their critiques would likely be delivered with less vitriol and a more collaborative focus.
Perhaps, using the words of Sperber, we could also accelerate the replacement of authority with argument as the primary basis for justified beliefs
Cracker reading. Nice work.