Why Nobody Believes Anything Anymore
We’re living through an epistemological crisis, which is a fancy way of saying nobody knows what’s true anymore.
It’s not that people don’t believe things. They believe all sorts of things, passionately and confidently. It’s that there’s no shared basis for belief, no common ground for establishing facts, no trusted arbiters of truth.
Ask ten people about any significant political or social issue and you’ll get ten different sets of “facts.” Not interpretations. Facts. They’ll cite sources, show you evidence, explain in detail why their version is correct and everyone else is deluded or lying.
They can’t all be right. But increasingly, they can all find support for their positions. The internet has made it possible to construct fully coherent alternative realities, each with its own evidence base, expert opinions, and explanatory frameworks.
This is historically unprecedented. For most of human history, information was scarce. The challenge was access. Now information is abundant to the point of meaninglessness. The challenge is figuring out what’s true.
Traditional gatekeepers have collapsed. Newspapers, once trusted, are now dismissed as biased or fake. Scientists, once respected, are accused of corruption. Experts in any field face immediate challenges from people with internet connections and strong opinions.
The result is what philosopher Bruno Latour called “the crisis of objectivity.” We’ve spent the last several decades deconstructing authority, questioning narratives, revealing hidden biases. This was often valuable. Authorities sometimes did lie. Narratives were sometimes propaganda. Biases were often real.
But we’ve gone too far. We’ve gotten so good at questioning everything that we can’t believe anything. Every fact is contested, every source is suspect, every claim has a counter-claim. We’ve achieved perfect skepticism and lost the ability to know.
Part of the problem is structural. Social media rewards engagement, not accuracy. Outrageous claims spread faster than careful truth. Conspiracy theories are more interesting than mundane reality. Algorithms optimize for attention, not for understanding.
But part of the problem is psychological. We’re all susceptible to motivated reasoning, the tendency to accept information that confirms what we already believe and reject information that challenges it. In an environment of infinite information, we can always find support for our preferred conclusions.
This creates what psychologists call “epistemic bubbles” or “echo chambers.” We surround ourselves with sources that agree with us, people who think like us, information that reinforces our worldview. We rarely encounter genuine challenges to our beliefs, and when we do, we dismiss them as propaganda or stupidity.
The consequences are severe. A functioning democracy requires a shared understanding of basic facts. When we can’t agree on what’s real, we can’t have productive disagreements about what to do. We can only shout past each other from incompatible realities.
This shows up in personal relationships too. Families split over politics not just because they disagree, but because they literally inhabit different factual universes. One person’s obvious truth is another person’s obvious lie. There’s no neutral ground to retreat to, no trusted source both sides accept.
So what do we do? How do we navigate a world where everything is contested and nothing is certain?
First, we need epistemic humility. This means acknowledging that we could be wrong, that our sources could be biased, that our reasoning could be flawed. It means holding our beliefs a little more loosely, being willing to update them when we encounter good counter-evidence.
Second, we need to rebuild trust in institutions, not blindly but carefully. Not all experts are corrupt. Not all journalism is propaganda. Not all science is fraudulent. We need to get better at distinguishing between legitimate criticism and conspiracy thinking.
Third, we need to practice better information hygiene. This means checking sources, reading beyond headlines, seeking out perspectives we disagree with, being suspicious of information that makes us feel good about our existing beliefs.
Fourth, we need to accept uncertainty. Not everything is knowable. Not every question has a clear answer. Sometimes the honest response is “I don’t know” or “the evidence is mixed” or “reasonable people disagree.”
This is uncomfortable. We want certainty. We want to feel like we understand what’s happening. But false certainty is worse than acknowledged uncertainty. Better to know what we don’t know than to be confident in falsehoods.
The epistemological crisis won’t resolve quickly. We’ve broken something fundamental about how societies create shared knowledge, and fixing it will take time and effort.
But the alternative is continuing to drift apart into separate realities, unable to communicate across the gaps, unable to solve collective problems, unable to function as a society at all.
The question isn’t whether we believe things. It’s whether we can learn to believe things together again.

