Philosophers Stephen Nadler and Lawrence Shapiro open their book When Bad Thinking Happens to Good People with a dire warning. “Something is seriously wrong,” they write. “An alarming number of citizens, in America and around the world, are embracing crazy, even dangerous ideas.” These ideas include the beliefs that vaccines cause autism, that the scientific consensus on climate change is a hoax, and that 5G cellular networks contributed to the spread of COVID-19. According to Nadler and Shapiro, the problem with those who hold such beliefs is not that they are unintelligent or uneducated. Rather, it is that they “think badly”—they should be “perfectly aware that they are forming and holding beliefs irrationally and irresponsibly, and even doing so willfully.”
Nadler and Shapiro are not alone in thinking that liberal democracies are experiencing an epidemic of willful ignorance. In the last decade, many observers have lamented the advent of a “post-truth” era, an era in which a growing number of citizens have little or no interest in the truth and would rather believe what is convenient for them to believe than what they have reason to believe. According to this diagnosis, so-called climate deniers, for instance, are literally in denial about climate change—they would rather believe that climate change is a hoax than accept the inconvenient truth that human activity is irreversibly changing the Earth’s climate.
Despite its initial plausibility, however, the post-truth diagnosis faces two serious problems. The first is that, if science deniers are actually in denial, then it is unlikely that their beliefs are as dangerous as Nadler and Shapiro and others assume. This is because, when people are in denial, it is their decisions that shape their beliefs rather than the other way around. Consider, for example, the smoker who is in denial about the dangers of smoking. It is not that he keeps smoking because he is in denial. Rather, he is in denial because he does not want to quit smoking. His state of denial does not motivate his decision not to quit—it merely rationalizes it. The same seems true of climate deniers. If they are indeed in denial about climate change, then their state of denial serves to rationalize their allegedly harmful decisions rather than to motivate them. If they are willing to go to such great lengths to justify those decisions, it seems unlikely that, even if we could somehow change their mind about climate change, they would suddenly make different decisions. It is more likely that they would simply find a different way to justify those decisions (as many of us already do).
The second problem is that many alleged science deniers do not seem to actually be in denial. Consider, for example, the working-class Republican voter who believes that climate change is a hoax because he has been repeatedly told so by several sources he trusts, including his favorite news channel, his favorite talk-show hosts, his favorite politicians, and the pastor at his church. While he might be ignorant about climate change, his ignorance does not seem to be willful. He is not in denial about the reality of climate change—he has only been systematically misled about it. Or, to pick another example, consider the middle-class mom who worries that vaccines might cause autism. She does not seem to be in denial either. After all, why would she be in denial about the existence of a safe and effective way to protect her child from serious illness? She, too, is likely to have been misinformed. In both cases, the problem is not that the individuals in question would rather believe what is convenient for them to believe than what is true—the problem is that both have placed their trust in unreliable sources of information.
But isn’t this enough to blame them for their beliefs? Shouldn’t they have been more careful about which sources to trust? Shouldn’t they have trusted the overwhelming majority of scientific experts instead of a handful of “hired guns” or pseudo-experts? The answer to these questions depends on how much we can reasonably expect of ordinary people. While we might trust a friend or a coworker because we have plenty of firsthand evidence of their trustworthiness, we do not seem to have any comparable evidence of the trustworthiness of a variety of sources of scientific information. Instead, we tend to trust sources that enjoy a good reputation in our community and to distrust the ones that do not. If we are lucky, we might belong to a community in which only trustworthy sources are trusted and untrustworthy ones mistrusted, but, if we are not that lucky, then we, too, would place our trust in the wrong sources like the two individuals in my examples did.
This, however, raises a deeper question: Why have so many communities come to place their trust in unreliable sources in the first place? As we have seen, the main problem is not that people care too little about the truth, as the post-truth diagnosis suggests. Rather, the problem is that many communities have lost trust in the institutions that are tasked with finding it. The post-trust diagnosis maintains that liberal democracies have entered a “post-trust” era—an era in which a growing number of communities have come to mistrust the institutions that form the backbone of society’s epistemic infrastructure, including science, government, academia, and the press. According to the post-trust diagnosis, misinformation about vaccines spreads more easily through communities that do not trust science to investigate the safety of vaccines, government to regulate their use, and the press to hold both science and government accountable.
What’s more, according to the post-trust diagnosis, the mistrust is not entirely unwarranted, as the epistemic infrastructure of contemporary liberal democracies is increasingly dysfunctional. The opioid crisis is perhaps one of the most dramatic examples of these dysfunctions. As profit-driven pharmaceutical companies aggressively promoted the use of increasingly potent opioid painkillers, communities were failed by every level of the epistemic infrastructure. Medical experts vouched for the safety of the new drugs, regulatory agencies approved them without adequately assessing their risks, physicians prescribed them in increasingly large quantities, and lawmakers weakened the power of law enforcement agencies to prevent their abuse. As a result, hundreds of thousands of lives were lost. In light of such a catastrophic systemic failure, it is hard to blame the communities most affected by the opioid crisis for mistrusting the very institutions that failed to prevent it.
Critics might argue that the opioid crisis was an isolated incident and, as such, it does not justify the levels of mistrust needed to support skepticism about the safety of vaccines or the reality of climate change. However, this response faces three problems. The first problem is that the kind of systemic failure that led to the opioid crisis would seem to be enough to justify some mistrust of our society’s epistemic infrastructure. As the saying goes, “Fool me once, shame on you; fool me twice, shame on me.” The second problem is that this justified mistrust provides fertile ground for more extreme and less justified forms of distrust to take root, which can be exploited by those intent on sowing disinformation and cultivating confusion. The third problem is that the opioid crisis is far from an isolated incident. Racialized, disabled, and LGBTQ+ communities, for example, have been given plenty of reasons to be mistrustful of a scientific establishment that pathologized their identities, conducted unethical research on them, or used findings to justify their oppression.
The post-trust diagnosis maintains that, to address the spread of “crazy, even dangerous ideas,” we should work to fix our increasingly dysfunctional epistemic infrastructure and to rebuild trust with mistrustful communities. This is, admittedly, easier said than done. However, it is a more promising and constructive approach than the one suggested by the post-truth diagnosis, which limits itself to blaming our fellow citizens for their alleged epistemic failures.

Gabriele Contessa
Gabriele Contessa is Associate Professor of Philosophy at Carleton University. He has recently published a short monograph titled Science Denial: Post-Truth or Post-Trust? (Cambridge University Press, 2025) and is currently completing a longer monograph on public trust in science, which expands the communitarian account sketched in his article "It Takes a Village to Trust Science" (Erkenntnis, 2023).






