Well-Informed About Misinformation

Misinformation. Disinformation. Fake research. Fake news. Nowadays, these terms are everywhere. Misinformation, in its various guises, seems to threaten every aspect of our lives: political elections, medical advice, and public health policy, to name just a few. By some accounts, the spread of misinformation has acquired epidemic proportions. And, of course, as misinformation has proliferated, research on this phenomenon has also grown: from tracking how it spreads to analyzing its nefarious effects to learning how to fight it.

Research shows that misinformation is a complex notion. A better understanding of the nuances of the misinformation system would undoubtedly help improve our information environment, but my goal here is to caution about what research on misinformation can and cannot tell us. This is not to cast doubt on its value but to show that such research is easy to misconstrue, potentially compounding the problem.

My focus is on misinformation regarding scientific claims. Although the phenomenon may seem a more obvious peril in the political realm, scientific information forms the basis of life choices that have a significant effect on people’s well-being: what to eat, whether to smoke, how often to exercise, whether to get a vaccine, which car to drive, and what medicines to take. Scientific information also underpins public policy in fields like water management, pollution control, and public health mandates, all of which have an impact at a societal level.

Why is the problem of scientific misinformation of such great concern? The answer seems obvious. Misinformation and its corollaries, such as disinformation, are bad for epistemic reasons: they create confusion and produce false, inaccurate, or poorly justified beliefs. If misinformation is prevalent, it threatens knowledge not just for those who directly consume and believe it but for all of us. Where misinformation is widespread, it is tougher to determine what is true and false or misleading. We must work harder to gain true or justified beliefs. When misinformation is presented in sophisticated ways, even those who are vigilant and have good media evaluation skills risk acquiring false or misleading beliefs.

Misinformation is also bad for practical reasons. Our beliefs inform our decisions and actions; thus, false or misleading beliefs can lead us into trouble. For example, believing that smoking is fine for our health can contribute to a decision to smoke or oppose tobacco restrictions. Likewise, believing that global warming is not produced by human action can lead to personal choices that exacerbate the problem, such as driving or challenging policies regulating oil products. Indeed, misinformation in its various forms is thought to be a significant reason for opposition to various science-based policies, including mask mandates, vaccine recommendations, and climate change legislation.

In democratic societies, conducting research on misinformation—its prevalence, how it is produced, and how to combat it—is crucial. To fight misinformation, we first need to know what it is, how to identify it, and who is producing it. But this presents several challenges.

There are various related concepts—misinformation, disinformation, malinformation, fake news, and propaganda—that are not always easy to disentangle. Different researchers use different definitions, putting more or less emphasis on inaccuracy, untruth, falsity, deception, unintended tendencies to mislead, and so on. With no consensus about the basic concepts, it is difficult to quantify the phenomenon, with some research indicating that claims about how much misinformation is disseminated and consumed might be overblown.

Problems exist even where researchers manage conceptual agreement. Often, misinformation refers to false or inaccurate information, while disinformation and malinformation imply a deliberate attempt to mislead or cause harm. Unfortunately, the false and the intentional are not quite the litmus tests they seem.

Although, in some cases, false statements are easy to identify, others are trickier to assess. For instance, for those that understand the technologies involved, the notion that Covid vaccines could contain tracking chips is ridiculous, but even people who keep up with new developments in science and medicine can be confused about cutting-edge developments since these forms of knowledge do change very rapidly.

It is even harder to assess the intentionality element. Mistakes happen, and there are many legitimate areas of scientific controversy. Furthermore, determining intentionality is far from a straightforward task. For example, whose intentionality is relevant? Although the producer of disinformation might intend to deceive, those subsequently spreading the false information might not.

But even if characterizing and quantifying misinformation was unproblematic, the question of its impact would remain. That a claim is false or inaccurate says nothing about whether it is believed, much less acted on. A claim can be misleading yet still fail to mislead anyone. Much is made of the extent to which people engage with information—whether and how often they share it—but this tells us little about whether they actually believe it. Sometimes people share misinformation to mock or debunk it. Sometimes they do so simply because others are doing the same. And if it is challenging to make inferences that distinguish whether people believe a claim from how they engage with it, it is even more difficult to ascertain what the role of misinformation is on actual behavior. Some evidence suggests that people are more likely to attend to misinformation that matches beliefs they already have. If so, they might have behaved the same way without it.

These challenges call for caution in making statements about how serious and widespread a problem scientific misinformation is. Poor information environments are not, after all, a recent problem. People have produced, shared, and believed false or misleading information in the past. Modern communication tools may make misinformation easier to spread, but they also provide unprecedently easy access to accurate information.

Also, a reason for caution is that the current emphasis on the role of misinformation risks minimizing, or even disregarding, the role of social, political, and ethical values in people’s acceptance or rejection of science-based public policies. That misinformation can create opposition seems obvious, but resistance to many science-based policies can only be fully understood by recognizing that scientific evidence alone is insufficient to support public policies. Fears about vaccine safety are not the only reason for opposition to mandates. Value judgments about the importance of liberty and health and concerns about trust in public health officials or pharmaceutical companies also play a role. Likewise, resistance to policies to reduce climate change can also result from disagreements about the relevance of future generations’ well-being or the significance of economic effects.

In minimizing or obscuring the role of values when it comes to accepting public policy, the current focus on scientific misinformation prevents engagement with stakeholders about the values they hold and the need to subject them to critical evaluation. Although addressing value disagreements is not an easy undertaking, that is not a reason to give up. Identifying and combating scientific misinformation are not straightforward tasks. Nonetheless, that has not stopped us from attempting to do so.

The Current Events Series of Public Philosophy of the APA Blog aims to share philosophical insights about current topics of today. If you would like to contribute to this series, email rbgibson@utmb.edu or sabrinamisirhiralall@apaonline.org.

picture of author
Inmaculada de Melo-Martin
Professor of Medical Ethics in Medicine | Website

Inmaculada de Melo-Martín is Professor of Medical Ethics at Weill Cornell Medicine. She holds a PhD in Philosophy and a MS in Biology. She works on Philosophy of Science, Ethics, and Applied Philosophy. Her most recent books are Rethinking Reprogenetics (OUP, 2017), and with Kristen Intemann, The Fight Against Doubt (OUP, 2018).

Website: https://vivo.weill.cornell.edu/display/cwid-imd2001

LEAVE A REPLY

Please enter your comment!
Please enter your name here

WordPress Anti-Spam by WP-SpamShield

Topics

Advanced search

Posts You May Enjoy

Women in Philosophy Behaving Badly? Or Madly?

*The term “Mad” is a contentious identifier. I use Mad as a form of resistance but not all diagnosed persons are on board. The...