Misinformation on science, technology and public health poses serious challenges to society, a situation magnified by rapid advancement in communications technology and development of online social networks. As enabling as these developments have been for the sharing and dissemination of credible information, the same is true of misinformation — and there is no silver bullet to address it.
Misinformation comes from a variety of sources, exploiting the tendency of many to fail to evaluate the veracity of information they are receiving and to prefer information aligned with their political beliefs. If it were benign, the prevalence of misinformation — and, similarly, fake news — could be dismissed, but exposure to misinformation is a cause of misperceptions among the general public that shape how people act politically. Nowhere is that truer than in the context of public health. Misinformation has been particularly problematic in science, technology and health policy. It preys on people’s predisposition to have strong, intuitive reactions to scientific advances, while having little knowledge base to accurately distinguish facts from falsehoods. Fueled by misinformation, many people endorse science-related beliefs that run counter to established scientific consensus, and they are less likely to heed the advice of scientists and medical experts as a result.
While the proliferation of misinformation and fake news appears low, there is little data that tracks its exposure and consumption. This report looks to answer three questions related to science communication and misinformation — How is misinformation spread? Who is most likely to fall prey to misinformation? How do we combat misinformation and its effects? — in part by highlighting case studies on climate change, vaccines and COVID-19.
Broadly speaking, there are three approaches to this problem: controlling its spread; correcting its effects through debunking (fact-checking) or persuasion; and pre-emptive interventions that allow the public to resist misinformation they encounter. With this in mind, five recommendations are presented:
- Track misinformation and debunk when needed;
- Promote accuracy-focused messaging;
- Invest in targeted persuasion focusing on downstream behaviours;
- Build relationships with trusted community leaders;
- Start early to create digital literacy and interest in science.
When taken in concert, these recommendations have the potential to mitigate the consequences of misinformation in science and public health.