Our information ecosystem has become a massive false-balance machine. Fringe positions that have already been studied and shown to be wrong are legitimized, given a huge profile and presented as reasonable and respectable alternatives to the existing body of evidence.
False balance – also known as bothsidesism – occurs when opposing views are represented as being more equally valid than the evidence suggests. At its core, false balance is the misrepresentation of the scientific consensus. And it has become a significant issue.
It is happening with debates on the causes of climate change, the safety of GMOs, the effectiveness of unproven therapies, the value of transgender care, abortion and, of course, the benefits and risks of vaccines.
While the issue of false balance is usually linked to how journalists represent topics, false balance is increasingly driven by social media echo chambers, the fragmentation of the news media and the ideologically motivated embrace of fringe ideas. Too often this has allowed a small cohort of vocal contrarians to have an outsized impact on public policy and public perceptions.
A recent study published in the Journal of the American Medical Association highlights how a small group of U.S. physicians spread “inaccurate and potentially harmful assertions” about COVID and the vaccines in a manner that, the authors speculate, had a significant impact on the public understanding and adoption (or lack thereof) of effective public health measures, contributing to the death of hundreds of thousands of Americans.
Other studies have found that false balance can have an adverse impact on the intention to vaccinate, which has been associated with a reduction in vaccine coverage. And research I did with my colleagues at the University of Alberta found that the bothsidesism media coverage of COVID public health policies – such as the value of the natural herd immunity, let-it-burn approach – potentially undermined the adoption of more widely accepted mitigation strategies.
Much of the harms that flow from false balance happen because it creates an impression that the science is more contested than it is. An interesting study from 2022 illustrates how this can fuel harmful misperceptions about important health topics. The study, done in the Czech Republic, found that 90 per cent of those in the public underestimate how much physicians trust the COVID vaccines. Indeed, almost half believe that only 50 per cent of doctors trust the vaccines. And four out of five “assumed the medical community is undecided.” In reality, almost 90 per cent of physicians say they trust the COVID vaccines and only 2 per cent say they don’t. Importantly, the lack of knowledge about the strong physician consensus had an adverse impact on the intention to vaccinate.
False balance is likely another reason that those who believe conspiracy theories and misinformation wildly overestimate how many people agree with them. A study by Gordon Pennycook and colleagues, for example, found that while most (but, alas, not all) conspiracy theories are believed by a relatively small minority, “conspiracy believers thought themselves to be in the majority 93 per cent of the time.”
Valuing the body of evidence over fringe views may feel like giving in to conformity, the embracing of group think or the stifling of innovation. I understand these concerns. Contrarian and controversial scientific views are important parts of the scientific process. For example, plate tectonics and the idea that bacteria cause ulcers were both once viewed as fringe-y. But once these controversial views were raised – at scientific venues, it is worth noting – they were subsequently supported by data and cogent arguments. The evidence wasn’t ignored and there was no ranting about conspiracy theories on social media, popular podcasts and cable news shows.
One could argue that our current bothsidesism culture has also led to a significant waste of scientific resources.
So, I’m not arguing for a silencing or ignoring of controversial ideas. In fact, many of the topics that have been the subject of the most intense bothsidesism – vaccines’ link to autism, the therapeutic value of ivermectin for COVID, the dangers of GMOs, the role of human activity in climate change, Trump’s “Big Lie” – have not, and have never been, ignored or silenced. They have been openly debated (probably too much). They have received a ridiculous amount of attention in popular culture (probably too much). And they have been studied and studied and studied (probably too much). And the fringe version has been consistently and demonstrably shown, over and over, to be wrong.
Indeed, one could argue that our current bothsidesism culture has also led to a significant waste of scientific resources. But for the false balance fuelling public perceptions, would we have spent so much time, energy and money studying ivermectin, homeopathy or the lie that vaccines cause autism?
What is needed is a more accurate representation of the science and the degree and nature of the scientific consensus.
Yes, the scientific consensus might shift. Scientific paradigms evolve. And conventional wisdom should be continually tested and questioned. We must also be honest and transparent about areas where there is a plurality of legitimate views and scientific uncertainty. But to not recognize the value of the existing body of evidence – especially in the context of decision-making at the level of both the individual and public policy – is to largely gut the value of science. It is a paralyzing position. Agreed upon science-informed conclusions become meaningless.
Notably, in this cultural moment, the false-balance problem seems to emerge most often with politically polarized topics, highlighting the degree to which bothsidesism misrepresentations are more about ideological positioning than debating what the science actually says. Consider these questions: Would you rather fly with an airline that uses planes designed using the scientific consensus or on All Knowledge Is Relative Airline? Would you want to drive across a bridge informed by the existing body of evidence or one where the builders embraced a contrarian view of physics? The fact that these questions are so obviously absurd illustrates that in many, perhaps most, areas there is an acceptance that the scientific consensus matters.
There is good news. Studies have consistently shown that both explaining the scientific consensus and using a weight of evidence approach – that is, accurately representing what the available evidence says – can have positive impacts on correcting misperceptions. To this end, journalists should take great care in how they represent contrarian views by, for example, reporting how other reputable scholars view the issue and, when possible, referencing the scientific consensus on point. The scientific community – including universities, public funding entities and scientific and professional organizations – should create accessible and shareable content about the scientific consensus on important topics. There is a new project at Durham University, called the Institute for Ascertaining Scientific Consensus, that is designed to help with this goal by measuring and compiling the strength of scientific consensus on a range of key topics. The team, led by Professor Peter Vickers, hopes it will be a useful tool for policymakers and will “serve to inform laypersons, fighting against ‘fake news’ and misinformation.”
Finally, we also need to correct misrepresentations wherever they emerge, be it on social media, in the news, on popular podcasts or out of the mouths of our political leaders.