Before the experience of the past two years of Covid-19, I had assumed that a deadly viral outbreak would be quickly contained by governments. Especially in rich countries with good health systems, public health infrastructure, economic support packages and trust in public institutions. I thought there would be general consensus on what needed to be done, and that people would get their information directly from experts working in universities and public health authorities. I mean, who would want to risk getting a deadly infection?
Now, I wonder if an even more deadly virus – like a pandemic-ready version of Mers, a coronavirus which killed 20% of those it infected in South Korea before being contained in 2015 – would be treated the same way as Covid-19. Would thousands of people show up at protests because theyhad read on Facebook that Mers was a hoax? Would there be similar scenarios at every pandemic after that? After my experience of actively working through the Covid-19 pandemic, these possibilities seem horrifyingly possible.
One of the most unpredictable aspects of the past two years, and one of the most disheartening, has been the rise of widespread misinformation. The line between facts and lies has disintegrated. Years of experience in infectious disease control and a doctorate or medical degree quickly became equivalent to the influencer on YouTube or Facebook who has garnered hundreds of thousands of followers by promoting exciting-but-untrue “facts”.
You can see this clearly in the rise of anti-vaccine sentiment, where popular conspiracists share stories about alleged side-effects such as how vaccines are microchipping our bodies, or changing our DNA, or poisoning us. This has gone far beyond social media chatter and personal resistance, becoming an aggressive real-world campaign that has led to protests at hospitals, health workers being attacked and scientists being mailed death threats.
I’ve personally learned that lies spread faster than truth. People have written entire blogs attacking my expertise and sharing clear falsehoods – such as the claim that I have no published scientific papers, or that I’m a global plant by the World Economic Forum or Gates Foundation, or that I am a philosopher rather than a scientist (because I have a DPhil from Oxford).
It’s easy to laugh at such obvious untruths, until it sinks in that this clickbait gets shared thousands of times. People believe it, and then they too share it. And there is no way to counter every single falsehood. These lies carry more weight among some internet communities than the fact that Edinburgh University evaluated my expertise and granted me a professorship.
I always try to counter these claims by sharing my funding sources, being transparent about what data I’m using for my analyses and advice, and acknowledging that while I might get things wrong, I always have tried to conduct myself with integrity and academic professionalism.
But this isn’t always enough. There are many people voicing strong opinions who aren’t constrained the same way. Lots of politicians and leaders are seeking popularity, often by actively opposing basic public health measures. There are social media celebrities who build followings by sharing emotionally appealing lies. There are even academics who staked a claim with a certain camp early in the pandemic, and haven’t shifted their thinking despite new tools such as vaccines, mass testing and antivirals becoming available.
The power of misinformation can be seen in the number of people appearing in hospitals, desperately sick with Covid-19 and struggling to breathe, who still say it is a hoax, and don’t believe that this is an actual virus (like many other viruses) that infects humans and makes us ill. A basic fact that somehow is harder for them to believe than the idea that this is a global conspiracy for government takeover of citizens.
How is someone supposed to sift facts and real data from manufactured and glitzy stories on social media? There is little to no regulation of quality of data or sources on the internet. Some of the small lessons I’ve learned are to explain science clearly and simply so that it is comprehensible to everyone, and to try to reach people in small groups or at an individual level; fighting misinformation on social media is a losing battle. Important too is for experts to engage with TV, radio and newspapers, as mass media still carries considerable influence.
Now is the time to understand and address these problems. I’ve written a book – out next year – about the experience of advising government and trying to communicate the fast-moving science and policy of the pandemic to the public, and hopefully how to do everything better next time. Reflecting and thinking ahead isn’t a theoretical exercise because we know that the next pandemic is coming – whether it is Mers or Sars or something else – and there may not be a quiet time to do it. Like everything else in the past two years, we have to learn these lessons as we go.