top of page
Search

Misinformation in the era of COVID

  • Writer: Steffi Evans
    Steffi Evans
  • May 15, 2023
  • 7 min read

Originally written August 2021


If there's one thing that's driven me to despair throughout this pandemic, it's misinformation. Not the seemingly endless and tedious lockdowns, or the pubs and restaurants being shut, or being separated from family and friends... but people reading, hearing, seeing and believing things that are just plain misleading.


And misinformation isn't limited to the pandemic and vaccination - it extends into other critical areas like global warming and the environment, racism, sexism and LGBTQIA+ rights, and to more insidious depths which aim to destablise governments and societies.


Recent work has led me to examine more about what motivates those who consume misinformation, and how we can try to tackle it.


At the heart is fear and mistrust

Most people will agree that from 2020 onwards... it's been quite a ride. Beyond the pandemic there's been news reaching us about all kinds of threats. The recent resurgence of the Taliban in Afghanistan and the harrowing images and stories reaching us from those affected, the ongoing migrant and climate crises, an increasing rise of the far right politically, wildfires, floods, the death of George Floyd and the largest civil rights movement since the 1960s... all this and more fueled what became an Oxford English Dictionary Word of the year in 2020: doomscrolling. An excessive amount of screen time devoted to the absorption of negative news.


Negative news and fear can be weapons. By destabilising a person and filling them with despair and anxiety, unscrupulous communicators make their audience more suggestible and increase the likelihood that they will accept information that supports their thinking rather than challenges it (a 'confirmation bias' or an 'easy way out' as it were... which you must admit is tempting when it feels like the world is going to shit around you). Importantly, it gives them a tribe - people who share their ways of thinking, and who in turn will also share information that algorithms learn to give them. Eventually they create a closed loop of misinformation as both the people in their tribe and the technology at their disposal reject anything that doesn't fit the mold.


And we can't have a conversation about communications and peoples' ability to interpret them without recognising that the playing field is far from even for everyone. Inequality and inequity play an enormous roll in the spread of misinformation.


To use myself as an example: I am a heterosexual white woman. I had a (relatively) stable family unit. I moved around as a child and living in different cultures broadened my horizons and made me open-minded. I went to good, private schools that taught me beyond what's in a curriculum. Nobody ever told me that what I dreamed for myself as a child was out of my reach. I went to university. I have a stable job that pays well and means I can afford some luxuries in life and save something on the side. I am in a stable and committed relationship. I live in a nice neighborhood. I don't list these things as some kind of weird boast, but to highlight I am privileged, and that this privilege has contributed to my ability (and in some ways allowed me the luxury of free time) to consume and appraise information. But many people don't have that same level of privilege. We have to recognise the myriad factors that interplay and help define somebody's socioeconomic status, and how accessible important technical information (like that related to the pandemic, vaccines, masks, hand hygiene etc.) ultimately is for them.


Critically at this junction of doomscrolling and inequality/inequity is social media. People who consume misinformation are frequently exposed to it on such channels. And if you think about it you can see why. In 2019, 55.5 million people in the UK owned a smartphone - that's 83% of the population who have access to a network of millions of other people with a device that fits comfortably in their pocket. 42 million people have a facebook account, almost 29 million are on Instagram, 16 million have a twitter account, and 6 million are on TikTok. In some ways, social media and the internet have been a great equaliser in how they increase access to information, even if some of it isn't necessarily from reliable sources.


And we may be seeing an emerging trend in the consumption of unreliable sources of information on social media. According to a report by NewsGuard, in 2020 nearly a fifth (17%) of engagement among the top 100 news sources on social media came from sources that the authors deems generally unreliable. That's compared to 8% in 2019.[1]


And to quote an oft-used phrase, "the pen is mightier than the sword". True. Misinformation is just opinions and words. But what happens when the pen is a rallying cry to potentially dangerous actions? When fear and social inequity weaponise misinformation? We've seen it happen already, all over the world. Anti-mask rallies and vaccine hesitancy put the vulnerable at risk and lead to increased pressure on healthcare systems. Protestors storm the Capitol building in Washington. Domestic terrorism increases. People are abused, attacked and even killed because of their gender, race, sexual orientation.....


If anything is clear in the tangled web of it all, it's that misinformation is something to be worried about.


So how do we even begin to overcome this?

In the face of it, misinformation almost feels like too big of a problem. Where do we even start? No one person, company or campaign is going to solve the problem on their own, but with some adaptations to how (and importantly where) we communicate, we might start to turn the tide.


Trust

Becoming a trusted source that people will turn to is key. And trust isn't easy to cultivate - especially in certain industries like pharmaceuticals, which are (rightly) highly scrutinised. Trust is a precious thing and once lost it isn't easily regained.


Clarity

Nine million people in England (around 14% of the population) have what are described as 'low literacy levels' - equivalent to the age of a nine to eleven-year-old. These individuals may struggle reading simple passages of text, understanding the instructions on a medicine bottle, or even in interpreting something like petrol gauge on a car.[2,3]


So how can we expect them to understand complicated information relating to public health?


Consider also that sources of information typically considered less reliable (tabloids, for example) pitch the reading level of their content at this population. A study of 15 leading newspapers from the US and the UK found that The Sun is considered highly readable - with a reading age of eight - a feat it achieves through the use of imagery, formatting, scannable content and sentimental/emotionally-charged language. At the other end of the scale are newspapers which are considered more reliable sources, like The Guardian which has an average literacy age of 14, making it less understandable for many readers.[4,5]


The average person relies heavily on these outlets. They might not even know where to begin when it comes to doing their own research (admittedly the first time I ever heard of PubMed during my Bachelor's my mind went to Club Med and I assumed the whole thing was some witty naming banter from the NIH), much less how to interpret a published article with a critical eye. They need news and social media to make information accessible to them.


When the average reading age in the UK is that of a 9-year-old, superfluous words (while professional-sounding) can make your communications inaccessible.

Obviously the level at which we pitch information depends on the audience, but with misinformation relating to COVID largely affecting the general public, we have to look at the bigger picture and develop simple, easy-to-understand communications as much as possible.





Balance

One of the great problems with misinformation is that it is presented in an 'all-or-nothing' manner. And for reasons we've touched on already, people don't go looking for alternative opinions when their confirmation bias is being satisfied. But lets also remember that you'll always catch more flies with honey than vinegar. So while the temptation may exist to create a 'myth buster' in your content to quick nip any misinformation in the bud, you rapidly risk alienating your audience. Instead, create short, concise arguments. Acknowledge existing thoughts and beliefs, but back these up with clearly written alternatives, and links to sources that are accessible and relatable to your audience too.


Go to your audience - because they won't bother coming to you

This might be the most crucial step in some ways. You have to put your content out where they will find it. This post isn't about the strategic steps to achieve that, but logically if consumers are getting most of their information on social media, it makes sense to start there. I've worked with pharma for almost a decade, and while they've embraced creativity and digital marketing, social media presence has been lacking. And it's obvious why. Legislation and regulation tightly govern what pharma can and can't do. And considering codes like the ABPI have a 'do not bring discredit upon the industry' clause, developing a social media campaign that will have impact in this area, with the audience you might be trying to target, is challenging.


Pharma are gradually embracing social media and all of the delightfully weird ways we communicate online though. Take Roche's ReDefine Haemophilia campaign - developed around the insight that haemophilia patients made memes to share how they were feeling. And that time Sanofi announced that racism isn't a known side effect of any Sanofi medication after Roseann Barr's racist late-night Twitter stream-of-consciousness.


Well played, Sanofi.

But ultimately, a problem like healthcare misinformation needs a consistent and engaging presence across channels - something pharma have yet to demonstrate, but which I'm keen to see them - and other industries - deliver.



I hope you found this article interesting! I've listed a few resources I admire which challenge fake news and misinformation, or which are great at communicating science in an engaging way on social media below in case you'd like to check them out:


@partialnews: compares headlines from papers around the world reporting on the same issue. Great example of how some news outlets use emotionally-charged language to make themselves more accessible to the public


@simplepolitics: a group of activists who do their best to share important political information in an accessible way, so people can be better informed. Importantly, they readily admit their own mistakes when they make them, which helps cultivate trust and honesty


@mrs.b.tv: TikTok scientist who explains all sorts of things about the human body in an easily digestible format


@josefineholum: TikTok med student who develops interesting quizzes people can play along with or 'duet' to


@washingtonpost: a sensible and respected Washington newspaper with a slightly insane social media platform and I love it

 

References

 
 
 

Comments


bottom of page