We’re in the midst of a global health pandemic, with the end far from sight. Why does everyone’s aunt on Facebook seem to have the answer?
Writer: Lia Bote
Editor: Daniel Jacobson
Artist: Sophie Maho Hamada
All you have to do is ingest inordinate amounts of vitamin C, eat twelve whole cloves of raw garlic, and yell at the telecommunications companies sending virus particles from their 5G towers. Then it will all be okay.
At least that’s what social media wants us to believe. With this devastating COVID-19 pandemic causing tragic fatalities, overwhelming healthcare systems, and crashing economies around the globe, people are growing increasingly desperate to find a solution. Most medical researchers claim that we are at least a year away from a viable vaccine, and yet we hear stories all the time about home remedies that could potentially soothe symptoms and prevent infection. Is there something that Facebook posts, Instagram stories, and WhatsApp text chains have figured out that our top scientists haven’t?
The rapid propagation of misinformation on social media – appropriately dubbed “infodemic” by World Health Organization (WHO) director Tedros Adhanom Ghebreyesus – has turned into a pandemic of its own, and it is far from inconsequential. A few weeks ago, for instance, a man died from ingesting fish tank cleaner that contained chloroquine, which has been touted as a potential COVID-19 treatment in its medical form, although remains unapproved by the FDA. As such, the more we learn about the truths behind the disease, the more important it becomes to effectively communicate them. Social media has made information more accessible than ever, but with nearly half of adults in the UK in 2019 reporting that they get their news from social media, it becomes difficult to filter out fact from carefully crafted fiction.
So why do we share content, even when it may be untrue?
A study led by Anuja Majumdar, from the University of Southern California, asked participants to select the reasons why they tweet from a list of options, including to “spread knowledge” and “get (my) followers to join the discussion”. However, there were four unsurprising, but somewhat unsettling, factors that prevailed: gaining approval, soliciting attention, entertaining, and arguing.
Given what we know about social media algorithms, perhaps this is understandable. Content that is interacted with more often is more likely to be displayed, which means that the most attention-grabbing, entertaining posts are usually the most successful, with respect to social media exposure. Alongside the rapid scrolling and paper-thin attention spans that characterize social media use, it is easy to latch on to and share the dangerously misleading buzzwords, falsely comforting fictitious claims, and drastic oversimplifications that circulate our feeds.
Understanding these dangers, most social media companies are now relying more heavily on artificial intelligence to filter through the content being published. While they claim that this may lead to more “false positives”, the removal of content which may not be in violation, this is a level of caution that is uncharacteristically high for these companies, and a step in the right direction to combat the spread of harmful fake news.
Still, with the very nature of social media misinformation being rooted in its rapid spread, it is unlikely that we will see this infodemic end before the pandemic does. In the meantime, we must at least do our part by staying at home, being informed, sharing information only from reliable sources, and taking vitamin C responsibly.