The COVID-19 ‘Infodemic’: A Covert Killer

Why immunity against the ‘infodemic’ can only happen with structural change 

It’s often said that there are always three stories: his, her’s, and the truth. While this nugget of wisdom applies to just about every recount or anecdote, the truth becomes the only story that should be told when the matter is of life or death. 

Facing the COVID-19 crisis has been an isolating experience the world over. Through our anguish comes another type of pandemic that is proving deadly in itself: the viral spread of false information.

First coined by World Health Organization (WHO) Director-General Tedros Adhanom in mid-February, the ‘infodemic’ involves the dissemination of misinformation (unintentionally shared) and disinformation (intentionally shared) about COVID-19–be it its epidemiology, response efforts, or narratives of the human experience during an unprecedented moment in history. 

Although an enabler, the Internet remains our most powerful tool in controlling the spread of false information. But the nature of the virus–and the times in which it exists–has allowed the infodemic to become so dangerous that the stability of nations are at stake. 

 Jumpstart speaks with three academics who have highlighted the structural issues that must be overcome to combat the infodemic, proposing fundamental changes in how we communicate as a global society.

A promising start

The infodemic infected our Facebook timelines with conspiracy theory-level intensity in a matter of days of the virus becoming known around the world. It was created in a laboratory as a biological weapon; it spreads through 5G; its vaccine will help Bill Gates implant microchips in billions of people–these are but a selection of the more outrageously unfounded theories that have popped up in the past several months. 

Of such statements, the most devastating are those relating to COVID-19 prevention or cures, where fear and uncertainty are pushing people to go to extreme lengths in search of immunity. According to the Associated Press, Iranian media has reported around 300 deaths from methanol poisoning, as claims about the chemical’s preventative abilities were pervasive on the country’s social media platforms. 

It’s hard to overstate our susceptibility to believe false information, as our willingness to trust friends often supersedes better judgment. A 2017 study by Media Insight Project found that respondents are more likely to believe information shared by a trusted sharer than a reputable media source. And our inclination to partake in this exchange is innate and easier than ever. 

“Social media doesn’t do anything more than what gossip did prior to social media. But now, it’s the exponential ability to share information. You can repost on social media sites; it’s that kind of functionality that makes it very, very dangerous,” says Dr. Nilanjan Raghunath, an Assistant Sociology Professor at the Singapore University of Technology and Design. 

Fanning the flame is the irrevocable link between false information and malice. From fake information websites that serve as digital traps to obtain user data to spear-phishing attacks (a highly targeted form of phishing), bad actors have shown little restraint in taking advantage of the public’s vulnerability during this time (The New York Times). Worse still, the indirect consequences of sharing false information are much more difficult to address. 

Dr. Rajib Shaw, a professor at Keio University’s Graduate School of Media and Governance, says that “fake news often creates tension, mistrust in the community, and hampers the response efforts by the authorities.” Although the perils of fake news are widely acknowledged, especially following the 2016 U.S. presidential election and ‘Brexit’ referendum, it’s the health-related repercussions of the infodemic that have prompted Big Tech to finally step up. 

On February 13, WHO representatives met with Amazon, Dropbox, Alphabet, Twitter, Youtube, and Facebook, among others, on the latter’s Menlo Park campus to discuss ways to share accurate information with users (CNBC). Manager of digital solutions at the WHO, Andrew Pattison, attended the meeting and said in an interview with the New York Times that the “tone is changing” in Silicon Valley.

Where previous calls for these platforms to police content were met with lukewarm efforts, such as downgrading its position in a user’s feed, platforms are now flagging or removing harmful content, prohibiting the advertising of COVID-19-related products, and providing free ad space to non-profit organizations. WhatsApp is limiting the number of times a message can be forwarded, and Youtube is demonetizing COVID-19-related videos that fail to meet its strict compliance requirements for “factuality and sensitivity” (World Economic Forum). 

Some actions have been more targeted; the Infowars app was banned on Google after its founder, Alex Jones, questioned the efficacy of quarantining on his alt-right radio show (Wired). 

Far from the finish line

Although the public and private sectors’ efforts to combat the infodemic are as unprecedented as the crisis itself, academics have been quick to point out the many obstacles that still stand in the way. 

In an April Issues in Science and Technology paper, Isabelle Freiling, a scholar who studies ‘Trust and Communication in a Digitized World’ at the University of Münster, highlights significant communication challenges that politicians, journalists, scientists, and other communicators must overcome. She and her three co-authors argue that in a continually evolving situation, “efforts to counter misinformation by narrowly focusing on ‘accuracy’ and ‘the facts’ are likely to backfire.” 

“Communicating uncertainty requires [you] to show that the issue at hand is still uncertain, which may be less convincing than clearer and simpler statements,” adds Freiling. “This dilemma gave room to those spreading misinformation and even conspiracy theories, especially when those claims were phrased clearly.”

This problem is made exponentially worse when the communicator is a political leader, where the pressure to instill collective calm is high–as is the potential for damage. According to an Oxford University study from April, politicians, celebrities, and public figures were the source of around 20% of false information about COVID-19, but generated 69% of engagement. 

In late March, Brazil President Jair Bolsonaro stated that hydroxychloroquine was a “cure” and “working everywhere”–a statement immediately refuted by the medical community and removed by Facebook. Beginning in March, U.S. President Donald Trump also made several statements about the efficacy of the anti-malarial drug and claimed at one point to be taking it himself. 

Thus far, there have been over 100 deaths in the U.S. relating to chloroquine overdoses (Newsweek). Doctors in the U.S. were also reported to have hoarded the drug, possibly at the expense of those who needed it for the stated use, such as lupus and rheumatoid arthritis treatment (The New York Times).

The politicization of the pandemic doesn’t stop with state leaders. The Issues paper notes that the partisan discursive environment further muddies the water of truth and can delegitimize the already meager number of trusted sources. 

Freiling uses perceptions about the WHO in the U.S. as an example. She first cites a Kaiser Family Foundation study from March, which found that 70% of respondents trusted the organization. But a Pew Research Center Poll from June would paint a different picture; only 28% of Republicans or right-leaning Independents think that the organization has “done at least a good job in handling the pandemic,” compared to the 62% of Democrats or left-leaning Independents who feel the same. 

“And this could affect their trust in the organization, as trust is built slowly but destroyed easily,” she adds.  

Freiling also warns the public of getting too comfortable with Big Tech’s effort to supervise content, suggesting that such measures only offer a reprieve. Concerns about censorship accusations and the effect of removing content on traffic and ad revenue, she says, are considerations that Big Tech needs to grapple with every new crisis and piece of information. 

“Platforms are deleting posts they declared as false on COVID-19, [but that] does not mean that they will do the same from now on for misinformation on every topic, nor that they should get to decide what is true and what is false,” she adds. 

A structural deficiency 

The academic community has acted quickly to propose remedies for the spread of false information. Along with Jinling Hua, Shaw published one of the first papers that looked at the issue through a data analysis lens. The study assesses actions taken by the public and governing bodies through newspapers, social media, and data from other digital platforms, finding that early corrective measures were crucial to containing the infodemic in China.

The paper states that “at an early stage, data management was an issue, but once the virus was confirmed and declared by the government, strict data management measures were put into place.” Shaw and Hua note that Tencent’s ‘rumors exposed’ website also played a role in bringing the problem of false information to the public’s attention. 

Additionally, they stress the importance of local response efforts; for instance, village-level volunteers who helped to implement mitigation measures during the early days of the pandemic were vital in helping to share “the right information” before the infodemic reached them.  

For countries facing a more contentious discursive environment around COVID-19, Freiling and her co-authors believe that non-partisan, highly trustworthy actors, such as scientists, should become key communicators as a way forward. They argue that, since many questions about COVID-19 still do not have a straightforward answer, scientists must change their communication strategy. 

“One way to balance accuracy against uncertainty is to separate questions that can be answered with the available scientific knowledge from those that science cannot answer yet,” she says. In an environment where ideologically divergent groups contentiously debate even the most basic findings, such as the efficacy of reducing the virus’s spread by wearing face masks or coverings, a nuanced approach that “[considers] values and emotions, but at the same time without getting partisan” is needed.  

From a systemic perspective, Raghunath believes a long-term solution to the infodemic and crises of the kind is to reframe the problem altogether. She urges the public and communicators to look at it through an ethical lens because “every human being has a right to information.” 

She alludes to an extensively-studied idea in Sociology, which says that information is equal to power because one’s access to information directly corresponds to one’s access to capital. Therefore, a proper response to the infodemic would mean broadening access to resources that allow individuals to discern correct information from incorrect information. 

“There’s a problem because at different levels of education and different access to resources affect the way we perceive information. There’s also the digital divide; we might think that the digital divide has disappeared, but actually, it exists,” adds Raghunath. 

Since the start of the pandemic, politicians, media outlets, and the like have been playing what can only be described as ‘the blame game,’ deliberating who should be responsible for the virus, its spread, and the economic fallout. Such exchanges do little to alleviate the condition of those who have been most impacted. 

Although the collaborative efforts by Big Tech and health organizations have been promising, the pandemic’s partisan nature will continue to create divides and weaken international response efforts. A lack of cohesion in approach and ideology has already caused avoidable setbacks, especially for socially and economically disadvantaged groups.

The unwelcome tradeoffs of the pandemic have presented a quagmire, as policymakers weigh the risks of loosening social distancing and travel restrictions to keep their economies afloat. But as the infodemic has shown, no concessions should be made when it comes to the truth, however hard it is to find. 

Min was Jumpstart’s Editor in Chief. 

SHARE THIS STORY

Share on facebook
Share on twitter
Share on linkedin
Share on email

RELATED POSTS

4 Reasons Why India's EV Industry is Poised for Rapid Growth

4 Reasons Why India’s EV Industry is Poised for Rapid Growth

Hold on to your seats, because India’s electric vehicle (EV) industry is not just gaining speed—it’s shifting gears faster than a Tesla Roadster on Ludicrous mode. EVs are no longer just futuristic fantasies; they’re already ruling the roads of Delhi and zooming past their counterparts fueled by fossil fuels on the highways of Mumbai.

Microsoft-backed Builder.ai Secures Over US$250 Million in Series D Funding

Microsoft-backed Builder.ai Secures Over US$250 Million in Series D Funding

London-based artificial intelligence (AI)-powered composable software platform Builder.ai has raised a significant investment of over US$250 million in Series D funding. Led by Qatar Investment Authority (QIA), the funding round brings the total amount raised by the company to over US$450 million, resulting in a valuation increase of up to 1.8x.

Essential Gaming Slang Terms for True Gamers

Essential Gaming Slang Terms for True Gamers

Gaming is not just a hobby; it’s a culture with its own unique language. Understanding slang and jargon is crucial for having an immersive experience and connecting with fellow gamers. From the acronyms that define player roles to the phrases that capture epic moments, mastering these slang terms is a must for every true gamer.

LinkedIn Launches Tools to Boost Job Seekers' Safety and Confidence

LinkedIn Launches Tools to Boost Job Seekers’ Safety and Confidence

Networking platform LinkedIn has introduced a range of tools to empower job seekers to confidently navigate their job search process while ensuring their safety and security. The latest updates include the implementation of verifications on job posts, enabling the display of verified information about job posters or their companies.

A Step-by-Step Guide

The Power of a Wikipedia Page for Your Business: A Step-by-Step Guide

The one thing that builds trust between your company and its potential customers is having its own Wikipedia page. It is the first thing that shows up when someone looks up your company (besides your website of course!) and gives potential customers all the information they might need about your business.