logo

Flattening the curve of an “infodemic”

By Amanda Quinn and Tamara Grigoryeva  

May 13, 2020   |   0 comments

When distancing from disinformation becomes critical 

Fake news about the virus could be just as harmful as COVID-19.

When dozens of people in Nigeria, Iran, Russia, Uruguay and Arizona were hospitalized in late March 2020, they were not victims of the notorious coronavirus. They were deceived by what the World Health Organization refers to as an infodemic. After reading false information on social media, they had attempted to use various harmful methods, including industrial menthol and chloroquine phosphate, to prevent contraction of the virus 

According to WHO, an infodemic is “an excessive amount of information about a problem, which makes it difficult to identify a solution.” It can include misinformation, disinformation, fake news and rumors during a health emergency that hamper an effective public health response by creating confusion and distrust. 

This phenomena didn’t come as a surprise, as Creative Associates International’s Development Lab has been focusing on disinformation for years in the context of its projects around the world; however, the mix of rapidly developing social media space and the just as swiftly spreading coronavirus gave the infodemic a new spin. Because disinformation will continue deepening the global health crisis, any efforts to offset the infodemic’s effects are urgent and necessary. As a global development implementer, Creative has a role to play in supporting its communities in this space. 

The delicate pandemic-infodemic dance 

Spread of rumors and untruthful information can be traced as far back as Roman timesbut the notorious term disinformation was introduced almost a century ago by the Soviet Union’s Joseph Stalin. And while the USSR ceased to exist almost three decades ago, disinformation wars have been increasing across the globe, especially in the past five years due to fast development of the social media networks and the very nature of disinformation and fake news. 

The use of social media has increased the dissemination of news and information, according to a 2017 study by the Arkansas Journal of Social Change and Public ServiceAt the same time, the very scandalous nature of fake information and fake news allows for much faster spread than the truthful information, a 2018 MIT study foundDisinformation has also been adopted, in unprecedented ways, as a tool of conflict by several major political influencers, including Iran, Russia, China, Pakistan, India, Saudi Arabia and Venezuela. By September 2019, at least 70 countries had been exposed to their disinformation (up from 48 in 2018 and 28 in 2017), according to an Oxford University study.  

Amid COVID-19, the disinformation narratives have centered around health concerns, conspiracy theories, lockdown fears and false cures, as well as societal and political polarization, according to a study by the EU Disinfo Lab 

A joint Reuters Institute and Oxford University study found that while the amount of disinformation has grown like never before during COVID-19,  so has the number of institutional and individual factcheckers (900 from January to March 2020)According to the study, 59 percent of the disinformation was reconfigured content, and the rest was completely false content. The reconfigured content received 87 percent of social media interactions. Twenty percent of the content was top-down (either purposeful or accidental) disinformation sharing from high-level politicians, security agencies engaged in inauthentic online behavior and celebrities, but this kind of content received over 69 percent of reactions.  

Disinformation is often disguised as merely a political tool, but when a pandemic breaks out, the rumor mill kicks into gear. Past pandemics, including cholera and Ebola, have been accompanied by disinformation or an infodemic. As emotions and fears run high, pandemics offer fertile ground for the spread of disinformation. For example, Russian disinformation has relied on emotion to attract people to their narrative for the past 10 to 15 years, and the current crisis is no different.  

Yonder, a startup that studies disinformation, has discovered that during uncertain times like COVID-19, conspiracy narratives spread faster. Under normal circumstances, it takes them six to eight months to fully penetrate, but amid COVID-19 it only takes three to 14 days. As people around the world social distance, many have turned to social media – the fastest disseminator of fake news  for more information and almost everyone is exposed to the infodemicAnd there’s another twist: the more the infodemic spreads and causes distrust in healthcare authorities (like the WHO or the CDC)the more the pandemic evolves 

Pandemics are not the only time when disinformation spikes. Elections are another common way for disinformation to flourishespecially when it comes to foreign-sourced disinformation, according to the National Endowment for DemocracyDisinformation also spikes during violent conflictswhich in extreme cases can lead to a genocide 

Disinformation might seem like a mere buzzword to some. But it is arguably as harmful as any other weapon of mass destruction, because it provokes mass reallife action. And while combating disinformation in general is important, at a time of a pandemic this task becomes critical.  

Creative’s work on disinformation  

Leaning on its decades of experience in citizen security, Creative took interest in disinformation’s effect on fundamental freedoms and societal resiliency long before COVID-19. Creative has implemented programming to strengthen independent and unbiased media in places like Syria and Afghanistan. Creative’s Development Lab has also begun to engage in the disinformation detection and response space, specifically looking at disinformation on online networks. The Lab has conducted social media monitoring for several of Creative’s programs, such as analyzing thousands of social media posts from public Boko Haram-affiliated accounts to investigate the group’s mentality and strategies.  

Creative’s Development Lab is producing quick guides on differentiating between fake news and professional news reporting and finding differences between inauthentic and coordinated social media campaigns and genuine posts, developing trainings and supporting materials for Creative projects and the communities we support.  

Even though the infodemic is mostly taking place online, the consumers and the debunkers of disinformation (such as fact-checkers, community leaders and journalists) are under physical distancing restrictions. This makes a lot of traditional fact-checking and journalistic activities more difficult. However, this challenge has amplified the need for expanding the use of open source intelligence (OSINT) tools and training civil societies and journalists in finding and investigating disinformation through OSINT. At the same time, because this work has largely moved to the online space, making it vulnerable to cyber-attacks and technological sanitation, digital security and encryption with integration of physical and psycho-social security have become key.   

On the other hand, apeople’s wider concerns around disinformation and fake news growexpansion of social behavior change communications (SBCC) is needed. SBCC could equip people with deeper critical thinking techniques and turn them into analytical information consumers. 

The disinformation efforts have been multiplying amid COVID-19, making it difficult to deliver authentic information about the virus to populations. At the same time, it is a pivotal time for news reporting, news consumption habits and responsible technology. Creative’s teams are focusing on using existing communications channels to disseminate accurate, vetted information on the pandemic with partners and community members, as well as supporting new efforts to combat disinformation as it arises in the contexts where we work. 

For more information and updates about Creative and our programs’ response to COVID-19, click here.

Comments are closed.