COVID disinformation group surges in popularity as Facebook ‘fails to enforce its own policies’

COVID disinformation group surges in popularity as Facebook ‘fails to enforce its own policies’

Facebook is unable or unwilling to block COVID-19 disinformation on its platform. The is the conclusion the Institute for Strategic Dialogue (ISD) has drawn in a new report it’s published on the issue. According to the organisation, Facebook is “failing to tackle prominent groups and individuals who spread false claims about COVID-19 and vaccines”.

Focusing on the World Doctors Alliance (WDA), a COVID-19 denial and sceptic group led by Dr. Dolores Cahill, the ISD found that its popularity has surged on Facebook. And this is despite the company’s promise to deal with disinformation by removing content and by engaging in fact-checking of the WDA’s posts

A surge in disinformation

Authored by Aoife Gallagher, MacKenzie Hart, and Ciarán O’Connor, the ISD published Ill Advice: A Case Study in Facebook’s Failure to Tackle COVID-19 Disinformation on Wednesday 20 October. It begins by stating their intent is to draw attention to the role of social media “in hosting and amplifying dangerous and false information about the pandemic and vaccinations”.

Paying close attention to the WDA’s presence on Facebook, the ISD found that between January 2020 and July 2021 its amount of followers grew from 3,456 to 460,179, an increase of 13,215%. In the same time period pages associated with the organisation have accumulated a total of 553,669 followers on Facebook, posting a total of 1,580 times resulting in 5.7m “interactions”. The average number of interactions also grew to 679,157 per month between February and July of this year, compared to an average of 299,100 per month between January 2020 and July 2021.

Analysing 89,315 posts mentioning the WDA and its leaders made in Arabic, English, German, and Spanish, the researchers found a number of common threads. 

Of the 50 most popular posts the ISD noted they’re “conspiratorial in nature, implying some sort of overarching ‘master plan’” on the part of health systems, the press, and world governments. The most popular disinformation is that some combination of hydroxychloroquine, vitamin C, and zinc can cure COVID-19. Along similar lines the next most popular posts questioned the death rates for the virus and promoted the idea that the pandemic is a “scamdemic” or planned in some way.

Former UCD professor Dolores Cahill appears to be the most popular of the WDA’s members. A total of 24,060 Facebook posts mentioned her, with Dr. Scott Jensen coming second with 17,434 mentions. As a result of these mentions being “across the platform” and not solely from the WDA’s own pages, the ISD researchers contend that it shows the WDA’s leaders “have established a broader presence and reputation beyond content produced on their own pages/accounts”

Ignoring the problem

The report also details issues with Facebook’s fact-checking system, arguing that “debunking claims or posts one by one presents an almost impossible task for fact-checkers” trying to deal with COVID disinformation.

For example, Dr. Cahill posted 149 times on her Facebook profile between March 2020 and August 2021. Of these posts, researchers fact-checked 75 of them. But, and as with other leading members of the WDA, “a large number of these fact-checks debunk the same claims or videos across multiple articles and languages”. Spreaders of disinformation are also able to post without receiving a fact-check on their content by publishing “slight variations” of previous claims. This status, the ISD says, “is untenable”. 

Yet another problem the ISD identified with Facebook’s fact-checking system was its lack of focus given to languages other than English. In one WDA video the report draws attention to, Dr. Vernon Coleman discusses what he says is a “‘hidden agenda’ behind the pandemic” as well as claiming COVID-19 isn’t as lethal as it’s made out to be. As the authors point out, such disinformation is prohibited by Facebook’s own guidelines. In spite of this, “the video has reached viral levels of popularity in Arabic-language communities”, with Facebook having not fact-checked it. The same pattern emerges in other languages, but especially in German and Spanish. Given that most of Facebook’s new users are coming from the developing world, the company’s lack of attention to languages other than English “is a huge issue”. 

Results like these “call into question” Facebook’s stated efforts to limit disinformation on its platform. The company, the authors declare, is “failing to enforce its own policies” and content that should be removed “is receiving significant engagement” instead. The result of this, the ISD argues, is “real world harm and tragic loss of life”.

“A defining role” in democracy

Although not entirely surprising the report’s findings are shocking in how they show the huge growth of just one COVID-19 denial group. It also fits with the wider pattern of Facebook in failing to deal with issues that have a very real effect on our societies. Civil society groups and activists have long highlighted that the social media giant allows disinformation and hatred to fester on the platform as well as how extremists use it to organise and spread propaganda.

Professor Eileen Culloty, a lecturer in the School of Communications in Dublin City University (DCU), and vice chair of Media Literacy Ireland, has pointed out that “uninformed or misinformed” people have always been “at risk of being exploited by vested interests”. Professor Culloty told The Beacon the role of the traditional media, which had “incredible power” in promoting the war in Iraq and the false weapons of mass destruction story are examples of how this influence has been “abused”.

But now a small number of social media companies “play a defining role in public exposure to information”. When it comes to Facebook “There’s a complete lack of accountability”, which she also insisted should be described “as an advertising company rather than a social network”. What’s more, how Facebook and similar platforms “facilitate radicalisation is a major concern”. Considering this, policymakers and researchers must be given access to data from social media companies in order to fully understand what’s going on. But this “hasn’t materialised”. And it’s caused a “bizarre situation of worrying about the dangers and impacts of social media, but without the ability to research it systematically”.

Similar to the ISD, earlier this year Professor Culloty, as part of a research team, analysed a number of transparency reports social media platforms submitted to the EU’s COVID-19 disinformation monitoring programme. They discovered that the reports “were highly repetitive and often irrelevant”. Going on, she revealed that:

For example, more than a quarter of the actions reported by Facebook were unrelated to COVID-19 disinformation. The implementation of basic actions, such as applying generic labels to posts about COVID-19, was highly inconsistent. We found cases where the same piece of content was labelled in one instance, but not in another. 

In the short term the professor has called for social media platforms to be “compelled to share data” and open themselves up to external audits. In the longer term she told The Beacon she’s against regulating social media in the same way as traditional media. The DCU professor insisted they’re not just publishers: They’re “for-profit advertising companies”. And the amount of power they yield “reflects a major failure in applying competition rules and oversight”.

An ongoing threat

Previous reports have shown how Facebook’s own algorithms were of benefit to the far right, potentially creating an “echo chamber of white supremacism”. Holocaust denial was thriving on Facebook as recently as last year when the company finally announced a crackdown on such content. 

In the Irish context though, pages and individual accounts spreading conspiracy theories and racism continue to remain online and unchallenged. In May this year far-right agitator Rowan Croft posted a video to his Grand Torino page in which he argued that politicians in favour of COVID-19 vaccinations should be “hung by the neck. In the same video he also encouraged followers to attack vaccination centres. 

Facebook eventually removed the video in question for violation of its policies but left the Grand Torino page, as well as his personal account, online. It wasn’t until earlier this month that the company finally deleted the Grand Torino page. When contacted by The Beacon, a Facebook spokesperson said the platform deleted the page “for violating our policies”. Croft’s personal account remained online however, where he continued to post similar content. In recent days though he’s chosen to deactivate his account.

Further driving home Facebook’s problem with anti-vaccination and COVID-denial activists was the disclosure last month that the Health Service Executive (HSE) has made dozens of complaints to the company regarding COVID-19 disinformation on the platform. Ken Foxe revealed that the HSE reported 291 comments and posts for spreading obvious lies about COVID-19. Yet a random sample selected by The Beacon showed that many of the comments and posts in question remain online.

In the meantime Facebook continues to allow disinformation and racism to be posted and shared on its platform. Perhaps as a consequence of this the company appears to be planning a rebranding in the coming weeks. It’s unlikely to be successful. The company has long been a haven for conspiracy theories and hatred. A name change and new colour scheme won’t change that.

Featured image via Pexels

Support The Beacon on Patreon!

Leave a Reply