Mastodon
Facebook potentially an ‘echo chamber of white supremacism’ as algorithms benefit the far right

Facebook potentially an ‘echo chamber of white supremacism’ as algorithms benefit the far right

Facebook is failing to do enough to stop white supremacist groups using the platform to recruit followers and spread racism. That’s the conclusion drawn in a new report published by the Tech Transparency Project (TPP). It’s argued that the findings “cast doubt” on Facebook’s claim to have clamped down on hate groups. 

Instead, the website has become “an echo chamber of white supremacism”.

Facebook is also home to a number of Irish-based groups which promote  far-right conspiracy theories around asylum seekers.

A haven for white supremacy

Titled White Supremacist Groups Are Thriving on Facebook, TTP published the report on 21 May. In the report it reveals that it tested the claims of Facebook’s CEO, Mark Zuckerberg, that his company bans hate groups from using the platform. TTP searched Facebook for the names of 221 white supremacist groups. And of this 113, or 51%, were on Facebook in some form. 

According to the TPP, these organisations were “associated with a total of 153 Facebook Pages and four Facebook Groups”. In some cases the pages are “auto-generated” by Facebook. As explained in the report:

These Pages are automatically created by Facebook when a user lists a job in their profile that does not have an existing Page. When a user lists their work position as “Universal Aryan Brotherhood Movement,” for instance, Facebook generates a business page for that group.

The TPP writes that this problem “has existed for some time”. And, even more troublingly, a whistleblower revealed that these pages are “a way for the groups to identify potential recruits”. 

Fooling the system

One aspect of how white supremacist groups are allowed to continue to use Facebook is the company’s moderation system. As noted by the TPP, the system “relies heavily on artificial intelligence (AI)” as well as reports by individual users. But this method “doesn’t work well”. 

Relying on users to report content, it’s pointed out, won’t work because “the platform is designed to connect users with shared ideologies”. Given this, “white supremacists are unlikely to object to racist content they see on Facebook”. 

AI also fails because users are able to skirt around AI-based moderation. TPP relates that the intentional use of misspellings, replacing letters with symbols or numbers, and even lack of spaces in sentences, has been enough to fool Facebook’s AI.

Far-right algorithms

Also drawn attention to is Facebook’s “Related Pages” recommendation system. It’s asserted that:

Facebook’s algorithms can create an echo chamber of white supremacism through its “Related Pages” feature, which suggests similar Pages to keep users engaged on a certain topic.

The research by the TPP showed that of the 113 white supremacist groups that had a Facebook presence, 

77 of them had Pages that displayed Related Pages, often pointing people to other extremist or right-wing content.

This is in keeping with the findings of The Beacon. When pages related to the National Party were “Liked”, an email was sent by Facebook with a list of recommended pages. Included on this list were the pages for Grand Torino, a well-known Irish far-right activist, and the Irish Freedom Party, a far-right Euro-sceptic party whose leader has voiced his belief in the “Great Replacement” conspiracy theory.

Facebook has claimed that it has “expanded its efforts” in clamping down on hate speech in recent years. However, as the TPP noted:

Research suggests there continues to be a gap between Facebook’s public relations responses and the company’s enforcement of its own policies.

Inconsistent

As reported by The Beacon last year, a number of far-right organisations were active on Facebook. This included pages pushing conspiracy theories related to asylum seekers, vaccines, and 5G. And it also included private Facebook groups dedicated to opposing the housing of asylum seekers around Ireland. The anti-asylum seeker groups were founded by a member of the Irish Freedom Party, Conor McZorba, also known as Conor Rafferty. 

As of the time of writing, all of the pages and groups reported on last year are still active.

Facebook does sometimes take action though. As a result of the TPP report and an email from the Huffington Post, 55 of the pages for the white supremacist groups identified by the TPP have now been removed.

But the far right and racists continue to find a haven in Facebook. In the Irish context the company has done next to nothing to battle the increasingly dangerous rhetoric emerging online via its platform. It’s putting lives at risk. And this is especially the case that now the far right is using the COVID-19 pandemic to spread its propaganda.

Facebook needs to do more. But for now, it likely remains too profitable for the company to do much of anything about the problem. The fact that we’re still having this conversation after the Christchurch attack says a lot about the attitude of Facebook towards the issue. And it says a lot about how much work remains to be done.

Featured image via Pikrepo

Support The Beacon on Patreon!
Become a patron at Patreon!

Leave a Reply

Discover more from The Beacon

Subscribe now to keep reading and get access to the full archive.

Continue reading