White Supremacy Thriving Online, Despite Prevention Efforts

As major tech companies step up efforts to curb extremist content on their platforms, far-right white extremists continue to find ways to spread violent messages and attract sympathizers on the internet.

Experts warn that far-right extremists in the West are turning to fringe sites such as Gab, BitChute, 4chan, 8chan and others to propagate their conspiracies.

While significant progress has been made to remove extremist content on Facebook, Twitter and YouTube, a large chunk of white extremist content on fringe platforms could go unnoticed.

The 8chan logo from an anonymous online forum.

“White supremacists are typically early adopters of technology. They go and hang out in places where there aren’t strong rules — places they’re more likely to get a foothold,” Keegan Hankes, interim research director at the Southern Poverty Law Center, told VOA.

In September, the U.S. Department of Homeland Security released its framework for countering terrorism and targeted violence, which states that online space appears essential to the recent growth of white supremacists, in particular.

“Celebration of violence and conspiracy theories about the ‘ethnic replacement’ of whites as the majority ethnicity in various Western countries are prominent in their online circles,” the department said.

As companies adopt stricter regulations and extremists are banned, “we see a lot of these individuals moving towards smaller, more secretive, and harder-to-detect platforms,” Hankes told VOA. “Those places tend to be more extreme.”

The far-right, anti-Islamic British group Britain First was banned from Facebook in March 2018, but it migrated its videos to BitChute, a less-regulated, YouTube-like platform.

The group also moved its social media presence to Gab and Telegram, as other white extremist organizations have done.

FILE - People mourn outside the synagogue in Halle, Germany, a day after two people were killed in a shooting.

Shortly after a gunman opened fire on a synagogue in the German city of Halle in October, a video of the shooting circulated among nearly 10 white supremacist Telegram channels to tens of thousands of users, many of whom hailed the shooter as a “hero” and a “saint,” according to NBC News.

The suspect reportedly livestreamed the assault on the gaming platform Twitch before the site removed it.

Global network

Some observers charge that by posting videos of mass shootings online and glorifying perpetrators, extremist groups hope to drag more vulnerable people into their ranks.

“We are witnessing the internationalization of the white supremacist movement,” the Anti-Defamation League found in a recent report. “European and American adherents are learning from each other, supporting each other and reaching new audiences. They feel empowered and emboldened because they perceive that they are influencing the political climate and reaching disaffected whites.”

Role of major outlets

Some social media platforms alleged they have narrowed content that advocates hate speech, while also stressing that everybody deserves a voice.

In a speech given last week at Georgetown University, Facebook CEO Mark Zuckerberg said, “More people being able to share their perspectives has always been necessary to build a more inclusive society.”

In September, Facebook said it banned more than 200 white supremacist organizations from its platform. Other major tech companies, including Twitter and YouTube, say they have achieved similar success through enforcing a set of standards for policing content.

FILE - Facebook CEO Mark Zuckerberg walks to meetings for technology regulations and social media issues on Capitol Hill, in Washington, Sept. 19, 2019.

Some monitor organizations say hate groups are still able to use these platforms, as many of the measures taken to counter them are not properly enforced.

Change the Terms, a coalition working to stop hate online, states on its website that “major tech companies have repeatedly failed to adequately address the problems that their own platforms are creating.”

In a petition on its site directed at Twitter CEO Jack Dorsey, the organization states “Twitter has done very little to stop white supremacists from organizing, fundraising, recruiting and normalizing attacks on women and people of color on its platform.”

Twitter said in a blog post in May that it, along with a few other tech giants, are committed to “working collaboratively across industry to attack the root causes of extremism and hate online.”

Restricting extremists online

When it comes to imposing regulations on both mainstream social media and fringe websites, experts have differing views.

Maura Conway, professor of international security at Dublin City University, told VOA the key point that should be taken into consideration when regulating is not the size of the site, but “the nature of the material being circulated, and the nature of the interactions taking place on the sites, and whether these are of a violent extremist or even terrorist nature or not.”

Alina Polyakova, a fellow at Brookings Institution and founding director of the Project on Global Democracy and Emerging Technology, says regulating content control is a “dead end” because of First Amendment protections, and that looking into the distribution channels, not the content itself, is essential.

“They (algorithms) help to promote extremist messages online. … When we know how the content is distributed, then we can regulate it,” she said at a recent event hosted by Public Knowledge, a nonprofit organization advocating freedom of expression and open internet, last week.

Josh Lipowsky, a senior researcher with the New York-based Counter Extremism Project, told VOA that more pressure is also needed on fringe sites to make sure they meet their responsibility of thwarting supremacist propaganda on their platforms.

Lipowsky said “restricting this type of rhetoric is not limiting freedom of speech under the First Amendment. We are talking about private businesses that provide services. They are well within their rights to set limits on how their services are used to avoid abuse of their platforms.”

He continued, “Sites like Gab pride themselves on being open forums that guarantee free expression. But they do not and cannot exist in a vacuum. They still need the web service companies to operate, and that is where pressure can be applied.”

He added that governments should step in to address these issues to “ensure across-the-board compliance in the interest of public safety and security.”

Matthew Grady and Rikar Hussein contributed to this story.
 


by via Voice of America - English

Comments