Skip to main content
Sanctioned Online Communities May Become More Radicalized, New Study Finds
New research suggests users of online communities on social media platforms that are sanctioned for their dangerous content just move to standalone websites where they can become more toxic and thus potentially cause even more harm to society.
New research suggests users of online communities on social media platforms that are sanctioned for their dangerous content just move to standalone websites where they can become more toxic and thus potentially cause even more harm to society.

When social media platforms ban or “quarantine” online communities for posting toxic and radical content that is potentially harmful to society, such as hate speech, is doing so an effective way to moderate them? How do users of these sanctioned online communities react when platforms act against them?

New research by Assistant Professor Shagun Jhaver, who examined Reddit’s attempt to moderate the online communities r/The_Donald and r/Incels, found “community-level moderation measures decrease the capacity of toxic communities to retain their activity levels and attract new members, but that this may come at the expense of making these communities more toxic and ideologically radical.”

Further, Jhaver said, their study, which explored what users of online communities do when they change platforms as a result of being sanctioned, found “As platforms seek to moderate content, they should also consider their impact not only on their own websites and services, but in the context of the Web as a whole. Toxic communities respect no platform boundary, and thus, platforms should consider being more proactive in identifying and sanctioning toxic communities before they have the critical mass to migrate to a standalone website.”

The study, “Does Platform Migration Compromise Content Moderation? Evidence from r/The_Donald and r/Incels,” was published in Proceedings on the ACM on Human-Computer Interaction and will be presented at the conference CSCW 2021 where it has received a best paper honorable mention award. Jhaver’s co-authors include Manoel Horta Ribeiro, EPFL; Savvas Zannettou, Max Planck Institute; Jeremy Blackburn, Binghamton University; Emiliano De Cristofaro, University College London; Gianluca Stringhini, Boston University; and Robert West, EPFL.

Toxic communities respect no platform boundary, and thus, platforms should consider being more proactive in identifying and sanctioning toxic communities before they have the critical mass to migrate to a standalone website.

The aim of their study, Jhaver said, was to focus on online communities’ migration to fringe websites that were explicitly created to support the growth of offensive communities and answer, “In the face of platform sanctioned bans, offensive communities often move en-masse to other less-moderated websites. Do such migrations strengthen these communities?”

Jhaver said they found that the Reddit bans were “effective in decreasing activity and the capacity of the community to attract newcomers on the new sites. Moreover, we found evidence of an increase in relative activity, i.e., fewer users posting more and that this increase in posts-per-user is likely due to self-selection: the users who migrated to the new community were more active on Reddit to begin with.”

On the other hand, Jhaver said they also found “significant increases in radicalization-related signals for one of the communities studied (r/The_Donald).”

They chose to study r/The_Donald and r/Incels because of the nature of the groups’ content. “The r/The_Donald subreddit (TD) was created on 27 June 2015 to support the then-presidential candidate Donald Trump in his bid for the 2016 U.S. Presidential election. The discussion board, linked with the rise of the Alt-right movement at large, has been denounced as racist, sexist, and islamophobic,” the authors wrote.

The r/Incels subreddit, the authors wrote, “was created in August 2013. Short for involuntary celibates, it was a community built around ‘The Black Pill,’ the idea that looks play a disproportionate role in finding a relationship and that men who do not conform to beauty standards are doomed to rejection and loneliness. Incels rose to the mainstream due to their association with mass murderers and their obsession with plastic surgery. The community has been linked to a broader set of movements referred to as the ‘Manosphere,’ which espouses anti-feminist ideals and sees a ‘crisis in masculinity.’ In this world view, men and not women are systematically oppressed by modern society.”

Their findings differ from those of previous studies, Jhaver explained. “After a group gets ‘deplatformed’ from a given social media site, they can choose to participate in other communities within the same platform or they can migrate to an alternative, possibly fringe platform (or platforms) where their behavior is considered acceptable. In both scenarios, things could backfire.”

How could they backfire? In two possible scenarios, Jhaver said. In the first, sanctioned users could just migrate to other communities (existing or newly created) within the same platform. Would they continue their problematic behavior/content there?

The second scenario concerns what the users would do if they migrated to an alternate platform. Could the ban unintentionally strengthen a fringe platform (e.g., 4chan or Gab) where users’ problematic content would go largely unmoderated? If this happened, the toxic community could inflict even greater harm to society from the new platform.

 For the first time, this paper explores what happens when users from sanctioned online communities change platforms.

Previous work has largely addressed the first scenario, Jhaver said, and it showed, “users who remained on Reddit after the banning of several communities in 2015 drastically reduced their usage of hate speech and that counter-actions taken by users from the banned communities were promptly neutralized.”

However, what makes their findings in this paper new and significant is they explored for the first time what happens in the second scenario when users change platforms.

It is critically important that platforms proactively moderate their content and consider this broader context because, “the lack of moderation on fringe websites can provide fertile ground for radicalizing end-users and motivating them to conduct offline infractions. Even a small number of such violations can have enormous societal harms. Platform managers should therefore consider it their civic and moral responsibility to proactively address the growth of online hate groups on their site,” Jhaver said.

Learn more:

Blog

Media coverage

Discover more about the Library and Information Science Department at the Rutgers School of Communication and Information here

Back to top