Facebook banned ‘Boogaloo’ related groups, but new research suggests a ‘boomerang’ effect

Share

Eighteen months after Facebook banned communities and users linked to the anti-government «Boogaloo» movement, the group’s extremist ideas had returned and flourished on the social media platform, a new investigation has found.

The paper, from George Washington University and Jigsaw, a unit within Google that explores threats to open societies, including hate and toxicity, violent extremism and censorship, found that after the ban on the Boogaloo militia movement on Facebook in June 2020, the content «boomeranged». first decreasing and then recovering to almost its original volume.

“What this study says is that you can’t play mole once and walk away,” said Beth Goldberg, Jigsaw’s director of research and development. “You need sustained content moderation – sophisticated, adaptive content moderation, because these groups are sophisticated and adaptive.”

The research, which has not been peer-reviewed, comes at a time when many tech companies, including Facebookthey are cutting back on their trust and safety departments, and content moderation efforts are underway. reviled and abandoned.

Teams inside Facebook and external workers hired by Facebook monitor the platform for users and content that violate its policies against what the company calls Dangerous Organizations and Individuals. Facebook designated the Boogaloo movement as «a US-based violent anti-government network» under that policy in 2020.

Meta, Facebook’s parent company, said in a statement that the new study is a «snapshot of a moment in time two years ago: We know this is an ever-changing space of confrontation, where perpetrators are constantly trying to find new ways.» to avoid our policies. It is our priority to keep our platforms and communities safe. That’s why we continue to invest heavily in people and technology to stay ahead of the evolving landscape, and then study and refine those tactics to make sure they’re effective in keeping our users and platforms safe.»

Members of the loose Boogaloo movement were known for showing up at anti-lockdown rallies and George Floyd protests in 2020, carrying assault rifles and wearing bulletproof vests and Hawaiian shirts. They called for a second civil war, and some had even been accused of a series of violent crimes, including murder.

In its summer 2020 ban announcement, Facebook said members of the Boogaloo militia movement were using the social media platform to organize and recruit. Facebook also acknowledged that removing hundreds of users, pages and groups was just a first step, as the company hoped the group would try to come back.

“As long as violent movements operate in the physical world, they will seek to exploit digital platforms,” Facebook said. blog post read. “We are intensifying our efforts against this network and we know that there is still more to do.”

Facebook was right, according to new research from George Washington University and Jigsaw.

The researchers’ analysis used an algorithm that identified content from far-right movements, including Boogaloo, QAnon, white nationalist, and patriot and militia groups. The data used to train the algorithm included 12 million posts and comments from nearly 1,500 communities across eight online platforms, including Facebook, from June 2019 to December 2021.

The algorithm was trained to recognize text in posts from these extremist movements and identify re-emerging Boogaloo content, not through identical keywords or images, but through similarities in rhetoric and style. Once the algorithm identified these posts, the researchers confirmed that the content was Boogaloo-related and explicitly found Boogaloo aesthetics and memes that included Hawaiian shirts and red laser eyes.

“The coded language they were using had completely evolved,” Goldberg said, noting that the move seemed to ditch terms like “boog” and “big igloo.” “This was a very intentional adaptation to avoid removals,” she said.

According to his analysis, the initial Boogaloo ban worked, greatly reducing content from the militia movement, but 18 months after the ban, in late 2021, the amount of Boogaloo-related content in public Facebook groups, including discussions of custom weapon mods and preparations for a civil war—had nearly returned to its pre-prohibition level.

Research published by Meta last week seems to support Jigsaw’s conclusions about the effectiveness of the initial removal of platforms: the removal of hate groups and their leaders hampered the online operations of the groups.

However, in a blog post Commenting on the research published Friday, Jigsaw noted that a growing interconnection between mainstream and alternative platforms, including Gab, Telegram and 4Chan, which may lack the will or resources to remove hateful and violent extremist content, makes it difficult to curb this content. be a more complicated problem than ever.

Experts at circumventing moderation enforcement, Boogaloo extremists were able to evade moderators by migrating through (and eventually back from) alternative platforms.

The new study is based on a small body of academic research, which generally shows that when extremists are banned from major platforms, their reach decreases. The research also suggests that a push towards alternative platforms may have the unintended effect of radicalizing followers who move to smaller, unmoderated or private online spaces.

Jigsaw also published the results of a recent qualitative study in which researchers interviewed people who had had posts or accounts removed on major social media platforms as a result of enforcement actions. They reported turning to alternative platforms and being exposed to more extreme content, some of which, in another boomerang effect, posted on mainstream platforms. Jigsaw reported that almost all of the people interviewed eventually returned to the main platforms.

“People wanted to be online where public debates were taking place,” the Jigsaw researchers wrote.

Bans initially work, but they must be enforced, said Marc-André Argentino, a senior fellow at the Accelerationism Research Consortium, which studies extremist movements online and was not involved in the Jigsaw article.

De Meta, Argentino said: “They have basically put a Band-Aid on a gunshot wound. They stopped the bleeding, but did not treat the wound.»

The efforts of technology platforms to keep online spaces safe rely too much on algorithmic devices and tools, Argentino said.

“They want to automate the process and that is not a sophisticated way to bring down the human beings who are actively working to get back on the platform and get their messages heard,” he said.

goal has reportedly terminated contracts or laid off hundreds of workers in content moderation and trust and security positions, according to documents filed with the US Department of Labor. The recent cuts echo gutting at other tech equipment companies that monitor hate speech and misinformation.

“They are cutting equipment and there are no resources to deal with the multitude of threats,” Argentino said. “Until there is a way to hold the platforms accountable, and they have to play ball in a meaningful way. It will just be these patchwork solutions.”

You may also like...