Facebook was warned about a Kenosha militia group that rallied armed counter-protesters to the city — but says users' reports didn't make it to the right team in time
KenoshaGuard" militia group, but initially refused to take action, The Verge reported Wednesday and Business Insider confirmed.
- Facebook eventually removed the group and a counter-protest event it had organized, several hours after a gunman — which the company says wasn't connected to the group — shot and killed two protesters in Kenosha, Wisconsin, on Tuesday night.
- A Facebook spokesperson told Business Insider that it has a "specialized team" that handles enforcement of its new anti-militia policy, which it eventually determined the group violated, but that team didn't see users' warnings.
- The miscommunication between Facebook's moderation teams is the latest example of how the company has struggled to consistently enforce its policies, especially amid quickly unfolding events.
A self-described militia group in Kenosha, Wisconsin, used Facebook to organize a "call to arms" event hours before two people were shot and killed during protests Tuesday night.
On Wednesday morning, hours after the shootings, Facebook removed the Kenosha Guard group as well as the event it organized, where it urged people to "take up arms and defend our city tonight from the evil thugs."But a new report from The Verge on Wednesday, which Business Insider has confirmed, reveals that Facebook had been warned multiple times about the group's violent intentions ahead of the previous evening's shootings.
In both cases, Facebook determined that neither the group, its "call to arms" event, nor specific posts — which, according to The Verge, included threats to stick nails in the tires of protesters' cars and discussion of which weapons to bring — violated its policies.Facebook had previously banned content explicitly calling for violence, but introduced a new policy last week targeting "movements that, while not directly organizing violence, have celebrated violent acts, shown that they have weapons and suggest they will use them, or have individual followers with patterns of violent behavior."
The policy, which specifically calls out accounts and pages "tied to offline anarchist groups that support violent acts amidst protests," says Facebook will continue to let people post content supporting such groups but will take a variety of actions to either remove or limit the groups' spread.Facebook eventually reversed course, with a spokesperson telling Business Insider: "The Kenosha Guard Page and their Event Page violated our new policy addressing militia organizations and have been removed on that basis." But the company initially declined to take action, according to the spokesperson, because users' warnings on Tuesday didn't make it to the team that deals with militia-related content.
"This work is done by our specialized team whose primary role is to enforce our dangerous organizations policy and who specifically enforces this new policy update. This team continues studying terminology and symbolism used by these organizations to identify the language used that indicates violence so we can take action accordingly," the spokesperson told Business Insider.
The disrepancy between how Facebook's normal team of content moderators and its "specialized" team dealt with the Kenosha group reveals its challenges in consistently enforcing an increasingly complex and constantly evolving set of policies — particularly during quickly unfolding events.Between the time Facebook's moderators were first alerted that the group violated its policies and the time it actually took action, an armed gunman shot and killed two protesters at the same demonstrations where Kenosha had urged people to "take up arms."
"At this time, we have not found evidence on Facebook that suggests the shooter followed the Kenosha Guard Page or that he was invited on the Event Page they organized," the spokesperson said, adding that Facebook has designated the incident as a mass shooting and is taking a variety of steps to remove content praising or supporting the shooter or his actions and is working with law enforcement on the matter.
Facebook recently announced several crackdowns on groups promoting violence, white supremacy, and conspiracies as pressure has mounted on the social media giant to curb toxic and dangerous content on its platform. But many of the groups have already used Facebook to recruit millions of members and reach even more people with their content, and critics say the company has waited too long to take action.Despite the purported crackdowns, Facebook has continued to struggle to stamp out such groups, and recent reports have found that Facebook has long been aware that its powerful recommendation algorithm encourages extremist and polarizing content but that executives including CEO Mark Zuckerberg have ignored the issue.
- Pidilite shares surge as the 61-year old company acquires Araldite maker for ₹2,100 crore
- Chennai lashed with rainfall, many roads waterlogged
- Pinterest stock soars 23% as the social media platform garners 442 million global monthly active users
- Top stock movers — Bharti Airtel, RBL Bank, Pidilite, Titan, Hero Motocorp, Asian Paints, TCS, L&T and others
- Manforce Condoms aims to capture 50% market share by 2023