Facebook says it has a special team taking down content supporting or praising the Taliban

Advertisement
Facebook says it has a special team taking down content supporting or praising the Taliban
A Taliban fighter runs towards crowd outside Kabul airport, Kabul, Afghanistan August 16, 2021, in this still image taken from a video. REUTERS TV/via REUTERS
  • Facebook is proactively taking down Taliban content, a spokesperson told the Insider.
  • The spokesperson said it had a team of "Afghanistan experts" monitoring its platform.
  • Facebook has faced criticism for failing to stop incitement and hate speech on its platforms.
Advertisement

Facebook says it is actively identifying and taking down Taliban content on its platform, after the group seized Afghanistan's capital city Kabul on Sunday,

A spokesperson told the Insider that, as a terrorist organization sanctioned under US law, the Taliban is banned from Facebook as well as Instagram and WhatsApp, under the company's "Dangerous Organization" policy.

"This means we remove accounts maintained by or on behalf of the Taliban and prohibit praise, support, and representation of them," the spokesperson told Insider.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

"We also have a dedicated team of Afghanistan experts, who are native Dari and Pashto speakers and have knowledge of local context, helping to identify and alert us to emerging issues on the platform," the spokesperson said.

Dari and Pashto are the most widely spoken languages in Afghanistan.

Advertisement

"Facebook does not make decisions about the recognized government in any particular country but instead respects the authority of the international community in making these determinations. Regardless of who holds power, we will take the appropriate action against accounts and content that breaks our rules," the spokesperson added.

"We are relying on that policy to proactively take down anything that we can that might be dangerous or that is related to the Taliban in general," head of Instagram Adam Mosseri told Bloomberg.

"Now this situation is evolving rapidly, and with it I'm sure the risk will evolve as well. We are going to have to modify what we do and how we do it to respond to those changing risks as they happen."

The Washington Post reported that, ahead of arriving in Kabul, the Taliban sent out messages on WhatsApp to residents, declaring "we are in charge of security."

Facebook did not immediately respond when asked by Insider how many people were on its team of Afghanistan experts.

Advertisement

The social media giant has faced criticism in the past for not doing enough to prevent its platforms being used for incitement and hate speech during humanitarian crises.

In 2018 the company admitted it failed to act when it was used as a tool to incite hatred against Rohingya people in Myanmar.

At the time, Reuters reported only 60 people were dedicated to moderating Myanmar's 18 million Facebook users.

It also reported Facebook only employed three native Burmese speakers at the time.

{{}}