Facebook says it has a special team taking down content supporting or praising the Taliban
Talibancontent, a spokesperson told the Insider.
- The spokesperson said it had a team of "
Afghanistanexperts" monitoring its platform.
- Facebook has faced criticism for failing to stop incitement and hate speech on its platforms.
Facebook says it is actively identifying and taking down Taliban content on its platform, after the group seized Afghanistan's capital city Kabul on Sunday,
A spokesperson told the Insider that, as a terrorist organization sanctioned under US law, the Taliban is banned from Facebook as well as
"This means we remove accounts maintained by or on behalf of the Taliban and prohibit praise, support, and representation of them," the spokesperson told Insider.
"We also have a dedicated team of Afghanistan experts, who are native Dari and Pashto speakers and have knowledge of local context, helping to identify and alert us to emerging issues on the platform," the spokesperson said.
Dari and Pashto are the most widely spoken languages in Afghanistan.
"Facebook does not make decisions about the recognized government in any particular country but instead respects the authority of the international community in making these determinations. Regardless of who holds power, we will take the appropriate action against accounts and content that breaks our rules," the spokesperson added.
"We are relying on that policy to proactively take down anything that we can that might be dangerous or that is related to the Taliban in general," head of Instagram Adam Mosseri told Bloomberg.
"Now this situation is evolving rapidly, and with it I'm sure the risk will evolve as well. We are going to have to modify what we do and how we do it to respond to those changing risks as they happen."
The Washington Post reported that, ahead of arriving in Kabul, the Taliban sent out messages on WhatsApp to residents, declaring "we are in charge of security."
Facebook did not immediately respond when asked by Insider how many people were on its team of Afghanistan experts.
The social media giant has faced criticism in the past for not doing enough to prevent its platforms being used for incitement and hate speech during humanitarian crises.
In 2018 the company admitted it failed to act when it was used as a tool to incite hatred against Rohingya people in Myanmar.
At the time, Reuters reported only 60 people were dedicated to moderating Myanmar's 18 million Facebook users.
It also reported Facebook only employed three native Burmese speakers at the time.
- Indian startup founders celebrate Paytm’s success as company posts first operating profit
- Kim Jong Un abruptly reappears after 36 days out of the spotlight and orders North Korea to 'prepare for war'
- OpenAI makes a ChatGPT-like tool called Codex that can write software. Here's why Codex won't replace developers and will instead create more demand for their skills.
- PM Modi endorses Indian Oil's 'Surya Nutan' solar cooker at India Energy Week! Should you switch?
- Adani Power's Q3 profit plunges 96% to Rs 9 cr as expenses rise
- After an Ather 450X catches fire, the company says its battery is safe — fire was due to a wiring harness issue
- CCTV camera for home with mobile connectivity for 2023
- Best smartwatches under ₹3000