Black Instagram users were 50% more likely than white users to have their accounts automatically disabled, internal research reportedly showed

Advertisement
Black Instagram users were 50% more likely than white users to have their accounts automatically disabled, internal research reportedly showed
Reuters
  • Internal researchers at Facebook reportedly discovered in 2019 that an automatic moderation algorithm at Facebook-owned Instagram was 50% more likely to auto-ban Black users than white users.
  • But when the researchers reported their findings to higher-ups, they were told to stop researching the topic and not to discuss their findings with others, current and former employees told NBC News.
  • Instagram reportedly implemented a slightly different version of the automatic moderation tool after that, but blocked staff from researching its potential racial biases.
  • Facebook said in a statement that the researchers who first discovered the disparity were using flawed methodology.
Advertisement

Facebook reportedly told staff to stop researching its products' racial bias after they found that an Instagram moderation tool was disproportionately banning Black users, according to a new NBC News report.

Current and former employees told the outlet that internal Facebook research found in 2019 that an automated content moderation tool on Instagram, which Facebook owns, was 50% more likely to automatically ban Black users than white users.

But after reporting their findings to their superiors, the researchers were reportedly told to stop investigating possible racial bias related to the tools. Instagram later implemented a slightly different version of the automatic moderation tool, according to the report, but employees were barred from researching whether it had racial biases.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

In response to the report, Facebook did not deny that researchers were told to stop investigating the tools' possible racial bias, but said that the researchers who discovered the tool's racial disparities were using flawed methodology. The company added that it is currently investigating how best to test its products for racial bias.

"We are actively investigating how to measure and analyze internet products along race and ethnic lines responsibly and in partnership with other companies," Facebook spokeswoman Carolyn Glanville said in a statement.

Advertisement

The new report comes as Facebook faces increasing scrutiny over its handling of issues related to hate speech and racial bias. The company struggled to develop content moderation policies that were capable of distinguishing between posts criticizing demographics like men and white people versus those criticizing oppressed groups, Vanity Fair reported last year.

Facebook has set ambitious goals to improve the racial and gender diversity of its staff. Its diversity report published last month showed that it's making incremental steps toward those goals, but that many segments of the company — especially technical roles and top leadership — remain disproportionately white and male.

Employees have criticized the company for downplaying the potential racial biases of its products.

"I've seen people be driven insane as leadership ignores them or outright shuts them down and commits us, again and again, to doubling down on this same path," one Facebook engineer told NBC News.

Earlier this week, Facebook confirmed that the company is looking to build new internal teams to study potential racial bias in its products.

Advertisement

Read the full NBC News report on Facebook's response to research on racial bias here.

{{}}