Meta Oversight Board says some Facebook and Instagram users were treated unequally and calls for overhaul of 'cross-check' moderation programme

Advertisement
Meta Oversight Board says some Facebook and Instagram users were treated unequally and calls for overhaul of 'cross-check' moderation programme
The Oversight Board made the comments in a report issued on Tuesday.Getty Images
  • Meta's cross-check moderation programme led to unequal treatment for users, per the Oversight Board.
  • The board made the comments in a blunt report issued on Tuesday.
Advertisement

Meta has treated some Facebook and Instagram users unequally and given some of them greater protection than others, according to a report issued by the company's Oversight Board on Tuesday.

Referring to Meta's cross-check programme, which was designed to improve moderation, the board said it found several shortcomings.

To address the issue of content that does not violate policies mistakenly being removed, cross-check aimed to provide "additional layers of human review," according to Meta.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

However, the Oversight Board said that "by providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period."

This could potentially cause harm, it added.

Advertisement

The board said Meta had failed to accurately track data on whether cross-check resulted in more accurate decisions and expressed concern about the lack of transparency around the program.

Nick Clegg, Meta's VP of global affairs, responded to the report in a statement posted to Meta's website.

He said: "We built the cross-check system to prevent potential over-enforcement (when we take action on content or accounts that don't actually violate our policies) and to double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe."

Clegg noted that the company had already made some changes in line with the board's recommendations but added that to "fully address the number of recommendations, we've agreed with the board to review and respond within 90 days."

Meta not did provide further comment beyond Clegg's statement.

Advertisement
{{}}