Facebook penalized clickbait-parody site Reductress for being too clickbaity

Advertisement
Facebook penalized clickbait-parody site Reductress for being too clickbaity

Advertisement
Recutress eyeliner

Reductress/Facebook

Reductress produces parodic clickbait articles.

  • Women's satirical online magazine Reductress had its distribution restricted by Facebook for posting too much clickbait.
  • Reductress articles are parodies of clickbait.
  • Editor Sarah Pappalardo told the Verge the account has never had its distribution restricted before.
  • It shows how Facebook's attempts to limit harmful and spammy content can miss the mark and fail to appreciate humour.
  • Visit Business Insider's homepage for more stories.

Facebook's moderation problems spilled over into the realms of satire when the site penalized a parody clickbait outlet for writing too much clickbait.

Founded in 2013, Reductress is a satirical online magazine. It publishes articles parodying clickbait aimed at women, often in a comically surreal fashion.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

Recent headlines include: "Eyeliner Smudge Reaches Gulf of Mexico,"4 Things Your Vagina Is Trying To Tell You Ever Since A Witch Cursed Her And She Became Sentient," and "Man Needs More Protein." The site's Facebook account has around 226,000 followers.

On Thursday, editor Sarah Pappalardo said in a Tweet that Facebook had issued them account with a note saying its distribution on the platform had been limited due to "repeated sharing of clickbait."

Advertisement

Pappalardo told the Verge that this is the first time this has happened to Reductress.

"This appears to be a case of just ignorant regulation," they said. Pappalardo added that the lack of transparency surrounding moderation was a frustrating factor.

"You have no idea who is reviewing this content, or if they even bother to research who they are throttling," they said. ("They" is Pappalardo's preferred pronoun.)

Read more: Facebook content moderation firm asked on-site therapists to disclose counseling details with employees, according to report

Facebook was not immediately available for comment when contacted by Business Insider.

Advertisement

Facebook has been making concerted efforts to convince the public that it's ramping up its moderation of harmful and spammy content, citing an increased safety workforce and increasingly powerful AI.

This isn't the first time Facebook's moderation processes have misfired, missing nuance in posts. Last year the company's AI automatically took down a post containing excerpts of the Declaration of Independence, on the grounds that it was hate speech.

{{}}