+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Meta is cracking down on revenge porn targeting children on Instagram and Facebook by funding a new tool to remove explicit images

Mar 1, 2023, 00:06 IST
Business Insider
Meta provided funding for a new tool created by the National Center for Missing and Exploited Children which takes down sexually explicit images of minors online.SOPA Images/Getty Images
  • Meta is cracking down on revenge porn targeting children under the age of 18 on its platforms.
  • It funded a new tool removing explicit images online, released by a child protection organization.
Advertisement

Meta is helping to clamp down on the spread of revenge porn targeting children and teenagers on Instagram and Facebook by funding a new tool that helps users remove sexually explicit images online.

The National Center for Missing & Exploited Children announced the release of Take It Down on Monday – a site that removes nude, partially nude, or sexually explicit images and videos of children under the age of 18 that have been posted online or are believed to have been posted online.

The tool can also be used by those over 18 to remove explicit pictures taken when they were a minor, and it is available globally.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

NCMEC said in its release that Meta provided initial funding for building the infrastructure of the program.

Users can select images and videos on their devices that they don't want to be posted online or that have already been posted online. Take It Down will generate a unique digital fingerprint, called a hash value, that is assigned to the specific content to help them to identify any copies, according to its website.

Advertisement

The hash is added to a secure list and shared with participating companies who will then scan their public and unencrypted platforms for it. If an image or video is identified that matches the hash value, it will be taken down.

Other platforms that have signed up to NCMEC include Pornhub, OnlyFans, MindGeek, and Yubo.

"We created this system because many children are facing these desperate situations." Michelle DeLaune, president and CEO of NCMEC, said in the release. "Our hope is that children become aware of this service, and they feel a sense of relief that tools exist to help take the images down."

But there are caveats to this tool. According to The Associated Press, if someone were to alter the image – by cropping it or turning it into a meme, for example – it becomes a new image and therefore needs a new hash. Moreover, images will still appear on sites that haven't signed up for this service.

Meta's global head of safety, Antigone David, said in a separate release shared with Insider: "Meta has worked with NCMEC, experts, and victims to develop this platform and help young people get the resources they need when facing these horrific situations. We look forward to other tech companies joining this effort so we can collectively combat this issue across the internet."

Advertisement

Insider reached out to NCMEC and Meta for further comment but did not immediately hear back.

Meta recorded more child sexual abuse material on its platforms than any other tech company in 2019 and was responsible for 99% of all reports to the NCMEC at the time. It also detected over 20 million images of child sex abuse on Instagram and Facebook in 2020.

Around 15,000 people globally work as moderators for Facebook and Instagram, according to a 2020 report from the New York University Stern Center for Business and Human Rights. Most of these are contracted through third-party firms. Jennifer Grygiel, a social media scholar at Syracuse University quoted in the report, said these numbers as "woefully inadequate."

"To get safer social media, you need a lot more people doing moderation," she said.

There have also been complaints that Facebook has treated its moderators poorly in recent years.

Advertisement

One moderator, hired through an outsourcing company in Kenya, told Insider that he had to sift through traumatic content including beheadings, sexual exploitation of children, graphic violence, and more, and was only paid $1.50 an hour.

Next Article