HomeNotificationsNewslettersNextShare
YouTube warns more videos than usual could be removed as content moderation is automated amid coronavirus outbreak
advertising

YouTube warns more videos than usual could be removed as content moderation is automated amid coronavirus outbreak

FILE PHOTO: Silhouettes of mobile users are seen next to a screen projection of Youtube logo in this picture illustration taken March 28, 2018. REUTERS/Dado Ruvic/Illustration
  • Nearly all of Google's more than 100,000 full-time employees worldwide, including those who work for YouTube, have been ordered to work from home due to the coronavirus outbreak.
  • YouTube said Monday that because of reduced in-office work, the platform is relying more on machine learning than humans to review videos.
  • YouTube warned creators that an increase in automation in the review process could lead to more videos being removed, including some "that may not violate policies."
  • Visit Business Insider's homepage for more stories.

YouTube has warned creators they could see more videos than usual being removed from the platform, as a result of its employees being told to work from home amid the coronavirus outbreak.

YouTube announced Monday that its video review process will be "temporarily" more automated, relying more on machine-learning tools instead of its bevy of human content moderators. Google, YouTube's parent company, has ordered most of its nearly 120,000 employees to work remotely as a precaution against the spread of coronavirus.

Nearly 200,000 coronavirus cases have been reported worldwide, including more than 4,000 in the US.

"This means automated systems will start removing some content without human review, so we can continue to act quickly to remove violative content and protect our ecosystem," YouTube said in a blog post. "As we do this, users and creators may see increased video removals, including some videos that may not violate policies."

Content moderation at YouTube is a responsibility shared by the company's automated systems and human reviewers. Although videos flagged my machine learning are often first reviewed by humans before action is taken, YouTube says it will skip this secondary step "so we can continue to act quickly."

Machine learning is not perfect, which has been demonstrated in the past: YouTube struggled to act quickly in taking down videos posted on its platform following the terror attack on mosques in Christchurch, New Zealand. YouTube warned that the reliance on automated content moderation could lead to videos being taken down that don't violate community guidelines.

Google is one of many companies around the world that has taken measures to prevent the spread of coronavirus by telling its employees to work remotely. Some Google employees whose jobs require them to be onsite are still working in offices, but the company says it has adjusted shifts and schedules to encourage social distancing.

Many employees who work on content moderation for YouTube - as well as for Facebook and Twitter - are contractors employed by companies like Accenture and Cognizant. Some Facebook contractors have been told they have to work onsite in order to keep their jobs amid the coronavirus outbreak, The Intercept reported last week.

It's unclear how many of YouTube's moderators are being allowed to work remotely. Google has committed to paying contractors and hourly workers whose jobs are affected, and has also set up a fund to offer paid sick leave to anyone who can't work because of coronavirus symptoms and quarantines.