YouTube says it will soon recommend fewer conspiracy theory videos on its platform

Advertisement
YouTube says it will soon recommend fewer conspiracy theory videos on its platform

YouTube Silhouettes

Dado Ruvic/Reuters

Advertisement
  • YouTube announced in a company blog post on Friday that it would recommend less "borderline" content, or videos that are untruthful in potentially harmful ways.
  • Examples of videos YouTube hopes to promote less often include ones that claim that the Earth is flat or promote phony cures for serious illnesses.
  • "We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users," YouTube said in its blog post.
  • YouTube has long struggled with its recommendations algorithm, catching backlash for promoting conspiracy theories and leading users to more extreme corners of the Internet.

The Earth is not flat and soon, you should start seeing fewer videos on YouTube that say that it is.

On Friday, YouTube announced in a company blog post that it would recommend less "borderline" content, or videos that are untruthful in potentially harmful ways.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

Essentially, YouTube, which is owned by Google, thinks it has created a better solution for stopping the spread of conspiracy theory videos on its platform.

Examples of videos YouTube hopes to promote less often include the Earth is flat claim, as well as those that promote phony cures for serious illnesses or make blatantly false claims about historical events like 9/11.

Advertisement

Many of these "borderline" videos don't necessarily violate YouTube's Community Guidelines, but the company says that limiting their reach will provide a better experience for its users. "We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users," YouTube said in its blog post.

These videos will not be removed entirely from the platform, and they may still appear in search results or recommendations if a user follows certain channels, the company explained.

YouTube also provided a bit of insight into how its recommendation model works, which involves "human evaluators and experts from all over the US" reviewing videos and using that feedback to train its machine learning systems.

YouTube has long struggled with its recommendations algorithm, catching backlash for promoting conspiracy theories and facing criticism for leading its users to more extreme corners of the Internet.

Read more: One viral thread shows how quickly YouTube steers people to wacko conspiracy theories and false information

Advertisement

"It's just another step in an ongoing process," the company said in its blog post on Friday. "But it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube."

Get the latest Google stock price here.

{{}}