YouTube is reportedly considering fundamental changes to its platform as it continues to face backlash over child safety
- YouTube is considering new strategies in response to ongoing concerns about child safety on the massive video platform.
- Both Bloomberg and the Wall Street Journal report that YouTube executives are considering moving all children's content to YouTube Kids, the platform's secondary app.
- While YouTube Kids has existed since 2015, a majority of children use the regular YouTube app to access the platform.
- YouTube will also consider disabling automatic video recommendations for children's videos.
- Visit Business Insider's homepage for more stories.
New reports indicate that YouTube's executives are considering fundamental changes to the video platform to help create a healthier environment for children.
Both Bloomberg and The Wall Street Journal report that YouTube will consider shifting all children's content to YouTube Kids, a secondary app with increased moderation. YouTube will also consider disabling the platform's recommendation feature for children's programming.The site had been criticized for automatically steering both kids and adults towards inappropriate or radical content, since the feature normally starts playing a new video seconds after a chosen video ends. Recommend videos are based on the viewer's habits and prior videos watched.
This year, YouTube has already removed more than 800,000 videos that violated its child safety policies, but some of the platform's essential features, like the automatic recommendations, have undermined the company's own safety initiatives.
In February, Wired reported that a "network of pedophiles" was using YouTube to find videos that showed children in varying states of undress. Many of the videos were seemingly harmless, showing kids practicing gymnastics or playing in the pool. But these users left time stamps in the comment sections of those videos to mark moments were children's genitals were exposed, betraying their intentions.
As more viewers left comments, these seemingly harmless home videos eventually gained thousands, and in some cases millions, of views. In response, YouTube disabled comments for videos featuring children and deleted more than 400 accounts associated with the time-stamp comments.
But even after the initial controversy about the time-stamped comments, The Verge found that YouTube's video recommendation system could automatically steer users to similarly questionable content featuring children. While there's been much speculation about what factors drive YouTube's algorithm for recommended videos, it has been demonstrated that YouTube can surface inappropriate content with a minimal amount of input from the viewer.YouTube established YouTube Kids in 2015 for users under 13 years old. Users under the age of the 13 are actually not allowed to have a standard YouTube account, though minimal age verification is required to start one. YouTube says it deletes thousands of accounts per week that belong to underage users. Along with a more family-friendly appearance, YouTube Kids is more heavily moderated than the regular platform and has additional parental control features.
However, Bloomberg reports that a majority of kids using YouTube are using the main YouTube site instead of YouTube Kids. Childhood development experts told Bloomberg that many children find YouTube Kids too infantilizing and they refuse to use the app after experiencing regular YouTube. A YouTube spokesperson told Bloomberg that YouTube Kids is used by more than 20 million people a week, but YouTube commands more than 2 billion active users each month.
Beyond child safety, YouTube continues to field criticism for the company's handling of a wide range of issues, including hate speech, radicalization, conspiracy theories, and monetization for creators. But making changes to the way the massive platform treats kids could go a long way towards regaining the trust of parents and the public.