YouTube's algorithm is steering viewers away from anti-vaccine videos
- A new study of
YouTube's recommendation algorithms shows the filter bubble is in full effect.
- A user's history watching
misinformationabout key conspiracy theories results in more videos being pumped toward them.
- But one exception is lies about vaccines. "YouTube might be reacting differently to different topics based on the pressures they're getting from the media," said the author.
View all Offers
- 13% OFF
Redmi 9 Prime (Mint Green, 4GB RAM, 64GB Storage)- Full HD+ Display & AI Quad Camera₹ 10499₹ 11999Buy On
- 11% OFF
Tecno POVA 2 (Polar Silver, 4GB RAM, 64GB Storage)| 7000mAh Battery | 48MP Camera | Helio G85₹ 11999₹ 13499Buy On
Samsung Galaxy M51 (Electric Blue, 6GB RAM, 128GB Storage) 6 Months Free Screen Replacement for Prime₹ 19999₹ 28999Buy On
- 21% OFF
iQOO Z3 5G (Cyber Blue, 8GB RAM, 128GB Storage) | India's First SD 768G 5G Processor | 55W FlashCharge | Upto 9 Months No Cost EMI | 6 Months Free Screen Replacement₹ 17990₹ 24990Buy On
- 18% OFF
Redmi Note 10S (Shadow Black, 6GB RAM, 64GB Storage) - Super Amoled Display | 64 MP Quad Camera₹ 13999₹ 16999Buy On
YouTube's propensity to push people down rabbit holes has been repeatedly probed by journalists and academics alike, and a new research paper shows the filter bubble in action.
Tanushree Mitra and colleagues at Virginia Tech University's department of computer science analyzed the way YouTube's algorithmic recommendations pushed videos on lightning rod topics for conspiracy theories.
The academics looked at how YouTube suggests videos related to 9/11 conspiracy theories, chem trails, the idea the Earth is flat, that we didn't land on the Moon, and that vaccines are harmful or don't work.
"We saw all these media reports and opinion pieces talking about how YouTube is driving people down the rabbit hole," said Mitra. "But I was like: 'All these reports are talking without any empirical evidence. Is this actually happening?'"
They gathered 56,475 videos on those five topics and audited YouTube's search and recommendation algorithms.
They created bot accounts on YouTube that then engaged with those topics and videos by watching them and searching for them.
The search audit the researchers conducted involves the bot accounts searching for videos around a particular topic using common search terms, and seeing what is recommended by YouTube's search algorithm.
They found that YouTube was better at pulling people out of the
"No matter how much you search for anti-vaccines, or if a user goes and searches for anti-vaccine videos, the resulting recommendations from the algorithm would still be pointing them to debunking videos, or pro-vaccine videos," said Mitra. "That's not the case for other ones, which potentially proves it'll push you down the rabbit hole if you're looking for chem trails, but not for vaccines."
A similar watch audit involved the bot accounts watching different types of videos related to each topic.
One set of bot accounts would watch solely anti-vaccine videos; another would watch videos debunking anti-vaccine conspiracy theories; and a third would consume a video diet that both supported and punctured misinformation about vaccines.
"We found even if the behaviour is watching anti-vaccine videos, the algorithm still gives pro-vaccine recommendations on the Up Next and top five recommendations sidebar, which was not the case for the other topics," she said. "That's where the difference lies between vaccine topics and the other topics we audited."
Mitra hypothesizes that YouTube is more proactively policing anti-vaccine videos given the current importance of the topic to the world's battle against coronavirus.
"A lot of these media articles are initially about how these platforms in general are pushing people towards vaccine controversies," she said, "so it's not surprising that's the first topic they want to tackle, and the other ones aren't a high priority for them."
A YouTube spokesperson said: "We're committed to providing timely and helpful information, including raising authoritative content, reducing the spread of harmful misinformation and showing information panels, to help combat misinformation.
They added: "We also have clear policies that prohibit videos that encourage harmful or dangerous content, impersonation or hate speech. When videos are flagged to us that break our policies, we quickly remove them."
- High speed external solid-state drives with 1TB storage
- Adobe Photoshop to get a 'Prepare as NFT' option soon
- WhatsApp's chat migration feature to transfer messages from iOS to Android is coming to Pixel, Android 12 phones
- Axis Bank, Bajaj Finance, Maruti Suzuki and other hot stocks on October 27 as earnings season gathers momentum
- Nykaa’s IPO to open tomorrow — check last date, listing date and more