How Google's 'autocomplete' search results spread fake news around the web
View all Offers
SleepX Ortho Plus Quilted 8 inch King Bed Size, Memory Foam Mattress (Purple, 78x72x8 )₹ 14209₹ 27082Buy On
Amazon Brand - Solimo Accord Chair (Fabric ,black,1 Piece)₹ 4799₹ 9000Buy On
SleepX Dual Comfort 6-inch Medium Soft & Hard Queen Size Mattress (Orange, 75x60x6, High Resilience Foam)₹ 7839₹ 14756Buy On
Amazon Brand - Solimo Medusa Engineered Wood Wardrobe With Mirror wenge finish , 4 Doors₹ 15999₹ 26000Buy On
- 16% OFF
Metallika Centerville Three Seater Sofa cum Bed with Mattress (Glossy Finish, Brown, Metal) By FurnitureKraft₹ 22589₹ 36480Buy On
There's a flaw in Google's autocomplete feature, which attempts to predict search queries as they are being typed: it actively directs users to fake content on the web, even when they're not looking for it.
Bizarre examples of this include suggestions that Michelle Obama is a man, Tony Blair is dead, and that president-elect Donald Trump can still win the election. We've mapped them out below.
Google's predictive "autocomplete" search feature is "designed to reflect the range of info on the web" and is generated by "what other people are searching for, including Trending stories."
The side effect of that design is that when the popularity of false content outweighs that of factual content across the web, the algorithm generating the prediction is unable to tell the difference, and it begins generating results that direct readers to fake news. This became a real problem during the US election when fake news went viral, and false rumours received wide readership. (One such bogus story claimed President Obama banned the pledge of allegiance in schools - he did not.)
A Google spokesperson told Business Insider:
"Autocomplete predictions are algorithmically generated based on users' search activity and interests. Users search for such a wide range of material on the web - 15% of searches we see every day are new. Because of this, terms that appear in Autocomplete may be unexpected or unpleasant. We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we acknowledge that autocomplete isn't an exact science and we're always working to improve our algorithms. "
"Our search results are a reflection of the content across the web. This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what search results appear for a given query. These results don't reflect Google's own opinions or beliefs - as a company, we strongly value a diversity of perspectives, ideas and cultures."
Scroll down to see how non-false search queries can generate fake news results. ↓↓↓
- pi Ventures plans to invest in 25 young startups with its latest $75 million fund
- Best deals on hair straighteners, trimmers and other grooming products on Amazon
- Amazon Republic Day Sale 2022 — Best deals and offers on Echo smart speakers, Fire TV Sticks and more
- Two IndiGo planes avert mid-air collision over Bengaluru airport, DGCA to take strict action
- India's tennis star Sania Mirza announces retirement from the sport by the end of 2022 season