Tech companies are embarrassing themselves with how they handle fake news

Advertisement
Tech companies are embarrassing themselves with how they handle fake news

mandalay bay hotel window paddock

AP Photo/Marcio Jose Sanchez

If you searched for "Las Vegas Shooter" or "Stephen Paddock" on YouTube after Sunday's mass shooting, here's a sampling of some of the videos you would have seen:

Advertisement

One video was titled "Proof Las Vegas Shooting Was a FALSE FLAG Attack - Shooter on 4th Floor"

Another: "Las Vegas Shooting Narrative Debunked in 3 Videos"

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

Yet another: "Las Vegas (Antifa?) Shooting"

And: "Las Vegas Gunman Stephen Paddock Was Anti Trump Far Left Activist"

Advertisement

You get the idea.

BuzzFeed's Charlie Warzel had the best roundup of the YouTube search results in a story he published Wednesday. While I shouldn't have to point this out, none of those videos, nor the slew of other clips that YouTube recommended to its users as related content, were based on fact. Police believe Paddock was the only perpetrator of the shooting that left 58 people dead, and there is no evidence of any motive or association with militant groups.

Not all the videos YouTube promoted about the incident were fabrications. There were also plenty of videos from reputable news sources. But it's clear that YouTube failed miserably in serving as a useful hub for news and updates about an important event.

Visitors to YouTube, the world's largest video site, were fed the ramblings of tinfoil hat-wearing conspiracy theorists. The fact that YouTube also had legitimate news videos about the incident on its site may actually have made things even worse: The ludicrous and the legitimate, the absurd and the authoritative, were all mixed together, free of any context or differentiation, on equal terms in the eyes of viewers.

A cycle of failure

FILE PHOTO: A picture illustration shows a YouTube logo reflected in a person's eye June 18, 2014.  REUTERS/Dado Ruvic/File Photo - RTX32ECV

Thomson Reuters

It wasn't until Thursday, more than three days after the shooting, that YouTube decided to do something and rushed out a planned change to the search algorithm designed to promote news videos from authoritative sources. A source close to YouTube told me the earlier-than-planned rollout of the change was in response to criticism of how it handled the Las Vegas shooting videos.

Advertisement

It's been 11 months since the 2016 US election, when the plague of fake news and abuse on internet platforms became a mainstream problem. Since then, Facebook, Google, and Twitter have all made attempts to change their algorithms and services to combat the problem. But despite their efforts and all that time, it still requires a journalist or noisy critic to point out a problem before something gets fixed.

Then come the half-hearted apologies and promises to do better. And the cycle repeats itself.

In April, for example, Google said it made changes to its search algorithm to favor authoritative sources for news, just like YouTube did this week. Facebook has made several changes to combat fake news like automated tips for spotting fake news and a new pop-up window that gives you information on the news source you're reading.

And yet, all these platforms continue to screw up. Every new update, every new press release, every new headline designed to show there's an investment into the problem ends up being trumped by a new controversy. When Facebook announced this week it was throwing 1,000 new employees at the ad abuse problem on its site, NYU professor and author of the tech business book "The Four", Scott Galloway called it "pissing in the ocean."

The problem is too big for incremental updates and promises to do better after each screwup. And these sites have become so massive and influential, that they have a responsibility to get it right.

Advertisement

Unfortunate timing

Sundar Pichai

Justin Sullivan / Getty Staff

In an interview with The Verge last week Google CEO Sundar Pichai discussed the responsibility Google has as the distributor of massive amounts of information.

"Today, we overwhelmingly get it right. But I think every single time we stumble. I feel the pain, and I think we should be held accountable," Pichai said.

In a telling case of unfortunate timing, at the very moment Pichai's comments were published, the online platforms he oversees were promoting a bunch of garbage about Las Vegas.

Google, YouTube's parent company, promoted a conspiracy theory thread on 4Chan in its search results and took hours to get rid of it. And competitors like Facebook performed equally poorly, with the social network's "crisis response" page for the Las Vegas shooting promoting unconfirmed reports and conspiracy theories from sites like Gateway Pundit

One way to be held accountable is to be proactive and transparent, something none of the companies claiming they feel a sense of responsibility have adequately done. For example, Facebook's pledge to hire those 1,000 new employees to monitor ad abuse and tweak its algorithm came with a shocking lack of detail and transparency. We have no clarification on how the employees will be trained, whether they're contractors or full-time workers, or what standards they'll be operating under. It's the latest promise to do something from a company that has proven over and over that it's ill-prepared, or un-motivated, to handle the darker edges of its platform. The same goes for its peers.

Advertisement

These companies employ some most brilliant minds in the world. They should be more than capable of tackling this problem and fixing it. Instead, they routinely wait for someone on the outside to point out a mistake before it gets fixed.

That's embarrassing. But even worse, it's dangerous.

Get the latest Google stock price here.