We're asking tech executives the wrong question about fake news

Advertisement

Sheryl Sandberg

Getty Images

Facebook COO Sheryl Sandberg.

Sheryl Sandberg became the latest tech executive to play down the fake news problem on Thursday.

Advertisement

The Facebook COO was asked on Today whether or not fake news stories spread on the social network had an influence on the US election. Her answer was the same as her boss Mark Zuckerberg's: Nope.

But it's also the wrong question.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

Of course executives at Facebook and Google are going to say they don't believe fake news on their platforms had any influence on the election, even as they promise to work on the problem. (That's been proven false, of course, as everyone from Pizzagate truthers to the president-elect have fallen for fake news stories.)

The better question for tech execs like Sandberg, Zuckerberg, and the rest is this: Do you think large distributors of news media, whether it's user generated or not, have a responsibility to vet that content for the truth?

Advertisement

It's a responsibility that the tech community doesn't appear to understand. I spoke with one high-level tech executive this week who told me the vast scale of content being posted online makes it nearly impossible to police for accuracy.

But while that argument makes sense on the surface, it falls flat when you consider that companies like Facebook and Google are able to filter out plenty of other types of content like porn and copyrighted materials from their platforms. They don't have to block people from posting conspiracy theories, but they should have the capability to make sure that content doesn't bubble to the surface and go viral.

It benefits these platforms to allow as much content as possible and deliver it to the people who want to see it. Otherwise, they risk alienating huge swaths of their audience. As CNN's Brian Stelter put it Wednesday at Business Insider's IGNITION conference this week, if people can't find the content that makes them feel good on Facebook, then Facebook risks losing them to some other site that will peddle that content.

So it's not a question of can fake news be tamed. It's a question of whether or not tech companies want to do it. Whether they want to admit it or not, distributing news comes with editorial decisions about what best serves the public.

With such a massive scale comes an equally massive responsibility. And I think we can all agree that that responsibility is to distribute the truth.

Advertisement

The opinions expressed in this article are those of the author.

This is a column. The opinions and conclusions expressed above are those of the author.

NOW WATCH: Why pizzas come with that plastic table in the center