This Facebook 'Manipulation' Scandal Is Ridiculous - Companies Test Products (And You) All The Time

Advertisement

Mark Zuckerberg

The Great Manipulator.


The Internet found something new to be outraged about last week, when it was revealed that a Facebook product test might have temporarily affected the way some of its users felt.

Advertisement

This test, one of many the company conducts routinely with the aim of improving its product, involved 700,000 English-speaking Facebook users for a week in 2012.

The test was designed to test a prevailing "meme" at the time: The hypothesis that the upbeat tone of the content people shared and saw on Facebook, all those "likes" and "friends" and happy group photos, might be causing people in the real world to be depressed-in the same way, presumably, that Photoshopped cover models are thought to make real people insecure and unhappy about their normality.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

Facebook's data scientists designed a clever way to test this theory. They set up a word filter to evaluate all the posts that were eligible to be placed in each user's News Feed and then they tweaked the selection algorithm so that some users got slightly more "positive" content and others got slightly more "negative" content. Then the scientists used the word filter to compare the posts of users who saw more "positive" content with those who saw more "negative" content.

The results of the test suggested that the "meme" of the moment was wrong.

Advertisement

Rather than being depressed that their lives didn't measure up to those of the happy people on Facebook, the users who saw slightly more positive content actually posted slightly more positive status updates. The users who saw slightly more negative content, meanwhile, posted slightly more negative updates.

In other words, far from getting depressed by the happiness portrayed on Facebook, people seemed to find the upbeat mood modestly infectious. And vice versa.

This information was helpful to Facebook in deciding what to do (and not do) to its product.

For understandable reasons, the company did not want its product to make people depressed (what company does?). And if Facebook's News Feed algorithm was selecting content that made people depressed, as the prevailing meme was suggesting, then Facebook wanted to know that. But the results of the test suggested that the theory was wrong. So, according to one person familiar with the situation, Facebook did not change its product in response to the test.

But that didn't stop a large number of people from exploding in indignation when they heard about the test.

Advertisement

Facebook is intentionally manipulating our emotions, these people howled.

Facebook is using us as unwitting guinea pigs!

Facebook is abusing its users' trust!

And so on. Even now, almost a week after news broke, people are still acting as though Facebook is some sort of digital-age Mengele, conducting evil, harmful experiments on millions of unwilling human test-subjects.

Come on.

Advertisement

This was a product test.

Companies conduct product tests all the time.

And these product tests are often designed to evaluate the emotions of those who use the products.

Yes, it's true that many such product tests are conducted via focus groups or test screenings or other venues in which people know they are participating in a product test. But many others are not.

When a direct-mail company is testing messages, for example, it doesn't tell you it's conducting a test. When politicians test talking points, they don't tell you they're testing them. Same for companies that test different advertisements or promotions and then compares response rates.

Advertisement

What's more, in the digital realm, companies like Facebook conduct product tests on "unwitting" participants-their users-almost every day.

When, as a Google executive, Marissa Mayer famously tested 41 shades of blue to see which shade prompted the highest user response rate, she was conducting a product test. Google's users didn't know they were being used as test subjects. Marissa Mayer didn't ask their permission to test how they felt about each particular blue. She just tested them.

On the advertising side, moreover, Google tests millions of advertisements every day. Google presents you, its users, with different ad placements and wordings and pictures, until it figures out which ads you are most likely to respond to. Google, in other words, tests what you see and do all day long, all without your knowing. And so do Amazon and dozens of other digital companies with the technology resources necessary to continually A/B test with the aim of improving their products.

But Facebook was trying to manipulate its users' emotions!, people still protest.

Actually, Facebook wasn't trying to do that.

Advertisement

Facebook was trying to see whether a prevailing theory about the impact of Facebook's product on emotions was correct or not. And it turned out it wasn't.

In the process, Facebook may-may-have temporarily, slightly influenced the emotions of some of its users.

But it's important to note just how slight this possible influence was.

Facebook's study, a person familiar with the situation says, found that users whose News Feeds were set to include modestly more "positive" content than the average user's made modestly more "positive" comments than average-to the tune of one (1) extra positive word in every thousand (1,000) words.

Similarly, users whose News Feeds were set to include modestly more "negative" content than average were found to write one (1) extra negative word per thousand (1,000) words than the average.

Advertisement

That's not exactly massive emotional influence.

But Facebook can manipulate our emotions, the indignant netizens still shout. Facebook can shape our thoughts. Facebook can affect elections!

Facebook is way too powerful, in other words.

Facebook is certainly powerful. And, in the hands of the wrong people, Facebook could certainly be used as a tool for mass-market mind control.

But when it comes to concerns about that - about companies that have the power to manipulate emotions, shape thoughts, and affect elections - there are probably other, fatter targets to worry about.

Advertisement

Such as television networks.

SEE ALSO: Facebook Reacts: "We Never Meant To Upset You"