Facebook Makes It Clear Users Are Playthings

Facebook's management has a message for users: you are our playthings and we'll manipulate you in any way that we see fit. That message was made clear when researchers released the results of a "massive experiment on Facebook" that manipulated the newsfeeds of hundreds of thousands of unwitting, involuntary participants.

The study's intent was to measure the emotional state of users when exposed to a preponderance of positive or negative posts. Shocker, there's an effect. People who saw only negative posts tended to make more negative posts themselves, while the converse was true, too. The control group was unaffected, and no one was asked for permission to be experimented on.

According to the researchers, "We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness."

Those researches published a handy chart to show the results of their morally bankrupt effort:

Facebook's Disgusting Experiment

A Chart Demonstrating Facebook's Complete Lack of Ethics

The results are as unimportant as they are unsurprising. What's at issue is Facebook's belief that it's OK to manipulate people; it's OK to skew what people see when they want to check out what they're friends are up to; it's OK to push people into artificially positive and negative mental states without their permission or even knowledge to benefit Facebook's corporate agenda.

It's this last thing that is the most worrisome aspect of this disgusting affair. Hundreds of thousands of people were sent into negative and positive states of minds to gratify Facebook's curiosity. For some of those people, there were consequences.

How many arguments got started because of this "massive experiment?" How many car accidents took place because someone was distracted or feeling aggressive after reading a deluge of negative posts on Facebook, unaware that there world view was being skewed? How many people proposed or quit their job when they weren't ready to do so because they were artificially buoyed by a stream of positive sentiments?

These are just some of the ways that this experiment might have affected uses.

Facebook researcher Adam D.I. Kramer claims that wasn't the researchers' intent. He told The Atlantic, "And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. […] In hindsight, the research benefits of the paper may not have justified all of this anxiety."

You think, Mr. Smartypants? Also, bite me.

Erin Kissane summed up the opinions of many people on Twitter with this little gem:

Speaking of awful, Facebook's initial reaction was to point out that they have permission to do this sort of thing because apparently we signed away that right when agreeing to the terms and conditions. That agreement relinquishes our data for the purposes of, "data analysis, testing, research."

But the difference here is that Facebook was manipulating us to get the data it needed. That's a far cry from merely anonymizing our data and mining it for information. The company was messing with people's lives, and it has me on a raging tilt.

I noted on TMO's Daily Observations Monday morning that with great power comes great responsibility. It's the comic book take on John Emerich Edward Dalberg Acton's observation that:

Power tends to corrupt, and absolute power corrupts absolutely. Great men are almost always bad men.

That last sentence often gets left out of that quote. It also doesn't apply to this situation—Facebook CEO Mark Zuckerburg is hardly a great man.

Either way, these are concepts that I believe in. Facebook has always mistaken what it can do for what it should do, and I'm getting sick of it.

The best thing, though, is that it was preventable. Facebook could have looked for volunteers to participate in an experiment. The psychology research world has spent decades developing ways to secure volunteers in ways that won't have an impact on the results. Facebook could have used those methods here and saved itself tons of negative sentiment.

Oh, the irony.