Facebook Makes It Clear Users Are Playthings

| The Back Page

Facebook's management has a message for users: you are our playthings and we'll manipulate you in any way that we see fit. That message was made clear when researchers released the results of a "massive experiment on Facebook" that manipulated the newsfeeds of hundreds of thousands of unwitting, involuntary participants.

The study's intent was to measure the emotional state of users when exposed to a preponderance of positive or negative posts. Shocker, there's an effect. People who saw only negative posts tended to make more negative posts themselves, while the converse was true, too. The control group was unaffected, and no one was asked for permission to be experimented on.

According to the researchers, "We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness."

Those researches published a handy chart to show the results of their morally bankrupt effort:

Facebook's Disgusting Experiment

A Chart Demonstrating Facebook's Complete Lack of Ethics

The results are as unimportant as they are unsurprising. What's at issue is Facebook's belief that it's OK to manipulate people; it's OK to skew what people see when they want to check out what they're friends are up to; it's OK to push people into artificially positive and negative mental states without their permission or even knowledge to benefit Facebook's corporate agenda.

It's this last thing that is the most worrisome aspect of this disgusting affair. Hundreds of thousands of people were sent into negative and positive states of minds to gratify Facebook's curiosity. For some of those people, there were consequences.

How many arguments got started because of this "massive experiment?" How many car accidents took place because someone was distracted or feeling aggressive after reading a deluge of negative posts on Facebook, unaware that there world view was being skewed? How many people proposed or quit their job when they weren't ready to do so because they were artificially buoyed by a stream of positive sentiments?

These are just some of the ways that this experiment might have affected uses.

Facebook researcher Adam D.I. Kramer claims that wasn't the researchers' intent. He told The Atlantic, "And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. […] In hindsight, the research benefits of the paper may not have justified all of this anxiety."

You think, Mr. Smartypants? Also, bite me.

Erin Kissane summed up the opinions of many people on Twitter with this little gem:

Speaking of awful, Facebook's initial reaction was to point out that they have permission to do this sort of thing because apparently we signed away that right when agreeing to the terms and conditions. That agreement relinquishes our data for the purposes of, "data analysis, testing, research."

But the difference here is that Facebook was manipulating us to get the data it needed. That's a far cry from merely anonymizing our data and mining it for information. The company was messing with people's lives, and it has me on a raging tilt.

I noted on TMO's Daily Observations Monday morning that with great power comes great responsibility. It's the comic book take on John Emerich Edward Dalberg Acton's observation that:

Power tends to corrupt, and absolute power corrupts absolutely. Great men are almost always bad men.

That last sentence often gets left out of that quote. It also doesn't apply to this situation—Facebook CEO Mark Zuckerburg is hardly a great man.

Either way, these are concepts that I believe in. Facebook has always mistaken what it can do for what it should do, and I'm getting sick of it.

The best thing, though, is that it was preventable. Facebook could have looked for volunteers to participate in an experiment. The psychology research world has spent decades developing ways to secure volunteers in ways that won't have an impact on the results. Facebook could have used those methods here and saved itself tons of negative sentiment.

Oh, the irony.

Sign Up for the Newsletter

Join the TMO Express Daily Newsletter to get the latest Mac headlines in your e-mail every weekday.

Comments

Arnold Ziffel

Facebook continues to give its users reason upon reason to ditch them.

Listen, people!

Lee Dronick

This is a serious qurstion, what social media sites can replace Facebook, what else is out there? I would switch my focus to another one if my Facebook friends would come along.

geoduck

The biggest question was why. Why do this at all. Government leaders and corporate marketing types have manipulation of public opinion via the skewing of news stories down to a science. They’ve been doing it for decades.

There was no reason to repeat what is well known and established.

geoduck

Lee:
I do most of my stuff on Tumblr.

zewazir

I’m not one for frivolous lawsuits, but this is a situation where the offender needs to be, literally, sued into oblivion. Class action in the $100 billion+ range.

Lee Dronick

Thanks Geoduck, I will look into getting a Tumbler account.

A reporter on the evening TV news just said that this study upset many Facebook users. I would disagree with that. A few days ago I posted a link to the story and only 2 out of 90 or so friends “liked” it, and none commented. Also I have not seen any of my friends post links to other news stories about it. Now of course not all of my friends are following me, the feeling is mutual, and Facebook may have suppressed my post. Anyway, it is business as usual on Facebook with “shares” of cute photos, survey’s, urban legends, guilt trips, and check-in to Floyd’s Muffler Shop, and other fluff.

Arnold Ziffel

People, remember there was life before Facebook, and it was good.

wab95

Bryan:

I have already posted a somewhat detailed comment in reference to this immediately after listening to your and team’s podcast of 20140630, providing the perspective of one experienced with the conduct of clinical trials, as well as related interventional, observational and other studies, and thus will not repeat those points here. Permit me to make another one.

FB’s assertion that, by signing a contract giving them permission to dredge, mine, analyse, spindle and otherwise do with our personal data what they will in perpetuity does not, and cannot, grant them the right to prospectively conduct an intervention, in which they effectively provide a controlled exposure, to their clients. This is a fundamentally, different issue, and although we can (and should) argue the legality and morality of their claiming perpetual and absolute ownership of users’ data, no ethical review committee, which should be involved as the vetting agency for any study on human beings, would - it has been my experience - agree that passively acquired observational data is synonymous with, or grants licence to acquire, data that arises from an intervention to test a specific question (also known as an hypothesis).

By their own admission, FB were testing a question. This is precisely why ethical review board of some form or fashion, independent of the institution sponsoring the study, should be involved to provide final approval, and to insure that the study is being conducted according industry standard ethical conduct - namely informing people that an experiment or study is being done, and giving them exclusive control over whether or not they wish to participate in that study. Failing to do this, irrespective of the degree of invasiveness of the study, or the risk of harm, let alone the capacity to intervene and treat in the event of harm (completely inapparent here), is by industry definition, unethical. There is simply no excuse or explaining this away.

Even if, as Lee suggests, most FB users are not perturbed by this, it does not change the nature of what happened, and one can only hope that lawmakers and others charged with insuring public safety will get wind of this and take action both to penalise FB for what they have done, and to reduce the likelihood that this will occur in future.

Lee Dronick

one can only hope that lawmakers and others charged with insuring public safety will get wind of this and take action both to penalise FB for what they have done, and to reduce the likelihood that this will occur in future.

The United Kingdom is looking into it http://www.bbc.com/news/technology-28102550

Log-in to comment