Jim Gearhart

by Jim Gearhart

Facebook’s Human Research Program

If you read English and are a Facebook user, you might have been part of a research study on human behavior.  In January 2012, Facebook picked out 689,003 of its users and began to manipulate the emotional content of the posts they received. Facebook considered all of its users eligible for the study, as long as they read Facebook in English.

For a week, Facebook used its algorithms to manipulate some of our News Feeds, to see whether different mixes of emotions affected our moods. Take away posts with happy words, and maybe we will respond online with less cheer. Block posts that have sad words and see whether our subsequent posts seem more positive. After collecting data from four different groups of users, a Facebook researcher teamed up with academics from Cornell to analyze the results, and  published their results in the PNAS (Proceedings of the National Academy of Sciences of the United States of America) on June 2.

facebook emotion experimentDo you remember giving Facebook permission to test your emotions? I don’t, but the published report says we all said it was okay when we registered and accepted Facebook’s Data Use Policy. This morning I read the Data Use Policy (ahem; re-read, because I am a responsible Internet user who peruses the Terms of Use for every web page I access. Right after I finish my three servings of vegetables a day. And floss.) I did not see any clear statement that Facebook will pick some of us at random, manipulate what we’re seeing, test our reactions, and publish the results in an academic journal.

Did we agree to participate in Facebook’s experiment?

I can see bits and pieces there, places to where Facebook could point and say, there, there is where we all said it was okay to be in the study. Here’s an excerpt from the current Data Use Policy, which Facebook says constituted our consent to be in the study (emphasis added):


How we use the information we receive

We use the information we receive about you in connection with the services and features we provide to you and other users like your friends, our partners, the advertisers that purchase ads on the site, and the developers that build the games, applications, and websites you use. For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you:

  • as part of our efforts to keep Facebook products, services and integrations safe and secure;
  • to protect Facebook’s or others’ rights or property;
  • to provide you with location features and services, like telling you and your  friends when something is going on nearby;
  • to measure or understand the effectiveness of ads you and others see, including to deliver relevant ads to you;
  • to make suggestions to you and other users on Facebook, such as: suggesting that your friend use our contact importer because you found friends using it, suggesting that another user add you as a friend because the user imported the same email address as you did, or suggesting that your friend tag you in a picture they have uploaded with you in it; and
  • for internal operations, including troubleshooting, data analysis, testing, research and service improvement

Yes, the last bullet point says Facebook will use our information for research, but nowhere does it say the research will be published. It is supposed to be for internal operations.

But it turns out that this language means nothing for the experiment in question. Forbes and the Washington Post report that the references to research in May of 2012, months after Facebook wrapped up its emotional reaction test. The Data Use Policy back in January of 2012 said nothing about conducting research on us. So, really, it seems Facebook told us nothing about it.

It took a few weeks for the criticism to start, but it’s been spreading fast.

The Atlantic, the Washington Post, the Wall Street Journal, Forbes, and the New York Times all have run articles. Cornell University has released statements noting its distance from the research. Among the many blogs appearing, cancer researcher Dr. David Gorsky over at Science-Based Medicine posted a detailed critique of the ethics, the design, and the results of the study (spoiler alert: he didn’t like any of them). Bloomberg BNA’s Medical Research Law and Policy Report (subscription required) will have an article later this month that includes comments from Quorum’s own Claire Carbary. Criticism has focused on whether we gave permission for Facebook’s research, and what responsibilities does a company like Facebook have regarding research that involves people? It’s been a lively discussion.
It is important to note that Facebook’s Data Use Policy does address at least one significant ethical concern: privacy. Facebook says that it shares and uses only anonymous data. So, by the terms of the policy and a reading of the study results, Facebook did not share or examine any personal information from the 689,003 of us in the study.

What are the consequences of Facebook’s testing our emotional reactions?

What harm has Facebook done? Certainly, this was no Tuskeegee study, in which a federally-funded study left hundreds of men ignorant of their syphilis infections. Nor was it the Milgram Experiment, where a university professor made people think they were administering electric shocks to others as punishment.

Facebook conducted a study of online responses to positive and negative words, which may not have presented more than a minimal risk for those who participated, even involuntarily. Even in the highly regulated pharmaceutical industry, such a benign level of risk does not always require consent from participants. But reactions to how Facebook handled this have been strong. One consequence looks to be a loss of trust about how Facebook handles our information. And distrust spreads easily, which could weaken already tepid interest in volunteering for research that needs people to succeed.

Where is Facebook’s objectivity?

Facebook said it approved this study through an ‘internal review process.’ But people who get experimented on may not put much faith in internal reviews. Where is the objectivity? The FDA and OHRP realized this long ago, and they require that any ethical review of research include members who are independent of the sponsoring organization. But those laws do not apply to Facebook. They apply to others: to researchers using federal funds; to universities and hospitals; to companies who seek FDA approval for new drugs and devices. Facebook may not have to follow the same rules, but it could look to them for some pointers. With a different process, Facebook might have heard the protests about manipulating our News Feeds before its experiment. Not after.

Tags: , ,