clock menu more-arrow no yes

Filed under:

Facebook apologizes for manipulating news feeds in psychology experiment

New, 23 comments

"There are things we should have done differently."

In June, Facebook researchers announced the results of a 2011 study that manipulated the news feeds of nearly a million user news feeds to see how positive or negative posts affected user behavior. The experiment only encompassed a tiny fraction of Facebook's more than 1.3 billion users, but saw incredible backlash from users who hadn't been asked if they wanted to partake in the study. Today, Facebook CTO Mike Schroepfer finally apologized in a blog post, and outlined plans for more structured research in the future.

"It is clear now that there are things we should have done differently," writes Schroepfer. "For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people." Schroepfer then listed a few ways Facebook will undertake research going forward:

  • Guidelines: Future studies will go through an "enhanced review process" before research can begin — doubly so for topics considered "deeply personal."
  • Review: A panel of individuals at Facebook will include senior researchers, along with engineers, lawyers, and the company's privacy and policy teams.
  • Training: Facebook has added research education to its six-week training bootcamp for new engineers.
  • Research website: Facebook's academic research is now available in one location, here.

Facebook can't take back what it did, but today's measures go a long way towards rectifying the underlying structures that enabled such an aggressive study to happen without Facebook's higher-ups having any idea it was taking place. It should also help new engineers understand that Facebook users aren't just numbers on a chart. 1.3 billion users is a whole lot of people, but that doesn't mean you can experiment on them — even a tiny percentage of them — without being more transparent about exactly what you're doing.