Skip to main content

Facebook conducted hundreds of psychological experiments with few boundaries: WSJ

Facebook conducted hundreds of psychological experiments with few boundaries: WSJ

/

Data Science team accused users of being robots

Share this story

Facebook has come under fire this week for a controversial experiment it performed on 689,000 of its users in 2012. The experiment, in which it promoted either positive or negative comments in a percentage of its users' news feeds, was orchestrated by the company's data science team. The team, founded in 2007, is made up of around 30 doctors, scientists, and ex-academics who The Wall Street Journal reports have been able to conduct hundreds of tests on Facebook's 1.3 billion users with few boundaries or limitations.

The Wall Street Journal describes one such test, in which thousands of Facebook users received a message from the company. The message said the affected users were to be blocked from the social network because Facebook thought they were either robots or using fake names. In reality, Facebook thought no such thing — the message was deliberately sent to real users in a bid to bolster Facebook's anti-fraud capabilities.

Tests were run on users without a formal review process

Many of the data science group's tests have provided the basis for published studies. These have included papers that have detailed how families communicate, examined the causes of loneliness, and one in 2010 which questioned whether "political mobilization messages" sent to 61 million people influenced their congressional election votes.

Experiments such as these were reportedly not subjected to a formal review before they began. Speaking to The Wall Street Journal, Andrew Ledvina — who worked as a data scientist for Facebook between 2012 and 2013 — said "anyone on that team could run a test," without needing to follow the kind of stringent review process academic experiments are subject to. Ledvina said that tests were so commonplace that scientists instead worried that users would be involved in too many experiments at once, providing inaccurate results.

Facebook now has stricter guidelines

Facebook's controversial 2012 News Feed experiment has prompted ethical and privacy questions from both its users and data watchdogs. Facebook has defended itself, saying it adopted stricter guidelines since the study, updating its terms of service to specifically indicate that user data may be used for research, and subjecting its research to internal review by a panel of 50 experts.

The company says it's also considering additional changes to avoid "upsetting" its 1.3 billion-strong user base again, but experts say the practice of testing on unknowing people is commonplace. Kate Crawford, MIT professor and Microsoft researcher, says companies "really do see users as a willing experimental test bed," while Ledvina says such research is designed to manipulate. "They're always trying to alter peoples' behavior."