Skip to main content

Meta has a ‘moral obligation’ to make its mental health research transparent, scientists say

Meta has a ‘moral obligation’ to make its mental health research transparent, scientists say

/

An open letter asks the company to set up a scientific oversight trust

Share this story

Meta logo on blue background
Illustration by Alex Castro / The Verge

In an open letter to Mark Zuckerberg published Monday, a group of academics called for Meta to be more transparent about its research into how Facebook, Instagram, and WhatsApp affect the mental health of children and adolescents. The letter calls for the company to allow independent reviews of its internal work, contribute data to external research projects, and set up an independent scientific oversight group.

“You and your organizations have an ethical and moral obligation to align your internal research on children and adolescents with established standards for evidence in mental health science,” the letter, signed by researchers from universities around the world, reads.

The open letter comes after leaks from Facebook revealed some data from the company’s internal research, which found that Instagram was linked with anxiety and body image issues for some teenage girls. The research released, though, is limited and relied on subjective information collected through interviews. While this strategy can produce useful insights, it can’t prove that social media caused any of the mental health outcomes.

The information available so far appears to show that the studies Facebook researchers conducted don’t meet the standards academic researchers use to conduct trials, the new open letter said. The information available also isn’t complete, the authors noted — Meta hasn’t made its research methods or data public, so it can’t be scrutinized by independent experts. The authors called for the company to allow independent review of past and future research, which would include releasing research materials and data.

The letter also asked Meta to contribute its data to ongoing independent research efforts on the mental health of adolescents. It’s a longstanding frustration that big tech companies don’t release data, which makes it challenging for external researchers to scrutinize and understand their products. “It will be impossible to identify and promote mental health in the 21st century if we cannot study how young people are interacting online,” the authors said.

The company likely has the data on platform usage and other user behavior to show the ways its platforms do or do not affect the mental health of kids and teens, Kaveri Subrahmanyam, a developmental psychologist at California State University, Los Angeles, told The Verge this fall. “Why are they not releasing the data that they have that shows the clicks and other behavior? I think they should be inviting researchers who have that expertise, and giving them that data and letting them do that analysis,” she said.

The open letter also called on Meta to establish an independent scientific trust to evaluate any risks to mental health from the use of platforms like Facebook and Instagram and to help implement “truly evidence-based solutions for online risks on a world-wide scale.” The trust could be similar to the existing Facebook Oversight Board, which helps the company with content moderation decisions.

Internal research available from Meta at this point cannot conclusively say if or how social media platforms affect mental health, but the leaked findings — coupled with other research on social media — raise enough concerning issues to warrant further investigation through more rigorous research that could help understand the relationship. Understanding how life online affects kids and teenagers is a critical question, the authors of the open letter note. But the secrecy with which Meta has navigated research so far has “somewhat predictably” engendered skepticism from the research community, which prioritizes transparency, and left other stakeholders (like lawmakers and parents) concerned, the authors of the open letter wrote. Taking the steps the authors outline could help clear up those issues and contribute to a more complete picture of mental health online.

“If the right scientific and ethical tools were in place, data collected by Meta could inform how we understand digital technology use and its influence on mental health in unprecedented ways,” the letter reads.