Skip to main content

Facebook reportedly ignored its own research showing algorithms divided users

Facebook reportedly ignored its own research showing algorithms divided users

/

Facebook fears harming engagement or enraging conservatives

Share this story

Illustration by Alex Castro / The Verge

An internal Facebook report presented to executives in 2018 found that the company was well aware that its product, specifically its recommendation engine, stoked divisiveness and polarization, according to a new report from The Wall Street Journal.

Yet, despite warnings about the effect this could have on society, Facebook leadership ignored the findings and has largely tried to absolve itself of responsibility with regard to partisan divides and other forms of polarization it directly contributed to, the report states. The reason? Changes might disproportionately affect conservatives and might hurt engagement, the report says.

“Our algorithms exploit the human brain’s attraction to divisiveness,” one slide from the presentation read. The group found that if this core element of its recommendation engine were left unchecked, it would continue to serve Facebook users “more and more divisive content in an effort to gain user attention & increase time on the platform.” A separate internal report, crafted in 2016, said 64 percent of people who joined an extremist group on Facebook only did so because the company’s algorithm recommended it to them, the WSJ reports.

Facebook found that its algorithms were pushing people to join extremest organizations

Leading the effort to downplay these concerns and shift Facebook’s focus away from polarization has been Joel Kaplan, Facebook’s vice president of global public policy and former chief of staff under President George W. Bush. Kaplan is a controversial figure in part due to his staunch right-wing politics — he supported Supreme Court Justice Brett Kavanaugh throughout his nomination — and his apparent ability to sway CEO Mark Zuckerberg on important policy matters. Kaplan has taken on a larger role within Facebook since the 2016 election, and critics say his approach to policy and moderation is designed to appease conservatives and stave off accusations of bias.

Kaplan, for instance, is believed to be partly responsible for Facebook’s controversial political ad policy, in which the company said it would not regulate misinformation put forth in campaign ads by fact-checking them. He’s also influenced Facebook’s more hands-off approach to speech and moderation over the last few years by arguing the company doesn’t want to seem biased against conservatives.

The Wall Street Journal says Kaplan was instrumental in weakening or entirely killing proposals to change the platform to promote social good and reduce the influence of so-called “super-sharers,” who tended to be aggressively partisan and, in some cases, so hyper-engaged that they might be paid to use Facebook or might be a bot. Yet, Kaplan pushed back against some of the proposed changes — many of which were crafted by News Feed integrity lead Carlos Gomez Uribe — for fear they would disproportionately affect right-wing pages, politicians, and other parts of the user base that drove up engagement.

One notable project Kaplan undermined was called Common Ground, which sought to promote politically neutral content on the platform that might bring people together around shared interests like hobbies. But the team building it said it might require Facebook take a “moral stance” in some cases by choosing not to promote certain types of polarizing content and that the effort could harm overall engagement over time, the WSJ reports. The team has since been disbanded.

In a statement, a Facebook spokesperson tells The Verge, “We’ve learned a lot since 2016 and are not the same company today. We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform’s impact on society so we continue to improve. Just this past February we announced $2M in funding to support independent research proposals on polarization.”

Update May 26th, 8:01PM ET: Added a statement from Facebook.