Skip to main content

After 18 years, Meta’s finally building Facebook a customer service division

After 18 years, Meta’s finally building Facebook a customer service division

/

Though it sounds like early days

Share this story

It’s hard to tell what it’ll look like at this point.
It’s hard to tell what it’ll look like at this point.
Illustration by Alex Castro / The Verge

Meta is trying to make it easier for its users to get support when their accounts or posts are removed, according to a report from Bloomberg. The report quotes Brent Harris, Meta’s vice president of governance, who says that the company is “spending a bunch of time on” customer service. Since Facebook’s inception, people haven’t really had any recourse or way to talk to the company about moderation decisions.

At this stage, it doesn’t seem clear what Meta’s customer service division will actually do. Last year, the company piloted a live chat support program that gave some English-speaking users a way to actually talk to a human at Meta if they needed help using a new feature or got locked out of their accounts. At the time, the company said it was “the first time Facebook has offered live help for people locked out of their accounts.”

The company didn’t immediately respond to The Verge’s request for comment on whether its recent efforts were related to that “small test” or if those capabilities have expanded in any way in the months since they were introduced.

Providing support for all of Instagram, Facebook, WhatsApp, Horizon VR, and other properties, though, would be a massive undertaking for the company. According to Bloomberg, Meta’s focus on fixing its customer support experience is in part due to feedback from its Oversight Board. Last year, the “independent” body set up to monitor and overturn Meta’s decisions (independent is in quotes because it’s funded by the company) reported that it received almost a million user appeals about Meta’s content moderation.

As Meta’s looking to help the users who accidentally find themselves on the wrong side of its automated moderation tools, it’s still working to remove the people who are purposefully evading bans. According to a tweet from Meta’s counterterrorism policy lead, Dina Hussein, the company recently took down around 500 accounts, pages, groups, and events associated with the Proud Boys, a white supremacist group that was banned from the platform in 2018.