Skip to main content

Myanmar activists say Facebook’s plans to stop violent speech are ‘nowhere near enough’

Myanmar activists say Facebook’s plans to stop violent speech are ‘nowhere near enough’

Share this story

In a letter obtained by The New York Times, Mark Zuckerberg responded to a group of Myanmar activists who have criticized Facebook’s handling of material meant to incite violence in the country. But the groups continue to say Facebook isn’t doing enough.

Last week, Zuckerberg was interviewed by Vox’s Ezra Klein. During the interview, Zuckerberg was asked about violence in Myanmar, where investigators say Facebook drives hate speech against the minority Rohingya population. To illustrate Facebook’s work in the country, Zuckerberg described a situation where Facebook “systems” successfully flagged attempts to spread violence.

After the interview was published, Myanmar civil society groups released an open letter addressed to Zuckerberg, which said the groups themselves uncovered the messages. The letter further criticized the CEO’s characterization of Facebook’s work in Myanmar, saying the events showed the company had “an over-reliance on third parties, a lack of a proper mechanism for emergency escalation, a reticence to engage local stakeholders around systemic solutions and a lack of transparency.”

Zuckerberg said the groups have an “important role”

Zuckerberg’s new response to the groups’ letter apologized for the slight and highlighted their “important role” in the country. He said Facebook was adding dozens of Burmese-language reviewers, and that the company was bringing on more people dedicated to Myanmar issues.

But the groups, in another response, immediately criticized Zuckerberg’s letter, saying it “doesn’t change our core belief that your proposed improvements are nowhere near enough to ensure that Myanmar users are provided with the same standards of care as users in the U.S. or Europe.”

“When things go wrong in Myanmar, the consequences can be really serious — potentially disastrous,” the groups write. “You have yourself publicly acknowledged the risk of the platform being abused towards real harm.”

Here is a copy of Mark Zuckerberg’s email response:

Dear Htaike Htaike, Jes, Victoire, Phyu Phyu and Thant,

I wanted to personally respond to your open letter. Thank you for writing it and I apologize for not being sufficiently clear about the important role that your organizations play in helping us understand and respond to Myanmar-related issues, including the September incident you referred to.

In making my remarks, my intention was to highlight how we’re building artificial intelligence to help us better identify abusive, hateful or false content even before it is flagged by our community.

These improvements in technology and tools are the kinds of solutions that your organizations have called on us to implement and we are committed to doing even more. For example, we are rolling out improvements to our reporting mechanism in Messenger to make it easier to find and simpler for people to report conversations.

In addition to improving our technology and tools, we have added dozens more Burmese language reviewers to handle reports from users across all our services. We have also increased the number of people across the company on Myanmar-related issues and we now we have a special product team working to better understand the specific local challenges and build the right tools to help keep people there safe.

There are several other improvements we have made or are making, and I have directed my teams to ensure we are doing all we can to get your feedback and keep you informed.

We are grateful for your support as we map out our ongoing work in Myanmar, and we are committed to working with you to find more ways to be responsive to these important issues.

Mark

And here is the Myanmar groups’ response:

Dear Mark,

Thank you for responding to our letter from your personal email account. It means a lot.

We also appreciate your reiteration of the steps Facebook has taken and intends to take to improve your performance in Myanmar.

This doesn’t change our core belief that your proposed improvements are nowhere near enough to ensure that Myanmar users are provided with the same standards of care as users in the U.S. or Europe.

When things go wrong in Myanmar, the consequences can be really serious — potentially disastrous. You have yourself publicly acknowledged the risk of the platform being abused towards real harm.

Like many discussions we have had with your policy team previously, your email focuses on inputs. We care about performance, progress and positive outcomes.

In the spirit of transparency, we would greatly appreciate if you could provide us with the following indicators, starting with the month of March 2018:

■ How many reports of abuse have you received?

■ What % of reported abuses did your team ultimately remove due to violations of the community standards?

■ How many accounts were behind flagging the reports received?

■ What was the average time it took for your review team to provide a final response to users of the reports they have raised? What % of the reports received took more than 48 hours to receive a review?

■ Do you have a target for review times? Data from our own monitoring suggests that you might have an internal standard for review — with most reported posts being reviewed shortly after the 48 hrs mark. Is this accurate?

■ How many fake accounts did you identify and remove?

■ How many accounts did you subject to a temporary ban? How many did you ban from the platform?

Improved performance comes with investments and we would also like to ask for more clarifications around these. Most importantly, we would like to know:

■ How many Myanmar speaking reviewers did you have, in total, as of March 2018? How many do you expect to have by the end of the year? We are specifically interested in reviewers working on the Facebook service and looking for full-time equivalents figure.

■ What mechanisms do you have in place for stopping repeat offenders in Myanmar? We know for a fact that fake accounts remain a key issue and that individuals who were found to violate the community standards on a number of occasions continue to have a presence on the platform.

■What steps have you taken to date to address the duplicate posts issue we raised in the briefing we provided your team in December 2017?

We’re enclosing our December briefing for your reference, as it further elaborates on the challenges we have been trying to work through with Facebook.