Skip to main content

Facebook admits it screwed up on Myanmar — but it refuses to take all the blame

Facebook admits it screwed up on Myanmar — but it refuses to take all the blame

/

Consistent policies, a moderation team, and Unicode

Share this story

Facebook has released the conclusions of an independent assessment regarding its role in the recent genocidal violence in Myanmar. In short, the company admits that it previously wasn’t doing enough to prevent its network from “being used to foment division and incite offline violence,” but it argues it’s already begun making the changes necessary to prevent it from happening again. However, while the report shows that the company has made progress in how transparent it is about moderation, it stops short of making any firm commitments about audits like this in the future — a key demand from activists.

Facebook’s handling of the Myanmar crisis has been criticized by everyone from activists to the United Nations. Back in May, a coalition consisting of activists from Myanmar, Syria, and six other countries, made three specific demands of the social network. That coalition called for sustained transparency, an independent and worldwide public audit, and a public commitment to equal enforcement of standards across every territory that Facebook is active in.

99 native speakers, and 64,000 pieces of content removed so far

Compared to these demands, Facebook’s report is a mixed bag. Since it was conducted by the Business for Social Responsibility, an independent nonprofit organization based in San Francisco, it certainly qualifies as independent, but it stops short of the worldwide audit that the coalition called for. Although Facebook claims to agree with the value of transparently publishing data about enforcement efforts and points toward a recent example covering its Myanmar moderation (it also posted a similar report about Iran), it makes no specific commitments about how regularly it will publish these reports in the future.

The coalition’s final demand — that Facebook equally enforces its standards worldwide — is much more difficult to evaluate. Every country is unique, and having equal standards worldwide risks missing crucial pieces of context. For example, Facebook notes Myanmar is one of the largest online communities that hasn’t standardized on Unicode for its text because of its long period of isolation from the outside world. Instead, it uses the Zawgyi typeface, which Facebook claims makes it much harder to detect offending posts. Facebook wants Myanmar to transition to Unicode, and it says it has removed Zawgyi as an option for new users.

Facebook has also created a team dedicated to addressing Myanmar’s specific issues on the platform, and that team includes 99 native Myanmar speakers. The company says it has already taken action on around 64,000 pieces of content from the country for violating its hate speech policies, proactively identifying 63 percent of these posts before they were reported manually. Similar claims about Facebook’s systems’ abilities to automatically flag content have previously been criticized by Myanmar civil society groups that claimed that they uncovered these messages that Facebook’s systems took credit for identifying.

Globally, the network has changed its credible violence policy to cover posts containing misinformation that could cause imminent violence or physical harm, and it’s “looking into” establishing a separate moderation policy to handle human rights abuses.

Every country’s problems are unique, but this report suggests that Facebook has struggled to understand the unique context of Myanmar’s recent violence. With more elections looming in the country in 2020, it’s critical that the platform dedicates enough attention to this previously isolated nation and its 20 million Facebook users.