As Facebook deals with global criticism for its platform’s role in promoting violence, the company says it’s taking new steps to slow the spread of hate. In a blog post published yesterday, the company noted action it’s recently taken in two countries: Sri Lanka and Myanmar, both of which have been hit by social media-fueled conflict.
In Sri Lanka, where posts on Facebook have spread anti-Muslim violence, the company says it’s limiting the number of messages that users can forward — a change the company previously made to WhatsApp amid similar concerns. The number of threads that can be forwarded at once is reportedly set at five in the country.
The company said it is also focusing on “borderline content,” which may be sensationalist but does not directly violate the platform’s rules. In Myanmar, where the minority Rohingya population is being persecuted, the company said it would reduce the distribution of all content from people who show “a pattern” of violating Facebook’s community standards.
The company says it will meanwhile continue to ban people who directly promote violence. “Reducing distribution of content is, however, another lever we can pull to combat the spread of hateful content and activity,” the company said in its post.
Facebook’s handling of violence in countries like Myanmar has been widely scrutinized, with advocates saying its efforts have been “nowhere near enough.” After an independent assessment on Myanmar released last year, Facebook concluded it “can and should do more.”