Facebook clarified its real name policy on Sunday night, offering something of a response to complaints that the social network discriminated against users who chose not to use their legal name online, or had names that Facebook saw as unacceptable. In a post that linked to the updated community standards, the company also specified how it selected content for removal from the social network, and explained how it responded to government requests for information.
The social network said that users did not have to sign up with their legal name, and could indeed use what Facebook terms their "authentic identity" — the name they choose to go by. Monika Bickert, Facebook's head of global product policy, told Recode that there had been "a lot of confusion from people who thought we were asking them to use what's on their driver's license." Bickert said that wasn't an accurate interpretation. "We want people communicating using the name they actually use in real life," she said.
Users can use their "authentic identities"
Facebook has come under fire for cracking down on users who choose not to use their birth or legal names online. Last month, radio DJ Jay Smooth was locked out of his account for using his business name on his profile rather than his birth name. Smooth, who complained that it was the only name he'd used publicly for 20-plus years, was quickly reinstated after a Twitter outcry, but other users without a vocal fanbase have suffered at the social network's apparently arbitrary rulings. A number of drag performers had their accounts suspended for using their chosen names, while others were booted out of their profiles even if they did use the names given to them at birth — Native American user Shane Creepingbear was suspended from the service for not having a "real" enough name.
In addition to its real name policy, Facebook has explained what it classes as hate speech, adding wording that specifically allows for satire, humor, and social commentary related to sensitive topics. The new guidelines also let users include or quote hate speech that would normally be removed for the purposes of raising awareness for a given topic.
Faceook says it hasn't added any new rules
Facebook says it hasn't added any new elements to its community guidelines — the company says this update simply offers more detailed wording to explain the existing policies — but some areas have been expanded upon. A new section in the community standards is devoted to sexual violence and exploitation. Any "sexual content involving minors, threats to share intimate images and offers of sexual services" come under the guidelines, along with photos or video depicting incidents of sexual violence. When content is removed, Facebook now offers more explanation as to why, with the guidelines stating that items aren't more likely to be removed if they receive more flags. Another new section details what happens when a user dies. The policy appears to be the same as before: if proof of death is offered, an account can be memorialized, but only deleted if a verified family member requests it.
The social network was criticized earlier this month after it censored l'Origine du Monde, a 19th-Century French painting of a vagina. Facebook's updated standards say that it it will allow artistic nudity, including "photographs of paintings, sculptures and other art that depicts nude figures," but that because of its global audience, it has be quick to respond to reports of nakedness. Bickert says that having to write universal rules for hundreds of millions of Facebook users means the guidelines can be "more blunt than [Facebook] would like them to be." According to Recode, the process of building those universal rules takes a long time. This latest update has reportedly taken a year to develop, write, and fine-tune.