Skip to main content

Facebook is expanding its unconventional approach to combating revenge porn

Facebook is expanding its unconventional approach to combating revenge porn

/

The social network is asking victims to upload nude photos to help keep them from circulating

Share this story

Illustration by Alex Castro / The Verge

Facebook made headlines last fall when news surfaced of a somewhat counterintuitive approach to combating revenge porn that involved asking users to upload nude photos to the company’s servers. Now, the official safety division of Facebook says it’s expanding that initial pilot program to the US, UK, and Canada. Previously, the solution had been deployed only in Australia and only in partnership with the office of the Australian government’s e-Safety Commissioner.

In a public post on the official Facebook Safety page, Global Head of Safety Antigone Davis says the company is “partnering with safety organizations on a way for people to securely submit photos they fear will be shared without their consent, so we can block them from being uploaded to Facebook, Instagram and Messenger.” Essentially, Facebook uses a technical approach known as hashing to tag sexually explicit photos that a user fears may be circulating without their consent. A hash is a numerical representation of a file that Facebook says cannot be read by humans, and the actual viewable image is then removed from the company’s servers once the hash is logged.

Facebook logs images with a numeric code and then deletes them from its servers

Facebook says this can be done once an image may already be circulating or in the event a user suspects a malicious third party, like a vindictive former partner or someone abusing and harassing a victim online, may at some point in the future try and post the file to any Facebook-owned service or messaging product. The process involves contacting one of Facebook’s trusted partners, which include the Australian Office of the eSafety Commissioner, the Cyber Civil Rights Initiative and The National Network to End Domestic Violence in the US, the UK Revenge Porn Helpline, and YWCA Canada. The user then fills out the form, receives a one-time upload link, and can then upload the images.

Facebook is careful to stress that a team of employees does review the photos manually, to confirm the content is in violation of the company’s terms of service and constitutes a generally accepted description of “non-consensual intimate image,” understood colloquially as revenge porn and typically involving nudity of some variety. The hashing process then ensures Facebook is notified whenever the file is uploaded at a later date. “Once we create these hashes, we notify the victim via email and delete the images from our servers — no later than seven days,” Davis explains. “We store the hashes so any time someone tries to upload an image with the same fingerprint, we can block it from appearing on Facebook, Instagram or Messenger.”

What Davis’ post does not mention is anything having to do with Cambridge Analytica and the ongoing data privacy scandal that has shaken trust in Facebook as a custodian of users’ personal data. While there is no reason to believe Facebook would ever act so carelessly with data as sensitive as explicit photos of its users, there’s no telling how damaging the Cambridge Analytica scandal has been to the general perception of the company. The company’s stock price has largely recovered and there does not appear to be any lasting user drop off we can measure right now. Yet general sentiment toward Facebook, at least anecdotally, may not be as high as it was last fall, which makes it seem less likely users would be willing to trust Facebook with nude photos of themselves.

Davis says the approach is a work in progress, and that Facebook’s safety team has been traveling the world to listen to victims’ stories about the impact of revenge porn and all the shapes it can take online. “This is one step to help people who fear an intimate image will be shared without their consent,” Davis writes. “We look forward to learning from this pilot and further improving our tools for people in devastating situations like these.”