Facebook unveiled a new system today that it hopes will cut down on revenge porn shared on its services.
The photo-matching system will flag previously reported images
In a blog post, the company said it will begin using a photo-matching system to prevent intimate images posted without permission from being reshared on Facebook, Messenger, or Instagram. When an image has previously been reported and taken down by Facebook, the company writes, that image will be flagged by the system, and the sharer will be alerted. (Facebook also says “in most cases” it disables accounts that share intimate images without permission.)
“If someone tries to share the image after it’s been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it,” the company said in the post. Facebook writes that it partnered with online safety organizations to build out the system.
The company has offered ways to remove revenge porn images from its platforms, but issues still plague it. In one recent high-profile incident, a group of Marines on a secret Facebook group used the service to spread nude images of servicewomen.
“We look forward to building on these tools and working with other companies to explore how they could be used across the industry,” the company writes.