Skip to main content

Bumble’s ‘private detector’ AI will automatically detect and blur lewd images

Bumble’s ‘private detector’ AI will automatically detect and blur lewd images

/

Lewd photos shared in chats will come with a warning

Share this story

Image: Bumble

Bumble is launching a “private detector” feature that can automatically detect lewd images with AI and warn users about the photo before they open it. Users can then decide if they want to view, block, or report the image to moderators. The feature is part of a new safety initiative, and it’ll also come to the apps Badoo, Chappy, and Lumen — which are all part of the same dating group parent company — starting in June.

As one of the few dating apps that allow photos to be sent in chat, Bumble already has measures in place to protect users by blurring all images by default. Recipients have to hold down the photo to view it, which will then show the photo with a watermark of the sender’s profile image. The idea was that photos attached to a sender’s profile should hopefully curb unwanted lewd images. However, as users have experienced, there hasn’t been much to stop anyone from making fake profiles. For example, don’t be like “James, 23,” below.

Image: Bumble

Now, lewd photo messages will at least come with a warning that the AI has detected (with 98 percent accuracy, the company claims) potentially inappropriate content.

In addition to the new feature, Bumble CEO and co-founder Whitney Wolfe Herd has been working with Texas legislators to pass a bill that would make the sharing of unwanted nude images a crime, with a punishable fine of up to $500. The bill was drafted by Republican Texas State Rep. Morgan Meyer on the basis that just as it’s illegal to expose yourself on the streets in public, it should be illegal to do the same online. “Something that is already a crime in the real world needs to be a crime online,” Meyer told NBC News.