Skip to main content

Nude is a next-generation photo vault that uses AI to hide your sensitive photos

Nude is a next-generation photo vault that uses AI to hide your sensitive photos


Making the camera roll safe for sharing

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Nudes are an inconvenient truth of the mobile era. The combination of ever-more-powerful cameras and ever-more-convenient sharing mechanisms has made the exchange of explicit pictures a fact of life for nearly everyone seeking romantic connections online. Yet when it comes to managing explicit photographs, technology generally has not been our friend. Mobile camera rolls seem to not take the existence of nudes into account, as anyone who ever stumbled across an odd penis while scrolling through a friend’s device can tell you. And as we saw during the 2014 Celebgate hack, photos stored online using services like iCloud can be vulnerable to breaches.

In the absence of attention from the makers of iOS and Android, entrepreneurs are rushing to fill the void. Private photo vault apps have existed for years. Nude, a new app from two 21-year-old entrepreneurs from UC Berkeley, attempts to create the most sophisticated one yet. Its key innovation is using machine learning libraries stored on the phone to scan your camera roll for nudes automatically and remove them to a private vault. The app is now available on iOS, and I spent the past week testing it.

camera rolls seem not to take the existence of nudes into account

Jessica Chiu and Y.C. Chen, who built the app together with a small team, said they received constant inquiries when promoting the app at the recent TechCrunch Disrupt conference. “Everyone said, ‘Oh I don’t have nudes — but can you tell me more?’” Chiu said. “Everyone’s like, ‘Oh man, I need this.’”

Chiu says she became interested in nudes-related business models after speaking with Hollywood actresses as part of a movie project she’s working on. Each had sensitive images on their phones or laptop, she said, and expressed doubts about how to keep them secure. When Chiu returned to Berkeley, friends would pass her their phones to look at recent photos they had taken, and she would inevitably swipe too far and see nudity.

She teamed up with Chen, who she had met at an entrepreneurship program, and an Armenian developer named Edgar Khanzadian. Together they built Nude, which uses machine learning to scan your camera roll for nudes automatically. (This only works for photos in the first release, so you’ll need to manually import any sensitive amateur films that may be on your camera roll.)

When Nude finds what it believes to be nude photos, it moves them to a private, PIN-protected vault inside the app. (Chiu said Nude would monitor your camera roll in the background; in my experience, it’s more reliable to simply open Nude, which triggers a scan.) After sending you a confirmation dialog, the app deletes any sensitive files that it finds — both from the camera roll and from iCloud, if the photos are stored there as well. Nude even uses the device’s front-facing camera to take a picture of anyone who tries to guess your in-app PIN and fails.

Crucially, the images on your device are never sent to Nude itself. This is possible thanks to CoreML, the machine learning framework Apple introduced with iOS 11. (Tensorflow performs a similar function on Android devices; an Android version of Nude is in the works.) These libraries allow developers to do machine learning-intensive tasks such as image recognition on the device itself, without transmitting the image to a server. That limits the opportunity for would-be hackers to get access to any sensitive photos and images. (For devices with iOS 10 and below, Nude uses Facebook’s Caffe2, but also manages to do the analysis locally on the phone.)

“If you have man boobs, those will be imported.”

Chiu and Chen attempted to use existing, open-source data sets to detect nudes. But they found that the results were often inaccurate, especially for people of color. And so they built software to scrape sites like PornHub for representative images, eventually amassing a collection of 30 million images. The algorithm still isn’t perfect, the founders say. (“If you have man boobs, those will be imported,” Chen says.) But the service will improve over time, he says.

Of course, you can use Nude to store more than nudes: the founders say it’s a good place to put photos of your passport, drivers license, and other sensitive documents. But it’s aimed at naked photos — the marketing tagline bills it as “the sexiest app ever” — and of all the photo vault apps it may be the most direct in its pitch. The app also has the makings of a sustainable business model: it will charge users a dollar a month for the service.

Of course, the big platforms could go after this market themselves, if they wanted to. But then they might have to acknowledge the rampant trading of nudes — something that, so far, they have been loath to do. And Chiu and Chen couldn’t be more grateful. “Under the surface,” Chen says, “we’re all human beings.” And human beings in 2017 are sending lots of naked photos.