Nude is a next-generation photo vault that uses AI to hide your sensitive photos

Nudes are an inconvenient truth of the mobile era. The combination of ever-more-powerful cameras and ever-more-convenient sharing mechanisms has made the exchange of explicit pictures a fact of life for nearly everyone seeking romantic connections online. Yet when it comes to managing explicit photographs, technology generally has not been our friend. Mobile camera rolls seem to not take the existence of nudes into account, as anyone who ever stumbled across an odd penis while scrolling through a friend’s device can tell you. And as we saw during the 2014 Celebgate hack, photos stored online using services like iCloud can be vulnerable to breaches.

In the absence of attention from the makers of iOS and Android, entrepreneurs are rushing to fill the void. Private photo vault apps have existed for years. Nude, a new app from two 21-year-old entrepreneurs from UC Berkeley, attempts to create the most sophisticated one yet. Its key innovation is using machine learning libraries stored on the phone to scan your camera roll for nudes automatically and remove them to a private vault. The app is now available on iOS, and I spent the past week testing it.

Jessica Chiu and Y.C. Chen, who built the app together with a small team, said they received constant inquiries when promoting the app at the recent TechCrunch Disrupt conference. “Everyone said, ‘Oh I don’t have nudes — but can you tell me more?’” Chiu said. “Everyone’s like, ‘Oh man, I need this.’”

Chiu says she became interested in nudes-related business models after speaking with Hollywood actresses as part of a movie project she’s working on. Each had sensitive images on their phones or laptop, she said, and expressed doubts about how to keep them secure. When Chiu returned to Berkeley, friends would pass her their phones to look at recent photos they had taken, and she would inevitably swipe too far and see nudity.

She teamed up with Chen, who she had met at an entrepreneurship program, and an Armenian developer named Edgar Khanzadian. Together they built Nude, which uses machine learning to scan your camera roll for nudes automatically. (This only works for photos in the first release, so you’ll need to manually import any sensitive amateur films that may be on your camera roll.)

When Nude finds what it believes to be nude photos, it moves them to a private, PIN-protected vault inside the app. (Chiu said Nude would monitor your camera roll in the background; in my experience, it’s more reliable to simply open Nude, which triggers a scan.) After sending you a confirmation dialog, the app deletes any sensitive files that it finds — both from the camera roll and from iCloud, if the photos are stored there as well. Nude even uses the device’s front-facing camera to take a picture of anyone who tries to guess your in-app PIN and fails.

Crucially, the images on your device are never sent to Nude itself. This is possible thanks to CoreML, the machine learning framework Apple introduced with iOS 11. (Tensorflow performs a similar function on Android devices; an Android version of Nude is in the works.) These libraries allow developers to do machine learning-intensive tasks such as image recognition on the device itself, without transmitting the image to a server. That limits the opportunity for would-be hackers to get access to any sensitive photos and images. (For devices with iOS 10 and below, Nude uses Facebook’s Caffe2, but also manages to do the analysis locally on the phone.)

Chiu and Chen attempted to use existing, open-source data sets to detect nudes. But they found that the results were often inaccurate, especially for people of color. And so they built software to scrape sites like PornHub for representative images, eventually amassing a collection of 30 million images. The algorithm still isn’t perfect, the founders say. (“If you have man boobs, those will be imported,” Chen says.) But the service will improve over time, he says.

Of course, you can use Nude to store more than nudes: the founders say it’s a good place to put photos of your passport, drivers license, and other sensitive documents. But it’s aimed at naked photos — the marketing tagline bills it as “the sexiest app ever” — and of all the photo vault apps it may be the most direct in its pitch. The app also has the makings of a sustainable business model: it will charge users a dollar a month for the service.

Of course, the big platforms could go after this market themselves, if they wanted to. But then they might have to acknowledge the rampant trading of nudes — something that, so far, they have been loath to do. And Chiu and Chen couldn’t be more grateful. “Under the surface,” Chen says, “we’re all human beings.” And human beings in 2017 are sending lots of naked photos.


Oh man, an app which scans and automatically identifies only nudes from your library?

Absolutely not.

Crucially, the images on your device are never sent to Nude itself. This is possible thanks to CoreML, the machine learning framework Apple introduced with iOS 11.

Sure jan

Inb4 Nude is breached and everyone’s naughty pics are leaked.

Well, they seem to say that it’s locally stored, which would theoretically prevent this worst case scenario.

But having an app fully dedicated to storing nudes does put an easy target on their back. I’m thinking that especially if it were to come to Android (where there’s easier access to the filesystem if you know what you’re doing), it probably wouldn’t take long for a counter app to be made that allows people to get direct access to nudes stored on the device as long as they have access to said device (potentially even when locked).

I don’t think you can access app files and data in any way unless you enable adb and have an unlocked device. And users who do have adb enabled most likely don’t need an app like this – file encryption is not difficult

A photos app that hides naughty photos but is named such that others would find you creepy…

They could name it Zupe and when you open the app the Z & P in the logo rotate so it spells Nude!


The Gateway Service to storing your illegal opioid distribution records, records of "campaign donations," accounting docs for sketchy gun sales, etc.

And human beings in 2017 are sending lots of naked photos.

Are they though? Any surveys whatsoever to back that up?

Here. Granted , it doesn’t specifically say that they’re sending nudes. However, with the rise of Snapchat – which was originally known for it’s ability to send nudes- you can see how Caseys’ statement is a fair assumption. There’s even a guide on how to send some…

Seems like just adding a pin to the hidden album feature on iOS and an equivalent on Android would quickly shoot down the market for this. Especially if it doesn’t require an app named nude on your phone.

When the obvious happens and this company gets hacked, then what?

Tested it and it does work, except for the tons of pictures it detected of my dog and arms/legs. A hassle to filter through and defeats the ease of use. Not to mention $1/month subscription is steep long term. I’ll stick to my photo vault calculator apps that cost a few bucks and are incognito.


What happens if you stop using the service? Are you able to release the photos from the vault back into the camera roll/into iCloud? Will it preserve the original EXIF data?

View All Comments
Back to top ↑