Skip to main content

Artificial intelligence can spot skin cancer as well as a trained doctor

Artificial intelligence can spot skin cancer as well as a trained doctor

Share this story

FDA Announces Stricter Guidelines For Sun Screens
Photo by Joe Raedle/Getty Images

Researchers at Stanford University have created an AI algorithm that can identify skin cancer as well as a professional doctor. The program was trained on nearly 130,000 images of moles, rashes, and lesions using a technique known as deep learning. It was then tested head-to-head against 21 human dermatologists, where its creators say it performed with an accuracy on par with humans (“at least” 91 percent as good). In the future, they suggest it could be used to create a mobile app for spotting skin cancer at home.

The algorithm could help with the early detection of skin cancer

Each year in the United States, some 5.4 million new cases of skin cancer are diagnosed. The usual process for identifying the many varieties of the disease involve a visual examination of moles or other marks on the skin by a dermatologist. It’s well established that the earlier the disease is detected, the better the chances of survival. For example, the five-year survival rate for melanoma detected early on is around 97 percent; but when detected in its later stages, that figure falls to around 14 percent.

To teach their AI how to identify skin cancer, the Stanford researchers started with an existing deep learning algorithm built by Google for image classification. As described in a paper on their work published in the journal Nature, the researchers fed the program tens of thousands of images collected from all over the world, along with the labels identifying the type of cancer they showed, or if they were benign.

“There’s no huge dataset of skin cancer that we can just train our algorithms on, so we had to make our own,” Brett Kuprel, a co-author of the paper said in a Stanford University blog post on the topic. “We gathered images from the internet and worked with the medical school to create a nice taxonomy out of data that was very messy — the labels alone were in several languages, including German, Arabic and Latin.”

“on par with all tested experts”

The team ended up with a database of 129,450 images spanning 2,032 different diseases. The deep neural network then scanned these pixel by pixel, looking for the characteristics common to each diagnosis. By the end of its training period, the network was able to identify diseases “on par with all tested experts,” say the researchers. With melanomas, for example, the human dermatologists correctly identified 95 percent of malignant lesions and 76 percent of benign moles. In the same tests, the AI was correct 96 percent of the time for the malignant samples, and 90 percent of the time for harmless lesions.

The Stanford team say the aim of developing their program is not to replace human dermatologists, but to offer people an inexpensive option for early screening. The hope is that a more advanced version of the algorithm can be turned into an app and used at home. However, this would take additional training for the AI (it’s used to working with high-quality medical images — not the sort of shots that would be produced on a smartphone) and more rigorous assessments of its safety would need to be made before such a program could go public.