Computer scientists from the University of Stanford have created an artificially intelligent diagnosis algorithm for skin cancer. Using the essence of machine learning and combining visual processing with deep learning, they have created a computer trained to detect the melanoma in its early state.
Researchers have used a database of nearly 130,000 skin disease images and trained their algorithm to visually diagnose potential cancer. From the very first test, it performed with inspiring accuracy, it is stated in the press release.
“There’s no huge dataset of skin cancer that we can just train our algorithms on, so we had to make our own,” said Brett Kuprel, co-lead author of the paper and a graduate student in the Thrun lab. “We gathered images from the internet and worked with the medical school to create a nice taxonomy out of data that was very messy – the labels alone were in several languages, including German, Arabic, and Latin.”
With the assistance of dermatologists at Stanford Medicine and as well as Helen M. Blau, professor of microbiology and immunology at Stanford and co-author of the paper, the team has made a database of about 130,000 images of skin lesions representing over 2,000 different diseases.
About 5.4 million people are diagnosed with the skin cancer in the US every year. The five-year survival rate for melanoma detected in its earliest states is around 97%. But, if melanoma is detected in its latest stages, the rate drops to approximately 14%. The early detection actually has an enormous impact on skin cancer outcomes. The process of detection begins with the visual examination. This initial part of the treatment could be done by the computer able to visually detect the melanoma.
“Advances in the computer-aided classification of benign versus malignant skin lesions could greatly assist dermatologists in improved diagnosis for challenging lesions and provide better management options for patients,” said Susan Swetter, professor of dermatology and director of the Pigmented Lesion and Melanoma Program at the Stanford Cancer Institute, and co-author of the paper.
The team hopes the algorithm will be compatible with the smartphones in the near future.
“My main eureka moment was when I realized just how ubiquitous smartphones will be,” said Andre Esteva, co-lead author of the paper and a graduate student in the Thrun lab. “Everyone will have a supercomputer in their pockets with a number of sensors in it, including a camera. What if we could use it to visually screen for skin cancer? Or other ailments?”
The algorithm currently exists only on a computer, but the team believes it is relatively easy to transition the algorithm to mobile devices.
The findings are published in the Jan. 25 issue of Nature.