Early detection is key to surviving melanoma, a type of malignant tumor responsible for more than 70% of skin-cancer-related deaths worldwide, but “suspicious pigmented skin lesions” (SPLs) are so common it’s impractical for doctors to check them all out. Now MIT researchers have developed a tool that can analyze skin photos taken with a smartphone to determine which SPLs should be evaluated by a dermatologist.
The researchers, who include professors Martha Gray, SM ’81, PhD ’86, James Collins, and Regina Barzilay and postdoc Luis Soenksen, PhD ’20, made use of deep convolutional neural networks, machine-learning algorithms often used to classify images.
The team had dermatologists visually classify the lesions in 20,388 images from 133 patients at the Hospital Gregorio Marañón in Madrid and a number of publicly available images. The system was trained on 80% of those images and tested with the rest. It distinguished more than 90.3% of SPLs from nonsuspicious lesions, skin, and complex backgrounds. It also was able to classify the level of suspiciousness.
“Our research suggests that systems leveraging computer vision and deep neural networks, quantifying such common signs, can achieve comparable accuracy to expert dermatologists,” Soenksen says. The screenings could be done during routine primary care visits, or even by patients themselves.
Keep Reading
Most Popular
How scientists traced a mysterious covid case back to six toilets
When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.
The problem with plug-in hybrids? Their drivers.
Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.
Sam Altman says helpful agents are poised to become AI’s killer function
Open AI’s CEO says we won’t need new hardware or lots more training data to get there.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.