skin cancer and derm apps not trustworthy…yet

3 minute read


AI mobile apps aren’t generally transparent about their algorithm, consistent enough, clinician backed, or regulated.


The quality of dermatology-related AI phone apps is “concerning” according to a review of both Android and Apple apps published in JAMA Dermatology earlier this month. 

While the apps hold promise, “in their current state, they may pose harm due to potential risks, lack of consistent validation, and misleading user communication,” the authors said.  

The US researchers found almost a thousand English-language mobile phone apps promising to identify skin and hair conditions, track moles, detect cancers, and manage acne and dermatitis. 

The problem with the majority of apps they studied, was that they lacked transparency about how they were able to provide advice to the user, and it was impossible to validate the information people were getting from them.  

Of the 41 apps selected for in-depth analysis, a quarter claimed to be able to make a diagnosis. “Notably, none of these had scientific publications as supporting evidence, and [two] lacked warnings in their descriptions cautioning about the potential inaccuracy of results and the absence of formal medical diagnoses,” the authors said. 

Only five apps (12%) were supported by research published in peer-reviewed journals, and only one of these was a prospective multi-centre clinical trial.  

Only six apps (15%) had any information about their data and training sets and this consisted of only “vague details”, the authors said.  

Over half (51%) had no information at all about the algorithm itself. And most had no information about whether a clinician was involved in developing the algorithm, with 16 saying they had dermatologist input and one involving a GP. 

User data privacy and ownership were also matters of concern.  

Only 12 apps (30%) said they didn’t store customer-submitted pictures. Of the 16 apps (39%) who said they did store them, only 12 said they used secure cloud servers.  

Almost half of the total number of apps (46%, 19 apps) didn’t say how they used the submitted images. Of the rest, almost half (49%, 20 apps) said they used them for data analysis for result provision and 29% (12 apps) said they used them for research and app development. 

Only four apps provided users with information about EU or FDA regulatory approval, depending on country of origin. 

Despite the drawbacks, the authors were positive about the potential for apps in improving access and outcomes for consumers with dermatological needs.  

It wouldn’t necessarily take a randomised control trial to assess whether an app was effective, they said, but currently it was not possible to evaluate them at all and that meant the risks outweighed the benefits. 

“App developers should, at a minimum, disclose information on the specific AI algorithms used; the datasets used for training, testing, and/or validation; the extent of clinician input; the existence of supporting publications; the use and handling of user-submitted images; and the implementation of measures to safeguard data privacy,” the authors wrote. 

JAMA Dermatology 2024, online 7 March 

End of content

No more pages to load

Log In Register ×