Google Lens is now able to offer more information on that pesky rash that you’re not sure whether to worry about. In a blog post published this week, Google outlined how the Lens image search feature built into its apps on iOS and Android can “search for skin conditions” like an “odd mole or rash.” It’ll also work on other parts of your body if you want more information about a bump on your lip, line on a nail, or hair loss from your scalp.
“Just take a picture or upload a photo through Lens, and you’ll find visual matches to inform your search,” the blog post reads. Crucially, however, Google specifically warns that results are “informational only and not a diagnosis” and says users should “consult your medical authority for advice.”
Google says Lens can identify skin conditions from a photograph.Image: Google
Google has been exploring the use of AI image recognition for skin conditions for years. At its I/O developer conference in 2021 the company previewed a tool that attempted to identify skin, hair, and nail conditions using a combination of photos and survey responses. At the time Google said the tool could recognize 288 different conditions, and would present the correct condition in the top three suggestions 84 percent of the time.
That’s all well and good, but that won’t prevent people from trying to use tools like these for diagnosis. Arguably, adding that sort of disclaimer only shifts liability onto the user, while letting Google still offer the same underlying service.
There’s good reason, too, to be cautious about AI diagnostic tools. One persistent criticism when it comes to identifying skin conditions, is that such software is less accurate for users with darker skin tones. Research cited by The Guardian in 2021 noted a lack of skin type category data across many freely available image databases used to train AI systems, and a lack of images of dark skinned individuals in databases that did include this information.
The company has also suggested in 2021 that its deep learning system was actually more accurate at identifying skin conditions for Black patients. In slides provided by Google to Motherboard, the company said its system had an accuracy rate of 87.9 percent for Black patients, higher than other ethnicities.