Google says its Lens image search can now help identify skin conditions

Google Lens is now able to offer more information on that pesky rash that you’re not sure whether to worry about. In a blog post published this week, Google outlined how the Lens image search feature built into its apps on iOS and Android can “search for skin conditions” like an “odd mole or rash.” It’ll also work on other parts of your body if you want more information about a bump on your lip, line on a nail, or hair loss from your scalp. 

“Just take a picture or upload a photo through Lens, and you’ll find visual matches to inform your search,” the blog post reads. Crucially, however, Google specifically warns that results are “informational only and not a diagnosis” and says users should “consult your medical authority for advice.” The feature is available to everyone in the US, across all languages, Google spokesperson Craig Ewer confirmed to The Verge.

Google says Lens can identify skin conditions from a photograph.
Image: Google

Google has been exploring the use of AI image recognition for skin conditions for years. At its I/O developer conference in 2021 the company previewed a tool that attempted to identify skin, hair, and nail conditions using a combination of photos and survey responses. At the time Google said the tool could recognize 288 different conditions, and would present the correct condition in the top three suggestions 84 percent of the time.

That’s all well and good, but that won’t prevent people from trying to use tools like these for diagnosis. Arguably, adding that sort of disclaimer only shifts liability onto the user, while letting Google still offer the same underlying service.

There’s good reason, too, to be cautious about AI diagnostic tools. One persistent criticism when it comes to identifying skin conditions, is that such software is less accurate for users with darker skin tones. Research cited by The Guardian in 2021 noted a lack of skin type category data across many freely available image databases used to train AI systems, and a lack of images of dark skinned individuals in databases that did include this information.

The company has also suggested in 2021 that its deep learning system was actually more accurate at identifying skin conditions for Black patients. In slides provided by Google to Motherboard, the company said its system had an accuracy rate of 87.9 percent for Black patients, higher than other ethnicities.

In response to The Verge’s questions about how well the feature works across different skin tones, Google spokesperson Craig Ewer said the company has attempting to build the feature in an equitable way by working with organizations and clinicians that serve patients from “diverse backgrounds.” He added that the company worked with dermatologists who are experts in different skin tones to curate thumbnail images.

Update June 16th, 3:15AM ET: Updated with comment from Google.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Secular Times is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – seculartimes.com. The content will be deleted within 24 hours.

Leave a Comment