Skip to main content

Google is trying to make its image processing more inclusive

Google is trying to make its image processing more inclusive

/

It should do a better job rendering dark skin tones and natural hair styles

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Google says its tweaked image processing will avoid over-brightening black and brown faces.
Google says its tweaked image processing will avoid over-brightening black and brown faces.
Image: Google

It’s a long-standing problem that dates back to the days of film: image processing tends to be tuned for lighter skin tones and not that of black and brown subjects. Google announced an effort to address that today in its own camera and imaging products, with a focus on making images of people of color “more beautiful and more accurate.” These changes will come to Google’s own Pixel cameras this fall, and the company says it will share what it learns across the broader Android ecosystem.

Specifically, Google is making changes to its auto-white balance and exposure algorithms to improve accuracy for dark skin tones based on a broader data set of images featuring black and brown faces. With these tweaks, Google aims to avoid over-brightening and de-saturating people of color in photos for more accurate representation.

Google has also made improvements for portrait mode selfies, creating a more accurate depth map for curly and wavy hair types — rather than simply cutting around the subject’s hair.

The company says it still has much to do — and it has certainly stumbled in the past on the image recognition and inclusion front — but it’s a welcome step in the right direction.

Related: