Google is improving the performance of skin tones in all products to implement the concept of image fairness

Skin color represents who you are is one of those things that most people experience, and while that might not sound like a big deal, it’s fair to say that a lot of people feel excluded because of their skin color.

Too often, the camera that takes an image doesn’t capture skin tones correctly, and Google is looking to change that. Last year, Google announced Real Tone for Pixel, just one example of the company’s efforts in this regard.

join us on telegram

Today, Google decided to introduce a new step in its commitment to “image fairness” and to improve representation across all of its products. Google has teamed up with Harvard professional and sociologist Dr. Ellis Munch, and the company is releasing a new skin tone scale that aims to be more inclusive of the various skin tones we see in our daily lives.

The Munch Skin Tone Scale will revolutionize the way different skin tones are represented, thanks to Google, it was designed to have an easy-to-use technique developed and evaluated, Google calls it the Munch Screen Tone Scale, which you can see below one time:

Updating our approach to skin tones can help us better understand representation in images, as well as assess whether a product or feature works well across a range of skin tones. This is especially important for computer vision, the artificial intelligence that lets computers see and understand images. If not intentionally built and tested to include a wide range of skin tones, computer vision systems have been found to not perform well enough for dark-skinned people.

Google says that quantifying skin tone will help us and the entire tech industry build more representative datasets so we can train and evaluate the fairness of AI models, resulting in features and products that work better for everyone — all kinds of colors. For example, we use this scale to evaluate and improve models that detect faces in images.

Leave a Comment