How Real Tone helps make a more equitable camera.
Camera technology has long misrepresented darker skin tones. Real Tone is one effort to make photography more inclusive.

The saying “the camera cannot lie” is almost as old as photography itself. But it’s never actually been true.

Skin tone bias in artificial intelligence and cameras is well documented. And the effects of this bias are real: Too often, folks with darker complexions are rendered unnaturally – appearing darker, brighter, or more washed out than they do in real life. In other words, they’re not being photographed authentically.

That’s why Google has been working on Real Tone. The goal is to bring accuracy to cameras and the images they produce. Real Tone is not a single technology, but rather a broad approach and commitment – in partnership with image experts who have spent their careers beautifully and accurately representing communities of color – that has resulted in tuning computational photography algorithms to better reflect the nuances of diverse skin tones.

It’s an ongoing process. Lately, we’ve been making improvements to better identify faces in low-light conditions. And with Pixel 7, we’ll use a new color scale that better reflects the full range of skin tones.

Inclusive photography in a world of faces

Photographing skin tones accurately isn’t a trivial issue. Some of the most important and most frequently appearing subjects in images are people. Representing them properly is important. 

“Photos are symbols of what and who matter to us collectively,” says Florian Koenigsberger, Google Image Equity Lead, “so it’s critical that they work equitably for everyone – especially for communities of color like mine, who haven’t always been seen fairly by these tools.”

Real Tone isn’t just about making photographs look better. It has an impact on the way stories are told and how people see the world – and one another. Real Tone represents an exciting first step for Google in acknowledging that cameras have historically centered on light skin. It’s a bias that’s crept into many of our modern digital imaging products and algorithms, especially because there hasn’t been enough diversity in the groups of people they’re tested with.

Pixel 7 Pro
The all-pro Google phone.
Florian Koenigsberger, Google Image Equity Lead It’s critical that cameras work equitably for everyone – especially for communities of color like mine, who haven’t always been seen fairly by these tools.Why many smartphone cameras struggle to photograph darker skin tones well

This racial bias in camera technology has its roots in the way it was developed and tested. Camera sensors, processing algorithms, and editing software were all trained largely on data sets that had light skin as the baseline, with a limited scale of skin tones taken into account. This bias has lingered for decades, because these technologies were never adjusted for those with darker skin. 

In 2020, Google began looking for ways to change this, with the goal of making photography more fair. We made a number of improvements, notably to the way skin tones are represented. We expanded that program again in 2021 to introduce even more enhancements across exposure, color, tone-mapping, face detection, face retouching, and more, in many apps and camera modes.

It was a group effort, because building better tools for a community works best when they’re built with the community.

“We continue to work with image experts – like photographers, cinematographers, and colorists – who are celebrated for their beautiful and accurate imagery of communities of color,” says Koenigsberger. “We ask them to test our cameras in a wide range of tough lighting conditions. In the process, they take thousands of portraits that make our image datasets 25 times more diverse – to look more like the world around us.”

How Real Tone helps

The result of that collaboration isn’t an app or a single technology, but a framework that we’re committing to over many years, across all of our imaging products. Real Tone is a collection of improvements that are part of Google’s Image Equity Initiative, which is focused on building camera and imaging products that work equitably for people of color, so that everyone feels seen, no matter their skin tone. Here are some of the things we’re doing to help make pictures more authentic:

  • Recognize a broader set of faces. Detecting a person’s face is a key part of getting a great photo that’s in focus. We’ve diversified the images we use to train our models to find faces more successfully, regardless of skin tone and in a variety of lighting conditions.

  • Correctly represent skin color in pictures. Automatic white balance is a standard camera feature that helps set the way colors appear in an image. We worked with various partners to improve the way white balance works to better reflect a variety of skin tones.

  • Make skin brightness appear more natural. Similarly, automatic exposure is used to determine how bright a photograph appears. Our goal with Real Tone was to ensure that people of color do not appear unnaturally darker or brighter, but rather exactly as they really are.

  • Reduce washed-out images. Stray light in a photo setting can wash out any image – such as when you are taking a picture with a sunlit window directly behind the subject. This effect is even greater for people with darker skin tones, often leaving their faces in complete shadow. A new algorithm Google developed aims to reduce the impact of stray light on finished images.

  • Sharpen blurry images even in low light. We discovered that photos of people with darker skin tones tend to be more blurry than normal in low light conditions. We leveraged the AI features in the Google Tensor chip to sharpen photos, even when the photographer’s hand isn’t steady and when there isn’t a lot of light available.

  • More ways to tune up any photo. The auto-enhancement feature in Google Photos works with uploaded photos that were taken any time and with any camera, not just Pixel. It optimizes color and lighting in any picture, across all skin tones.

Collectively, all of these improvements work together to make every picture more authentic and more representative of the subject of the photo – regardless of their skin tone.

Real Tone is a work in progress

This is an area we’ve been focused on improving for some time, and we still have a lot of work to do. Making photography more inclusive isn’t a problem that’s solved in one go – rather, it’s a mission we’re committed to. We’re now looking at images from end to end, from the moment the photo is taken to how it shows up in other products. This will have ongoing implications for multiple products as we continue to improve.

Recently, Google partnered with Dr. Ellis Monk, a Harvard professor and sociologist who for more than a decade has studied the way skin tone impacts people’s lives. The culmination of Dr. Monk’s research is the Monk Skin Tone (MST) Scale, a 10-shade scale that has been incorporated into various Google products. The scale has also been made public, so anyone across the industry can use it for research and product development. We see this as a chance to share, learn, and evolve our work with the help of others, leading to more accurate algorithms, a better way to determine representative datasets, and more granular frameworks for our experts to provide feedback.

“In our research, we found that a lot of the time people feel they’re lumped into racial categories, but there’s all this heterogeneity with ethnic and racial categories,” Dr. Monk has said. “And many methods of categorization, including past skin tone scales, don’t pay attention to this diversity. That’s where a lack of representation can happen… we need to fine-tune the way we measure things, so people feel represented.“

Inclusive photography is a work in progress. We’ll continue to partner with experts, listen to feedback, and invest in tools and experiences that work for everyone. Because everyone deserves to be seen as they are.

Takeaways
Real Tone is a part of Google’s product inclusion and equity efforts to address skin tone bias in camera technologyGoogle worked with the community and used more images of people of color to train Pixel’s cameraBy optimizing settings, people with darker skin tones won’t appear darker or brighter, but exactly as they really are
Pixel 7 Pro
The all-pro Google phone.
Related products
Related stories
How an AI camera can make everyone’s photos better. How Google Tensor powers up Pixel phones. Take even better sunset photos.
Share this article
Read on
How an AI camera can make everyone’s photos better.
Sports fans, make sure Pixel is on your team at the game.