The saying “the camera cannot lie” is almost as old as photography itself. But it’s never actually been true.
Skin tone bias in artificial intelligence and cameras is well documented. And the effects of this bias are real: Too often, folks with darker complexions are rendered unnaturally – appearing darker, brighter, or more washed out than they do in real life. In other words, they’re not being photographed authentically.
That’s why Google has been working on Real Tone. The goal is to bring accuracy to cameras and the images they produce. Real Tone is not a single technology, but rather a broad approach and commitment – in partnership with image experts who have spent their careers beautifully and accurately representing communities of colour – that has resulted in tuning computational photography algorithms to better reflect the nuances of diverse skin tones.
It’s an ongoing process. Lately, we’ve been making improvements to better identify faces in low-light conditions. And with Pixel 7, we’ll use a new colour scale that better reflects the full range of skin tones.
Photographing skin tones accurately isn’t a trivial issue. Some of the most important and most frequently appearing subjects in images are people. Representing them properly is important.
“Photos are symbols of what and who matter to us collectively,” says Florian Koenigsberger, Google Image Equity Lead, “so it’s critical that they work equitably for everyone – especially for communities of colour like mine, who haven’t always been seen fairly by these tools.”
Real Tone isn’t just about making photographs look better. It has an impact on the way stories are told and how people see the world – and one another. Real Tone represents an exciting first step for Google in acknowledging that cameras have historically centred on light skin. It’s a bias that’s crept into many of our modern digital imaging products and algorithms, especially because there hasn’t been enough diversity in the groups of people they’re tested with.
This racial bias in camera technology has its roots in the way it was developed and tested. Camera sensors, processing algorithms, and editing software were all trained largely on data sets that had light skin as the baseline, with a limited scale of skin tones taken into account. This bias has lingered for decades, because these technologies were never adjusted for those with darker skin.
In 2020, Google began looking for ways to change this, with the goal of making photography more fair. We made a number of improvements, notably to the way skin tones are represented. We expanded that program again in 2021 to introduce even more enhancements across exposure, colour, tone-mapping, face detection, face retouching, and more, in many apps and camera modes.
It was a group effort, because building better tools for a community works best when they’re built with the community.
“We continue to work with image experts – like photographers, cinematographers, and colourists – who are celebrated for their beautiful and accurate imagery of communities of colour,” says Koenigsberger. “We ask them to test our cameras in a wide range of tough lighting conditions. In the process, they take thousands of portraits that make our image datasets 25 times more diverse – to look more like the world around us.”
The result of that collaboration isn’t an app or a single technology, but a framework that we’re committing to over many years, across all of our imaging products. Real Tone is a collection of improvements that are part of Google’s Image Equity Initiative, which is focused on building
Recognise a broader set of faces. Detecting a person’s face is a key part of getting a great photo that’s in focus. We’ve diversified the images we use to train our models to find faces more successfully, regardless of skin tone and in a variety of lighting conditions.
Correctly represent skin colour in pictures. Automatic white balance is a standard camera feature that helps set the way colours appear in an image. We worked with various partners to improve the way white balance works to better reflect a variety of skin tones.
Make skin brightness appear more natural. Similarly, automatic exposure is used to determine how bright a photograph appears. Our goal with Real Tone was to ensure that people of colour do not appear unnaturally darker or brighter, but rather exactly as they really are.
Reduce washed-out images. Stray light in a photo setting can wash out any image – such as when you are taking a picture with a sunlit window directly behind the subject. This effect is even greater for people with darker skin tones, often leaving their faces in complete shadow. A new algorithm Google developed aims to reduce the impact of stray light on finished images.
Sharpen images even in low light. We discovered that photos of people with darker skin tones tend to be more blurry than normal in low light conditions. We leveraged the AI features in the Google Tensor chip to sharpen photos, even when the photographer’s hand isn’t steady and when there isn’t a lot of light available.
More ways to tune up any photo. The auto-enhancement feature in Google Photos works with uploaded photos that were taken any time and with any camera, not just Pixel. It optimises colour and lighting in any picture, across all skin tones.
Collectively, all of these improvements work together to make every picture more authentic and more representative of the subject of the photo – regardless of their skin tone.
This is an area we’ve been focused on improving for some time, and we still have a lot of work to do. Making photography more inclusive isn’t a problem that’s solved in one go – rather, it’s a mission we’re committed to. We’re now looking at images from end to end, from the moment the photo is taken to how it shows up in other products. This will have ongoing implications for multiple products as we continue to improve.
Recently, Google partnered with Dr. Ellis Monk, a Harvard professor and sociologist who for more than a decade has studied the way skin tone impacts people’s lives. The culmination of Dr. Monk’s research is the
“In our research, we found that a lot of the time people feel they’re lumped into racial categories, but there’s all this heterogeneity with ethnic and racial categories,”
Inclusive photography is a work in progress. We’ll continue to partner with experts, listen to feedback, and invest in tools and experiences that work for everyone. Because everyone deserves to be seen as they are.