The saying “the camera cannot lie” is almost as old as photography itself. But it’s never actually been true.
Skin tone bias in artificial intelligence and cameras is well documented. And the effects of this bias are real: often resulting in darker complexions appearing unnaturally darker, brighter, or washed out. In other words, they’re not being photographed authentically.
That's why Google introduced Real Tone. It’s a comprehensive approach, developed in partnership with image experts who’ve dedicated their careers to accurately representing communities of colour. By tuning computational photography algorithms, Real Tone aims to better reflect diverse skin tones.
This work is ongoing. Recently, we've focused on improving face detection in low-light conditions and using a colour scale on Google Pixel that truly captures the full range of skin tones.
Some of the most important and most frequently photographed subjects in images are people. Representing them properly is important.
“Photos are symbols of what and who matter to us collectively,” says Florian Koenigsberger, Google Image Equity Lead, “so it’s critical that they work equitably for everyone – especially for communities of colour like mine, who haven’t always been seen fairly by these tools.”
Real Tone isn’t just about better photos; it shapes how stories are told and how we see each other. It’s an exciting first step for Google in recognising that cameras have long favoured light skin, a bias that’s crept into modern imaging products due to a lack of diversity in testing. By addressing this, Real Tone aims to bring authentic representation to everyone.
Early sensors, algorithms, and editing tools were trained mostly on data sets with light skin as the baseline, neglecting a broader range of skin tones. This oversight persisted for decades because adjustments weren’t made for those with darker skin.
In 2020, Google set out to change this, aiming to make photography fairer. Improvements focused on better skin tone representation, and in 2021, the effort expanded to enhance exposure, colour, tone-mapping, face detection, and face retouching across various apps and camera modes.
It’s been a collaborative effort. “We continue working with image experts, photographers, cinematographers, and colourists, celebrated for their accurate imagery of communities of colour,” says Koenigsberger. “
They test our cameras in tough lighting, taking thousands of portraits, making our datasets 25 times more diverse - to look more like the world around us.”
The result of that collaboration isn’t an app or a single technology, but a framework that we’re committing to over many years, across all of our imaging products.
Recognise more faces. Detecting faces is crucial for a sharp photo. We've diversified the images used to train our models to identify faces accurately, regardless of skin tone or lighting.
Accurately represent skin colour. Automatic white balance sets the colours in an image. We’ve worked with partners to improve this, ensuring a variety of skin tones are represented correctly.
Make skin brightness natural. Automatic exposure controls brightness. With Real Tone, we’ve aimed to make sure people of colour don’t appear unnaturally dark or bright but look just as they are.
Reduce washed-out images. Stray light can wash out photos, especially for darker skin tones. Our new algorithm reduces this effect, even when there’s a sunlit window in the background.
Sharpen images in low light. Photos of darker skin tones can be blurrier in low light. Using the AI in the Google Tensor chip, we sharpen images, even with shaky hands or low light.
Tune up any photo. The auto-enhance feature in Google Photos works with any uploaded photo or camera, not just Pixel, optimising colour and lighting for all skin tones.
Collectively, all of these improvements work together to make every picture more authentic and more representative of the subject of the photo – regardless of their skin tone.
This is an area we’ve been focused on improving for some time, and we still have a lot of work to do. Making photography more inclusive isn’t a problem that’s solved in one go – rather, it’s a mission we’re committed to. We're looking at every step, from taking the photo to how it appears across different products.
Recently, we partnered with
The scale has also been made public, so anyone across the industry can use it for research and product development. We see this as a chance to share, learn, and evolve our work with the help of others, leading to more accurate algorithms, a better way to determine representative datasets, and more granular frameworks for our experts to provide feedback.
“In our research, we found that a lot of the time people feel they’re lumped into racial categories, but there’s all this heterogeneity with ethnic and racial categories,”
Inclusive photography is a work in progress. We’ll continue to partner with experts, listen to feedback, and invest in tools and experiences that work for everyone. Because everyone deserves to be seen as they are.