Google Tensor, the custom-built chip that powers Pixel phones, was years in the making.
And now it helps to push the limits of helpfulness in a smartphone. Google Tensor enables new experiences that require state-of-the-art machine learning, including Assistant voice typing, Live Translate,1 Motion Mode, and Face Unblur. And it allows Pixel to adapt to you, accommodating the different ways we use our phones.
But this secret ingredient of Pixel phones – what makes them more powerful and helpful2 – didn’t happen overnight.
A few years ago, Google’s researchers came together across hardware, software, and machine learning to build the best mobile computer chip. It had to realize a vision of what should be possible on Pixel smartphones. The result was
While all advanced smartphones come equipped with powerful processors, Google Tensor stands out because it is focused on delivering complex artificial intelligence and machine learning processes on the phone itself instead of sending data to the cloud, essentially bringing cloud computing to your smartphone. And because these algorithms run on-device rather than through a network and server, they work fast.3
These AI and ML capabilities are becoming “more and more prominent and integrated into our daily lives,” says Nesra Yannier, who researches human-computer interactions at Carnegie Mellon University. As people become used to the benefits of personalized, intuitive technologies, she adds, “it’s becoming something they expect and feel is natural in technology products.”
Pixel phones use artificial intelligence and machine learning powered by Google Tensor to perform increasingly advanced tasks, like instantly translating messages and videos1 without internet,3 or surfacing the content you need when you need it.
And Pixel phones get better with time, becoming more personalized and helpful,4 turning them from a one-size-fits-all piece of hardware into a device that’s more helpful for you. Pixel can distinguish a tree from a person in a photo, or understand if the picture is blurry and in need of a touch-up.
“Your phone needs to understand your world – your context, your unique needs and preferences, how you speak, and what you care about,” said Rick Osterloh, SVP of Google Devices and Services, at the Pixel Fall 2021 Launch, when Pixel 6 phones were unveiled. The phones go a long way toward delivering on that promise:
Take better photos – The most visible example of machine learning in action on your phone can be found in your photos. ML powers advanced photography features that intuitively understand how to get the best picture for you. For example, Pixel 6 phones can detect faces in a shot and adjust the focus and brightness to perfect the exposure. Pixel represents all people and skin tones accurately, with Real Tone. If someone moves, Pixel can fuse multiple shots to remove the blur from a face with Face Unblur,5 and the Magic Eraser feature lets you remove unwanted items from a picture without losing what’s behind them.6 Night Sight combines multiple shots into one frame to optimize for low-light scenery.
Communicate faster – Machine learning recognizes speech and turns it into text through
Live Caption , in real time. Google Tensor also makes real-time language translation possible with Live Translate, either speech-to-text or directly to audio.1Use your phone longer – Google Tensor runs more advanced, state-of-the-art ML models but at lower power consumption compared to previous Pixel phones.4 This helps save power so your battery lasts longer.
Protect your data – There’s a separate core on the Google Tensor chip that’s set apart from the application processor, so sensitive tasks and controls run in an isolated environment, making them even more resilient to attacks. The Tensor Security core works with the next-gen Titan M2™ co-processor, adding another layer to better protect your personal information like texts, photos, sensitive data and more.7
Get more relevant suggestions – Google Tensor helps Pixel surface the content you need, when you need it. Simply glance at Pixel’s always-on display to find out when you need to leave for your next appointment, or see what song is playing with Now Playing song detection.
Pixel phones learn the different ways we use our phones so they can provide the right information at the right time.
For instance, understanding the nuances of human speech is a challenge that’s unique to each person. It means understanding syntax, intent, and the context of your request, and knowing the terms and names that are important to you but potentially uncommon to others. It also calls for understanding your accent and dialect, isolating your voice when there’s background noise, and even hearing you correctly when your mouth is full.
“If these products are invisible and user-friendly and natural, they will become part of the everyday experience,” Yannier says. “That’s how technology should really be.”
Not available in all languages or countries. Not available on all media or apps. See
g.co/pixel/livetranslate for more information.Based on internal benchmark testing on pre-production devices.
Internet connection required during setup only.
Compared to Pixel 5. Based on internal CPU benchmark testing on pre-production devices.
Deblurring may not work on all photos or videos with faces.
Magic Eraser may not work on all image elements.
Compared to earlier models of Pixel phones.