Hear directly from Google’s product and engineering teams on the development of the latest Pixel devices.
Earbuds demand a lot. High performance audio, Active Noise Cancellation, and increasing amounts of machine learning, all in a tiny package. Our biggest challenge was maximizing all of these without compromise. To do this, we first defined the ideal user experience, then created the algorithms and systems around that, which we then translated into the final product. It was a long journey, but customizing every part of the system was the best path to optimizing the product.
One of the more challenging components of Audio Magic Eraser is the sound separation engine, which allows sounds to be pulled into individual tracks, processed individually, and remixed together to make the final audio. There were many early iterations of the system that were either too slow, or where we processed too quickly and crashed different processing cores. We had to pivot numerous times to find the right balance that we have today.
I listen to people who are excited about our devices, and I read reviews from people who are critical of our products or our competitors. I look at development opportunities and think, what problem might we solve for our customers? What could we give them that would be exciting in their daily lives? All this helps me create a mental model about how technology is perceived today, and opportunities that might exist in the future.
Working on the integration of Gemini Live for Pixel Buds Pro 2 offered a glimpse of what we might accomplish by combining this sort of technology with custom hardware. Google is in a very unique position as we create advanced AI like Gemini, and can deploy it to internally developed hardware that contains custom silicon. I believe we might be the only company in the world that can do all of those things simultaneously.
I use Gemini a lot. I often wonder ‘Would this work?’ Then I test it. One of the most frequent use cases for me is to revise my writing. I don’t always take its suggestions, but using Gemini as a critic-in-a-box can help me see another perspective. I'll give it some text, then ask it to make suggestions or read it through the eyes of a specific audience. This can help see what I might have missed, or taken for granted.
I’ve used Gemini as an empathy machine, which helps put myself in the shoes of someone else. It can be really interesting to get inside the head of a specific character and try to see things from their perspective. For example, I wrote ‘You're an English pirate in 1712, sailing in the Caribbean; what are your key concerns? Write in second person, in a style appropriate to a pirate.’
As part of a hackathon, I built a choose-your-own-adventure game that generated and illustrated a story you could play through. That was super fun, and I was surprised how well it worked. This used the Gemini and Imagen3 APIs, so it was a bunch of prompts chaining into prompts, but it was great validation of how Gemini can help create custom experiences.
I'm personally super excited about the launch of Imagen3, our highest quality text-to-image model for generating incredibly detailed images. It does a great job of adhering to your prompt, so you get pretty much what you ask for. I can't wait to see what people create when we get it into the hands of users.
Michael Specht: There is no one setting that results in ‘the best picture.’ The Pixel Camera can produce amazing photos with no extra settings. Just point and shoot. However, our goal is to let users add their own touch. With Pixel 8 Pro we added Pro Controls, which let users adjust even more settings and achieve a personal aesthetic. Another of our favorite controls is brightness. This lets you take a scene like a cat lying in a ray of sunlight, then create a dramatic image that isolates the cat in a sea of black. This almost gives the illusion that the cat had a full-on studio session with a black backdrop and strobes!
Isaac Reynolds: For us, the challenge is testing. Every year we test every scene, from sunlit to backlit to nighttime–on every camera–on every device. It’s how we ensure you get great photos, even for those point-and-click moments. And new innovations are always helping with this. Just this year we used AI to detect faces and make them look more natural and consistent, especially when zoomed in on Pixel 8. But it took a lot of testing to make sure you always look like you.
Navin Sarma: We see AI as a means to help users get better results and become better photographers. For example, AI can suggest a mode that might be useful for framing a specific scene. Or it can perform an edit that wasn’t possible during capture, such as unblurring or removing distractions. We’re also looking to use AI to enable creativity for our users so they can do things like select a one-tap preset to make a photo look more compelling. AI remains an important area for us as we help our users get the most out of their photography journey.
Isaac Reynolds: Fun fact! The folks who build roadmaps and features and go to launch events have remained overwhelmingly the same since the first Pixel. We’ve of course added many new friends and leaders, but there is a core group of longtime folks. When you look at a Pixel Camera, you’re not just looking at Google’s interpretation of a camera, but also the interpretation built and shared by the folks who have been here for so long. I hope you love it!
I think my favorite will always be the generation we’ve just launched! In many ways Pixel is like a concentrated version of Google that you can hold in your hands. So each version is developed to bring the latest advances in Google’s AI technology, along with software and hardware innovation, to make your life easier and better. The newest capabilities included for our users in Pixel 8 and Pixel 8 Pro with Tensor G3 are a great example of this in action. When I reflect back on our journey so far with Pixel, one generation that stands out is Pixel 6, when we first introduced Tensor, our mobile SOC. It was an important milestone that helped build the foundation of where we’re heading with Pixel and how we can continue to deliver Google’s cutting-edge technology for our users. From the start, we’ve envisioned Pixel as the AI-centric mobile computer. Tensor helps unlock a lot of important capabilities as part of this vision.
I love using the new Best Take feature in Pixel Camera — it might be the only time I’ve been able to get my whole family smiling in one photo! Magic Eraser has always been one of my favorite things to show people who haven’t tried it out yet, because the reactions are so much fun to see and now I get similar reactions with the new Magic Editor. We also just announced some terrific new computational audio capabilities with Audio Magic Eraser on Pixel 8, which can help reduce distracting sounds in your videos.
I really appreciate some of our newest features that make information more accessible, especially when things are busy. For example, with Pixel 8, Google Assistant can summarize articles for me — or even translate and read them out loud, which is great for when I’m on the go.
There are some terrific features that I love to use on my Pixel regularly — like Assistant voice typing. It lets you use your voice to type, send messages or insert emojis, on average 2.5X faster than typing on the keyboard. Using our Pixel products together as a portfolio can also really enhance your experience. For example, we just announced our biggest update to Pixel Buds Pro, including better audio quality — and the ability to automatically pause your music if you start talking to someone. Plus, you can use your phone to find your Pixel Buds if you misplace them. Pixel Watch 2 includes the best heart rate tracking on any of our wearable devices, and there are some great new coaching features to help you stay on pace during a workout. And as Google’s AI gets better, our portfolio of devices keeps getting better too.
In general I’ve been fascinated by working on consumer electronics products like phones because they are such a complicated and important piece of technology in people’s daily lives. I’ve worked on other things but have always come back to this space, because it feels like there is so much potential to help people. I love working on Pixel because it brings together many different elements of Google’s innovation into one package for our users. I can’t wait for you to see how we’re building the Pixel portfolio with Google’s AI advances. We’re so grateful for the role that our Pixel Superfans play in this journey — thank you again for all you do to inspire us and the feedback you provide along the way.
Caity: I’m especially excited about bringing together Fitbit and Google’s machine learning expertise. As you know, having an accurate heart rate throughout your day – not just when you’re exercising – gives you a really good insight into your health and fitness.1
The watch brings the best of Google to your wrist, in an LTE2 connected smartwatch, which means that you can track your runs,3 buy your groceries with Google Pay,4 do turn-by-turn directions,2 and we’ve added interactive maps modes. You can actually see where you are in the map on your wrist, and you can pan through to see what turns you need to take.
Caity: Well, we designed this watch during a pandemic. Engineers and designers really like to be in the room together when designing beautiful hardware products, but we didn’t have that opportunity. So we had a lot of meetings over video conferences where we got very good at drawing pictures and then holding them up to the camera. We even had to pass parts around to each other’s homes.
Also, as something that you wear on your wrist, you want to make sure it’s comfortable. So we sent lots of watches and prototypes out to our testers at their homes to wear. We’re all very glad that we get to be back in person again.
DeCarlos: Readiness and HRV tracking. There’s a lot of work around providing data and motivating people to do more work. But we know that if we want to see improvements, rest is just as important. Knowing how to optimize your routine, when to turn up the dial or when to rest or when to make changes in your daily life, are possible with features like Readiness and HRV.
DeCarlos: Technology has always been a way to solve problems and to help people achieve goals. In the space of wearables we’re excited about creating products that help people live healthier and more productive lives. As technology continues to improve, we’re able to do things that were once only possible in clinical settings and sports science labs. We’re bringing those right to your wrist.
Xinxing: That varies based on how many TPU (Tensor Processing Unit) cores we use and how large the model is. The Google Translate team constantly improves the models’ quality by finding more and higher-quality training data, tuning the model parameters, and experimenting with new technologies.
Xinxing: There are a lot of translations on the open web. The Google Translate team can mine the web data to find those translations, and use them to train the machine translation model. More and more data are created everyday, and we keep improving how effectively we can find useful data online to help make improvements.
Vince: Everyone on the team, from engineer to designer, speaks more than one language. Our conversation designer even speaks over a dozen, and tests everything himself.
Vince: Recently, I’ve been touched by hearing how Live Translate helps those fleeing crises and bringing people together, like what’s happened in Ukraine.
Alan: We live in an exciting age of technology, where each decade brings more big changes from the previous one. We look forward to what the next few years will bring for Live Translate and the translation landscape!
Nidhi: They’re the first Pixel Buds with Active Noise Cancellation. To make ANC effective, you need to process outside sound faster than it reaches your eardrum, and cancel it with extremely low latency. To achieve this we’re using a custom 6-core audio chip. Additionally, how well the eartip seals in your ear canal can make a big difference in how much noise gets canceled. Our Silent Seal technology compensates for audio leakage, helping to maximize the amount of noise that’s being canceled for a blank canvas with no distractions, so you can zone into your music and silence everything else.
Nidhi: I’m excited about being able to leave my window open for a breeze this summer while canceling out the traffic outside.
Laura: I’m looking forward to time to concentrate in loud surroundings. There are so many distractions in a day, so I’m looking forward to finding balance. While I’m tuning out, I’ll let Google help let me know when to tune in and switch to Transparency mode to have quick conversations.
Laura: Testing and refining details, including optimizing the snout and cap angles, was a major focus to ensure sound quality and comfort. The team was determined to get it right, so there must have been over 1,000 prototypes throughout development.
Laura: People don’t realize that your ear is as unique as a fingerprint, and the inner architecture of your left ear can differ from your right ear. Every ear is different, so we recommend trying different sized eartips in each ear until you get the right fit. Pixel Buds Pro offer an eartip seal test – we’ll play music for a few seconds and then recommend which eartip size might work best for you.
Nidhi: I’m looking forward to a future where mundane tasks are sped up and automated, so that I can have more time to do the things I love with the people I love.
Laura: Technology is going to get even more helpful, accessible, and inclusive as time goes by, and I can’t wait to see the ease and joy that brings currently underserved communities.
Works with most phones running Android 8.0 or newer. Requires a Google Account and internet access. Paid subscription required for some features. See g.co/pixelwatch/specs for technical and device specifications.
Data rates may apply. Requires compatible 4G LTE wireless service plan (sold separately). Google Pixel Watch and paired phone must use the same carrier network. Contact carrier for full details. See g.co/pixelwatch/networkinfo for more information.
Some features may require Fitbit account and mobile app.
Google Pay is not available in all countries or languages. Data rates may apply.