DiscoverAll Things PixelGoogle AIPixel CameraHelp at HomeWellnessPodcastReviews & AwardsGift Guide
Podcasts - Season 4, Episode 4
Pixel Buds Pro: Elevating your audio
Discover new features of Pixel Buds Pro, powered by generative AI to deliver clearer, crisper, and more natural audio¹ – completely hands-free
AI doing sound work

This episode of the Made by Google Podcast walks us through the exciting new features of Pixel Buds Pro, powered by Google AI to deliver intuitive listening. Google product manager Pol Peiffer joins the conversation to talk about the improvements.

Hands-free helpfulness 

Whether it’s juggling car keys, coffee, luggage, you name it, we often find ourselves wishing for an extra set of hands. Enter Conversation Detection. It identifies when you start speaking and responds by pausing your music and switching to Transparency mode, so you can focus on what’s in front of you – all without lifting a finger, much less an extra appendage. Pol explains how Conversation Detection uses AI to determine what’s happening in the world around you. 

Here’s to your ears

Pol also talks about a renewed focus on hearing wellness and helping you protect your hearing long term while enjoying the music you love now. With this update, Pixel Buds Pro learn your listening behaviors and provide suggestions to maintain hearing wellness over time. 

Tune in to the full episode on the Made by Google Podcast today – maybe even listen with your Pixel Buds Pro – to learn more about how new features are elevating your audio.

Transcript

Welcome to the Made by Google podcast, where we meet the people who work on the Google products you love. Here’s your host, Rachid Finge…

Today, we’re talking to Pol Peiffer who works on things like audio wearables!

Rachid: Pol, great to have you on the Made by Google podcast. Tell me what it's like to work on the smallest Pixel product we currently have.

Pol (00:18.078) Hey, thanks. Thanks for having me. Yeah, it's great. I'm Pol. I'm a product manager here at Google. I work on our audio wearables, like the Pixel Buds Pro that we launched last year. And yeah, I think the smallest product, as you mentioned, is also maybe one of the most challenging ones of bringing all of that small hardware and all the software and all the AI into a small form factor into your ear. So it keeps us busy, but it's exciting.

Rachid (00:53.255) Amazing. So you're from Luxembourg, which is down the road from where I am, near Amsterdam, relatively speaking at least. But you're in the U.S. now. So how did you end up there?

Pol (01:02.034) I actually started as an associate product manager intern back in Zurich and, seeing firsthand what Google was up to. I was working on the Assistant back in the day, I was just sure I had to come back.

And so I joined the associate product manager program back in Zurich, spent a year working on travel products like Google flights and putting ticket information on Google search and maps. And then I thought I'd see what it's like at the mothership and move over to the US and join the Pixel teams working on our audio wearables ever since.

Rachid (01:57.72) So what is it that you do on Pixel Buds Pro? And does that mean you're an audiophile as well?

Pol (02:03.358) Yeah, so I'm a product manager on the team. Over the last two years, I've covered a lot of stuff across our hardware and their software, including launching the Pixel Buds Pro last year and been working on a lot of exciting new features that we can talk about more in a bit ever since. Am I an audiophile? I totally am, yes. I have a bunch of cans and headphones. I have played the violin since I was six years old and so I've always been into audio and music so it's a great fit that way.

Rachid (02:39.635) Now, the Pixel Buds Pro got a huge software update this year. I actually don't think I've seen such a comprehensive update for a hearable ever. Could you sort of summarize or tell us what are you most excited about with this huge update?

Pol (02:53.514) Yeah. I think really for the Pixel portfolio in general, we want our customers to buy a device that only gets better over time. And that is, just as true for our phones as it is for earbuds. And so we've had a great launch last year with the Pixel Buds Pro really our most premium Pro product with great audio. The first time we've launched a noise canceling product. And we've been really happy with how it's landed and the reception we've been having.

And so for this year, we were focused on how we can be even more helpful, bringing more of Google's AI out there to the market. And I think we've done that across a few things. We're improving call quality. We can go into more details there to really make them the best out there in the market at calls.

We're adding a cool new feature that lets you leave your Pixel Buds in for even more of the day with conversation detection. We're improving things for gamers by cutting latency up to in half. And we're also adding features around your hearing health and preserving that hearing wellness.

And finally, we're launching two new beautiful colors that will go well with our new Pixel 8 and Pixel 8 Pro phones.

Rachid (04:50.279) So Pol, let's take all the features you mentioned one by one. I thought call quality was already great on Pixel Buds Pro. So what is it you did to make it even better this time?

Pol (05:13.674) Yeah, I think we saw a really good reception of our call quality, even, at our first launch, we were leveraging a lot of ML already on the earbuds to suppress noises; that is machine learning, right? To suppress environmental noises so that you come really across clearly to whoever you're talking to. What we're doing this year to make that even better is we're bringing clear calling onto Pixel Buds Pro.

And then the other feature we're adding that we're super excited about is super wideband calling. So I'm gonna get a bit technical here, but via Bluetooth headphones so far, what you've generally had is wideband calling. And what that means is the human hearing range is about from zero to 20,000 Hertz, and Bluetooth could only transmit about 7,000. So that's wideband speech.

Rachid (06:19.003) Let's try it.

Pol (06:39.978) And that is why, if you're calling on Bluetooth headphones, you can sometimes sound a little bit less natural than you would in person. Super wideband calls double that bandwidth. So you will now get 14,000 hertz. And so that will make you just sound fuller, more natural, you know, voices, like female voices, for example, that maybe have higher frequencies will sound just a lot more natural because they will go into that 7 to 14,000 hertz range.

And that has been a year-long project, very close collaboration with Pixel, with Android, and also with the Bluetooth SIG that we work closely with to push this back into the Bluetooth standard.

Rachid (07:30.855) Amazing. And then that improvement, all you need is just a Pixel Buds Pro and a Pixel phone, and you just have better call quality.

Pol (07:38.102) That's right. We will start rolling this out on Pixel 8 and Pixel 8 Pro.

Rachid (07:43.639) Amazing. Okay. Then something that sounds really useful. And I just want to make sure I get it right. Conversation detection. What does it do? It detects my conversation and then once it detects it, what does it do?

Pol (07:56.862) Yeah, I think we've all been in these moments where you're wearing wireless earbuds and you're walking around the office and somebody comes up to you and says, hi, but you're carrying a cup of coffee and your laptop in the other hand, and things get a little awkward because you tried to take out your earbuds. And I've just had a friend tell me actually that they tried to do this, that they dropped their earbud into the coffee, which is not what you want to happen. So this feature tries to address that.

It lets you leave your earbuds in to have short conversations, be it with a colleague, be it ordering a coffee, saying hi to a neighbor while you're carrying your groceries. And what's going on here under the hood is that we have an ML model that runs all the time and uses the microphones and the IMUs, which are the accelerometers on the earbuds, to detect when you're speaking.

And so it will listen for both your voice coming through the microphones, but also the bone conduction going through your head, shaking that accelerometer just that tiny little bit.

Rachid (09:00.571) Right, because when you're speaking, your head sort of emits micro-vibrations, I suppose, that are then picked up by the accelerometer. Okay, I see.

Pol (09:08.502)

That's right, yes. So we can really make sure that it's you that's speaking and not somebody next to you, right? So we wanna listen to those tiny vibrations that are really your head shaking and your voice doing that. We then stop your music, switch your into transparency mode so you can have your conversation. And then what's really exciting is our model doesn't just listen for your own voice, but it also listens for voices around you.

And so when you're in transparency mode and you're having that conversation, you can have that very natural back and forth. There's no fixed timer that, you know, after X seconds, it will revert back. And so then when you're done with that conversation and you stop talking, your counterparty stop talking, we'll take you back into your music, back into noise cancellation mode or whatever mode you were in before and start playback again.

Rachid (10:03.587) It's a good thing you mentioned, of course. It's not only about you speaking, you need to keep that transparency mode on while the other person is speaking as well. So how do you pull that off? How do you make sure that you understand who the other person is?

Pol (10:19.166) Yeah, I think that's really kudos to our engineering teams here at Google. There's a lot of AI and machine learning magic that goes on in the background. We've done a lot of data collection across various people with various voices in various environments. You can imagine as a feature that is always on, there are many situations that you could be in that could sound like speech, where we obviously wouldn't want to interrupt your music. And so that's really what the feature is.

A lot of last year has been about tuning this so that you can leave it on all the time and it will trigger when you need it and not trigger when you don't want it.

Rachid (11:58.619) So when it comes to conversation detection, I'm curious how you sort of test this. I mean, I guess you could put in a Pixel Buds Pro with the latest software and you go around grocery shopping, you speak to people in the Google office. What's a way to maybe scale that sort of testing?

Pol (12:16.658) Yeah, so we do both, right? So we have very repeatable setups in our lab where we can perfectly control through speakers the environments that you're in. And we have various people with very different voices. You can imagine a female voice will shake the accelerometer less than a male voice that is very rumbly. And so we have a lot of people come in to do recordings and test the software.

And then we also do the second, which is the fun part. We go out in the world and, you know, try it out in various scenarios, note where it works, note where it doesn't collect data and then improve the models there. So it's really an iterative process. A story that I can tell is that, you know, when we first started building this, a lot of our recordings were, you know, from various places, and then we trained the model in the lab.

And what we notice is that in the real world, you actually have a lot of wind. And wind can shake these microphones quite harshly, especially if you're in San Francisco, where it tends to get quite windy at times. And so that's a real example where, we had to collect more data, we drove out to the windiest spots we could find to do recordings. And the model has really improved dramatically where, you know, wind doesn't trigger it at all at this point.

Rachid (13:39.471) That's amazing. That's definitely something you wouldn't have learned had you only test this in a lab situation for sure. So I think, and that's something I learned from an earlier conversation on the pod is when you start working on this feature, you're not sure if it will actually come out because of course it needs to be good enough for all of our users to use. So I'm wondering, do you remember that first time when you were using this feature?

Pol (14:13.906) Yeah, I think the first time I actually had it running on a set of Pixel Buds Pro, we've had it running on other kinds of demo devices before, was probably at a Google Cafe and trying to order that first coffee end to end. I think that is pretty exciting. It's quite a magical thing that these earbuds just feel like they were before, and then you have this new feature on there and they start speaking and something happens.

And so I remember that closely. And yeah, you're absolutely right. And in the beginning, when, especially with these machine learning projects, you never really know how good you can make the quality and, where it's headed and, you run into an issue like this wind thing and you're like seeing, can we solve this, is this a showstopper? So it keeps you on your toes for sure. But we're very happy where we landed now.

Rachid (15:04.711) Amazing. So in that list of updates, you also mentioned latency, which I guess, maybe we should explain a little bit what it is, but it's probably like the biggest enemy of gamers. So tell us a little bit about what is latency and why did you do work to cut down on latency? Why is that important?

Pol (15:23.658) Yeah, latency is a tricky problem for wireless earbuds, right? So by cutting the cord and giving you all the benefits that true wireless earbuds have, you can move around, you can leave your phone in a different room, et cetera. What ends up happening is, you know, all of this information now needs to travel via Bluetooth through the air to get to you. And so for gaming in particular, obviously you want that latency to be as short as possible.

When I press something on my screen and that has an audio feedback, I want to hear that as immediately as possible to feel really immersed. And it can even be related to performance, right? If there's a certain rhythm that I need to hit and I'm off in my ear with the audio, obviously I won't do as well at my game. And so it's been a long standing problem for wireless earbuds and Pixel Buds Pro where we believe like really good at launch.

But we've spent this year really refining and working with the Pixel teams and the Android teams at Google to cut out all the little pieces in this chain of, you know, from your finger hitting the screen and something happening in the game to that going through all of the engines that eventually create the sound and send it up to Pixel Buds to just cut whatever we can. And so now we're up to halving latency for games that are supported, which we're really excited about and I think will be an amazing improvement for gamers.

Rachid (16:51.835) So it sounds like the people who were working on this, they were just maybe, because we're talking milliseconds, right? It may be found two milliseconds here, three milliseconds there. Seems like quite a tough job to make these improvements that in the end, end up to something that's really useful.

Pol (17:10.01) Oh yeah, and again, kudos to our engineering teams here. I think it's really been about finding, as you say, 10 milliseconds here, 10 there, optimizing some stuff in the core audio framework, et cetera, to shave off whatever we can. And it takes patience and it takes real diligence to dig in deep and optimize wherever you can.

Rachid (17:35.759) The final thing you mentioned for the update for the Pixel Buds Pro is hearing health. That wasn't the first thing I was thinking about. So how did this come about?

Pol (17:45.65) Yeah, it's something that we at Google are pretty passionate about in general. How can these wearable devices help you with maintaining and maybe even improving your health? You see this across our Pixel Watch and Fitbit and phone features. And we came across a statistic that there are actually 12% of adolescents 6 to 19 years and 17% of adults, 20 to 69, have some sort of permanent hearing damage. And so that comes from many things.

We all like to go to very loud concerts, but it also comes from listening to music too loudly for too long of a time. And really what we found is that we're not super well educated as to, you know, how do you actually do damage here, because it's not just a function of listening super loudly for a short moment of time, but it can also be like listening at what a lot of people would consider moderately loud for prolonged periods of time. And one long flight where you're just cranking up the volume could really already do some permanent damage here. And so what we wanted to do is build a feature that can educate people about their listening habits. And then give them a proactive warning once they're getting close to what is generally considered the threshold that is good for you.

And so that's what we did. The hearing wellness feature shows you your listening habits over the last 24 hours in a week of how loudly you have been listening to music. And it will also send you a little notification once you're getting close to, you know, where you maybe should consider lowering your volume.

Rachid (19:41.191) That's interesting because I think up until now we had a feature where if you wanted to go up above a certain volume threshold, it would say like, are you sure you want to do this? But now we'll actually measure the volume that's sort of in your ear and then notify you once it's been a little bit too long maybe.

Pol (19:58.454) That's right. And we're doing calculations on the earbuds to actually measure how loudly, how strongly we're driving the earbud speakers. There's a lot of things at the content side, depending on how things are mixed. So that you can't just look at, you know, is my phone volume setting at 80% or 85% because for some content that might be completely fine because the content is mixed quite quietly for some content. That might be really bad. And so we had to put this on the earbuds to really track what is hitting your eardrum and give you an accurate result.

Rachid (20:35.591) Pol, we always ask our guests for a top tip. So let's say I have a pair of Pixel Buds Pro, I have the latest software on it. What would you recommend I try and do with Pixel Buds Pro?

Pol (20:46.526) Yeah, I know you're a Googler, so you're probably using a Chromebook. It's the one feature I haven't mentioned yet is we're bringing our companion app that now you have on Pixel phones and Android also to Chrome OS. And so starting from October 12, 2023 you'll be able to go to mypixelbuds.google.com, where you can install this web app. And you'll be able to get great updates and change settings there directly from your Chromebook as well without having to pull out your phone to change any settings that you might want.

Rachid (20:58.384) Mm-hmm.

Rachid (21:16.839) Amazing. That's a great tip for all those Chromebook users. So if you already have Pixel Buds Pro, make sure you're on the latest software for all these great new features that Pol talked about. If you don't have a pair of Pixel Buds Pro, well, why not consider them and go to GoogleStore.com to check them out. Pol, thank you so much for your time and thanks for joining the Made by Google podcast.

Pol (21:37.666) Thank you for having me.

Rachid (24:21.039) The Pixel Buds Pro got a massive software update, adding a lot of fantastic new features. Let's hear all about it from Pol Peiffer, who is a product manager for audio wearables.

Related podcasts
The Pixel Watch and personal safety Your Pixel phone and earbuds questions, answered The power behind Pixel phones
Where to listen
Share this podcast
  1. Compared to Pixel Buds Pro with earlier software. Call quality depends on signal strength, environment, network, and many other factors. Actual results may vary.