DiscoverAll Things PixelGoogle AIPixel CameraHelp at HomeWellnessPodcastReviews & AwardsGift Guide
Podcasts - Season 1, Episode 4
Chip Chat: All About Tensor
Explore Tensor, the silicon chip powering Google Pixel devices, in this episode of the Made by Google Podcast
Powering Pixel 

It’s what’s on the inside that counts – and when it comes to the silicon powerhouse within every Pixel, that couldn’t be more true. This episode of the Made by Google Podcast features Monika Gupta, senior director of product management for the Google Silicon team, as she walks us through the flagship piece of the Pixel portfolio: the Tensor G2 chip.

Eyes on the horizon

In partnership with Google’s AI scientists and researchers, Monika focuses on making  incredibly complex technology function and perform on a smartphone. Just as a human brain manages every part of the body, the Tensor mobile SoC (system on chip) enables your phone to process and share information instantaneously. 

The Tensor silicon chips unique to the Pixel portfolio are designed to fully incorporate Google’s innovative AI expertise. As Monika says, “We know where machine learning models are today, but we also know where they’re heading in five years.” Google builds structures for Tensor to keep pace with machine learning research. This visionary approach to Tensor development ensures Google Silicon can bring every ounce of Google innovation to Pixel customers. 

Innovating for the future

Google Cloud researchers started Tensor’s smartphone journey while working with devices that operated at 1000x higher power capacity than what a handheld device can provide. Over the years, Google has harnessed that power and reshaped it to function within a smartphone. Amazing progress, but what’s next? For Monika, the future lies in collaborating across Google teams to develop ambient computing capabilities – or, in layman’s terms, making current features even easier for users. From photography features like Photo Unblur1 to options like Direct My Call,2 developers are continuing to make Tensor the most versatile and helpful chip on the market today. 

Tune in to the Made by Google Podcast to hear more from Monika Gupta and explore Tensor’s wide range of capabilities and functions. 

Transcript

[Main Intro by Rachid]

[00:00:01] Rachid Finge Hey, I'm Rachid Finge, and you're listening to the Made by Google podcast. I'm here to introduce you to some amazing people who work on Google's products. In previous episodes, you met Isaac, for example, who works on Pixel's amazing camera system. We also heard from Jessie and Rae who work on keeping your devices secure. And last week, you met Isabelle, who designs many of our products. I learned so much from Isabelle about colors, materials and finishes. And now I also know why fast wifi is even faster when it's beautiful. If you don't know what I mean by that, then feel free to listen to last week's episode. While you're at it, why not subscribe to the Made by Google podcast? Because we still have great guests to come until the end of the year. Now, today we have a cracker for you because we're going truly deep inside in a literal sense, pretty much as we get to find out more about Tensor, the chip inside Pixel that powers pretty much everything that it's famous for. Tensor is a chip that Google designs itself, and I wanted to find out more about why we're doing that. So we're fortunate to have somebody with us today who knows Tensor Inside Out. It is our senior director of product management for Silicon, Monica Gupta. Monica, welcome to the Made by Google Podcast. It's great to have you.

[00:01:22] Monika Gupta Hi Rachid. Thanks for having me and super happy to be here.

[00:01:25] Rachid Finge Great. So you're a product manager for Google Silicon Teams, and I know that for people working in Silicon Valley, they probably know what silicon is. But for listeners that don't exactly understand that, what is silicon?

[00:01:37] Monika Gupta Silicon or semiconnectors are chips. And when I first started working on chips, I did have to tell people it's not like Doritos chips, but semiconnector chips. But yeah, I lead product management for Google for our custom silicon in our consumer hardware devices and our flagship product being the tensor mobile SoC.

[Meet the Googler]

[00:01:57] Rachid Finge Today's guest is Monica Gupta, who knows everything about our tensor chip, which you could call the heart and soul of Pixel. Monica is a director of product management and as she'll tell you there are very important reasons for Google to have its own chip. In short, without Tensor Pixel wouldn't be what it is today, or more specifically, it couldn't do what it can today. Together with Google's A.I. scientists and researchers Monica's focus is on making amazingly complicated technology work on a phone. Decisions that she takes today affect our products years from now. You'll also find out how Tensor got its name and why Google doesn't look at benchmarking quite the same way as many others. Talking about computer chips usually isn't easy, but Monica made it a breeze. I hope you'll enjoy our conversation. So, Monica, what is it that you do then as a product manager for silicon teams?

[00:02:55] Monika Gupta I lead the product management for custom silicon inside our Consumer Devices division and the flagship of our custom silicon being the Tensor mobile SoC. It is the brains of a smartphone. It connects to the camera, to the microphone, to the speakers, to your Bluetooth or Wi-Fi and all the other sensors that you have in a smartphone. And it takes all that information and processes that. So as a product manager for silicon, while our consumer devices teams are busy planning and launching the next generation of our device, I get to focus on what we need five years from now.

[00:03:27] Rachid Finge So at Google, we have this internal system where you can look up all your coworkers and then everyone has this like little mission statement in there, right? And yours is unlock Google Innovation with AI first silicon. What did you mean by that when you wrote that?

[00:03:41] Monika Gupta Okay. First off, I wrote that like probably years ago. So now it sounds kind of dorky now that you read it back to me, but it still holds. What I mean by AI first silicon is really a nod to Google as a company, we're an AI first company, and the reason we are doing custom silicon is to bring that Google AI innovation to the smartphone. And we felt like the only way we could do that is we had our own silicon.

[00:04:07] Rachid Finge Right, so we make our own chips. I think there are many companies that make Android phones who do not make their own chips. So why did we decide to go a different route?

[00:04:16] Monika Gupta When we looked at what our vision for Pixel was, we just felt like we weren't getting to it fast enough and we were sort of held back. Google has some of the best machine learning and A.I. researchers in the world, and we felt like we couldn't bring that to the market in the way or as fast as we wanted to. And so for us, doing custom silicon was a necessity.

[00:04:38] Rachid Finge It was important because without it we could probably not run all the artificial intelligence that is in Pixel seven today.

[00:04:46] Monika Gupta There's no reason why we can't bring all of that to this market, but there's a challenge there. The field of machine learning is rapidly evolving. So it's not like you have decades worth of hardened processing units that can run them really well. And it's a rapidly evolving field. Right? So the only way to keep pace with that rapidly evolving field of machine learning is to design hand in hand both the silicon architecture and the machine learning models. They have to be co-designed. And that's what really Tensor brings to the table is we know where machine learning models are today, but we also know where they're heading in five years. So we can build our architectures for our chip to keep pace with our machine learning research. And I think that's really the uniqueness of tensor and that's what allows us to bring all the innovation that Google has as a company to the pixel phone.

[00:05:32] Rachid Finge So we then have our own system on a chip, which people know by now is called Tensor, which comes from math I guess. But why Tensor?

[00:05:40] Monika Gupta The name tensor really represents for us as a company, ushering in this new era of AI and ML based computing for smartphones. And like you just mentioned, some of your listeners probably know that Tensor is actually a mathematical building block of machine learning. There's a funny back story. Anyone that's ever had to name or brand and do branding knows that there's a lot of due diligence that goes into setting a brand name. Especially for a big initiative like this one. Taking on custom mobile SoCs is not like some small thing. It was either in 2019 or 2020, the marketing director told me out of frustration in a meeting. Hey, naming your chip is harder than naming my first born, and I felt so bad for him because you can imagine everybody has an opinion and he has to be the one to do the due diligence, do the legal do, with all of it. How we settled on Tensor was actually spearheaded by our CEO Sundar. He really set the direction. He said, Hey, I don't care what you guys pick, but I really need this to truly express this AI and ML based approach to mobile computing, right? That's what we are doing and our brand needs to represent that. So that's how we came up with Tensor.

[00:06:50] Rachid Finge So you mentioned that a system on a chip is like the brain of the phone. What else does tensor bring then to a traditional system on a chip?

[00:06:58] Monika Gupta I would say front and center is the machine learning. If you look inside Tensor, if you look under the hood, we have this thing called a TPU, a Tensor processing unit. So that is the heart of the machine learning engine within our chip. However, machine learning doesn't just run on one block. If you actually deconstruct all the applications and features that Google launches, like computational photography, for example, you'll see they actually light up the entire chip. Almost every major subsystem is used. And so with Tensor, I would say there are two very unique things. Number one, the actual machine learning engine was co-designed with Google Research, right? And all of our history in machine learning. And then second, we were very thoughtful when it comes to this notion of heterogeneous computing. And what that means is every part of the chip is involved in running all these complex things that we are doing. And so those are really the two unique things when it comes to tensor and how it's different from how others may have designed their mobile SoCs.

[00:08:07] Rachid Finge Right, so our chip has a part that is called a TPU and it's great at running all sorts of AI and machine learning magic that researchers at Google have been coming up with over the past few years. I guess something like that.

[00:08:20] Monika Gupta Well, I would say the past few decades, but yes.

[00:08:22] Rachid Finge Few decades even. Right. And is that a main reason to build a TPU? Because then you have teams that do the research and then we have teams that build the chips and those can be lined up and tuned to each other as best as possible.

[00:08:37] Monika Gupta Yeah, we have some of the best researchers in the world when it comes to AI and machine learning, and we just couldn't bring it to the smartphones as fast as we wanted to. And it was really important for us to work with those researchers and take what you do in the data centers. Right. So if you compare a data center in a smartphone, in a data center, you essentially have infinite compute capabilities and infinite power. Right. Compared to something you hold in your hand. And for us, designing Tensor was working with those same researchers and all the innovations they've done in our data centers. And that's a really tough sort of technical problem. Right? Like, how do you go from infinite compute, infinite power and bring that down into something that's tiny and can be held in your hand?

[00:09:20] Rachid Finge So the main challenge with building Tensor is scaling down, I guess what what is running in a data center? Make sure it works in a way that it doesn't like. I don't know, run out of battery in 5 minutes or so.

[00:09:31] Monika Gupta Yeah. The bar that we held for ourselves i s we don't want to compromise on quality. So you can always take something big and scale it down, right? That's easy. Anyone can do that. But for us we wanted to preserve that quality that Google is known for with some of our features and services that we run in our data centers. But we wanted to bring that same quality but within a power budget that fits inside a phone.

[00:09:52] Rachid Finge So whenever I speak to Googlers who work on hardware, they all tell me hardware is hard in a way. Maybe that's why they call it hardware. And then doing the chip, as you mentioned, is maybe even harder. So we go through a lot of pain to do that. And then, of course, the question is what kind of benefits the users listening right now. Have they seen thanks to having Tensor in their Pixel phones.

[00:10:15] Monika Gupta Yeah, I mean, hardware is hard, but when you work at a company like Google, it actually becomes a lot easier because we have so much leverage from right in terms of software and machine learning. Now, if I look at from a user perspective. Mm hmm. So what does this all bring to you? There's probably two areas that really stand out for me personally as a user. So number one and you know, and I haven't always been a Pixel phone user, I'll admit that, but I am now. And the number one thing that first immediately stood out to me was the camera. Right. Like the Pixel camera is just amazing. I went to Bali back in July this year and it was a group of ten of us and I was literally the only Pixel user. And by day two, it was actually really annoying. It was flattering and annoying at the same time that the nine other people on the trip kept sending me their photos and they would say things like, Can you magic eraser for me? Can you fix this for me? Can you make this look better? Because of course you can imagine Bali setting lots of sunsets, a lot of challenging light conditions. So the camera is a stand out thing, it is just so good and everybody sees it right. Like somehow you become the designated photographer when you go on a trip. The other thing I noticed in everyday life is our speech recognition and the quality of our speech recognition is so good. It's as good as our data centers, but we can fit it in the power budget of a phone. I hardly ever type anymore. Text messages I just press on the microphone icon and I type complicated sentences, complicated responses. It's really easy. And then the other thing is all of our call screening services, I don't know if you use them, but my goodness, they're amazing. I can be working and I'll have my phone on hold for me while I'm trying to do some personal errand or something. Right. And I have to do it during the working hours for whatever, you know, business or whatever that I'm calling. And to be able to see the menu options right there, to know when it's on hold music versus a live person. And all of that is based on having really good speech recognition on device. And so I would say the camera and the speech recognition are the two standout things for me.

[00:12:20] Rachid Finge Yeah. And that's only possible because we have this TPU that works perfectly well together with whatever Google researchers came up with over the past few decades.

[00:12:29] Monika Gupta As if you look at the software that enables a lot of these features I just mentioned, whether it's the camera or the speech, it doesn't just run on the TPU, it runs on the entire chip. And that's why this notion of heterogeneous computing is so important. It's not just do you have the fastest individual block? Actually, what's more important is how these blocks interact with each other because you are literally taking, for example, your speech data or your camera data and you are moving it across the entire chip. For camera, for example, or videos. It's like a Christmas tree. You're literally lighting up the entire chip to process photography or videos.

[00:13:04] Rachid Finge Right. Start sounding like team sports. Then you need the whole team to be up there in order to make this work.

[00:13:09] Monika Gupta Yeah, absolutely. I mean, if you're an NBA fan without mentioning names, there's plenty of examples where one person can't do it all. It's definitely a team sport.

[00:13:20] Rachid Finge Need more than just the MVP, I guess. Now, you were talking about different blocks. And I know there are folks out there who love running benchmarks, which is like putting a car on a German autobahn and then buying it for the maximum speed, I guess. And then they come up with certain conclusions and we say, we don't really care about that, so why don't we care as much about classical benchmarks?

[00:13:43] Monika Gupta I think classical benchmarks served a purpose at some moment in time, but I think the industry has evolved since then. And if you look at what Google is trying to do by pushing A.I. innovations into a smartphone, because we feel like this is the approach that will deliver helpful experiences, like some of the ones I just mentioned, classical benchmarks were authored in a time where A.I. in phones didn't even exist. They may tell some story, but we don't feel like they tell the complete story. And so for us, what we benchmark are the actual software workloads that we are running on our chip. And then we strive with every generation of Tensor chip to make them better, whether it's better quality, better performance, lower power. So that's our approach. Classical benchmarks, unfortunately, a lot of times are synthetically engineered workloads, right? They don't reflect the actual software that's running on the device. And I think that's fundamentally the problem with classical benchmarks, that they might be like some tool, right? If you don't know the actual workloads that will run on your device, there's some tool to give you some kind of indicator. Are you trending in the right direction or not? But they don't actually help when it comes to real world software that needs to run on these chips.

[00:14:52] Rachid Finge Right. So better to measure the real world impact rather than just a tiny part of it that maybe reveals a number that is not very relevant for most people who use a device.

[00:15:03] Monika Gupta Yeah, and it's not even tiny. A lot of these benchmarks are all synthetic right there because you don't have anything else to go off of. But because we're an in-house chip supplier, we're what we call vertically integrated, meaning the silicon, the hardware and the software and the services all come from Google. We're in a pretty good position to know what should we optimize for? What can be good enough? When you design a mobile SoC, you're probably making hundreds of decisions, right? You don't just say, Hey, I just want the best of everything, right? You don't say that. The essence of product management is having the judgment and experience to make the right tradeoff calls. You're always going to be forced into that situation. And for us, being vertically integrated makes it really easy. There's no guesswork. We know exactly what we are building for. And if that means we're not going to win on benchmarks or not, look as great on benchmarks were perfectly comfortable with that. Because the end result speaks for itself. Like on Pixel six and Pixel seven, you can see all the amazing innovations that we have landed and a lot of them were like the first on Pixel. So we're very comfortable with that approach.

[00:16:04] Rachid Finge I'm curious about what you just said about being a product manager, and it's basically about prioritizing, like, what are we going to do? What are we not going to focus on? How hard is it to do that when it comes to building a chip? Seems like there are a lot of things you can say yes to, but maybe you say no many times. How does that work in practice?

[00:16:21] Monika Gupta Yeah, I mean, it's hard. And especially with silicon, our development timelines are so long. So we're not even just saying yes or no to what we think is needed next year. We're saying it to what we think is needed like five years from now. And I think there you just really have to trust. Like I said, it's so much easier when you're vertically integrated because I know exactly where machine learning models are trending in five years. I'm not making decisions based on where machine learning is today. And I can say that because I work at Google. Right. Same with the software that our software team is doing. I know where the software team wants to take the user experiences five years from now, and that's the benefit of not being a merchant silicon supplier, but an in-house silicon supplier. So those tradeoff decisions are very tough, but I think they get a little easier when you're vertically integrated.

[00:17:10] Rachid Finge So is it that maybe a software team knocks on your door and say, Hey, Monica, in five years from today, I might want to be able to do this and this and I need to silicon that can make it sing on the phone. Is that like a conversation that could actually happen in practice?

[00:17:25] Monika Gupta Well, it's actually even better than that. So we'll have software folks being like, hey, I want to do this, this and this. We have researchers that have lots of ideas across Google. And then we also have, of course, the hardware team and the design team that want to push the envelope on those areas as well. So yeah, it's pretty fun to be able to sit in a room with all those different functions and brainstorm and just know directionally. We all have clarity of thought directionally where we are headed now.

[Made by Numbers]

[00:17:55] Rachid Finge Monica We now arrived at a section we call Made by Numbers where we ask our guests to bring a number that is important to them, or important to the development process that they were just in. We've had a wide array of numbers going from one and a half billion to half a billion to just 100%. So, Monica, I'm really curious, what's the number you brought for us?

[00:18:15] Monika Gupta Yeah, I heard your last podcast and everyone was trying to up number each other. I'm not going to do that. Okay. So the number that I came with is 1000 x.

[00:18:25]Rachid Finge 1000 x. All right. And why is that?

[00:18:28] Monika Gupta So it's really funny. I mentioned before that we work with the same researchers that develop machine learning for Google Cloud, Google data centers. And this Tensor journey has been, you know, along when we didn't start this yesterday. So when I think back to the original days of engaging with Google Research, it's really funny because you're talking to people that, like I said, have infinite compute, infinite power compared to a smartphone. And now we're like, Hey, bring all your great innovations to this tiny little thing that fits in your pocket. And 1000 X represents like they think in Watts, right? Watts like for power consumption. They think in like that unit watts. And we're like, no, no, no. We operate in milliwatts, which is a thousand X different.

[00:19:11] Rachid Finge So the amount of power you get to use in your field is a thousand times smaller than what the cloud engineers get to work with.

[00:19:17] Monika Gupta At least. Yes, right. And it was really funny, just like that first early days, conversations where you realize, okay, we come from different worlds. But it was actually quite amazing to see the evolution of now the same researchers that developed for Google infrastructure. Big Google, as we like to call it, how they have literally taking their technology and adopted it to preserve the quality, but at a power budget that works for us in a smartphone.

[00:19:46] Rachid Finge So that makes me curious. So you work on making things from Google Cloud a thousand times more power efficient for use on mobile phones. And I'm wondering, does it also go the other way around? So maybe someone who works in Cloud says, hey, that thing on on Pixel is a thousand times more efficient. Maybe we should bring some of that over to the cloud because efficiency is great for anyone, right?

[00:20:06] Monika Gupta Yeah, absolutely. And it's not really like the device people versus the cloud people. Right. If you look at the machine learning or A.I. community within Google, it's one community. And so if we drove innovation for the pixel phone and it's a more efficient way of doing things while preserving quality and things like that, absolutely. It will get adopted everywhere. So it's not like one versus the other. And like even on Cloud, like I say, like I joke that it's infinite compute an infinite power, but really everything has a cost to it, right? Of course. So serving on the Cloud, I mean, there's a cost there. And if there's ways to improve the cost, whether it's a lower like the cooling and the power management. Absolutely. But we're like one sort of organization when it comes to machine learning and they fully adopt any new innovations everywhere.

[00:21:00] Rachid Finge My favorite feature on Pixel seven for sure is Photo Unblur, and that's then something that your silicon intimately is involved in to make that even work. I'm just wondering, do you remember the first time using photo Unblur yourself on the phone and what was your thought when you did it?

[00:21:16] Monika Gupta So my cousin, who I haven't seen in since high school, I was about to say how many years, but I'm not going to date myself. She sent me a photograph of our trip to India when we were teenagers. And it's like me, my brothers, my mom, her mom, and it was really cool to use Photo Unblur for that. It's just amazing. I mean, you can imagine the cameras back then, right, to take that photo. And it was a really nice moment. She sent me that picture and then I sent her back the photo Unblurred version of that picture.

[00:21:49] Rachid Finge And what was their response?

[00:21:51] Monika Gupta And, of course, amazement. Yeah. And then she started sending me more pictures and I was like, Oh, here we go again.

[00:21:57] Rachid Finge It's just like that Bali trip all over again. Yeah. As we discussed, you are working on on future versions of our silicon. And I know we cannot go into details because you and I both would get fired, probably. But I'm still wondering, what would you like our future silicon to be able to do?

[00:22:15] Monika Gupta Yeah, so I would definitely get fired for talking about our future silicon roadmap. Right. But I think sort of overall vision for us and for Tensor family is really all about ambient computing. And ambient computing means that the technology is making your life easier. And I think we have a lot of evidence of this that we talked about today, whether it's making photography easier, whether it's making phone calls and how you use your phone like your day to day tasks easier. I would say we build upon that vision of ambient computing and figure out how to do super complex, nuanced things in the chip in a power efficient way that are going to unlock some of those ambient computing experiences.

[00:22:59] Rachid Finge Well, sounds like we have a exciting future ahead of us. So maybe three or four years ago when today was the future, what kind of experiences were you trying to unlock and you're probably proud of that are possible today? Maybe in the camera world, for example.

[00:23:13] Monika Gupta So if you look at the camera, even before Tensor, Google had amazing camera innovations. Thanks to this AI based approach to photography we call computational photography. We are known in the industry for that and now bring Tensor SoC onto the scene. We can just do so much more now when it comes to computational photography and videos, and that's what we did with Pixel six and Pixel seven. So for example, like on Pixel seven like Night Sight, amazing feature invented by Google and where even in low light situations, right your photos come out amazing. And now with Tensor G2 Night Sight runs two times faster. So that's a pretty big achievement because you're working in super tough, low light conditions and you're trying to make your photos more beautiful. There's so many things around the camera, like you can go down to the nitty gritty details of fundamental things, things like focus, autofocus, right? Like in typical Google fashion, we throw ML at that problem. That is not the classical way of doing autofocus. But at Google it is. And that's something you'll find in Tensor G2.

[00:24:19] Rachid Finge And what does that mean? Does that mean like that? We show the chip in the system what faces looks like in order to make sure that they can focus on them.

[00:24:27] Monika Gupta That's called face detection. In a photo what's more important than the face? Right. That's pretty foundational to photography. And last year in Pixel six, we talked about this face detection model again. In typical Google fashion, we throw ML at the problem. And now on that one, there's actually if I can geek out for a second, of course, there's an even-

[00:24:46] Rachid Finge Of course, that’s what we’re here for.

[00:24:46] Monika Gupta More interesting story about machine learning. So face detection, pretty foundational, right? In photography, would you not say? Yeah. And then we as Google number one, we apply machine learning techniques to do face detection, but the story doesn't end there. If you want to know how we really created face detection, we actually used machine learning to create the ML model for face detection.

[00:25:09] Rachid Finge You got to help me out here. So you used ML to create the ML?

[00:25:14] Monika Gupta Yeah.

[00:25:14] Rachid Finge How does that work?

[00:25:16] Monika Gupta This is what I mean about working at Google. When you're an in-house silicon supplier, you've got so much ML goodness around the company. So we have this thing called shoot. I'm not remembering the external name for it right now. Is it AutoML? Yes, that's the external name AutoML. And the concept behind AutoML, like automated ML, is now machine learning is optimizing machine learning models, making them higher quality, making them lower power, more efficient, all of it. So the face detection and Pixel six was actually taking advantage of AutoML and where ML made ML better.

[Top Tips for the Road]

[00:25:57] Rachid Finge Monica, we ask every guest for their top tips for the road. So for example, we asked Isaac Reynolds from the Pixel Camera about his top three tips for people who love taking pictures and want to get better at it. Maybe there are people listening who want to become a product manager or who want to become a better product manager? I'm just wondering, what would you tell those people? What's your tip for the road?

[00:26:22] Monika Gupta Okay, so how to become a better product manager? I think for me over my career, I don't know if it was coincidentally or by design. I always end up working on the new and emerging technologies. So meaning that it wasn't like Gen 10 of something, right? It was news, a clean sheet of paper. There's no playbook. And consistently that's been the case over the decades. And I think like with joining Google and working on Google Tensor, I mean, mobile industry is a very well-established market. However, how we are approaching Tensor is very different. And as a product manager, I think working on new and emerging technologies, what it teaches you over the years is, number one, to really hone in your analytical skills, not fall into the trap of going through the motions. And the hardest piece is having the bravery to take risk. And I think with Tensor, that's what we've done, right? Like we are changing the paradigm of mobile computing with this chip and we are deviating from the norm and we are leading with AI. We're not leading with benchmarks, for example. And I think that takes some bravery and some risk taking and we're pretty happy with the outcome, right? Like the results speak for themselves but for a product manager that that can be a very uncomfortable place to be. So my tip for the road would be take the risk, be brave.

[00:27:38] Rachid Finge That sounds like a great tip. And indeed, it made some beautiful hardware possible at Google with Pixel Seven and Pixel seven powered by Google Tensor G2. Monica, thank you so much for joining us on the Made by Google Podcast. It's great talking to you.

[00:27:51] Monika Gupta Thanks for having me.

[00:27:53] Rachid Finge Well, if there's something Monica did, it is betting on AI and taking a risk there. And I'm glad she did because we're reaping a lot of benefits from that today. It's been great learning more about Tensor. And if there's one thing I'll remember, it is that Tensor is all about making cutting edge artificial intelligence possible on a mobile device. So that's it for this week's episode. But don't you worry, there's a new one next week, so please take a moment to subscribe to the Made by Google podcast because we'd love to have you again. Have a wonderful day wherever you are. Take care and talk to you next week.

Related podcasts
Designing for an Ecosystem of Devices Mission: Accessible Why Matter Matters
Where to listen
Share this podcast
  1. Requires Google Photos app. May not work on all image elements.

  2. Not available in all countries or languages. Toll-free numbers only. May not detect every on-hold scenario.