DiscoverAll Things PixelGoogle AIPixel CameraHelp at HomeWellnessReviews & AwardsGift GuideNew to Pixel
Podcasts - Season 1, Episode 1
Zooming in on Smartphone Cameras
Learn about the latest advancements on the Pixel camera.

In the very first episode of the Made by Google Podcast, Rachid Finge hosts a discussion with Isaac Reynolds, a director of product management at Google, who shares his excitement about the latest advances in the Pixel Camera. Isaac explains that Google aims to make people more confident photographers and help them create beautiful memories. 

By studying high-end cameras and why people carry expensive equipment, Google has brought advanced imaging capabilities to something that easily fits in your pocket, from Guided Frame to Night Sight

Isaac highlights several unique features of the Pixel Camera, such as Guided Frame, Magic Eraser, and Photo Unblur.1,,,2 Guided Frame is designed to help people with low vision take better selfies, while Magic Eraser allows users to remove unwanted objects from their photos. And Photo Unblur uses machine learning to produce an image that is less blurry than the original, even if it was taken on another phone. 

Reynolds also describes the technical advancements in the Pixel 7 Camera, like the multi-camera system that allows for sharper images of moving objects in low light, plus new zoom technology that uses AI to enhance image quality. And he discusses the breakthrough feature Real Tone, which focuses on achieving accuracy in color and skin tone. 

Overall, the conversation highlights how Google Pixel Camera features are designed to help people take better photos and preserve meaningful memories while making technology more accessible to all users. 

Transcript

[00:00:01] Rachid Finge This is it. Our very first made by Google podcast episode. My name is Rachid Finge. I'm a Googler here in Amsterdam. So proud to welcome you to the Made by Google podcast. So what is it about bringing you behind the scenes and meet all the fellow Googlers I work with that make all those devices and services that you love to use there to people who work every day to make a dream into a reality. So in the next few weeks and months, I'll take you on a journey. So you get to meet all those amazing people and hear what goes into making all these products. Now, if you haven't yet, this is a great time to subscribe or follow, so you don't miss any episode of our podcast. It is October six today, so that's Made by Google Day, on which we announce a lot of new pixel devices. So of course, in this very first episode, we're talking pixel or should I say megapixels. Isaac, it's great to see you. Thanks for joining.

[00:00:57] Isaac Reynolds Thank you. Thanks for having me, too.

[00:00:59] Rachid Finge So you're the product manager for the Google Pixel Camera. Does that also mean that you are a photographer yourself or have you always been interested in the field way before working at Google?

[00:01:09] Isaac Reynolds Oh yeah. My first quarter in freshman year at my university, I took a photography class. So I've always been into photography. I love cameras, I love the technology as well as the artistic side of it. I think it's really common for people in engineering to enjoy photography because it can be such a technical, technology driven sort of hobby. But yes, I've always been interested in photography. I'm not a pro, that's for sure. I'm only professional at one thing, and that's product management. But I do love photography.

[00:01:34] Rachid Finge Our first guest has helped shaped what people think about when they hear pixel, it's camera system. For years and years, he's been pushing the boundaries of what technology can do to make photography better. And as you'll hear during our conversation, better is not a matter of piling on more and more megapixels and things like that. No, in his mind, better means something else. Now, that doesn't mean that our guest can talk about megapixels or machine learning. It's just that there's a bigger picture to keep in mind. Oh, and also our guest has some creative uses for Pixels Magic Eraser that you don't want to miss. So without further ado, I'm proud to say that our very first guest in a made by Google podcast is none other than the pixel camera product manager Isaac Reynolds. Well, it sure came a long way. When it comes to photography and smartphones, that is. I remember my first camera phone, I was like, why is it even on it? Because the resolution was so low. And then maybe it took another eight years before it amounted to something. And now it's probably sort of on the same level as professional cameras in certain aspects. Like, when did you see that a smartphone camera could be that good?

[00:02:48] Isaac Reynolds For me, it was. I actually remember a very specific moment. It was in 2017. I was traveling and I was able to bring my pixel with me because I was on a business trip, but I wasn't bringing a big camera with me. Right. And at that time, I was really familiar with my SLR could do. And I was walking around downtown sunset on the water. And I took a photo of the sunset. And I thought to myself, you know, my SLR could not have gotten that richness of the colors in the sunset. At that time, there were still things I would turn to my SLR for, but for just walking around, taking snaps and wanted to capture the richness and the beauty and the contrast. I honestly think that smartphones and especially the pixel, do a much better job of capturing certain scenes than SLR. SLR do, especially high dynamic range scenes.

[00:03:30] Rachid Finge Yeah. So tell us about that. So how can a smartphone camera, which is much smaller, still be better than those large and often more expensive professional cameras?

[00:03:39] Isaac Reynolds Yeah, it's all about computational photography. If you scroll back the clock decades. Cameras were pieces of hardware and that was it. It was a chemical reaction on a piece of film and light, traveling through a glass, a glass lens or glass disk. These days, it's it's so much driven by software. Modern SLRs are still primarily hardware driven and smartphones are primarily software. And we found some really interesting techniques that we can do in the smartphone space because we have the capabilities of a basically a little tiny computer. You'd be surprised how much horsepower is in one of these little phones. But the new techniques that we can really only do on smartphones.

[00:04:15] Rachid Finge That creates a whole new world of processing images. This is also the day that we are launching Pixel seven. Pixel seven. PRO What is the biggest change? Like if you need to compare it to previous generations of phones with computational photography, what is it the thing you're most proud of?

[00:04:30] Isaac Reynolds So if you if you look and so it's going back clock decades, you scroll back just ten years, you've seen us transition from what used to be the megapixel wars. Into something that's a little more photography and software focused. And still, through those ten years, we've gone through multiple phases. The the phase that we're in now where a lot of the exciting technology and a lot of exciting algorithms are happening and where we're making the biggest difference for the little the snapshots, the casual snapshots that people take is in two kind of technical areas that I'm really excited about. One of the ones we had a big success just this year is in multi-camera. So all of the pro phones you buy in the pixel seven pro come with three cameras in the back on the pixel and pro you get an ultrawide camera, you get a primary camera and you get a telephoto camera, a5x telephoto camera. And we can do things by running multiple cameras simultaneously that we could not do with just one. One of the things that is really, really exciting on the Pixel seven is we can run the ultra wide camera in the background quietly with a really fast exposure speed that freezes motion and we can run the main camera with a longer exposure speed that minimizes noise. And then we can take two simultaneous images and fuze them into one. That gives us the sharpness from the ultra wide and the low noise from the main in one image at the same time. And we use that to help you take sharper images of people's faces in low light or when they're moving quickly or when they can't hold still.

[00:05:56] Rachid Finge Oh, that's great. As a parent, that will probably be very useful when you have a one and a half year old running around. What else is a large change compared to the previous generation, you'd say?

[00:06:06] Isaac Reynolds We're also introducing some exciting Zoom technologies. So one of the new things in Pixel seven as well, getting back to the multi-camera concept is we have a telephoto lens that lets you have 5x zoom. Mm hmm. And we have a main lens. That's one esim. Now, traditionally, you might not be able to use that, find a lens until you've zoomed into 5x by using a fusion technique, running the main camera and the telephoto camera simultaneously. We can use the detail from the telephoto camera once you reach about 2.5 x. So now you're getting benefit from that telephoto camera starting at 2.5, not just starting at 5. And that's really cool. It means more value for the hardware that you're paying for and that you're buying. One of the things I'm really excited about in the Zoom space is it's a little it's outside the multi camera space, but it's kind of a novel approach to the megapixelverse. You'll see the pixel four or the pixel seven has the 50 megapixel main camera. Mainly we produce 12 and a half megapixel images from that, but we can produce 12.5 megapixels two different ways. One. When you're at 1x zoom, we give you a super low noise, high dynamic range version of 12.5 megapixels. But now on Pixel seven, when you zoom in 2x, we can just use the 12.5 megapixels in the center quarter of the 50, right?

[00:07:22] Rachid Finge So you don't actually lose quality when you zoom in 2x. So Isaac, we'd like to do something. We would call made by numbers where every guest will ask them to bring a number that is somehow important to the work that they were doing. And Isaac, what is the number you brought along for your episode?

[00:07:43] Isaac Reynolds Ooh. I think the number I would choose is 498,000,498 million.

[00:07:49] Rachid Finge I'm wondering if there's going to be someone who's going to top that number. It's oddly specific. Isaac why 498 million?

[00:07:56] Isaac Reynolds It's a very memorable number because I computed it once and I got stuck in my brain. 498 million is the number of pixels per second that a phone has to process in 4K video. I'm sorry, 4K 60 video specifically is 498 million pixels per second. So when you talk about running auto exposure, auto white balancing, autofocus HDR net tone mapping, you're talking about touching or analyzing 498 million pixels per second. It takes a lot of computing power to do that and a lot of dedicated hardware to do that.

[00:08:32] Rachid Finge Yeah, so a half a billion essentially half a billion pixels every second that are processed by multiple algorithms. If you're recording a high quality video on your pixel device, I think a lot of people are going to remember that number 498, and that's just in a second. So many more pixels to be processed when they're recording a longer video than that. A few times a year, something happens at Google that blows my mind and like, how is that even possible? And a few years ago that happened when I first experienced night sight, it was like, how is it even possible that a camera can create that much light in a space that doesn't even have light in it and then make it so easy? Right, because I'm not on a tripod, I'm just holding it in my shaky hand and it looks great. How is that even possible?

[00:09:21] Isaac Reynolds Yeah. One of the first things that Pixel introduced that really kind of changed the entire market, the entire technical market for smartphone cameras was this thing called temporal merging. The idea is that if you typically would like to take a 1 second exposure, it's very low light, 1 second exposure. Traditionally, you need a tripod to do that. Otherwise things are going to get blurry because your hands are moving, subjects move. And so 1 second exposures weren't always realistic and so you had noise. Instead we found on Google is you could chop that 1 second exposure up into five or six or even 15 little bits and you could get and then you could sort of so one, two, three, four spread over an entire second. Right. And then you could merge those 15 into one one moment in time, one true representation of one moment in time that had the noise characteristics of a 1 second exposure, but the blur of a 15 second. And fundamentally, that's what night sight does. Night sight can do up to a 6 second exposure, but shot the little segments so that you get a nice sharp photo. The big improvement with night sight this year is we're able to take a photo with these same noise characteristics as a pixel six but with half the blur night sites on pixel seven now take half as much time as they did before with equivalent quality otherwise. So this this technical area of temporal merging that we pioneered on pixel with HDR plus and then we brought to night sight and then we brought to Superzoom is continuing to show us how much quality we can get out of one of these little smartphone sensors.

[00:10:57] Rachid Finge So let me get this straight. So in pixel seven, compared to Pixel six, you can take a night sight photo in half the time with the same end result, basically, yes. Does that also mean that Pixel seven works better in even lower light scenarios?

[00:11:11] Isaac Reynolds Yes, it does work even better in low light scenarios. Yes.

[00:11:15] Rachid Finge That is incredible. So so what else goes into deciding which features you bring to a to a smartphone camera? So definitely there is this let's call it a capability side, right? You may have an idea, but you need the right set of tools and right set of algorithms to actually make it work.

[00:11:31] Isaac Reynolds We start by looking at the pinnacle of what cameras can achieve today. We start by looking at why are you going to be willing to lug around a 15 pound, $10,000 dedicated camera? What is it that makes that worth it? Is it file formats? Is it low light? Is it frame rate? Is it resolution? Is it bitrate? Is it dynamic range? Is it zoom? Is it creative controls? What makes that device worth it? And then we start trying to bring those capabilities down to something that fits in your pocket. And we try to take out those controls, take out the hard edges. I call them sharp edges, things that make those devices difficult to use so that someone who doesn't have the time or the energy to deal with those controls can just snap a shot once or press a start record once and get the same kind of output and capability and results. As someone who is willing to put in hours and hours and fill their backpack with equipment.

[00:12:28] Rachid Finge For people who don't know. Internally at Google, you can look up every Googler and you can have your own little mission statement on that page. Right. And yours says, Create more confident photographers and more beautiful memories. So definitely, with what you said before, explains the first part. Right. You know, creating more confident photographers. And it essentially also explains the more beautiful memories.

[00:12:48] Isaac Reynolds I love, by the way, that you pulled out my my internal corporate directory motto.

[00:12:52] Rachid Finge Definitely have to check it out.

[00:12:54] Isaac Reynolds I also love, by the way, that when I wrote that I actually get made fun of this by one of my friends at Google. But more beautiful memories and more conflict photographers. There are two different ways to interpret that. It's more either modifies confident or it modifies photographers. So I mean it all four different ways. But yes, we have to think about the creative side as product managers here. Cameras are not their tools. They achieve something for you. The camera is not the thing that you want. It's the pictures and the videos. Those are the things that you want. So my goal is to get people in and out of camera as fast as I can, get them on their journey to share, to post, to edit, to create, to curate. So it's really not about camera for me. It's about the sharing in the art and the creation.

[00:13:39] Rachid Finge And actually brings me to maybe one of the biggest breakthroughs that that pixel had, which is the real tone feature. So can you tell us a little bit about real tone and how you also decided to build that? And then what's new for real tone in Pixel seven?

[00:13:52] Isaac Reynolds Yeah, lots of people ask me, to be honest what feature on the phone I'm most proud of, and I've always had a hard time answering because I've been here for seven years and I've launched hundreds of features and I never really had an answer for what is the most proud of until we launched real tone. What's really new in Pixel seven is we're doubling down on the process. Real tone fundamentally is a process. It's collaborating with the community that understands what the problems are and really listening and giving that community an opportunity to speak to the people who have the skills and the power to fix it. So where we started from with Pixel seven was talking to the some of the same people we've talked to for Pixel six, some folks who are really amazing at showing us how to build a better camera. And the main things we focused on were bringing a lot of the real tone improvements to across the camera. So not just baked into photo mode, but we wanted more of those things to land in video mode, for example. So we continue to work on the accuracy of color and skin tone and skin color and skin richness, the richness of that skin tone. And in particular, we tried to achieve more of, we call it internally temporal consistency. You can imagine that when you take. Two photos or three photos in a row. Those photos are probably going to look similar, but there might be little tiny shifts in the color of the detail or the focus or the rest or something like that. That's sort of acceptable for pictures because you're going to post them one at a time. It's less acceptable for videos because in video you start to see the wavering, right? And those shifts are not, you don't want to watch the shifts happen in real time. So focusing on temporal consistency and making sure that color's consistent over time instead of wavering back and forth between two different colors is especially important in video. So we made some improvements in video that help with the temporal consistency in particular. And then we also just focused on optimizing the skin tone and skin color across the board.

[00:15:43] Rachid Finge You know, Isaac, of course, pixel seven is the second generation phones where we have our camera bar. And I'm wondering how does that actually come together? And there is this interplay, right, between what a phone looks like and and what the camera capabilities are of of a device.

[00:16:01] Isaac Reynolds Like everything, it's a collaboration. No one gets to say, we're doing it this way, and the other person just has to get on board. So so realistically, there's a collaboration and a give and take when you look at the team. But it's certainly true that when you look at the back of the pixel seven, it's a nice it's a nice exterior, but it's fairly featureless until you get to the cameras. Right. And the cameras are really what make it look like a pixel seven because it has that, that bar, that camera bar. What I love about that is it sort of there used to be these debates about form over function or function over form. I love that now that pixel has aligned on this form is function and function is form and there's no difference between the two these days. So when you look at the back, what you see is the cameras, which is great because when you bought the Pixel seven pro, it's because and in large part because you wanted the better camera. And there's no mistaking that. When you look at the back of the Pixel seven pro, you have the best pixel camera there is because there.

[00:16:51] Rachid Finge Can be no doubt. That's absolutely true. And, you know, we've been running through some of the new features that we have on Pixel seven. I think another one to talk about is making macro pictures.

[00:17:02] Isaac Reynolds Yeah, well, if you look, we talked a lot about multi-camera, right? Mm hmm. And there's no reason to put two cameras in the back of the phone unless they do two different things. That's why we have, for example, a ultrawide and a telephoto. One of the other reasons people buy special lenses you can imagine for their big cameras is astrophotography. And we have an astrophotography mode. Another reason they buy them is for macro focus. There are special macro lenses that you can buy. So we wanted to make sure that there were more reasons to feel comfortable walking out of your house with just your pixel. So what we did on the Pixel seven is we took that ultrawide lens. We gave it an autofocus module. AutoFocus means that we can shift the point of focus wherever we need it to go between very close and infinity. Right. And on that ultra wide lens, we're able to focus around three centimeters away from the subject. So you can get honestly, you can get so close. It feels like you're about to touch what you're taking a photo of with the ultra wide lens. And that gives you incredible detail, incredible resolution. You can see things in a macro photo that you a lot of people couldn't see what they're just their eyeballs alone. And it's just another reason to feel confident walking around with a pixel with a pixel in your pocket.

[00:18:11] Rachid Finge That's like the Swiss Army knife being extended once more a little bit. Again, I think we've described the back of the pixel phone now. And what if we turn it around and then we come to the selfie camera? It's estimated that 92 million sailfish will be taken every day across devices this year. So how do you help selfie makers, which is basically all of us, help make that easier on pixel seven and Pixel seven?

[00:18:34] Isaac Reynolds PRO Yeah, the first thing you have to recognize is selfies are very different from rear camera photos. They're taking a different contexts for different purposes. They're created differently, they're composed differently, and it's almost an entirely different set of problems and solutions that you have to create for a front facing camera versus a rear facing camera. What you're seeing on a Pixel seven pro is that we are really aligning what the front sensor and front camera can do with what the rear camera rear sensors can do on pixel seven, pro and pixel seven. You now have an ultra wide capability on the front, just like you have in the back. You have 4K 60 video recording on the front, just like you have on the back. You now have a sensor that's around 12 megapixels. In this case, it's 10.8 specifically. But we're starting to get into higher resolution cameras. It has big pixels as well, 1.22 micron pixels.

[00:19:26] Rachid Finge And that helps in low light, right?

[00:19:28] Isaac Reynolds Oh, right. Helps in low light. You get more detail, more low light and it has something called a high full well capacity which helps for dynamic range. Feels like you in front of the sunset. The other thing that is, is you have to recognize that selfies is third party apps. So not just the default apps, but third party apps are where these apps that you download from the play store is where a lot of selfies are taken. Probably the majority of selfies happen in apps people download from the play store. So we're actually partnering with some third party apps this year on Pixel seven to introduce new features there. That'll help you take selfies. For example, you can use ultrawide selfies in a couple of key apps. You now have stabilization, video stabilization available in a couple key apps. And you also can now take video from the front and back camera simultaneously so that you can see both what you are seeing as the photographer, videographer and your reaction to that moment in the same video stream at the same time. So giving people the best selfies for us is about aligning the capabilities in low light, dynamic range, high contrast video, and then partnering outside of the default apps to make sure that the apps you're using to take selfies work really, really well.

[00:20:38] Rachid Finge Isaac think there's another pixel feature where I think if people read it, they might not immediately understand what it is, even though it is very important and that is guided frame. How can you explain what that is?

[00:20:50] Isaac Reynolds So one of the things I and this is probably one of the reasons I came to Google in the first place seven years ago. But it takes me back a little bit, but I really appreciate being able to work at a company like Google that really lets us and encourages us to go the extra distance for users who maybe have a harder time with technology these days. This is one reason I'm most proud of realtone, right? Because we're able to build a better product for users who have historically not been served as well as they should have been by technology and guided frame is another example of that. Guided frame is a way to help people with low vision take better selfies. It's a way to make sure that the camera is pointed right at your face, that you're in focus, that you're framed, that you're centered, and as the mode says, ready for selfie, right? So the idea is that once you turn on certain accessibility controls and you open the selfie mode, it will give you audible haptic. So vibration and color and image guidance that helps you get your face framed just right for a perfect selfie. So really nice feature. Really great way to let people with low vision take better photos of themselves.

[00:21:55] Rachid Finge You wouldn't think that guided frame and real tone in a way are related, but actually they're very much related exactly. In the way you mentioned you go the extra mile to make the camera more capable and useful for people that previously may not have that great of an access to two cameras and the results that they produce. And then just we cannot have a camera conversation about pixel without talking about. Magic eraser. All the things we have been erasing over the past years. Anything that comes to your mind where Magic Eraser saved your picture, where it was very useful and maybe created a much, much better picture than an otherwise would have been.

[00:22:34] Isaac Reynolds Oh, yeah. Oh, for sure. I love Magic Eraser. Yeah. It always saves my landscapes as I to say it. It saves my landscapes because I will often go on road trips and I will make sure that I'm driving around sunset, sun, sunset and sunrise. And I can pull over on the side of the road at just the right spot at the right time and take a photo. And I often end up with fences because, you know, you're on the freeway or whatever. There's like a power line or there's a fence and I can just get rid of the power line or get rid of the big fence beams. And then it's like I was out in the wilderness taking my my landscape photo, but really, I was just pulled over to the side of the road. So it saves some photos like that for me, for sure.

[00:23:12] Rachid Finge Yeah, that is great. So I just keep getting back to what you say about helping people create better memories or storing them in a better way. But now we have this feature called Unblur that sort of helps people after the fact, maybe when things didn't exactly work out the way they want it to, but how do you unblur something? Because then, you know, sort of how do you know what it was supposed to look like without the Blur? It sounds so incredibly complicated, yet so easy to use. How do you do it?

[00:23:40] Isaac Reynolds I think you've probably seen in the news all of these websites now where you can type in a sentence and it produces an image, right? Yeah. Okay. There's a category of machine learning models in imaging that are known as generative networks. A generative network produces something from nothing. Or produces something from something else that is lesser than or less precise than what you're trying to produce. And the idea is that the model has to fill in the details, has to take a guess by itself. What these websites are doing is they're taking your sentence, which is the gas, and they're producing an image based on what the model that does the face on blurring is doing is it takes an understanding of what a human face looks like. It takes an understanding of what a human face looks like after it's been blurred as well. Mm hmm. And it kind of takes an understanding of what Blur looks like, because Blur, you can actually kind of measure the amount of blur in an image and what its character is. Okay. Takes all those things, and it says, here's what I think your face would look like after it's been on Blurred. Here's what I think. Here's what it must have looked like in order to look like this after it was blurred. And that does amazing things. And it never changes what you look like. It never changes who you are. It never strays out of the realm of authenticity is a very authentic representation. It's very true and honest. Never changes who you are, but it can do a whole lot of good to blur photos in a light when your hands were shaking. If the person you're photographing is just slightly out of focus or things like that.

[00:25:11] Rachid Finge Isaac, you mentioned you have been in love with photography for a long time and now you help shape how people take photos. So if there is someone who has top tips for the road, it must be you. So, Isaac, what would you tell our audience about tips for creating beautiful pictures?

[00:25:34] Isaac Reynolds The first thing I would say is light is everything in photography. If you do something as simple as spin someone 90 degrees or you move your subject over six or eight feet from direct sunlight into a little mixed shade, you can. You can turn a photo that's boring and and not great looking into something that's got a lot of. It's very dynamic. It's very exciting. It's very interesting. It's more beautiful. So I would say don't be afraid to make little, little shifts in lighting to achieve a really big result.

[00:26:26] Rachid Finge All right.

[00:26:27] Isaac Reynolds The next thing i would say is be aware of your background. But be ready to push magic eraser. And i would say know what’s in your background but honestly you can push magic eraser pretty far and you can get away with a lot even when there's people in the back of your photo. So know it’s there, but use magic eraser to its fullest. And i think this one does go out to the Pixel 7 customers. You can launch your camera at any time by pressing the power key twice just today. And the camera will launch instantly no matter what. Locked unlock screen off screen on your phone, that whatever. It'll always launch camera. That is by far the fastest way to get into the camera. You can have honestly, I can get my hand in my pocket pressed pressed the power button with my thumb twice. And by the time it's in front of my face, the cameras are running. That's the that's the best way to get the moments in front of you. For a pixel specifically.

[00:26:51] Rachid Finge I'm thinking that might be maybe might be the best pixel feature, you know, since the original pixel back in 2016 because it already had that feature then. Right.

[00:27:01] Isaac Reynolds We've had it for a long time and it's still one of the most beloved fan favorites to this day.

[00:27:06] Rachid Finge So thanks, Isaac, for the three tips. Light is important, background sort of important, but be aware and use magic eraser to fix everything else and double tap the power button to make photos really fast. It seems like Pixel seven in general focuses a lot on video, for example, with the addition of ten with color. And is it true that that you that is very useful when you have like a lot of contrast in the scene. Like maybe I want to, you know, in the past when I tried to take a picture of, you know, the sun coming up, it always looks better in real life than it looked on, you know, on a picture. Do I have a better chance now with ten bit video?

[00:27:43] Isaac Reynolds Yes, you do. If you were going to take a picture of a flat gray wall, then eight bit and ten bit just as good, right? But once you start to get into scenes like a sunset, right, where you have this beautiful gradient from orange, red to purple to blue to black ten, it starts to become really important. Or if you're trying to take a photo of flowers like a close up of some flowers, especially a really beautiful red rose, I think we've all had occasion to take a picture of a red rose. Then and it starts to become really important for those colors as well. So it's the gradients and it's the saturation like you see in a sunset especially.

[00:28:21] Rachid Finge What else would you say is new when it comes to video? Because it seems to me maybe maybe that's the question I want to ask. Is it because, you know, if you want to record a 60 FBS video, is it also actually 60 times harder to create a great video feature compared to a photo?

[00:28:38] Isaac Reynolds Oh, is it 60 times harder to do? I think it's a thousand times harder to do.

[00:28:42] Rachid Finge Okay. And why is it that?

[00:28:43] Isaac Reynolds Well, on Pixel we are really bringing software to bear on imaging. So you take a 12 megapixel photo. We're going to spend about 3 seconds processing that photo. That's 3 seconds for 12 megapixels. Right. So as that divide by four and multiply by thousand. Okay. Now in 4K 60, we're doing 498 megapixels. So some things we've been bringing to video and we really we've really been trying to bring a lot of things that we haven't photo modes now to video with different implementations and different technologies. But we now have if you look at the Pixel seven, the Pro and you look at some of the phones that came before it, we now have a really outstanding video package we already talked about. We have ten bit HDR colors now we have 4K 60 on all cameras on the Pixel seven pro. So whether you're on the Ultrawide, the main, the telephoto or the front facing camera, 4K 60 is always there and generally so is 4K is always there in scopes is always there. Maybe you're doing 1080, for example, we have cinematic blur now in video where you can take there almost like portrait mode photos, but they're videos instead. And that capabilities present time is present. Now you also have an automatic time lapse setting to make sure that when you take a timelapse, you always get a link to video between 15 and 30 seconds that is highly shareable. That's going to get you good engagement on social platforms because, you know, there's that sweet spot for engagement on social. So always make sure you get that. We've brought manual white balance controls into video as well. We've also got focus locking, exposure, locking and white balance locking into video. So you have a lot more capabilities there. Beyond that, we have more creative items, right? We have four different stabilization modes on pixel camera. So you can choose the stabilization mode that works best for you, not the default mode. We have a locked mode for zoomed in cases. So you're on a stage or I'm sorry, you're in the crown, you're watching a stage. Maybe it's a dance recital. You can use locked mode. Maybe you're you're running. You're not just walking or running. You know, you can use the active stabilization mode that we've had since pixel five. We also a mode that's made for B-roll footage. These are the very epic cinematic shots that you put over under a voiceover or transitioning between segments. The video B-roll is and we have a special mode just for that on on pixel and.

[00:30:58] Rachid Finge It’s also another one where it slows down a little?

[00:31:00] Isaac Reynolds It does, yes. Just a little. A touch. Just a little variety. Not too much. Not too little. And then we also have some great audio features. One of the big ones that I use the most and like the most is we have a speech enhancement mode that runs in the front and the rear cameras that lets you get like lapel mic quality speech.

[00:31:19] Rachid Finge Even if the mic is a far away, then the phone is far away. Correct?

[00:31:21] Isaac Reynolds It is. It's targeting the speech and it's turning down the volume on everything else. But it's better than that. It's not just targeting the speech, it's targeting the speaker. But that's why we call speech enhancement lapel mic mode. Internally, it's because it's focusing on your speech, not everyone else's speech. And the only reason we can do that is because the machine learning model, that power speech enhancement, is looking at both the audio and the video. So we have one learning model that combines two different inputs and it looks at who's facing the camera and literally looks at Are your lips moving?

[00:31:56] Rachid Finge Oh, it does that even.

[00:31:57] Isaac Reynolds It does. If your lips are moving and it matches them up to the words it's hearing and it tries to kind of figure out to your lip movements, match that speech so I can capture that speech and save it. And do I need to turn and then I'm going to turn everything else that doesn't match your lip movement now. So it's a really, really incredible model, but it's just one example of what machine learning is doing for imaging on pixel.

[00:32:17] Rachid Finge If this were a video podcast, I would you can see me now. I'm like, Wow, this is much more involved and complicated and sophisticated than than I thought. So it does beg the question like, you know, where where can you bring this technology in the future?

[00:32:33] Isaac Reynolds I think what's lovely here is speech enhancement is a use case. But it's powered and magic razor is a use case and face unblur is a use case and zoom is a use case, but you're seeing a lot of the same underlying technologies. Power though we've talked about temporal merging today, we've talked about multi-camera fusion today, we've talked about generative models today. Those technologies are going to power improvements across the board from photo, from video, from the default app to third party apps. You get from the play store, from zoom to dynamic range to low light to creative things. Like, like Magic eraser does. They're gonna power a whole bunch of improvements across the entire camera system. Where I'm most excited to go, where I'm most excited to see what the next stage is. It is probably in those two final areas, general models and multi-camera fusion, because we've really shown with the last generations of pixel what generative models can do for deblurring faces and and Magic eraser. And we show what multi-camera fusion can do for Zoom, for fusion Zoom and for fusion debugger, which we talked about today. I think you're going to see more use cases in those areas.

[00:33:41] Rachid Finge I wanted to say, I bet you you're already working on some of these things. Absolutely. And well, you know, maybe in a year's time we'll speak again and see what else you've brought with all these new technologies to help people become better photographers and save beautiful memories. That's such a such a great motto to have.

[00:34:02] Isaac Reynolds Thank you. Yeah, I love that motto. Headed up there for years.

[00:34:04] Rachid Finge Isaac, thank you so much for joining the first ever Made by Google podcast. Whatever happens, they cannot take that away from you, you know.

[00:34:11] Isaac Reynolds I'm staking my flag.

[00:34:14] Rachid Finge Thanks so much. Thank you for listening to the Made by Google Podcast. We have a new episode every Thursday and our next one is going to be great. We'll talk security. If you think that securing a smartphone is like building a castle, well, you're right. Learn much more about how Google secures Android and Pixel by subscribing or following. And join us next time on the Made by Google podcast.

Related podcasts
Designing for an Ecosystem of Devices Mission: Accessible Why Matter Matters
Where to listen
Share this podcast
  1. Requires Google Photos app. May not work on all image elements.

  2. Requires Google Photos app. May not work on all photos or videos with faces.