DiscoverAll Things PixelGoogle AIPixel CameraHelp at HomeWellnessPodcastReviews & AwardsGift Guide
Podcasts - Season 1, Episode 6
State of the Heart
Explore the Google Pixel Watch’s innovative biometric tracking capabilities, powered by Fitbit, and how they support overall wellness in the latest episode of the Made by Google Podcast.¹
Huddle up! 

Bringing Fitbit’s health tracking to the Google Pixel Watch adds another layer to an already premier device: the ability to monitor and improve personal health benchmarks. In this episode of the Made by Google Podcast, product manager DeCarlos Love takes a page from his college football days to give a play-by-play account of the science behind biometric tracking and how innovative sensors can provide insights for everyone.2 

Strength in the sensors 

Whether it’s our heart rate or step count, health data is more visible to us than ever before thanks to technology in devices like the Pixel Watch. As Love explains, the green light on the back of the watch acts as a photoplethysmography (PPG) sensor, or a heart rate monitor. By flashing the green light onto red blood cells, the sensors use the contrasting reflection to translate heart activity into an accurate heart rate on your Pixel Watch screen. 

Tracking a resting heart rate is one thing, but PPG sensors take things up a level – thanks to machine learning – in accurate measurements during activity. Between wind, sudden movements, heavy breathing, and everything else that comes with a workout, Pixel Watch algorithms understand how to filter out interference from genuine data. 

Resting up

Heart rate monitoring isn’t just essential for activity levels. From sleeping in to sprinting out, Google’s machine learning and AI capabilities work to provide data and actionable insights for everyday experiences. Take sleep, for example: Rest is an essential part of overall wellness and readiness to take on the day, so the Pixel Watch’s sensors track movement, sleep stages, and Inter Beat Intervals to help you understand your sleep patterns and make improvements to your routine.3

Right on your wrist, you have a true partner in your wellness journey.

Tune in to the Made by Google Podcast to hear more from DeCarlos Love and learn more about Pixel Watch data insights and health tracking. 

Transcript

Rachid Finge (00:00): Paras and Edward, welcome to the Made by Google Podcast. Great to have you. Just to start off, could you tell us a bit about your roles at Google and how you ended up here, Paras, maybe we can start with you.

Paras Unadkat (00:10): Yeah, so I'm a product manager on the Fitbit team. And I was a product lead for the fall detection feature. I work on a number of our sensors and algorithms based features, basically using the sensing technology on wearables along with our amazing machine learning capabilities at Google to create great features for our users.

Rachid Finge (00:34): Amazing. And what about you, Edward?

Edward Shi (00:37): Yeah, I'm also a product manager on our safety team on the Android and Pixel safety team. And I've been working on products that aim to help users feel and be safer such as car crash detection and in this case fall detection. And when I heard about the feature with fall detection from Paras, I basically thought it'd be a great way to collaborate given the cross section between the algorithm and the safety themes of the feature.

Rachid Finge (01:00): All right. Well, let's bring this a little bit into reality. I actually fell off the stairs last December. Now let's say that would've happened somewhere this week. I was wearing my Pixel Watch and I have fall detection enabled. Edward, what happens, you know, when I fall and I'm wearing my pixel watch.

Edward Shi (01:18): Well, first off, I hope you're okay. I'm glad. I think you're okay, and I'm glad to see that.

Rachid Finge (01:22): Yeah pretty much.

Edward Shi (01:23): When Pixel Watch first detects a hard fall, it'll wait and see if it detects you moving. Now, if you don't move for about 30 seconds, it'll start to display a fall detected on screen notification to check in and see if you need help. Now, during this time, it'll also vibrate and sound an alarm, so you know that Pixel Watch has detected a potential fall, and you can, of course, respond accordingly. So in the case that you need help, you can tap ‘I fell’ and be connected immediately to emergency services or if you're okay and you don't need help, you can tap ‘I'm okay’ on your watch face to dismiss that notification. Now, otherwise, if you're unable to respond for about a minute, Pixel Watch will automatically attempt to call emergency services and play an automated message that lets them know that it has detected a possible fall. You didn't respond. And also share your location. So emergency services can know where to send help during this time. If you recover or you're able to speak, you can also speak to an emergency operator yourself, and if possible, let them know that you need help.

Rachid Finge (02:19): Okay. And we'll get back to that 30 seconds and additional 30 seconds a little bit later. Now, Paras I guess, your job is to make sure that the watch detects the fall in the first place. So how does it do that?

Paras Unadkat (02:32): So, the fall detection feature uses an algorithm that works based on the motion sensors on your Pixel Watch, and it uses machine learning to differentiate between the motion signatures of fall events like you were saying, falling down a flight of stairs. Right. Compared to things throughout your day-to-day life that might look like falls. Right. So things like exercising, doing burpees, kettlebell swings, and things like that, there's no way to build a perfect system. But by collecting a lot of data and training our models across all these different data sets of configurations, our model does a pretty good job of differentiating between different types of motion patterns.

Rachid Finge (03:07): I find it interesting because like you mentioned, burpees, which give me a sort of headache, I guess for you as an engineer that gives you a completely different kind of headache where you need to be able to distinguish it from an actual fall. So how do you keep the two apart?

Paras Unadkat (03:22): So it really just comes down to the data sets that we've collected and by collecting just a large data sets of, you know, real and simulated falls along with a huge data set of real world activities, like things like burpees people driving in cars, different types of vehicles, breaking hard all sorts of things like that, we've been able to build a model that does a really good job of being able to differentiate and learn the difference between those different motion signatures.

Rachid Finge (03:46): Okay. And we'll get back to how we train the model a little bit later, but Edward, we were just discussing the 30 seconds notification and the 60 seconds where emergency services will be actually called if you're on responsive after a fall. So how do you decide that it's 30 seconds at first and then an additional minute to have these things kick in?

Edward Shi (04:06): Yeah, and to start, I did wanna say, you know, at any time when a fall is detected, users can immediately contact emergency services by tapping the I need help button on the in-screen notification. And that's really important to us. But that being said, you know, we wanted to ensure that users have the opportunity to respond if they are able to, after a fall, for instance, we know sometimes when a user falls, they're able to recover themselves and they don't actually need to call emergency services. So we wanted to give them that opportunity to dismiss that call during those 60 seconds, essentially, if help isn't needed, we also know at the same time, unintentionally calling emergency services can be a really stressful experience for both the user and it can be burdensome to emergency call centers. So with the 30 and the 60 seconds, this gives us an opportunity for users to indicate that they're okay and cancel or stop automatic calls. At the same time, if we detect significant movement, we can also stop that automated call from going through. So this is our approach in essence, to balance getting users help quickly while minimizing stress caused by unintentional calls.

Rachid Finge (05:09): And, and how does that go, like during the development of the product? So at some point you're gonna say it's not 25 seconds, not 35 it is going to be 30. So how, how does that come about?

Paras Unadkat (05:19): Yeah, so I think, you know, the, one of the big things that we were trying to do with our feature was sort of make sure it's providing as much value as possible, being able to detect when our users actually need help and you know, when they're just doing burpees or when they're just getting up really quickly after an event happens. So because of that, we, we kind of built in this stillness threshold where we make sure that, hey, before we place any emergency calls or do any of that, or even kinda just bother the user and say Hey, it looks like you fell, we just make sure that they're actually motionless.
Or they're underneath a particular stillness threshold after that fall event and the idea is that if the user actually needs help and they're actually unable to make a call for themselves, we'll be there for them in that situation, but we won't spam them with any alerts or anything like that otherwise.

Rachid Finge (06:05): Right. So let's say I'm playing football, sorry I actually mean soccer. So that's football here in Europe and I'm wearing a Pixel Watch and I fall, but I stand up immediately to continue the game. Then I will not get any notification because the Pixel watch notices, and it's like, okay, it seems to be you're fine. You're moving again.

Paras Unadkat (06:21): We know you're okay. Exactly. We know you're okay. We don't wanna take you outta your game. We don't want to cause a bunch of chaos. So we just won't trigger at that point.

Rachid Finge (06:29): Perfect. Now for the people who don't know, within Google, we have this tradition of what we call dogfooding, which is where we test products that are about to be launched, you know, with the whole Google population. So the product teams get as much feedback as we can. Now, I'm just wondering, Paras, did I miss any dogfood invitation for maybe, I don't know, volunteering to fall and train your, your model? I'm just so curious how you gather data about falling likely without, hopefully without hurting people.

Paras Unadkat (06:58): Yeah. No, you definitely didn't miss any, any dogfooding communications about us collecting data from, from Googlers. We wanted to just make sure we collected all of our data in a way that was safe. And nobody got hurt. So no Googlers were harmed in the creation of this, this product,

Rachid Finge (07:18): All right. And what was in the actual way of doing it? How do you gather that data?

Paras Unadkat (07:23): Yeah, it's a really good question. So, you know, obviously the problem of gathering fall data is quite difficult. You can't exactly ask people to, you know, just go and take falls for you. And there are a lot of different things that impact how a fall looks to a set of sensors. Right. So if you ask somebody to go on like a crash pad and fall there that doesn't look like a real fall because there's a crash pad there. The level of impact is significantly attenuated. The thing that causes harm is a thing that you're actually trying to detect which means that it's really difficult to get people to actually do that. So we had a few different ways of going about this. Initially early on in the program, we worked with a number of different university labs and external fall labs to really just understand what a human fall looks like. We collected a ton of data that way through harness assisted falls. We had people come into the lab, set them up on a bunch of different elaborate rigs to kind of, you know, have them lose balance, see how they react to that, figure out what that all looked like really understand like what the different components were to a fall and different types of falls and, and that sort of thing. So we collected a lot of data in that way. We also had other kinds of mechanisms for collecting real world and simulated data. And one of the biggest problems that you run into in building a model like this is the problem of dataset variability, right? You collect a number of falls, but you can't collect every single type of fall in the world.

Paras Unadkat (08:50): And there's so many different ways and configurations in which people might fall and might be harmed so one of the ways that we, we approached this was that we started a partnership with the Google research team to use AI and computer vision to take videos that we'd collected of people falling and use those videos to map them into sort of a simulation space and simulate the physics of different fall types to really augment our data set of real, real world falls. So we still had that big data set of real falls, but to make it even bigger and just collect this, you know, really massive variety of different falls to really make sure that our algorithm was picking up on all the different varieties across different scenarios we were able to tune things like, you know, different lengths of limbs, different body types, different weights, different parts of the body, different surfaces of people falling on slopes, you know, just different ways of falling to really make sure that we had broad coverage. And then one of the more interesting ways I think that we were able to validate that our algorithm was actually working on real-world data, is our entire team actually drove down to LA and we worked with the team of stunt doubles for, for a week. And we spent the entire week basically just directing this team of stunt doubles to say, hey, here's like different types of falls that we wanna validate against. Like your example of people falling down flights of stairs we've actually validated our algorithm works in the real world and somebody falling down a flight of stairs. Wow. You know, we had people like getting into bike crashes, falling off ladders, even getting hit by cars. We had one day where we went down to the Six Flags parking lot and we just had, you know, a bunch of people getting hit by cars

Rachid Finge (10:27): That's amazing.

Paras Unadkat (10:27): And, you know, it was like, it was incredible. But I think sort of a combination of all these things helped to give us a lot of confidence that our algorithm and our feature is gonna be working really well and actually being able to detect these types of events when they happen in the real world.

Rachid Finge (10:42): I think we can hopefully safely add a disclaimer that no people were harmed during the development of this feature. Right. Maybe that's a good thing to say. Paras Unadkat (10:50): Done very safely. And, you know the people that we wanted to collect this data on were professionals and they do this type of thing and they'd know how to be in these situations safely, which is why we would never do it with a Google or a person who doesn't have experience with this. But it was all done very safely, you know, with all the right kinda safety precautions in place, but really with an eye towards collecting as much real world realistic data on this as possible in the safest possible way.

Rachid Finge (11:16): Now, Edward, you also worked with the safety team on something like crash detection in a car. I'm wondering like, you know, from a product perspective, when do you decide that the system is good enough to go to real world people? Is there something you can say about that?

Edward Shi (11:31): Truthfully, the balance is a little bit nuanced, and we're always looking at the data in particular. And so a lot of the different criteria that Paras look at for fall detection also applies in similar ways to crash detection. We really look at, okay, what are the different scenarios that we want to capture? You know, we wanna make sure we capture or optimize for crashes above a certain speed, for example And so when we have our data and we look at those particular scenarios that we're really optimizing for, we wanna make sure both ends, you know, of course an actual crash is detected, and then we also at the same time wanna make sure anything that may look like a crash but isn't a crash is also mitigated. So it's striking a good balance between those two. And that's when we, and when we look at the data, we wanna see that before we actually release to

Rachid Finge (12:17): Amazing. And of course, fall detection is new on Pixel Watch for people who own the watch. I'm just wondering from maybe your personal experience testing it, have you already experienced a benefit of it or have you seen people maybe testing it that already we're happy that they had the feature on, on the device?

Edward Shi (12:35): It's funny that you mentioned it. When said that no one was harmed, you know, I did take some voluntary falls just because outta my curiosity. And I may have had a bruised elbow but it was worth it. So in that sense, I've been fortunate that I haven't had an actual fall, but I'm definitely relieved and I have that extra peace of mind to know that if I did take a hard fall, I trust and believe in our product to help me if I need it.

Rachid Finge (13:01): Amazing. Definitely. You took one for the team. So that's absolutely great. Now we close every episode of the Made by Google podcast with a top tip for our listeners. So how can they get the most out of fall detection?

Paras Unadkat (13:14): I think one big thing with fall detection in general is that, you know, a lot of times people have this image of falling as a thing that, you know, is really built for people at risk of falling, right? So you kind of imagine like those old life alert ads like that have fallen and can't get up, type thing but really falls are a much, much broader widespread problem. Kind to your point about tripping and falling down a flight of stairs or doing, you know, household chores and you're climbing up on a stool for something and stool tips over and you fall and hurt yourself. So there are all sorts of situations that you can get yourself in that I think a feature like this could be really valuable for. We're very focused on, you know, making sure we kept all that in mind, developing this future. So it really, I think is a future that's built for everyone and can really add this element to peace of mind and safety no matter who you are.

Rachid Finge (14:06): Amazing. And indeed, I think it is especially useful for people who might have had a harder time getting up, definitely the case. And Edward, to use fall detection, is there anything you need to be mindful of? Do I need to enable anything on the device or in settings?

Edward Shi (14:22): Yeah, tactically, I'd say for fall detection specifically, you wanna make sure you've granted personal safety location permission on your Pixel Watch. We can still operate without that, but in the case that you do fall, we wanna make sure we can share your location with emergency services so they know precisely where to send help. So that's kinda my top tip for fall detection. And then separately, at the same time, I know we talked about crash detection here. You know, please feel free to check out our other safety features available as well. We have a number of to, you know, help give you peace of mind from crash detection to emergency location sharing emergency, so all to hopefully help everyone be a little safer.

Rachid Finge (14:58): Excellent. Paras and Edward, thank you so much for your time. Hope you stay safe out there while you undoubtedly test new personal safety features for Pixel Watch. Thank you so much for talking to us.

Edward Shi (15:09): Thank you.

Related podcasts
Designing for an Ecosystem of Devices Mission: Accessible Why Matter Matters
Where to listen
Share this podcast
  1. Works with most phones running Android 8.0 or newer. Requires a Google Account and internet access. Paid subscription required for some features. See g.co/pixelwatch/specs for technical and device specifications. Some features may require Fitbit account and mobile app.  

  2. The Health Metrics dashboard should not be relied on for any medical purposes. It is intended to provide information that can help you manage your well-being.  

  3. Some features may require Fitbit account, mobile app and Fitbit Premium membership. Not intended for medical purposes. Consult your healthcare professional for questions about your health.