Alphabet's (GOOGL -1.97%) (GOOG -1.96%) Google segment recently held an event called The Checkup by Google Health, in which teams across the company discussed advancements being made in healthcare around the globe. In this episode of "The Health & Fitness Show" on Motley Fool Live, recorded on March 25, Fool.com contributor Brian Orelli talks about several specific initiatives in Google's "companywide effort to help billions of people be healthier."

10 stocks we like better than Alphabet (A shares)
When our award-winning analyst team has a stock tip, it can pay to listen. After all, the newsletter they have run for over a decade, Motley Fool Stock Advisor, has tripled the market.*

They just revealed what they believe are the ten best stocks for investors to buy right now... and Alphabet (A shares) wasn't one of them! That's right -- they think these 10 stocks are even better buys.

See the 10 stocks

 

*Stock Advisor returns as of March 3, 2022

 

Brian Orelli: They bought Fitbit recently, and they're using Fitbit not only just to track people's steps and other things, but they got an FDA approval for an EKG app that uses the Fitbit and they've tested that its ability to detect atrial fibrillation and 25% of people don't learn that they have A-fib until after they've had a stroke or heart attack. By studying, the study that they ran, a prospective study, found that it could identify A-fib 98% of the time. That's pretty interesting and useful. They are partnering with the CVS MinuteClinic, when you do a search for a CVS MinuteClinic, you're going to be able to book your appointment like right on the Google website. Basically, how you book a hotel room, if you search for hotels in a city, you have the ability to book right there. They're doing the exact same thing with CVS MinuteClinics. Then they're hoping to add additional partners.

The YouTube division has partnered with authoritative health experts to provide videos, and then they are using those or highlighting those videos at the top. Just like when you do some search for a news event, you end up with articles from authoritative places, they're doing the same thing, the YouTube is doing the same thing with these videos. When you search for a specific disease, you will get videos from these things and, I mean, they didn't say this, but I mean, the reason they're doing this is so that that drops down unauthoritative videos on the same subject. This way they know these are authoritative, so they promote them in their own special section. Then the New England Journal of Medicine has, they have a partnership with them. They have their own channel, and they're creating videos based on the research that's published in the New England Journal of Medicine.

Then on the AI side, they've developed this thing called Automated Retinal Disease Assessment, or ARDA, A-R-D-A. They used AI to train the system to detect diabetic retinopathy, which is a disease of the eye, that often it's associated with diabetes. The ARDA, they did a prospective study and it had comparable accuracy to eye specialists being able to identify diabetic retinopathy. The idea here is that it could be used in rural areas where the ratio of the population to eye specialists is much higher. And so you would use this ARDA and then would be anything that showed positive as detecting diabetic retinopathy, then you would refer just those people to an eye specialist. Then they're hoping that they could eventually use the cellphone camera, because they're getting better at that to detect, the cameras are improving. They want to hopefully at some point use the cellphone camera rather than a special device. They're also developing something called Derm Assist, which is basically the same thing. But instead of for diabetic retinopathy, it will be for issues on your skin to try to determine whether the machine thinks they're cancerous enough and whether you should go have a biopsy.

On Google Fit, which is their app, they've developed a way to measure heart and respiratory rate using the camera. Now they're testing whether they can detect heart and respiratory issues using the cell microphone. Then they have this thing called Google Tags, which they're using to measure the movement of joints. You wear them, and you've probably seen where they put balls on people and then they can make that into a computer person. There are labs that do basically that same thing, but they're pretty limited and they want to measure it in real world settings. They would wear these little tags, and they have basically a computer in them and they can figure out which angle your leg is and then they talk to the one that's on your upper leg and talk to the one on your lower leg. Then they can figure out what angle your knee is, and that's so they could monitor people after knee surgery while they're recovering, and then they could figure out how well this person is improving and then bring people back in that are having problems with their recovery.

They have a new thing called Care Studio. This is for doctors to look at patients' electronic health records. There's so much data in the electronic health record. The idea is for Care Studio to be able to bring up the most important things in the doctors, so the doctor can see them. Then also because Google is big into search, they want to be able to use search to be able to pare down the data in the electronic health records to exactly what the doctor wants to see at that moment. They just established a partnership with Meditech, which is in electronic health records. They're not really trying to reduplicate, make a new electronic health records, Google with Care Studio is trying to design it so that they have the best possible data. They'll be integrated into other people's electronic health record situations.

Then two more left, Google is working with the WHO, the World Health Organization, in low- to middle-income countries. Smartphones are fairly plentiful in low- to middle-income countries. But there's no standard on the apps that are used in the smartphone. The WHO and Google are developing what they're calling Fast Healthcare Interoperability Resource, or FHIR. The idea is that you would have a standard that all the different smartphone apps for rural doctors could be built on and then that would allow them to talk to each other and that would make it a lot easier for the doctors that are using these apps, and the apps can talk to each other. They're also developing it so that it will store data locally and then they don't need a remote connection. Then finally they're looking at ultrasound assessments. These are like small probes that are connected to a cellphone. I guess 50% of women, especially in low- and middle-income countries, aren't screened for an ultrasound when they're pregnant. Their idea is that they're hoping to be able to train midwives using these small probes and then a cellphone to identify high-risk patients, and their goal is to cut infant mortality in half. That's all the stuff they went over, and it was really interesting to me.