Q&A: Verily on using generative AI within healthcare

Q&A: Verily on using generative AI within healthcare

Alphabet’s life science subsidiary Verily announced a strategic restructuring last year, shifting its focus to using AI and data science to improve precision health options. 

Andrew Trister, chief medical and science officer at Verily, sat down with MobiHealthNews to discuss the company’s use of AI technology, its work with tech giant Google and what excites him about the business’s future.

MobiHealthNews: Can you tell our audience about Verily and what it does?

Andrew Trister: Verily came out of Google X in 2015. It has had a long history of looking across the entire ecosystem in healthcare, starting with research. So, building devices, finding new ways to measure disease (kind of the metrology problem), and being able to do care delivery. So, it’s been largely through Onduo, our diabetes care management system. And then working in care financing. So, Granular is a stop-loss insurance product.  

So, if you think about each of the different pain points across how a person gets better care, you can think about the applications of having better measurements. That’s the kind of data generation question, making greater insights and knowledge from that data and then taking action. 

And so these are all of the parts that have been built since the initial investment made by Google X, and we see ’24 into ’25 as the time where all of these things become mutually reinforcing so we can really drive differentiated viewpoints on how people access their care in an equitable way and then have better outcomes.

MHN: Does the company often work alongside Google

Trister: So, there have been a number of projects that have started within Google Health Research that have come over into Verily to build products. Some of those have been almost as partnerships across a memorandum of understanding between the two companies.  

The best example of that is there was an algorithm that was published in Nature by Google Health Research, looking at the back of the eye, so images of the retina, to determine whether a person might have diabetic retinopathy. They also went on to do other things like classification of whether it’s a male or female eye, things that humans couldn’t do to demonstrate the utility of the AI application.  

But the issue with any algorithm like that is where does that fit into the workflow today, and how do you really bring impact to people? So one of the major questions that arose, and this is where the partnership with Verily and the hardware excellence that is within Verily came to bear, it really came to the point of how do you even obtain some of the images of the back of the eye?  

So we built an almost fully automated retinal camera, and it’s called Verily Retinal Camera, and that device allows a person to take back-of-eye images, and then there’s AI on top of it. We used Google’s AI, and then we’re working with other companies that have built other AI models as well that will demonstrate this utility, starting first as classification diagnostics and then on the larger horizon. We’re excited about the applications of just what you can measure similar to Google’s efforts to do things like sex classification.  

You don’t necessarily need to do that from retinal images, but maybe there are other diseases that we could determine from those types of things. So, once you have the device in place, what else could you do? So that’s an area of continuous discussion.

MHN: Is Med-PaLM something Verily is looking to utilize within its offerings?

Trister: We’ve been discussing the utility of Med-PaLM, the models there, and how we might be able to leverage novel multimodal approaches. So there’s Med-PaLM, and obviously, there’s a lot of work being done on Gemini. So, we’re certainly exploring what that could look like but we’re not fixed to only work with Google. 

We’ve been looking at other applications. We can be agnostic to which of the major [genAI] models are out there if we can find that we have the right data infrastructure in place. So that’s a lot of what we have built that’s differentiated at Verily, different from, say, Azure or Google Cloud or AWS. And then what really brings impact for people.

I think a lot of what generative AI applications have been focused on has been more on the back office piece. We do some of that work primarily through our insurer, but where we see the biggest change that could be made using these tools is going to be in front of people even before they become patients. So, how can we help people navigate this crazy thing that is our healthcare system?  

MHN: What are you most excited about within Verily right now? 

Trister: There are so many really strong engineering applications and really hard problems in healthcare that Verily has decided to just tackle head on. But some of them in the past have been kind of siloed projects.  

So, things like the Retinal Camera, as an example, is a tremendous product, but it isn’t built in a way that, you know, holistically drives across all of the different pain points that we see within health systems. We’re now in a position where we can start to tie things together in novel ways.  

If we can start to show that things actually become mutually reinforcing across multiple different points, I think that’s where the real value is created for people because they can live healthier, better lives and not drive costs up.

Technology has been such a major driver of cost in healthcare for decades at this point, that this may end up bending the cost curve if we run it down far enough into the future.

Source link