How GE HealthCare and NVIDIA Are partnering to deliver better health outcomes

Expert Analysis: How GE HealthCare and NVIDIA Are Using AI to Revolutionize Imaging, Diagnostics, and Global Health Access

8
Nvidia GTC backdrop with image of Zeus Kerravala (Author of article) appearing at bottom.
Artificial IntelligenceNews Analysis

Published: April 11, 2025

Zeus Kerravala

When we hear about artificial intelligence, it’s often about the chatbots and virtual assistants we encounter daily. However, some of the most interesting—and valuable—applications for AI occur in healthcare, where people’s lives are on the line.

At NVIDIA’s recent GTC conference in San Jose, many of the company’s technology partners participated in 1,000 sessions at the annual event. Two experts from GE HealthCare delivered one of the week’s most interesting and meaningful presentations.

NVIDIA and GE HealthCare announced a collaboration to advance innovation in autonomous imaging. GE HealthCare will focus on developing ultrasound and X-ray applications using the new NVIDIA Isaac for healthcare medical device simulation platform.

The Isaac platform includes pre-trained models and physics-based simulations of sensors, anatomy, and environments. The platform accelerates R&D workflows, which enables GE HealthCare to train, test, and validate autonomous imaging system capabilities in a virtual environment before they are deployed for patient care.

Roland Rott, president and CEO of imaging, and Parminder Bhatia, chief AI officer at GE Healthcare, presented how their company is transforming diagnostic imaging by applying cutting-edge AI technologies. The pair described several of their innovative AI applications to improve medical diagnoses and patient outcomes. Here are some of the highlights.

Addressing key healthcare challenges

Rott laid the foundation for what GE HealthCare strives to accomplish with its medical applications by focusing on some key challenges facing medical teams and patients. He said cancer, cardiovascular diseases, and the fact that “4.5 billion people don’t even have proper access to health care” are among the most significant global healthcare challenges.

He said even when a patient has access to health care, traditional processes can move too slowly to diagnose and treat a serious illness. He described the case of a woman in her 50s who spent months seeing different specialists and undergoing a range of tests, then had to wait three months more for all the data to be analyzed and shared by a team of medical professionals so a diagnosis could be made and a treatment plan started. He said this lengthy process underscores how fragmented the healthcare system still is today in how all these constituents work together.

Rott said the average hospital generates about 50 petabytes of data annually but only uses a fraction of it. He cited a study that found 97% of the data is just generated. It’s stored. It’s never really retrieved. He also noted that about 60% of doctors spend more time with emergency medical records (EMR) than patients because they are working to retrieve all the right data to make meaning of it.

This is a problem I’ve talked to many of healthcare experts about. Healthcare generates a veritable cornucopia of data but most of it is in silos and there is no one capable of connecting the dots between the data points. This is where AI can create a step function in advancing healthcare.

On the surface, more diagnostic testing and the ability to generate data about a patient seem like advantages for medical teams. However, all that time spent examining medical records, in addition to their many other responsibilities, increases the stress on medical professionals. This stress leads to churn due to burnout and frustration.

Bhatia said GE HealthCare is examining how to tackle these challenges with its D3 strategy, which the company defines as “using smart devices across disease states enabled by digital tools” to enable precision care tailored to the patient and “addressing systemic challenges like burnout and patient backlogs.”

AI-powered innovations

Bhatia said the company’s D3 strategy is focused on addressing the following questions: “How do we make our devices smarter? How do we integrate more smartness into our devices with AI? Secondly, how do we provide digital solutions for patients across their care journey, from screening to diagnosis, treatment, and therapy? Finally, how do we address the key challenges and burden on providers with the amount of overwhelming data that is getting created, which is multimodal in nature?”

The company’s goal of making its devices smarter is reflected in the magnetic resonance imaging device AIR Recon DL, which was launched in 2020. He stated, “It makes our images scan faster and accurately by reducing the scan time by 50%. Now, what does it mean for providers? In the past, if they could do three scans in an hour on an MR machine, now they can go through up to six scans with the same resources. So there’s a huge lift on the throughput for the providers, as well as providing earlier access to the patients in need.”

Bhatia explained GE is working to streamline hospital operations with an AI-enabled command center. The system uses predictive analytics to reduce the length of hospital stays for patients. He described how Deaconess Health Care of Indiana leveraged Command Center to improve capacity utilization, resulting in 2,000 more beds annually. He added that GE HealthCare is looking to bring in more AI and innovation by making this process more streamlined, focusing on staffing patient flow as well as bed management problems, to make sure we can actually predict them days or weeks in advance, making sure we can recommend solutions and make the system more proactive.

Better imaging quality with AI

Even with the incredible capabilities medical imaging delivers, the GE HealthCare team is continuously working on further enhancements. These include “smarter devices which actually have logic inside, which are AI-enabled, which give guidance to users and ultimately make them more confident or accelerate a diagnosis,” Rott said.

The objective, he said, is to acquire better images for every patient in, for example, cardiac MRIs, typically a very long procedure taking up to 90 minutes per patient, while requiring the patients, often elderly and all with heart issues, to hold their breath repeatedly so the machine can capture images as clearly as possible.

To solve that problem, GE HealthCare is applying AI to deep learning of the raw data of MRI itself. “We take the raw data… to identify and reduce artifacts, and ultimately to optimize this particular exam…for a single heartbeat,” Rott explained.  “All these images get optimized. The noise gets out of the signal. We have been able to improve the speed here by 12 times for this particular examination. At the same time, the image quality is better.”

He said the overall exam time could be reduced by 83%… to a few minutes for this particular, important examination. “It can actually mean that certain patients can be assessed the first time with this technology. And it also means that for a healthcare system, for the physicians, and the hospital, they can ultimately provide access to patients faster, and they can ultimately also run more patients through their equipment,” stated Rott.

He proudly added that the company has similar successes in CT and ultrasound scans, X-rays, and mammography. “We can apply these principles broadly, and that has been a major breakthrough over the last several years,” he added.

Using imaging devices such as ultrasound, in conjunction with AI, can be extremely valuable whether a patient is far from a large medical center or in the middle of one with many other patients waiting for their turn under the imaging machines. “It’s like in your car, your navigation system will tell you where to go when to turn left when to turn right,” explained Rott.

“Similar concept here. You apply an ultrasound, you hold it in this in a certain position, and the system will, with AI feedback, let you know whether you know you are at the right angle, whether you have the right depth, and ultimately have a confidence indicator, a quality meter, whether what you see here is really diagnostically relevant and good enough, and then ultimately saving these best images right away.”

Rott said these are examples of smart devices GE HealthCare is building to support the patient journey. “We are currently working with the Bill and Melinda Gates Foundation to bring these technologies to low- and medium-income countries,” he said.

A cloud-based application for diagnosing cancer

Bhatia discussed the time burden on physicians who must learn about each patient in depth to treat them appropriately. “Today, physicians are spending a significant amount of time getting up to speed on the patient’s history, where, for a new patient, it can actually take up two hours. The process is time-consuming and frustrating for multiple reasons because they have to search through large amounts of unstructured, multimodal data that is spread out across multiple silos and is fragmented for them to come out with the best treatment options.”

In 2024, GE HealthCare introduced CareIntellect for Oncology, a cloud-based application that, Bhatia said, “helps you to synthesize this multimodal healthcare data that is accessible in a single pane of glass to actually streamline decision-making.”

Powered by AI, CareIntellect for oncology enables medical teams to use AI not just at the output level but to “aggregate or synthesize this unstructured and multimodal data, bringing in more benefits across the value chain as well. It actually leverages Gen AI capabilities to summarize patient data so that meaningful insights and actions can be taken across the spectrum of things as well. Providing this longitudinal, multimodal data synthesis can actually empower clinicians toward looking into the disease, how the disease is doing, how is the disease progression happening, as well as flag potential deviations that can happen from the treatment,” Bhatia said.

The future of healthcare

GE HealthCare has a significant goal of making medical data—especially imaging results—more useful for medical teams that diagnose and treat potentially life-threatening conditions.

“The future is very much enabled by what’s possible today. There’s so much more possible today than what was possible five years or 10 years ago,” said Rott.

GE HealthCare launched a research project, Health Companion, where multiple AI agents work together “much like a virtual tumor board,” said Bhatia. “We are leveraging agent AI technology on one of the hardest problems in cancer treatment, which is disease progression; determining how to treat cancer hinges on what the cancer is doing at a particular point in time. Answering this question requires going into longitudinal, multimodal data that spans multiple disciplines.”

“One agent is looking into the clinical data, another agent is looking into the biological, another is looking into the radiological, genomic, and even into the coverage data,” explained Bhatia.

“These agents then provide their recommendations to the supervisory agent, and then a supervisory agent collates that information and then comes out with a best treatment plan.”

Both Rott and Bhatia said the partnership with GE HealthCare and NVIDIA—which has been innovating for 15 years—is ready to continue blazing a trail to better healthcare results by leveraging technology far into the future.

AI AgentsAI AssistantsGuest BlogInvestmentsPartnerships
Featured

Share This Post