by Mark Stewart

AI promises to play a game-changing role in radiology, but the algorithms have a long way to go.

No topic has been hotter in the media this year than the coming of artificial intelligence. AI is getting better and getting faster and, some fear, getting a bit ahead of our ability to understand and control it. That is up for debate—and there will be debate—but most of AI’s proponents are more focused on the best ways to harness it. AI has the awesome potential to make us smarter, safer, faster and more productive. And that has people in the medical profession very interested.

The fact is that artificial intelligence is nothing new. You’ve been using it since the first time you followed directions from a GPS app. AI has been a part of medicine for a while now, too. Computer-aided detection, for example, was being used in mammography in the 1990s. And many FDA-cleared algorithms are currently in use, including at Trinitas. What’s new are some profound advances in deep learning, which combines algorithms with massive amounts of input data to create a level of “expertise” that will enable doctors to perform faster and more efficiently. And theoretically, one day, AI will be able to see and predict things that humans cannot.

In radiology, AI has already revealed its game-changing potential. The digital images that have traditionally been interpreted by radiologists are now being translated into quantitative data, which is then used to create algorithms to detect anomalies that might be invisible to the human eye. This in turn can alert a doctor to a situation that he or she can address sooner than ever before. If you know someone in the radiology field, then you know that it can be a high-stress job with little or no margin for error—either missing something important or producing a false positive. Sophisticated AI-supported clinical decision-making can only help.

Far From Perfect

Dr. Albert Li, a vascular and interventional radiology specialist—who considers himself the “champion of AI” in his practice—cautions that, although the potential is exciting, currently it’s far from perfect. “Right now,” he says, “it’s meant to augment the experience of radiologists, who are trained to interpret what the AI is telling them and then either accept it or dismiss it. The thing to understand about AI is that it doesn’t ‘know’ what it’s looking at—or what it means—it only understands how to look for the patterns it has been programmed to look for.”

Patients should be reassured, Dr. Li adds, that the algorithms currently being deployed are indeed meant to help radiologists perform more accurately and faster. They are good at detection (detection of tumors, detection of brain bleed) and at triage—in the sense that, when the computer sees a grave abnormality, it tells the radiologist to prioritize its interpretation.

“My goal as a radiologist is that AI must be safe for my patients—not to make money, not to generate revenue. Whatever we employ, it needs to be effective and safe.”

While the benefits of AI in respect to outcomes and patient safety are obvious, less obvious is the fact that the same deep learning skills can be applied to almost every aspect of the patient experience—from scheduling to imaging decisions to examination protocols, as well as managing workflow for the hospital staff, which has become a higher priority throughout the industry post-COVID. AI also has the potential to decrease by 30 to 50 percent the time it takes to produce a scan and to pull together an image, which enables radiology departments to see many more patients a day. And of course, AI doesn’t get “tired.”

Needless to say, some big tech companies are working to make AI a bigger part of healthcare, including Google, Siemens and Canon.

Big Picture

One area that could benefit greatly from AI is the whole-body scan. By measuring and analyzing hundreds of biomarkers, both individually and as part of the body’s system, the chances of identifying and accurately predicting future medical issues are greatly increased. Right now, whole-body scans are too time-consuming and expensive to be available to the general population. That could change in the not-so-distant future as better deep-learning algorithms and scanning equipment are developed. For instance, AI may observe that a patient has a probability of developing a disease like cancer in the future, even though there is absolutely no sign of the disease at that moment. Dr. Li jokes that this reminds him a little of the movie Minority Report.

“I don’t know that we’ll see something like this in the next 10 years,” he says. “A lot of people talk about diagnosis, but that’s far away. The training data is not robust enough yet—we’ll need more clean data to make that happen. On the one hand, it will take time; but on the other, it’s becoming more robust because of better, cheaper computer power. That’s actually where we’ve seen the biggest jumps recently.”

Finally, AI holds considerable promise in an aspect of oncology that can be daunting for both the doctor and patient: deciding on exactly the right treatment strategy, especially as the number of therapeutic options continue to broaden. An AI-supported decision would take into account hundreds of data points (including genomic information) above and beyond the radiological data—and help doctors understand which patients might respond best to various therapies.

Again, the concept of deep learning is that, if you upload enough data, algorithms will start revealing patterns that humans don’t see.

Will AI replace radiologists? That is highly unlikely. A machine can’t look you in the eye and deliver good news or bad news or discuss options. Also, there is the issue of “machine drift,” a somewhat alarming phenomenon that results from the fact AI equates “the past” with “the future”—producing some weird results. In the end, AI is merely a medical device and a doctor needs to be on the interpretive end of things.

While many in the medical profession were skeptical when the role of AI was first being brainstormed, few if any need much of a push to recognize its potential anymore. There is still some natural trepidation, but the promise of being better and more efficient at what they do—and having back-up support driven by an ocean of data—is increasingly appealing.

In the meantime, AI developers will continue working with radiologists and others in the medical field to not only ensure that their products meet expectations, but exceed them.

Editor’s Note:

Dr. Albert C. Li is an Interventional Radiologist at Trinitas. He received his medical degree from Rutgers New Jersey Medical School and has been in practice for more than 20 years. For LDCT screening appointments, call (732) 955-8825. University Radiology at Trinitas is located at 415 Morris Avenue in Elizabeth.