Automated diagnoses in radiology

  • Eyal Toledano picture 1
    Eyal Toledano Founder and CTO of Zebra Medical Vision LTD
  • Eyal Toledano Portrait
    Eyal Toledano Founder and CTO of Zebra Medical Vision LTD
  • Eyal Toledano picture 1
    Eyal Toledano Founder and CTO of Zebra Medical Vision LTD
  • Eyal Toledano Portrait
    Eyal Toledano Founder and CTO of Zebra Medical Vision LTD

Making the best of your PACS

The idea that computers make decisions about health and diseases troubles patients and doctors alike, the former because of wide-spread mistrust in technology, the latter because they see their jobs in peril. We cannot tell yet whether these fears are justified but in view of the trails algorithms are already blazing towards increased diagnostic security and early detection of common diseases, we have cause to be optimistic rather than skeptical.

This, at least, seemed to have been consensus at an ETIM (Emerging technologies in Medicine) event that took place in Essen, Germany, last February. While the event, hosted by the University Hospital Essen, focused on artificial intelligence and bio-printing, it offered inspiring perspectives on emerging medical treatment trends. Eyal Toledano, co-founder and CTO of Zebra Medical Vision LTD, illustrated one of these trends in his presentation “How to revolutionize medical imaging using artificial intelligence and big clinical data”.  The Israeli company developed an algorithm for a product called “radiologic assistant” and is working on expanding the concept to other medical fields. Talking to VISUS VIEW, Eyal Toledano explained not only the algorithm but also the vision of his company.

What exactly are you working on and how can your products and ideas improve patient care?

Our medical vision focuses on the ability to provide automated readings of medical images. One part of that vision is to support the radiologist by offering additional evidence. Our radiologic assistant can find more incidents or more conditions that support the reading and thus increase its quality. The assistant is a kind of plug-in that is integrated in a number of certain PACS for example CareStream PACS. Thus artificial intelligence takes the PACS to the next level and strengthens their original functions. Another product Zebra Medical Vision provides is the Imaging Analytics that uses the same algorithm to scan the entire VNA to detect populations at risk, for example those with a high coronary calcium score since they are at a higher risk for cardiac events. Imagine being able to scan all images available in the hospital archive to detect anomalies of any kind and independently of the original reason for acquiring the image. This cannot be done by humans. But the algorithm can easily do that and issues a message as soon as something abnormal is found. The hospital or care provider can then invite the patient for a diagnostic workup and – if necessary – start preventive treatment. In short: the improvement in patient care is about detecting diseases more reliably and in an earlier or even pre-clinical stage. 

Is this pie in the sky or close to medical reality?

It is very close to reality. We have already concluded five pilots in major US hospitals and the tool is now introduced in business and can be applied. And although a new workflow is required for this kind of automated diagnoses, the way radiologists interpret a study is not affected. Artificial intelligence simply provides insights which the doctor can include in his reading – or not. It complements, rather than replaces previous information.

So in the near future, we might imagine the following scenario: a patient presents with lung problems and a CT scan is performed. The scan is read by the radiologist but moreover an algorithm checks the existing data for other anomalies in the background?

Yes, all studies can be scanned and the algorithm will have a look at cardiac anomalies, bone density in the spine suggesting osteoporosis. It will evaluate your liver tissue to determine if it is fatty which would indicate a blood disease or diabetes. It will look at the texture of the lung and quantify emphysema. The list is theoretically more or less endless. We have a few FDA-cleared pilots, including emphysema, fatty liver and osteoporosis. Additionally, some diagnoses are currently in the FDA clearance process, for example for detecting compression fractures in the spine, segmenting and detecting brain bleeds in CT, classification of different lung pathologies within the COPD family, etc..

Are there any figures yet regarding the false positive/negative rate?

We did a very large study at the Clalit Healthcare Services with 196,892 CT cases for the detection of osteoporosis and we have shown that looking at just the imaging data we could increase annual osteoporosis diagnose by 50 percent. Those results were double checked by physicians. The accuracy was 85 percent for people suffering from osteoporosis. So we can say that the false positive risk is very low when algorithm was tuned to detect patients with dexa lower than -2.5 we had less than two percent false positives, which really is a good score.

Where do you get your reference data?

We have around 14 million indexed studies that allowed us to develop our algorithms. Those data are based on partnerships, for example with Clalit Healthcare and Intermountain Health 

What is the reaction of the doctors at your pilot institutions?

In the beginning they were all skeptical. But after a while they understood that if you have a robot performing routine tasks in a very precise way, you can focus on different, more advanced and more important things such as taking care of patients or concentrating on complex cases. Smart automated algorithms really have the potential to address inefficient processes in radiology by doing routine work. As a result the radiologist has more time to focus on complicated multimodality studies such as looking for a genome related characteristics of a specific cancer. Doctors very quickly recognize this win-win situation.

What is your vision for the next 15 years?

The vision is a kind of autopilot similarly to the one in an aircraft. When you are on a flight you know that most of the flight time, the pilot is only supervising the flight which is actually performed by the autopilot. This can be possible in medicine as well if we are able to develop algorithms that improve human accuracy. I guess the role of radiologists is going to change, but in a positive way. They will have more time, for example for science and will further improve patient care.