Health

The AI will see you now

This article is part of the Global Policy Lab: Decoding Cancer.

BERLIN — When a woman gets a mammogram in Europe, its standard practice virtually everywhere for two radiologists to take a look at the X-rays to check for signs of breast cancer.

Soon that second opinion could come from a computer.

A range of private companies around the world, from small startups to global tech behemoths, are developing software that uses artificial intelligence to analyze medical images — a field that has been the domain of radiologists since German engineer Wilhelm Röntgen discovered X-rays in 1895.

Advances in the field mean computers will be able to spot irregularities in medical images and make independent decisions on whether or not a second physician needs to take a look at the scan.

“Those who are willing to work with [AI] as early adopters will be the winners” — Jonas Muff, CEO of Berlin-based MX Healthcare

It effectively upgrades them from tools in the hands of physicians to digital peers that can make diagnoses and even give tailored recommendations — and with that, it is likely to radically reshape the role that radiologists play in diagnosing and treating cancer.

Its a world that is deeply skeptical about the newcomers.

“The claims made by the start-ups are mostly targeted at very specific cases,” said Stefan Schönberg, a professor at the University Hospital of Mannheim and president of the Deutsche Röntgengesellschaft, an influential group that has represented the interests of radiologists for some 110 years.

Many are “based on very small amounts of data,” according to Schönberg, who added that he expects “many of them to fail at the certification hurdle.”

Deep-learning AI systems are set to transform the world of medicine | Image via iStock

Tech interlopers like Berlin-based MX Healthcare — a spinoff of the AI incubator Merantix — arent surprised by the criticism and maintain they want to work with, rather than against, radiologists.

MX Healthcare trained its software on a data set of “up to 1 million” mammograms collected from partnering radiology offices and hospitals, said CEO Jonas Muff. The collection of images — “in my knowledge the largest data set in the world” — was checked and enhanced by the startups own team of more than 10 radiologists, according to Muff.

He said he hopes their product — an “Uber”-style cloud-based platform that radiologists can access in their internet browsers — will get certified within the first three months of next year.

The new technology, he insists, will allow doctors who are open to it to automate large parts of their workflow without revenue losses. “All in all, of course there will be fewer radiologists who do [mammograms],” said Muff. “But those who are willing to work with [AI] as early adopters will be the winners.”

* * *

Modern medicine has evolved with emerging technology since its inception.

Over the course of the last century, physicians have come to rely on electronic equipment and digital devices. Talk about how to apply AI technology — which allows machines to do jobs that previously required human thinking — in medicine has been around for more than 50 years.

But its only in recent years that the rapid rise of a new machine-learning method has brought AI to the forefront of cancer research, diagnosis and treatment. That seismic shift has seen an explosion of innovation from firms jostling to make their mark in the new field.

One such startup is Helsinki-based Kaiku Health. Founded by a team of five software developers 2012, its first telemedicine platform had little to do with whats understood as cutting-edge AI today.

The app it developed allows cancer patients to report their well-being to their doctors around the clock via their smartphone. At the same time, doctors can monitor their patients symptoms and react to irregularities in real time.

Tens of thousands of people have used a Finnish telemedicine platform since it launched in 2012 | Vesa Moilanen/AFP via Getty Images

“It makes life so much easier,” said Tuire Lehtinen, a 65-year-old retired nurse, who was diagnosed with multiple myeloma, a form of blood cancer, in 2014 and started using the Kaiku app about a year and a half ago, after undergoing a stem cell transplant and radiotherapy.

More than 70,000 patients like Lehtinen in over 40 European hospitals and clinics have used the app since it was first launched, according to Kaiku.

Now, Kaiku wants to transform its platform by infusing it with new AI technology, CEO Lauri Sippola said in an interview at the companys offices in Helsinkis hip Sörnäinen district. “We will be able to preventively, based on the patient-reported data, forecast how a patient will be doing the next week.”

The ultimate goal is to be able to predict future symptoms, based on data collected by the app, and help physicians decide whether a patient will benefit from certain therapies, he said.

These efforts rely on the same technology MX Healthcare in Berlin uses to spot signs of breast cancer in X-rays — one that made global headlines in 2016, when a computer beat the world champion in the ancient game of Go — a machine-learning method known as “deep learning.”

* * *

“Deep learning” allows machines to “learn” by recognizing patterns and finding correlations in troves of data so huge no human could ever process in a lifetime.

Physicians across disciplines have started using it in cancer treatment, but nowhere is its impact already more evident than in the image-intensive areas of medicine: pathology, dermatology and, in particular, radiology.

Deep-learning systems are already outperforming humans, according to a number of studies. The possibilities have kicked off a race among tech innovators gunning to plant their flag as leaders in the new developments.

When it comes to how AI will revolutionize cancer treatment, radiology is the canary in the coal-mine.

According to Schönberg, the head of the German radiologists association, efforts to replace radiologists or increase the role of machines “only make sense if they happen on the base of much well-structured and curated data.”

The German group has partnered with other organizations to build up a platform that would include such a large-scale database, he said, adding that the plan is to eventually give app developers working on AI-powered medical systems access to it.

But Berlins MX Healthcare is already pushing forward with its own data set. Some of its partners, including radiology offices and hospitals that provided data, have already signed up to trial the software, according to CEO Muff — if its approved as a medicinal product early next year, that is.

In the future, machines may be able to analyze tissue to determine the most appropriate course of treatment | American Cancer Society via Getty Images

Regardless of who does the data collecting, whats certain is that the fundamental role of radiologists will change.

“Its quite possible that when it comes to the radiological follow-up assessment of cancer patients, our core business — comparing A versus B — is diminishing and that the role of radiology is shifting toward looking at completely unsolved questions,” Schönberg, the radiology professor, said.

While today, for example, radiologists can determine by looking at the white-and-gray distribution on an X-ray that some spots are signs of cancer, they cant say whether such metastases will react to certain therapies.

“In the future, a machine could perhaps take over the task of analyzing … whether tissue is a metastasis,” Schönberg said. “And we radiologists take over the holistic process, and say for example that a metastasis contains certain genetic information and will react to certain molecular medication.”

* * *

When it comes to how AI will revolutionize cancer treatment, radiology is the canary in the coal-mine.

But while deep learning is already considered to be quite accurate in its predictions and recommendations, and is expected to become more powerful in the years ahead, todays technology still faces major hurdles.

Deep learning turns computers into de facto black boxes, making it impossible to understand how exactly they came up with an output, even for the developers who wrote the code.

Its also only as good as the data its being fed, and still depends on human experts labeling that data.

And perhaps most importantly, a deep-learning algorithm can only “know” whats happened in the past; it is unable to consider things that have never been seen before, unlike humans.

The key will be to get computers to a point where they learn similarly to how we humans do: based on only tiny amounts of data, but highly effectively.

To reduce that gap, experts are working on what they call “unsupervised” deep learning: systems that — just like human brains — dont need anyone to label data for them.

Geoffrey Hinton, a computer scientist working for Googles deep-learning research team Google Brain, predicted in a September paper in the Journal of the American Medical Association that the discovery of these types of “unsupervised” algorithms could be only a few years away.

The key will be to get computers to a point where they learn similarly to how we humans do: based on only tiny amounts of data, but highly effectively.

“How the brain does this is still a mystery,” Hinton wrote, “but will not remain so.”

The POLITICO Global Policy Lab is a collaborative journalism project seeking solutions to pressing policy problems. Join the community.

Read this next: Theresa May appeals over MPs heads for Brexit support

Original Article