'It's going to create a revolution': how AI is transforming the NHS

'It's going to create a revolution': how AI is transforming the NHS


Artificial intelligence (AI) Technology is making impressive inroads into cancer treatment, saving lives and money


The tumour is hard to miss on the scan. The size of a golf ball, it sits bold and white on the brain stem, a part of the organ that sends messages back and forth between body and brain. In many ways it is the master controller: from the top of the spinal cord, the brain stem conducts every heartbeat, every swallow, every breath.
For this young man, the cancer came to light in dramatic fashion. The growing tumour blocked fluid draining from his brain, triggering a huge seizure. Now doctors must work out the best way to treat him.
Raj Jena, a neuro-oncologist at Addenbrooke’s hospital in Cambridge, has pulled up the image to explain how doctors plan radiotherapy for patients. For a case like this he might need to study more than 100 images, each showing a thin slice of the brain. Then, image by image, Jena must carefully mark out the border of the tumour and the contours of sensitive brain regions that should be spared the radiotherapy beams: the hypothalamus, the pituitary gland, the pathways to the brain’s vision centres, for example. The process can take hours. But only once it is done can computers start calculating how to hit the tumour with radiotherapy beams without frazzling important parts nearby.
“Until we define where the tumour is and have defined the healthy tissues we want to protect, we cannot start the treatment,” says Jena. “This is the bottleneck. The quicker you get this done, the quicker you can get the patient into treatment.”
With artificial intelligence (AI), the painstaking task can be completed in minutes. For the past six months, Jena has used a Microsoft system called InnerEye to mark up scans automatically for prostate cancer patients. Men make up a third of the 2,500 cancer patients his department treats every year. When a scan is done, the images are anonymised, encrypted and sent to the InnerEye program. It outlines the prostate on each image, creates a 3D model, and sends the information back. For prostate cancer, the entire organ is irradiated.
The software learned how to mark up organs and tumours by training on scores of images from past patients that had been seen by experienced consultants. It already saves time for prostate cancer treatment. Brain tumours are next on the list.

Automating the process does more than save time. Because InnerEye trains on images marked up by leading experts, it should perform as well as a top consultant every time. The upshot is that treatment is delivered faster and more precisely. “We know that how well we do the contouring has an impact on the quality of the treatment,” Jena says. “The difference between good and less good treatment is how well we hit the tumour and how well we avoid the healthy tissues.”
A mile or so from Addenbrooke’s, Antonio Criminisi, the lead researcher on InnerEye at Microsoft Research, explains how automatic processing could pave the way for even smarter radiotherapy. Because it is so time-consuming and expensive, tumour images today are marked up only once, before radiotherapy begins. If it was fast and cheap, patients could have “adaptive radiotherapy” where scanning, image mark-up and beam planning are done before every treatment session. That way, the radiotherapy beams are sculpted to the tumour’s size and shape on the day, not when it was first imaged. “This could be transformative,” says Criminisi. “It could enable a new way of treating cancer that is faster and a lot less burdensome for patients and the NHS.”
Computer engineers are fond of asserting that data is the fuel of AI. It is true: some modern approaches to AI, notably machine learning, are powerful because they can divine meaningful patterns in the mountains of data we gather. If there is a silver lining to the fact that everyone falls ill at some point, it is that the NHS has piles of data on health problems and diseases that are ripe for AI to exploit.
Tony Young, a consultant urological surgeon at Southend University hospital and the national clinical lead for innovation at NHS England, believes AI can make an impact throughout the health service. He points to companies using AI to diagnose skin cancer from pictures of moles; eye disorders from retinal scans; heart disease from echocardiograms. Others are drawing on AI to flag up stroke patients who need urgent care, and to predict which patients on a hospital ward may not survive. “I think it’s going to create a revolution,” he says.
Technology will not transform the NHS overnight. Like any other innovation, AI systems must be tested, validated and approved. And systems that learn often need careful interpretation. A patient’s blood test may reveal sure signs of life-threatening cancer, but an AI may rate the patient as low risk if that cancer can be treated very well.
What may help drive AI through the NHS is the hope that, in some instances, the innovations can save money as well as lives. If patients are triaged faster, tests performed more efficiently, and good diagnoses made more swiftly, the whole system becomes streamlined. One technology the NHS has embraced is called HeartFlow. Spun out of Stanford University, it draws on CT scans that are taken routinely for patients suspected of having coronary heart disease. HeartFlow uses AI to create a personalised 3D model of the heart and the flow of blood around it. From this, doctors can see how specific blockages disrupt blood flow in individual blood vessels and better decide what treatment, if any, is needed. In tests, more than half of patients who had HeartFlow analysis avoided an invasive angiogram, a common but costly procedure that squirts dye into the heart, cutting costs by a quarter. “People ask how can we afford to have these kinds of technologies in the NHS? My answer is we cannot afford not to,” says Young.
It is early days for Vishal Nangalia, a consultant anaesthesiologist at the Royal Free hospital in London, but his company, Life Engine.AI, is honing an AI that crunches blood test results and other data to predict which patients are most likely to die, or have serious problems such as kidney failure, when they are admitted to hospital. Trained on nearly 1bn blood test results from 20 hospitals, the program spots subtle changes in red and white blood cells, and electrolytes such as sodium and potassium, which suggest a patient is going downhill. It does not tell doctors what to do, but helps them intervene sooner by flagging up those patients who might benefit from tests, a scan, or a review from a specialist. “What machine learning can do is help identify issues and bring them to the attention of doctors,” Nangalia says.
Will AI replace doctors, or diminish their role? Back at Addenbrooke’s, Jena shakes his head. “I’d rather spend my time thinking about how to optimise a patient’s treatment than clicking a mouse,” he says. “For many oncologists, we are coming in at weekends and on evenings. With this, we are freed up to do the things that we bring real expertise to.”

Comments

  1. Artificial intelligence with augmented reality is making a difference, giving some retailers an edge over others.
    http://todayssimpleaiformarketing.com/

    ReplyDelete

Post a Comment

Popular posts from this blog

Report: World’s 1st remote brain surgery via 5G network performed in China

Visualizing The Power Of The World's Supercomputers

BMW traps alleged thief by remotely locking him in car