Artificial Intelligence (AI) is everywhere, and medicine is no exception. For example, it has been shown to help radiologists more accurately interpret X-rays and scans. And there are potential applications for artificial neural networks in research, information transfer and retrieval, as well as in clinical decision systems.
But does it have a role in helping patients get more out of healthcare?
The jury is still very much out on this, although one area that is showing promise is the use of AI as a medical scribe.
Ambient artificial intelligence scribe programs are software that uses a microphone to listen to conversations between clinicians and patients. They then summarise the visit into a structured medical note that the doctor can use to share with other health professionals as well as becoming part of the patient’s medical file. One big disadvantage is that an AI scribe tool can’t pick up everything that goes on during the appointment, such as how a patient appears or acts. So if someone is upset and starts crying, this does not automatically appear in the generated medical notes. However, this can be added later to the record by the doctor.
Dublin Airport night flights: rule on limits a ‘necessity’ to manage health effects from plane noise
Shocking crimes, royal illness and Labour’s landslide: The eight big moments that defined 2024 for Britain
Novo Nordisk shares tumble as weight-loss drug trial data disappoints
First group of children evacuated from Gaza to receive medical treatment arrive in Ireland
But it is a proven timesaver – one family doctor estimates the tool saves her at least 20 per cent of her time; she says that by being more efficient, she is now able to see patients on time and also to carry out a more complete physical examination. “I can spend quality time” with patients and “be a lot more empathic”, she says.
Dr Ronan Kavanagh, consultant rheumatologist at the Galway Clinic, is a fan. “I believe that this technology has cracked a nut which has eluded everyone working in healthcare up until now – that of fast, accurate, efficient, secure and cost-effective documentation of medical encounters,” he says. “It has transformed our practice since we started using it two months ago.”
Humans want human judgments … AI can be an assistant, but diagnosis or rationing decisions should not be left solely to AI
— Dr Helen Salisbury, a leading British GP
Dr Vincent Liu, chief data officer for the Permanente Medical Group and senior research scientist at the Kaiser Permanente Division of Research in California, wrote about the benefits of incorporating AI into electronic health records in the New England Journal of Medicine earlier this year. “I see AI in health becoming part of the standard way that we take care of patients, and I think that it won’t be long in the future where patients themselves are wanting to receive care that’s AI informed,” he told the Canadian Broadcasting Corporation.
I was glad to see him strike a note of caution, however: “I don’t think that patients or clinicians are ready to take humans out of the loop or let AI make critical decisions or be, you know, driving those decisions without human oversight.”
Equally cautious is a leading British GP. In a discussion about patients using chat bots for health information, Dr Helen Salisbury told the British Medical Journal last week that “Humans want human judgments ... AI can be an assistant, but diagnosis or rationing decisions should not be left solely to AI.”
I remain sceptical of AI’s supposed future ability to empathise with patients. For those of us who practice narrative-based medicine (NBM), a key question is: Will AI have the capacity to listen closely to patients, and even more importantly respond in a natural empathetic way to the issues raised by them?
Close listening and close reading are essential elements of NBM. By realigning consultations to place the patient more at the centre of clinical interactions, we value the importance of active listening and the use of reflective summarising statements that focus on people’s emotions.
I’m afraid I just cannot see AI being able to undertake these complex tasks. In fact, I reckon an untrammelled AI could lead to a significant risk to patients’ health.
AI is not deus ex machina, just machina. And it’s a machine I believe doctors and patients must engage with very carefully indeed.