NOTE: By submitting this form and registering with us, you are providing us with permission to store your personal data and the record of your registration. In addition, registration with the Medical Independent includes granting consent for the delivery of that additional professional content and targeted ads, and the cookies required to deliver same. View our Privacy Policy and Cookie Notice for further details.
Don't have an account? Register
ADVERTISEMENT
ADVERTISEMENT
AI unchecked could pose a significant risk to patients’ health
Artificial intelligence (AI) has its strong points:
Intelligence isn’t one of them.
Mark O’Connell, author,
The Irish Times, 8 June 2024
Artificial intelligence (AI) is everywhere at present – including in healthcare. Despite a proven propensity to offer false information and the potentially serious risks this poses to patients, AI has some health system advocates.
There are potential applications for artificial neural networks in research, information transfer and retrieval, as well as in clinical decision systems.
An international conference, titled ‘Insights into AI healthcare’, was organised by the UK’s Royal Society of Medicine earlier this year.
At the meeting, a broad range of contributors spoke about how AI had made significant progress in areas such as radiology and surgery. An emergency medicine physician and AI specialist spoke about how AI can help close communication gaps with our patients. He created an AI package designed to help patients refine their questions prior to a doctor or nurse practitioner visit.
There are some initial benefits to using AI in primary care. A number of practitioners have begun using an AI scribe to reduce the time spent on administrative tasks. Some have noted that the tool allows them to focus on listening to patients more directly, minimising the need to engage with the computer during appointments.
In a recent perspectives piece in The Lancet, ‘Clinician as editor: Notes in the era of AI scribes’, the authors argue that the time has come for doctors to learn to be good editors. They say they have noticed that AI scribes can miss information from longitudinal clinician–patient relationships and non-verbal communication.
“To counter these risks of automation, clinicians using AI scribes should view their new role as note editor – a transition best supported by training in editorial and narrative skills…,” they write. They add that, more broadly, “clinicians must ask: If note writing supports the cognitive processes of doctoring, how might this be recaptured in the age of AI scribes? And, if the clinician is now not author but editor, what new expertise does this work entail?”
The widespread attention on AI isn’t solely positive. Recently, two computer scientists from Princeton University released a book titled AI Snake Oil: What Artificial Intelligence Can Do, What It Can’t, and How to Tell the Difference. They caution that the notion of predictive AI as a replacement for human decision-making is not grounded in reality.
However, a key question for those of us who practise narrative-based medicine is: Will AI have the capacity to listen closely to patients, and even more importantly, respond in a natural empathetic way to the issues they raise?
Despite article headlines such as ‘Google AI has better bedside manner than human doctors – and makes better diagnoses’ (Nature Communications, 12 January 2024), I have not been able to find quality research to back up such sweeping statements. But this does not mean future iterations of the technology will not develop such an ability.
Close listening and close reading are crucial components of narrative-based medicine. By reshaping consultations to centre more on the patient, we emphasise the importance of active listening and the use of reflective summarising statements that highlight patients’ emotions.
The technology behind AI systems, such as ChatGPT (GPT-4), is known as a large language model. These are artificial neural networks, a type of AI designed to mimic the human brain. They can generate general purpose text and perform language processing tasks such as classifying concepts, analysing data, and translating text.
We are informed that the next version, GPT-5, is expected to have an enhanced capacity for understanding dialogue context and interpreting language nuances. It is said to incorporate a higher degree of emotional intelligence, enabling more empathetic interactions with users.
I struggle to envision AI being capable of handling such complex tasks. In fact, I believe that AI unchecked could pose a significant risk to patients’ health.
Curious, I decided to ask ChatGPT if it saw a role for itself in narrative-based medicine. It confidently responded with a ‘yes’. When prompted for examples, this was the reply I received:
“Certainly! AI in narrative medicine can involve sentiment analysis of patient narratives to understand emotional states, natural language processing for extracting relevant medical information from stories, and machine learning models to identify patterns in patient experiences that may contribute to more accurate diagnoses or treatment recommendations. Additionally, chatbots or virtual assistants can engage with patients to collect their narratives, aiding in more comprehensive healthcare support.”
The last sentence raises an intriguing question: Could we reach a point where human listeners become obsolete in storytelling? Might the very qualities we cherish in narrative-based medicine – nuance, empathy, and humanity – come to be seen as too subjective? Could the virtual collection of narratives be preferred for its perceived objectivity?
Perhaps AI could enable us to record these conversations more accurately and with a new level of consistency.
In response to these questions, I deeply value the individuality and fluidity that emerge when two people engage in a free-flowing conversation. As a practitioner of narrative-based medicine, preserving this human connection is something I strongly wish to maintain.
But, like it or not, AI is coming to all parts of healthcare and medicine.
It’s something I believe we must engage with very carefully indeed.
A version of this article has been published by
Narrative-Based Medicine Lab
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
Dr Jeffrey Kuhlman’s insights into providing care for US presidents are worth reading So Donald Trump ...
Living handbag-free is simpler and lighter than I ever imagined A few years ago, I made...
ADVERTISEMENT
The public-only consultant contract (POCC) has led to greater “flexibility” in some service delivery, according to...
There is a lot of publicity given to the Volkswagen Golf, which is celebrating 50 years...
As older doctors retire, a new generation has arrived with different professional and personal priorities. Around...
Catherine Reily examines the growing pressures in laboratory medicine and the potential solutions,with a special focus...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
Leave a Reply
You must be logged in to post a comment.