NOTE: By submitting this form and registering with us, you are providing us with permission to store your personal data and the record of your registration. In addition, registration with the Medical Independent includes granting consent for the delivery of that additional professional content and targeted ads, and the cookies required to deliver same. View our Privacy Policy and Cookie Notice for further details.



Don't have an account? Register

ADVERTISEMENT

ADVERTISEMENT

ChatGPT, AI, and the future of healthcare

By David Lynch - 10th Sep 2023

AI

The influence of artificial intelligence in healthcare, including the rise of ChatGPT, is the subject of growing interest and concern. David Lynch reports

The high-profile release of ChatGPT late last year has led to significant discussion on the impact of artificial intelligence (AI) on society, including healthcare.

Recent months have seen a number of papers published in leading medical journals on the issue.

In February, The Lancet Digital Health ran an editorial titled ‘ChatGPT: Friend or foe?’. The opinion concluded that widespread use of ChatGPT, “is seemingly inevitable, but in its current iteration careless, unchecked use could be a foe to both society and scholarly publishing.”

Earlier this year, Prof Erwin Loh from the Monash Centre for Health Research and Implementation, Monash University, Australia, wrote a commentary in the BMJ Leader on ChatGPT and medicine.

The title of the article was ‘ChatGPT and generative AI chatbots: Challenges and opportunities for science, medicine, and medical leaders’.

“The biggest challenge posed by generative AI for medicine is its potential for biased results,” Prof Loh told the Medical Independent (MI). He added this potential for bias may mislead clinicians to follow a treatment path which “may be erroneous or based on inaccurate or incomplete data”.

“The AI model is only as good as the data it is learning from, and, unfortunately, the internet has a lot of sources that may have data that is skewed or focused on a population that cannot be generalised. This is a problem for research and clinical trials in general, which may become enhanced if AI tools are over-relied on.”

However, the technology also has many potential positive consequences for the field of medicine, he noted.

In Prof Loh’s view, the biggest benefit of generative AI models in medicine “is their potential to transform how we diagnose, treat, and predict diseases”. In addition, there is the potential for tools to be “democratised” so that they are accessible to not only healthcare professionals, “but with appropriate training and guard rails, the general public.”

According to Prof Loh: “Large language models have the ability to synthesise vast amounts of evidence from existing literature and summarise them in simple language, which has implications for how we diagnose illnesses when it comes to radiology and pathology, choose treatment protocols, for example with oncology, or predict the risk of diseases such as dementia or diabetes.”

Writing on ChatGPT in this newspaper in April, Dr Muiris Houston stated that like most new technologies “it will take time to bed down and work around”.

In the meantime, he stated, “it’s probably best to treat ChatGPT as an incomplete tool.”

“One doctor described asking it to do a literature review that she had already done and it came up with fake, but real-looking, references, and fake, but real-sounding, information,” according to Dr Houston. “This could be especially dangerous if you asked it for information to give to patients. At this point, it is clearly a case of caveat emptor where ChatGPT and its use in healthcare is concerned.”

How are medical regulators responding to the issue? When questioned by MI, a spokesperson for the Medical Council only confirmed that the draft content for the updated Guide to Professional Conduct and Ethics for Registered Medical Practitioners is “under final review and the revised ethical guide will be launched in the autumn”. The Council spokesperson said the “existing guidance remains relevant for the moment”.

HSE

AI, in general, has also been on the agenda of the HSE. For instance, at the end of last year, the HSE Spark Innovation Programme held a forum on AI and machine learning in healthcare. The event in Dublin was attended by frontline healthcare workers as well as corporate representatives, researchers, and academic institutions.

Speaking at the forum in December, the head of the Spark programme, Mr Jared Gormly, said the HSE recognises that AI and machine learning have the potential to “revolutionise” the health sector. He added that the HSE “firmly believe cross-discipline innovation in this space can not only impact healthcare service delivery, but also improve health outcomes”.

The Government’s national AI strategy AI – Here for Good was launched in July 2021. Last month, the first progress report on its implementation was published. The report contained a number of references to healthcare, including a new HSE project, which was an “example of AI at work in the public service”.

The HSE project is focused on the automated extraction of kidney failure concepts from clinical notes using AI.

The kidney disease clinical patient management system (KDCPMS) is an electronic health record managed by the HSE. It has been used in every haemodialysis and transplant centre in Ireland since 2007. KDCPMS has a mixture of structured data fields (eg, name, date of birth, medications, past medical history) and unstructured or free-text data fields (clinical notes, outpatient letters).

A recent national survey conducted by the Irish Nephrology Society’s research committee showed that although many nephrology centres used KDCPMS for day-to-day management, few used it for audit or research (15 per cent). Most of the data is stored in unstructured free-text format. This means that users could describe the same issue using different words, which makes audit, quality improvement and research difficult to perform. This work currently needs hours of manual clinical chart review to perform the most basic of audits.

Funded through the public service innovation fund this year, the aim of this HSE project is to utilise an AI natural language processing technology to “extract structured information from unstructured or free text data… and populate structured data fields”.

“This will make it easier and more efficient to carry out a range of clinical audits, quality improvement projects, and clinical research using the KDCPMS data.”

According to the progress report, the expected outcomes of the project include: Efficiencies by enabling the use of KDCPMS for national clinical audits and quality improvement projects; reductions in operational costs, time, and resources; and benefits for end users and healthcare providers to better understand the needs of their patients.

Legislation

On the wider legislative level, Minister of State in the Department of Enterprise, Trade and Employment Dara Calleary told the Dáil in July that the Government supports the draft European Union AI Act that will set out harmonised rules for the design, development, placement on the market and use of AI systems in the Union.

“The proposal aims to address the risks generated by specific uses of AI through a set of complementary, proportionate, and flexible rules,” said Minister Calleary.

“Given that AI has evolved significantly in the past few months, it is important that this regulation is flexible and future-proofed in order to ensure that it continues to protect the safety and fundamental rights of the individual while also ensuring that innovation for good continues in this area.”

Final agreement on the Act is expected by late 2023 or very early into 2024.

RCSI

As reported last month in MI, the challenges posed by ChatGPT and AI were discussed within the RCSI in recent months. At the RCSI Council meeting in February, the role of ChatGPT was raised during a discussion on ‘academic integrity’.

An update was presented to the Council on “the increased recognition of the need to promote and maintain academic integrity given the recent considerable interest in… AI and ChatGPT… which has immediate implication for certain assessment strategies”.

The minutes, seen by MI following a Freedom of Information request, further noted that “the use of paraphrasing tools and file sharing sites also bring their challenges”.

An RCSI spokesperson told MI that the impact of AI on teaching, learning, and assessment approaches is “a rapidly developing area” with best practice on the appropriate and ethical use of AI in education “still emerging”.

The spokesperson said that alongside these challenges “posed to academic integrity”, it is also considered that both healthcare education and practice are likely to be “positively impacted” by AI.

“As such, a complete ban on the use of generative artificial intelligence is not currently expected to be implemented.”

The spokesperson said that after the Council meeting in February, the RCSI determined that further consideration was required in this area to inform policy and practice. As part of this process, Prof Phillip Dawson, Centre for Research in Assessment and Digital Learning, Deakin University, Australia, attended the International Education Forum at the RCSI in June to discuss ‘Academic integrity in the emergent environment of artificial intelligence’.

The RCSI spokesperson said most of the College’s assessments “are less susceptible” to AI products such as ChatGPT. This is because healthcare degree students take many in-person exams.

However, students on taught postgraduate programmes are required to complete written assignments. “For these programmes, course leaders are alert to and considering how to minimise any possible inappropriate use of AI,” said the spokesperson. 

The RCSI academic integrity working group will continue “to lead our thinking” on the issues of ChatGPT and AI within the College, they added.

Leave a Reply

ADVERTISEMENT

Latest

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

Latest Issue
medical news Ireland
Medical Independent 19th November 2024

You need to be logged in to access this content. Please login or sign up using the links below.

ADVERTISEMENT

Trending Articles

ADVERTISEMENT