Chatbots like ChatGPT have great potential in healthcare settings, but users should be aware of the technology’s limitations, as the AI may present results that seem convincing, but are not necessarily based on scientific medical evidence.

Image source: Adobe Stock/ProstoSvet

Article • Potential of large language models for nursing and health science education

ChatGPT: the next big thing in cardiology?

Hardly a day passes without new headlines about ChatGPT, the AI-powered large language model, and its potential applications in healthcare. First results have been somewhat sobering after the initial hype, with the AI’s “hallucinations” often replacing scientific truth. Still, Professor Philip Moons is convinced that the chatbot might bring actual benefits to nursing and health science education. At the ESC cardiology congress in Amsterdam, the expert outlined how the technology could be put to use, despite – or even because of – its liberal approach to facts.

Report: Wolfgang Behrends

First, the expert from KU Leuven in Leuven, Belgium, provided a short rundown for the uninitiated: ChatGPT (short for “Chat Generative Pre-trained transformer”) is an artificial intelligence language model, trained with a massive dataset of written text: ‘Basically, it has swallowed the entire internet,’ Prof Moons explained. 

The chatbot developed by OpenAI – and similar products such as Google Bard, Pi from Inflection, Meta’s LLaMA, or the ERNIE Bot from Baidu – can generate a huge variety of texts, some of which might be useful for clinical practice: ‘ChatGPT could be used to generate patient education materials, automatic charting and documentation from structured data, medication lists and instructions for patients,’ Moons pointed out. It could provide up-to-date information on latest research and guidelines by summarizing articles in a concise and easy-to-understand format. Another important aspect is the chatbots’ translation feature – highly relevant in a setting where not everyone speaks the same language.

Born to chat, not to research

The key word in this listing, however, is “could” – because in many cases, the technology is not quite there yet: Even though the AI-generated texts may look convincingly natural, it is important to keep in mind that ChatGPT has no concept or real knowledge about healthcare topics, Moons stressed. ‘They are chatbots, developed to have conversations, not to look things up.’ So, until this issue is resolved, users need to be wary of potentially false or misleading information. 

The industry, aware of these shortcomings, is already developing solutions geared towards medical professionals. For instance, the “BrightSpark” AI1 aims to create a more reliable chatbot by only incorporating trusted guidelines. Currently, the tool only covers cardiology, but the developers have announced expanding their AI model to other specialties as well. 

AI translating medical terminology to lay language: a partial success

First attempts to employ chatbots in cardiology have been promising yet flawed: In a recent paper published in the European Journal of Cardiovascular Nursing2, Prof Moons and his colleague Liesbet van Bulck used two chatbots – ChatGPT and Google Bard – to generate education material that is easy to understand for patients. ‘We all know that the patient information brochures and flyers that we make are too complex,’ the expert said. Manuals, for example from the American Medical Association3, recommend that information material intended for patients should be written at a sixth-grade level, or for readers of twelve years – a target that is arguably missed more often than not.4 

portrait of philip moons
Professor Philip Moons

Image source: Göteborg University; photo: Johan Wingborg

For their proof-of-concept study, the experts fed patient information sections from scientific journals to the chatbots and tasked them to simplify them to a sixth-grade reading level. This goal was achieved – in a way: While the complexity was admittedly decreased, the AI models tended to leave out significant chunks of text from their translation – in one case, 83% of the overall word count. ‘It was oversimplified by removing important pieces of the information,’ Moons reported. 

Despite these shortcomings, chatbots can be put to good use in healthcare settings – as long as the task is chosen carefully.5 ‘They can be used to summarize large amounts of text, such as research articles, clinical notes and patient records,’ the expert said. Other strengths of existing builds are generating structured data from unstructured texts, language translation of articles and writing assistance. The AI can even serve as a source of inspiration, Moons added: ‘When you are given a new task, and really don’t have a clue how to proceed with it – go consult ChatGPT.’ While many of the AI’s suggestions can be discarded immediately, there is often one approach that can at least start a new train of thought. 

A knowledgeable sparring partner for patient communication roleplay

The chatbots’ liberal approach to facts can even work in favour of medical education, for example in medical consultation training, Moons suggested: Nowadays, communication skills are often conveyed to students via role-play. Setting up ChatGPT as a would-be patient could make these lessons more efficient, less time-consuming, and easier to embrace for more introverted students. 

While this already works reasonably well with existing AI models, the expert shared his vision for taking the concept even further: A specialized chatbot imbued with the knowledge of the relevant textbooks and materials for a given course. This AI could not only play the part of the patient to be consulted, but also immediately answer the students’ questions, possibly even providing them with the relevant literature to enhance their learning experience. ‘This is really something that should not be too difficult to train,’ Moons concluded with an appeal to the developers, ‘and I’m sure that this would be a great asset for health education.’ 


Philip Moons is Professor of Healthcare and Nursing Science at the Department of Public Health and Primary Care at KU Leuven University, Belgium. He is also visiting professor at the Institute of Health and Care Sciences of the University of Gothenburg, Sweden, and honorary professor at the Department of Paediatrics and Child Health of the University of Cape Town, South Africa. Prof Moons is the editor-in-chief of the European Journal of Cardiovascular Nursing. 


  2. Moons P, Van Bulck L: Using ChatGPT and Google Bard to improve the readability of written patient information: A proof-of-concept; European Journal of Cardiovascular Nursing 2023 
  3. Weiss, BD: Health Literacy: A Manual for Clinicians; American Medical Association 2003 
  4. Rooney MK et al.: Readability of Patient Education Materials From High-Impact Medical Journals: A 20-Year Analysis; Journal of Patient Experience 2021 
  5. Moons P, Van Bulck L: ChatGPT: Can artificial intelligence language models be of value for cardiovascular nurses and allied health professionals; European Journal of Cardiovascular Nursing 2023


Read all latest stories

Related articles


News • Machine learning assessment

Heart transplantation: AI can provide decision-making support

Matching the right donor heart to the right recipient at the right time is a complex task. Now, experts point out how AI can provide unbiased decision-support for transplantation process.


News • Warfarin, personalised

AI helps dosing anticoagulation meds in heart surgery patients

Warfarin is sometimes prescribed after heart surgery, but getting the dose right requires a personalised approach for each patient. A new AI tool is designed to help with this complex task.


News • Literacy on new technologies

Should AI enter medical education curricula? Yes, say students

AI plays an increasingly important role in medicine – and this should also be reflected in medical curricula. New work from an international team suggests how this could be done.

Related products

Subscribe to Newsletter