Technology

AI translation is replacing interpreters in GP care – here’s why that’s troubling

2025-12-22 12:24
812 views
AI translation is replacing interpreters in GP care – here’s why that’s troubling

Across GP surgeries and hospitals, as migration increases and health systems strain, doctors are turning to an untested helper: Google Translate.

  • Home

Edition

Africa Australia Brasil Canada Canada (français) España Europe France Global Indonesia New Zealand United Kingdom United States The Conversation Edition: Global
  • Africa
  • Australia
  • Brasil
  • Canada
  • Canada (français)
  • España
  • Europe
  • France
  • Indonesia
  • New Zealand
  • United Kingdom
  • United States
s Newsletters The Conversation Academic rigour, journalistic flair Doctor tapping on his mobile phone. Krakenimages.com/Shutterstock.com AI translation is replacing interpreters in GP care – here’s why that’s troubling Published: December 22, 2025 12.24pm GMT Anne Cronin, Anthony Kelly, University of Limerick

Authors

Disclosure statement

Anthony Kelly receives funding from Innovation Fund Denmark.

Anne Cronin does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Partners

University of Limerick provides funding as a member of The Conversation UK.

View all partners

DOI

https://doi.org/10.64628/AB.jkgtdnfhn

https://theconversation.com/ai-translation-is-replacing-interpreters-in-gp-care-heres-why-thats-troubling-271639 https://theconversation.com/ai-translation-is-replacing-interpreters-in-gp-care-heres-why-thats-troubling-271639 Link copied Share article

Share article

Copy link Email Bluesky Facebook WhatsApp Messenger LinkedIn X (Twitter)

Print article

When a doctor can’t find an interpreter, many now reach for Google Translate. It seems like a practical fix to a pressing problem. But a new study warns this quick solution may be putting refugee and migrant patients at serious risk – exposing them to translation errors that could lead to misdiagnosis, wrong treatment or worse.

The study, led by an interdisciplinary team of researchers at the University of Limerick – of which we were part – examined how artificial intelligence (AI) is being used to bridge language gaps between doctors and patients. The findings reveal a troubling pattern: AI translation tools are increasingly replacing human interpreters in GP surgeries, even though none of these apps have been tested for patient safety.

Anyone who has tried to explain themselves across a language barrier knows how easily meaning can slip away. In everyday situations – from the nail salon to the car mechanic – we often manage with gestures, guesses and good humour. But healthcare is different.

Clear communication between a patient and their doctor must be accurate and safe. It is the cornerstone of good medical care, especially when symptoms, risks or treatment decisions are involved, and it allows patients to feel heard and to participate meaningfully in decisions about their own health.

When a patient and doctor do not speak the same language and rely instead on an AI translation app such as Google Translate, communication becomes less certain and more problematic. What appears to be a convenient solution may obscure important details at precisely the moment when clarity matters most.

The recognised standard for cross-cultural communication in healthcare is access to a trained interpreter. The role of an interpreter is to provide impartial support to both the patient and the doctor. However, interpreters are often inaccessible in practice, due to availability, time pressures and limited resources in general practice.

Consequently, doctors report that they increasingly turn to the device in their pocket – their phone – as a quick, improvised solution to bridge communication gaps during consultations. Google Translate is now being used as an interpreter substitute, despite not being designed for medical communication.

My colleagues and I examined international studies from 2017 to 2024 and found no evidence that an AI-powered tool can safely support the live, back-and-forth medical conversations needed in clinical consultations.

A mobile phone with the Google Translate app open. Not designed for medical translation. Yaman2407/Shutterstock.com

Errors create serious risks

In all the studies we reviewed, doctors relied on Google Translate, and they consistently raised concerns about its limitations. These included inaccurate translations, failure to recognise medical terminology and the inability to handle conversations that unfold over multiple turns.

The studies reported translation errors that risk misdiagnosis, inappropriate treatment and, in some cases, serious harm. Worryingly, the research found no evidence that Google Translate has ever been tested for patient safety in general practice.

In other studies, Google Translate was shown to misinterpret key medical words and phrases. Terms such as congestion, drinking, feeding, gestation, vagina and other reproductive organs were sometimes mistranslated in certain languages.

It also misinterpreted pronouns, numbers and gender, and struggled with dialects or accents, leading to confusing or inaccurate substitutions. Alarmingly, researchers also reported “hallucinations” – where the app produced fluent-sounding but entirely fabricated text.

Relying on Google Translate to support doctor-patient communication carries the risk of displacing human interpreters and creating an overdependence on AI tools that were not designed for medical interpretation. It also normalises the use of AI apps that have not undergone the safety testing expected of healthcare technologies.

It is difficult to imagine any other area of medical practice where such an untested approach would be considered acceptable.

The study found that refugee and migrant advocates prefer human interpreters, particularly in maternal healthcare and mental health. Patients also raised concerns about consenting to the use of AI and about where their personal information might be stored and how it might be used.

To deliver safe healthcare to refugees and migrants, doctors should ensure that patients have access to trained interpreters, whether in person, by video, or by phone. Clear instructions for accessing these interpreters must be available in every healthcare setting so that staff can arrange support quickly and confidently.

The evidence shows that AI tools not specifically designed and tested for medical interpreting should no longer be used, as they cannot yet provide safe or reliable communication in clinical situations.

The Conversation asked Google to comment on the issues raised by this article but received no reply.

  • General practice
  • Healthcare
  • Interpreter
  • Google Translate
  • Give me perspective

Events

More events

Jobs

More jobs
  • Editorial Policies
  • Community standards
  • Republishing guidelines
  • Analytics
  • Our feeds
  • Get newsletter
  • Who we are
  • Our charter
  • Our team
  • Partners and funders
  • Resource for media
  • Contact us
Privacy policy Terms and conditions Corrections