Two in five Australian GPs use AI scribes to record patient notes – but do they trade care for convenience?
Some doctors argue it allows them to better connect with patients, but advocates warn the AI technology risks the opposite
silverguide.site –
When a patient walks into a GP’s office in Australia today, the doctor may begin with a question: “Do you give consent to use an AI scribe to record our conversation?”
That’s what is supposed to happen, at least.
According to an online poll by the Royal Australian College of GPs (RACGP), use of AI scribes by doctors in Australia nearly doubled from 22% in August 2024 to 40% in November 2025.
Those AI tools – such as that offered by Australian company Heidi – record, transcribe and summarise the conversations between doctors and patients for medical notes.
“We make a big effort to let patients know we are using AI, and give them the option to opt out. That’s a really key bit,” says Dr Max Mollenkopf, a GP based in Newcastle. “Just telling patients what’s going on, not trying to be subtle about it.”
Heidi is increasingly being used by GPs, with the Melbourne-based startup saying it has supported more than 115m sessions in 18 months across the globe.
The increasing popularity of AI medical scribes may help to relieve doctors’ administrative burden, but experts point to concerns about consent, privacy and accuracy.
Sign up for the Breaking News Australia emailDoctors Guardian Australia spoke to stressed it was important to gain consent from patients before using such AI tools.
But Dr Elizabeth Deveny, the chief executive of the Consumer Health Forum, says not all practices are having explicit conversations about the tools. “I went to my GP recently and in the waiting room there was one poster for an AI scribe,” she says. “It basically said: ‘By reading this, you understand that your consent is being given.’”
When doctors do ask, Deveny says it is framed as a “you don’t mind if I use this?” conversation. “Consider the power differential between a consumer and the clinician. What are they going to say?”
For some GPs, AI scribes are seen as an administrative tool, allowing them to better connect with patients during a consultation by allowing them to focus on the patient directly rather than furiously typing notes, and to help with record-keeping afterwards.
“They also use it as a compliance tool,” Mollenkopf says. “The nice thing about the scribe tools is that they can take that consult and then put it in a format that’s suitable if you’ve got a Medicare audit.”
“So it gives GPs a lot of reassurance that if they do get audited, they can prove to Medicare the work they did, because the scribe tool heard the whole consult.”
Mollenkopf was a beta tester for Heidi AI but was not paid by the company for that work or testimonials, and continues to pay for his own use of the app. He has also been using Heidi’s Comms service, an AI bot that can call patients on behalf of doctors to get an update on their condition between consultations.
“You would think that [some patients] would be furious about it … but that’s actually quite a small minority,” Mollenkopf says.
Deveny says there is concern that outsourcing note-keeping to AI risks doctors failing to retain and recall their conversations with patients.
“What consumers are saying is when they go back to see their GP, it feels to them like a GP did not emotionally connect with them … because the GP doesn’t seem as familiar with what happened last time,” she says.
Dr Caitlin Curtis, a University of Queensland researcher specialising in responsible AI, agrees. “Note-taking isn’t just administrative – when we write and summarise things, it’s part of how we think,” she says.
“It helps us process information, reflect, prioritise, and really understand what’s going on. If that process is automated or removed, it may save time – but it raises the question of what else might be lost.”
RACGP digital health and innovation deputy chair Dr Janice Tan says, however, that freeing up some of the administrative burden from doctors could help.
“Clinicians might actually have room to think again – to be present in a consultation rather than half-distracted by paperwork,” she says. “Burnout in general practice is bad right now, and if AI can take some of that load, that’s worth paying attention to.”
AI note-takers also do not record the tone, emotion and nonverbal signals a patient uses when saying something – for example, in a mental health consultation, Curtis says.
The tools are currently exempt from Therapeutic Goods Administration regulations because they do not directly diagnose patients.
The risk to patient data is a constant in health, and Australia has already had a number of privacy breaches related to medical data, including Australian Clinical Labs, Medibank, and Genea.
The RACGP president, Dr Michael Wright, said he was optimistic AI tools would help patients and GPs work more closely together in deciding the best course of action for individual patients, but said privacy and consent concerns are an issue.
“The GP – and potentially the patient, too – needs to confirm that any AI output is correct,” he says.
Heidi co-founder and chief executive, Dr Tom Kelly, says data is processed in the country the patient is in and is not used to train the AI or sold to others. The company uses third-party testing and auditing to keep the data secure, he adds.
He says the company holds itself to a high standard in ensuring transcriptions – which Heidi’s model processes itself – are accurate, but doctors still need to check their AI-assisted notes.
“Even clinicians mistype and misspeak and get things wrong,” he says. “But the errors we [Heidi] make are weirder because they’re unusual mishearings that humans wouldn’t make.”
Related Widgets

Comment