Modern technology could put your medical data at risk, legal expert warns
By
Gian T
- Replies 0
As technology continues to evolve, the healthcare industry is also seeing shifts, with an increasing number of professionals exploring AI tools for administrative tasks.
Reports indicate a growing trend in AI-assisted note-taking during patient consultations, reflecting broader changes in medical practices.
However, this development has sparked concerns, with experts warning about potential risks to sensitive patient information.
The next time you visit your GP, you might be greeted with an unexpected question regarding your consent for AI to record and process your medical consultation.
This scenario is becoming increasingly common as clinics and healthcare providers seek to improve efficiency and reduce the administrative burden on doctors.
Dr Robert Holian from Bendigo has been trialling Lyrebird Health's AI scribe for five months and has seen positive results, with the technology now fully implemented at the Bendigo Primary Care Centre.
He believed that the AI scribe allows him to remain fully engaged with his patients and reduces fatigue, even when dealing with multiple complex conditions in a single appointment.
Dr Holian is confident that the private information is securely contained within the patient's medical file and not accessible to the software provider.
Despite these assurances, the legal community is voicing concerns.
Professor Mimi Zou, head of the University of New South Wales (UNSW) School of Private and Commercial Law, warns that Australia's lack of AI regulation poses a significant risk.
‘We don't have AI regulation in Australia, and it is a risk that these AI systems may not have the requisite security controls,’ she said.
Patients must provide informed consent before their doctor uses AI technology, but currently, there are no mandatory guidelines governing AI use in medical settings in Australia.
Professor Zou is advocating for legally binding safeguards, particularly in high-risk settings where AI use would be subject to more stringent obligations.
‘It might seem very innocent, note taking, but that could potentially lead to all sorts of risks,’ she cautioned.
The federal government is aware of these concerns and is considering proposed mandatory guardrails for AI in high-risk settings.
‘Consultation on these proposed regulations closed late last year, and the government is now considering the feedback received,’ a spokesperson said.
At the Bendigo Primary Care Centre, general manager Callum Wright assures that the AI tool has been integrated into its existing software, with patient data remaining protected within its network.
However, Lyrebird Health, the provider of the AI scribe, declined an interview and did not respond to questions from the ABC.
The adoption of AI scribes is not without its critics within the medical community.
Perth GP David Adam, a member of the Royal Australian College of General Practice's (RACGP) expert committee on practice technology and management, notes a significant variance in functionality and costs among the products available.
A small RACGP survey found that while GPs saved some time on note-taking, they often spent that time proofreading and editing the AI-generated notes.
Dr Adam suggested that it is too early to determine whether AI scribes will become the new normal in medical practice.
In other news, many Australians may not realise that social media platforms and online services use their personal data to train artificial intelligence.
A LinkedIn setting enabling AI data sharing was automatically activated for Australian users. You can read more about this development here.
What are your thoughts on AI scribes in healthcare? Have you had an experience with them during a medical appointment? Share your stories and concerns with us in the comments below.
Reports indicate a growing trend in AI-assisted note-taking during patient consultations, reflecting broader changes in medical practices.
However, this development has sparked concerns, with experts warning about potential risks to sensitive patient information.
The next time you visit your GP, you might be greeted with an unexpected question regarding your consent for AI to record and process your medical consultation.
This scenario is becoming increasingly common as clinics and healthcare providers seek to improve efficiency and reduce the administrative burden on doctors.
Dr Robert Holian from Bendigo has been trialling Lyrebird Health's AI scribe for five months and has seen positive results, with the technology now fully implemented at the Bendigo Primary Care Centre.
He believed that the AI scribe allows him to remain fully engaged with his patients and reduces fatigue, even when dealing with multiple complex conditions in a single appointment.
Dr Holian is confident that the private information is securely contained within the patient's medical file and not accessible to the software provider.
Despite these assurances, the legal community is voicing concerns.
Professor Mimi Zou, head of the University of New South Wales (UNSW) School of Private and Commercial Law, warns that Australia's lack of AI regulation poses a significant risk.
‘We don't have AI regulation in Australia, and it is a risk that these AI systems may not have the requisite security controls,’ she said.
Patients must provide informed consent before their doctor uses AI technology, but currently, there are no mandatory guidelines governing AI use in medical settings in Australia.
Professor Zou is advocating for legally binding safeguards, particularly in high-risk settings where AI use would be subject to more stringent obligations.
‘It might seem very innocent, note taking, but that could potentially lead to all sorts of risks,’ she cautioned.
The federal government is aware of these concerns and is considering proposed mandatory guardrails for AI in high-risk settings.
‘Consultation on these proposed regulations closed late last year, and the government is now considering the feedback received,’ a spokesperson said.
At the Bendigo Primary Care Centre, general manager Callum Wright assures that the AI tool has been integrated into its existing software, with patient data remaining protected within its network.
However, Lyrebird Health, the provider of the AI scribe, declined an interview and did not respond to questions from the ABC.
The adoption of AI scribes is not without its critics within the medical community.
Perth GP David Adam, a member of the Royal Australian College of General Practice's (RACGP) expert committee on practice technology and management, notes a significant variance in functionality and costs among the products available.
A small RACGP survey found that while GPs saved some time on note-taking, they often spent that time proofreading and editing the AI-generated notes.
Dr Adam suggested that it is too early to determine whether AI scribes will become the new normal in medical practice.
In other news, many Australians may not realise that social media platforms and online services use their personal data to train artificial intelligence.
A LinkedIn setting enabling AI data sharing was automatically activated for Australian users. You can read more about this development here.
Key Takeaways
- Around one-quarter of Australian GPs are utilising AI for note-taking during patient consultations.
- Legal experts are concerned about the risk to patients' sensitive medical data due to the use of AI scribes.
- It is suggested that there are no mandatory guidelines for AI usage in medical settings in Australia, leading to calls for legally binding safeguards in high-risk settings.
- Although some GPs for efficiency embrace the AI scribe technology, there are mixed feelings about its effectiveness and concerns about data security and accuracy.