AI for Healthcare: Practical Applications in Australia (2026)

2 March 2026By Chris Raad

Clinical scribes, recall automation, and admin AI for Australian GP practices. What works, what is safe, and what other practices are doing in 2026.

Key Takeaway

I'm Chris from Studio Slate. We build custom AI systems for Australian healthcare practices, including patient lifecycle workflows, recall automation, and intake systems that integrate with Cliniko, Halaxy, and Best Practice. I wrote this guide because most practice owners I talk to hear the AI noise, see the vendor pitches, and have one question: what actually works, and is it safe?

This guide covers the current state of AI in Australian healthcare based on published clinical data, regulatory filings, and vendor disclosures. No speculation. Every claim is sourced.

Clinical scribes: the breakout use case

AI clinical scribes are the single most adopted AI tool in Australian general practice. They listen to the conversation between clinician and patient, then generate structured clinical notes, referral letters, care plans, and other documentation.

Two Australian companies dominate the market: Heidi Health (Melbourne-based, raised $15 million AUD) and Lyrebird Health (integrated directly into Best Practice Premier). Coviu offers an AI scribe as part of its telehealth platform, handling in-person, phone, and video consultations.

The clinical data on time savings is consistent across studies:

MetricSourceFinding
Documentation during appointmentsHeidi/Modality Partnership trial (47 GPs, 2,879 consults)51% reduction
After-hours adminSame trial61% reduction
Clinician-reported note qualityLyrebird Health/Best Practice data88% reported improvement
Workflow efficiency gainsSame source84% reported gains
Capacity released per dayHeidi 2025 Impact Report59 minutes per clinician
Clinicians who would continue usingSame report98%
Patient acceptance rateSame report99%

The Heidi/Modality trial is the most rigorous published study. It ran for 25 days across a network of 53 GP surgeries. After the pilot, Modality expanded to over 200 clinicians and reported 530 clinical hours saved per month at scale, with over 17,000 consultations transcribed monthly.

What scribes actually produce

The output goes beyond transcription. Both Heidi and Lyrebird generate structured notes in formats like SOAP, draft referral letters, create mental health care plans and chronic condition management plans, and populate clinical fields in the electronic medical record. Lyrebird's Best Practice integration writes directly into the patient file, including observations like blood pressure and weight that were discussed during the consult.

Lyrebird also handles government-mandated paperwork: chronic care management plans, mental health treatment plans, Centrelink and WorkCover certificates, and NDIS PDFs, all drafted from the consultation and ready for clinician review.

Pricing

Heidi offers a free tier with core features and a Pro subscription. Lyrebird is now free for all Best Practice Premier customers for core scribing, with Pro features at an additional cost. Coviu bundles its AI Scribe with telehealth at $79 per clinician per month as an add-on, or $99 per clinician per month in the full Clinic Bundle.

For context, a human virtual medical scribe in Australia costs approximately $6,728 per month. The AI scribe costs between $0 and $99 per month.

The RACGP recommends that GPs obtain patient consent before using any AI listening tool. Medical defence organisations back this position. In practice, not every GP is following through. A Guardian investigation in March 2026 found that some practices frame consent as a "you don't mind if I use this?" afterthought rather than a genuine opt-in.

Dr Elizabeth Deveny, chief executive of the Consumer Health Forum, flagged the power imbalance: "Consider the power differential between a consumer and the clinician. What are they going to say?"

The best practice approach: tell the patient what the tool does, offer the option to decline, and confirm that declining does not change the care they receive. Heidi and Lyrebird both include consent workflows built into the product.

Need custom AI workflows for your practice?

We build private AI systems for Australian clinics. Recall engines, intake automation, care plan management. Integrated with Cliniko, Halaxy, and Best Practice. From $4,997.

See healthcare AI services

Administrative AI: the next frontier

Clinical scribes solve the documentation problem inside the consultation. The larger administrative burden sits outside it: recall management, patient intake, referral triage, billing, and document processing.

Recall and chronic disease management

For most general practices, chronic disease management plans are a significant revenue driver. From 1 July 2025, the old MBS items 721 (GP Management Plan), 723 (Team Care Arrangements), and 732 (Review) were replaced by the new GP Chronic Condition Management Plan (GPCCMP) items (965, 967) at $156.55 per plan or review.

The recall workflow is where AI can have immediate financial impact. A practice with 4,000 active patients and a 15% chronic disease prevalence has 600 patients who are eligible for annual GPCCMPs. At $156.55 per plan, that is $93,930 in potential annual Medicare revenue. The challenge is identifying who is overdue and getting them booked.

AI systems can watch the patient database for overdue recalls, match patients to eligible MBS items, draft personalised recall messages, and flag the ones most likely to respond based on past attendance patterns. This is not theoretical. HotDoc already automates clinical recalls and reminders across its platform of 21,000+ practitioners and 13 million patients, though its tools are rule-based rather than AI-driven.

The gap is intelligence. A rule-based system sends every overdue patient the same message at the same interval. An AI recall engine can prioritise patients by complexity, personalise the outreach, identify patients who have lapsed from the practice entirely, and suggest the most effective contact channel based on past interactions.

Document triage and inbox management

MediRecords launched Evolve Direct in early 2026, an AI tool that classifies incoming clinical documents, matches them to the correct patient and provider, and files them in the right place. Beta testing showed up to 120 minutes saved per day for practices with 10 doctors, with an 83% reduction in manual steps.

This is the kind of "boring" AI that delivers immediate ROI. No clinical risk. No patient-facing interaction. Pure administrative efficiency.

Patient intake and pre-consultation

HotDoc's platform handles digital forms, online bookings, check-in, and routine requests (including repeat prescriptions and referrals). Practices using HotDoc's automated recalls see a 51% reduction in no-shows compared to non-automated systems.

The next step is AI-powered intake: collecting patient history, medications, and presenting complaints before the appointment, summarising them for the clinician, and flagging anything that needs attention. Some of this already exists in fragments across different platforms. The opportunity is in connecting the pieces into a single workflow that runs from first contact to post-consultation follow-up.

Patient-facing AI: triage and symptom assessment

This is the most cautious category, and for good reason. AI that interacts directly with patients and influences clinical pathways carries higher risk and higher regulatory scrutiny.

The TGA's updated guidance on clinical decision support software (January 2026) draws a clear line. AI that only presents information for a clinician to review can be exempt from full TGA regulation, provided it meets three criteria: it does not directly process medical images or device signals, it only supports (not replaces) clinical judgement, and the clinician can independently reach their own conclusions from the source data.

AI that uses "black-box" machine learning to generate diagnostic recommendations does not qualify for exemption and must be included on the Australian Register of Therapeutic Goods (ARTG) before it can be supplied.

In practice, this means:

AI typeTGA statusExample
Clinical scribe (listens, transcribes, structures notes)Not regulated (not a medical device)Heidi, Lyrebird, Coviu
Recall engine (identifies overdue patients, drafts outreach)Not a medical deviceHotDoc Recalls, custom builds
Symptom checker (asks questions, suggests possible conditions)Likely a medical device, may be exemptDepends on design
Diagnostic AI (analyses images/data, generates diagnosis)Regulated, must be on ARTGHarrison.ai, Annalise.ai
Treatment recommendation AI (suggests medications/interventions)Regulated, must be on ARTGNot yet common in AU GP

The safe approach for practice-level AI: stay on the administrative and documentation side of the line. If the AI never gives a patient clinical advice, never suggests a diagnosis, and never recommends treatment, the regulatory burden is minimal and the risk is low.

Privacy and compliance

Healthcare data is the most sensitive category of personal information under Australian law. The regulatory framework for AI in healthcare rests on three pillars.

The Privacy Act 1988 and Australian Privacy Principles (APPs)

The 13 APPs regulate how personal information (including health information) is collected, stored, used, and disclosed. Any AI tool that processes patient data must comply. The key requirements for healthcare AI:

  • APP 3 (Collection): Only collect health information that is reasonably necessary for your functions. An AI scribe that records a consultation and discards the audio after generating notes is handling this differently from one that retains audio indefinitely.
  • APP 6 (Use and disclosure): Health information collected for one purpose cannot be used for another without consent. If patient data is used to train an AI model, that is a secondary use requiring explicit consent. Both Heidi and Lyrebird state that patient data is not used to train external AI models.
  • APP 11 (Security): Take reasonable steps to protect personal information from misuse, interference, loss, and unauthorised access. For healthcare AI, this means encryption at rest and in transit, access controls, and audit logging.

The My Health Records Act 2012

The My Health Records Act adds specific protections for data in the national My Health Record system. Information downloaded from a My Health Record to a local system falls back under the Privacy Act and state health information laws. Unauthorised collection, use, or disclosure of My Health Record data carries mandatory data breach notification requirements and potential criminal penalties.

Notifiable Data Breaches scheme

Under the Privacy Act, healthcare providers must report eligible data breaches to the Office of the Australian Information Commissioner (OAIC) and affected individuals. Healthcare has consistently been among the top sectors for reported breaches in Australia. The Medibank breach (2022), Australian Clinical Labs breach (2022), and Genea breach remain cautionary examples.

What this means for AI adoption

Every major Australian healthcare AI vendor processes data onshore:

VendorData processing locationCompliance claims
Heidi HealthAustraliaSOC2, ISO27001, Australian Privacy Principles
Lyrebird HealthAustraliaAPP, HIPAA, GDPR aligned
CoviuAustraliaISO 27001, APP compliant
MediRecordsAustraliaEnterprise-grade security, clinical advisory co-design
HotDocAustralia21,000+ practitioners, PMS integration

The OAIC has assessed GP clinics for compliance with My Health Record security requirements. Findings included practices without written security policies and potential breaches that had not been notified. Any practice adding AI tools should ensure its Security and Access policy under Rule 42 of the My Health Records Rule 2016 covers AI-assisted data handling.

What the RACGP says

The RACGP's position statement on AI in primary care sets out three principles:

  1. GPs must be involved in the development and integration of AI tools
  2. The sector must be appropriately regulated, including post-market surveillance
  3. The RACGP will support GPs to develop AI skills

The position is pragmatically optimistic. The RACGP recognises administrative efficiency gains but warns that AI "should not create unnecessary and low-value work for GPs" through task substitution, such as having to check and correct poor AI outputs.

In their October 2024 submission to the AI legislation review, the RACGP was more specific: administrative AI is the clearest opportunity. Clinical AI needs tighter regulation. Consumers should be informed about AI use. And a dedicated oversight body for healthcare AI may be needed.

RACGP President Dr Michael Wright told The Guardian: "The GP, and potentially the patient too, needs to confirm that any AI output is correct."

What is not ready yet

Three areas where practice owners should exercise caution or hold off entirely.

Clinical decision-making. AI that diagnoses conditions, recommends treatments, or interprets pathology/imaging results is subject to TGA regulation and requires ARTG inclusion before supply. Diagnostic AI companies like Harrison.ai and Annalise.ai operate in hospital and specialist radiology settings with formal clinical validation. This is not yet a tool for the average GP desktop.

AI-generated clinical advice to patients. No credible Australian vendor offers unsupervised AI that gives patients clinical guidance. Lyrebird's usage policy explicitly states the platform "does not replace professional judgement, trigger clinical actions, perform patient monitoring, or recommend treatments." This is the right position.

AI receptionists. Tools that answer phones and interact with patients using synthesised voices carry reputational risk. Patient perception of AI phone answering in healthcare skews negative. The better path is AI that works behind the scenes: automating recall messages, triaging documents, preparing intake summaries for the clinician.

Adoption data

The numbers tell a story of rapid but uneven adoption.

Data pointSourceDate
40% of Australian GPs use AI scribesRACGP online poll, reported by The GuardianNovember 2025
Up from 22% in August 2024Same sourceAugust 2024
~33% of GP supervisors use AI clinicallyGPSA Annual National Supervision Survey2025
64% of GP supervisors rate themselves as AI novicesSame survey2025
66% of doctors have not used any AI toolsAusDoc survey, 212 doctorsFebruary 2025
76% see AI as beneficial for administrative efficiencySame surveyFebruary 2025
Heidi supported 115M+ sessions in 18 monthsThe GuardianMarch 2026
Lyrebird supports 1M+ Australian consults per monthLyrebird/Best Practice announcement2026

The AusDoc survey from February 2025 and the RACGP poll from November 2025 show different numbers because they measured different things: the AusDoc survey asked about all AI tools, while the RACGP poll focused specifically on scribes. Both surveys show the same trend: adoption is accelerating, but most practitioners still consider themselves beginners.

The GPSA found that fewer than one-third of GP practices have a formal policy on AI use. Registrars (trainee GPs) are often adopting AI tools before their supervisors, creating a knowledge gap that practices will need to address.

Investment flowing in

The investment numbers confirm that healthcare AI is not a niche experiment.

Australian healthtech companies raised $380 million in the first two months of 2026, exceeding the entire first quarter of 2025. Healthcare and biotech account for 40% of all AI startup funding in Australia, more than any other sector.

Globally, healthcare AI raised $7.8 billion in 2025, up 287% from 2024. Clinical documentation and drug discovery led the categories. Heidi Health and Harrison.ai were cited as proof that clinical AI can scale from Australia.

The Australian Digital Health Agency's 2025-26 Corporate Plan prioritises interoperability, share-by-default data access, and AI governance. The National Policy Roadmap for AI in Healthcare, developed by the Australian Alliance for AI in Healthcare, calls for a fully funded national plan to create an AI-enabled healthcare system by 2025.

The $400 million Strengthening Medicare Fund in the 2025-26 federal budget includes allocations for digital health integration, with early indications that funding will prioritise technologies reducing pressure on general practice.

Practical first steps for a practice

A practice owner or practice manager looking at AI adoption in 2026 does not need to build anything custom. The proven path starts with off-the-shelf tools and moves to custom solutions only when the generic options do not fit.

Step 1: Start with a clinical scribe. Heidi or Lyrebird, depending on your practice management system. Both are free for core features. Lyrebird has deeper Best Practice integration. Heidi works across more platforms. Run a 4-week pilot with two or three willing clinicians. Measure time saved per consultation and after-hours documentation reduction.

Step 2: Audit your recall and chronic disease management workflow. Count the number of patients eligible for GPCCMPs (items 965/967) who are overdue. Calculate the uncaptured Medicare revenue. If the number is material (and for most practices it is), look at what HotDoc, your PMS, or a custom recall engine can automate.

Step 3: Automate document triage. If your practice runs on MediRecords, Evolve Direct handles this out of the box. For practices on other systems, evaluate whether incoming document processing is consuming meaningful reception hours.

Step 4: Write an AI policy. The GPSA found that fewer than one-third of practices have one. Covering which tools are approved, how consent is obtained, how outputs are reviewed, and who is responsible for errors is basic governance that regulators expect.

Step 5: Consider custom builds when off-the-shelf tools leave gaps. If your practice has specific recall protocols, complex intake workflows, or integration needs that no vendor product covers, a custom AI system built to your specifications and running on infrastructure you control may be the right move. We built Angels Link, a healthcare-adjacent NDIS platform with Next.js and Shopify dual architecture, and Equal Legal, an AI triage system that handles intake conversationally and hands every decision to a human before it reaches the client. The same architecture applies to clinic workflows: AI handles the administrative preparation, clinicians retain full control of clinical decisions.

How Much Does a Website Cost in Australia?

Real pricing data from 15+ Australian agencies, platform costs, and what you actually get at each price point.

Read more

The bottom line

AI in Australian healthcare is real, growing, and already saving clinicians measurable hours every week. Clinical scribes are the entry point. Administrative automation (recalls, document triage, intake preparation) is the next layer. Clinical decision-making AI exists but operates in specialist settings under TGA oversight, not on the average GP desktop.

The practices adopting now are recovering 1 to 2.5 hours per clinician per day in documentation time alone. The practices that also tackle recall automation are capturing tens of thousands of dollars in Medicare revenue that was previously falling through the cracks.

The regulatory framework is clear enough to act on. The Privacy Act and APPs cover data handling. The TGA draws a line between documentation tools (not regulated) and diagnostic tools (regulated). The RACGP supports adoption with appropriate oversight.

Start with a scribe. Audit your recalls. Write an AI policy. The tools are proven, the data is published, and the cost of waiting is measured in clinician hours lost and patient recalls missed.

Sources

Frequently Asked Questions

Is AI safe to use in Australian healthcare?

AI tools used for documentation and administration, such as clinical scribes and recall systems, are considered low-risk and are widely adopted in Australian general practice. Tools that influence clinical decisions are subject to TGA oversight. The RACGP recommends that GPs review all AI outputs before acting on them, and that patient consent be obtained before using listening tools during consultations.

What percentage of Australian GPs use AI scribes?

According to an RACGP poll, 40% of Australian GPs were using AI scribes as of November 2025, up from 22% in August 2024. The GPSA reports that roughly one-third of GP supervisors use AI in their clinical work, though 64% still rate themselves as novices in AI proficiency.

Do patients need to consent to AI being used in their consultation?

Yes. Medical defence organisations and the RACGP recommend that patient consent be obtained before using any AI tool that listens to or records a consultation. The RACGP advises GPs to explain what the tool does, offer the option to opt out, and ensure that declining does not affect the quality of care.

How much time do AI clinical scribes save?

Published data from Heidi Health shows documentation time during appointments reduced by 51% and after-hours admin dropped by 61% across a 47-GP trial. Lyrebird Health reports an 80% reduction in post-consult documentation time for some clinicians. Heidi's 2025 Impact Report found 59 minutes of capacity released per clinician per day.

What privacy laws apply to AI in Australian healthcare?

The Privacy Act 1988 and its 13 Australian Privacy Principles (APPs) govern how personal health information is collected, stored, used, and disclosed. The My Health Records Act 2012 adds specific protections for data in the national My Health Record system. All major Australian healthcare AI vendors process and store data within Australia and comply with APP requirements.

Does the TGA regulate AI clinical scribes?

No. AI scribes that only transcribe and summarise clinical conversations are currently exempt from TGA regulation because they do not directly diagnose patients or make clinical decisions. AI tools that do provide diagnostic recommendations or treatment plans may be classified as medical devices and require inclusion on the Australian Register of Therapeutic Goods (ARTG).

Chris Raad

Written by

Chris Raad

Founder of Studio Slate. Law degree from Macquarie University. Fell in love with programming at law school when he discovered he could automate his study workflows. Now builds digital infrastructure for professional services firms on the same technology as TikTok and Uber.

More about Chris