What AI can and cannot safely do in an Australian law firm. Proven use cases, hallucination risks, confidentiality obligations, tool landscape, and ROI maths.
Key Takeaway
- 98% of Australian legal professionals now use AI in some capacity, but only 16% of Australian firms use legal-specific AI daily compared with 49% globally. Most firms are using generic tools like ChatGPT rather than purpose-built legal AI integrated into their practice management systems.
- The proven use cases are administrative, not advisory: document review (cited by 54% of Australian firms), legal research, client intake triage, billing automation, and first-draft document assembly. AI is not safe for generating substantive legal advice or court submissions without full human review.
- An Australian practitioner has already been referred to the NSW Legal Services Commissioner for filing AI-generated fake case citations, following the same pattern as the landmark US case Mata v Avianca. Hallucination risk is a professional conduct issue, not a theoretical concern.
- The ROI case is straightforward: a partner spending two hours per day on admin at $600/hr is leaving roughly $300,000 per year on the table. Legal technology saves professionals one to three hours daily on routine tasks according to Thomson Reuters. The firms adopting AI are recovering billable time, not replacing legal judgment.
I'm Chris from Studio Slate. I studied law and built Equal Legal, an AI-powered legal document platform for the Australian market. I have watched the AI conversation in legal circles go from curiosity to anxiety to vendor overload in about 18 months. Managing partners at the small-to-mid firms I speak with all have the same question: what is real, what is safe, and what is marketing fluff? This guide is the answer.
Where Australian law firms actually stand on AI
The headline numbers look impressive. 98% of Australian legal professionals report using AI in some capacity, according to Clio's 2025 Legal Trends Report. That figure outpaces the US, Canada, and the UK. But the details tell a more honest story.
A 2026 survey by LEAP Legal Software of 700 legal professionals across six countries found that only 16% of Australian respondents use legal-specific AI tools daily or as part of core workflows, compared with 49% globally. While 57% of respondents globally said they regularly use integrated AI, the Australian figure was 37%.
The gap is between using ChatGPT occasionally and embedding AI into how the firm operates. Most Australian firms are doing the former.
Trust remains a barrier. 32% of Australian respondents reported low or no trust in AI integration, the highest level across all surveyed markets. This caution is not irrational. Lawyers carry professional obligations that make careless AI use genuinely dangerous, as several recent court cases have demonstrated.
The Thomson Reuters 2025 Generative AI in Professional Services Report found that 26% of legal organisations are now actively using generative AI, up from 14% in 2024. Document review (77%), legal research (74%), and document summarisation (74%) are the top use cases. But only 41% of law firms have established policies governing AI use. The technology has outpaced the governance.
| Metric | Australia | Global |
|---|---|---|
| Use legal-specific AI daily | 16% | 49% |
| Regularly use integrated AI | 37% | 57% |
| Report AI saving moderate-to-significant time | 50% | 71% |
| Low or no trust in AI | 32% | Lower across all other markets |
| Have established AI governance policies | 37% (AU, Clio 2026) | 41% (global, TR 2025) |
What AI can do in a law firm today
The use cases that work are the ones that eliminate admin, not the ones that replace legal reasoning. Five categories have proven track records.
1. Client intake triage
A prospective client fills in a web form or chats with an AI interface. The AI classifies the matter type, assesses urgency, identifies conflicts, and routes the enquiry to the right solicitor with a summary. The alternative is a paralegal or office manager reading every email and making the same judgment calls, often after a delay.
When I built Equal Legal, the conversational triage layer was the highest-value component. A user describes their problem in plain language. The AI determines whether the matter is simple enough for a templated document or complex enough to require a solicitor referral. High-risk matters (family law, criminal) are automatically blocked from self-service and referred to a qualified practitioner.
For a firm receiving 50 enquiries per week, intake triage alone can save 10 to 15 hours of admin time. At paralegal rates of $80 to $120 per hour, that is $40,000 to $90,000 per year in recovered capacity.
2. Document review and contract analysis
This is the most mature legal AI use case and the one cited by 54% of Australian firms as their primary AI application. AI reads contracts, leases, and disclosure documents, then flags unusual clauses, missing provisions, and deviations from standard positions.
Tools like Luminance, which was developed at the University of Cambridge, use a combination of supervised and unsupervised machine learning to analyse complex legal texts. For M&A due diligence, a document set that would take a team of four associates two weeks to review can be triaged by AI in hours. The associates still review the flagged items, but the volume of human reading drops by 60 to 80%.
Harvey, backed by OpenAI and adopted by firms including Herbert Smith Freehills Kramer, handles document review, discovery, contract analysis, and compliance research at enterprise scale. For smaller firms, the document review features built into LEAP, Smokeball, and Clio provide a lighter-weight version of the same capability.
3. Legal research assistance
AI-powered legal research does not replace Westlaw or LexisNexis. It sits alongside them. LEAP's LawY research assistant lets lawyers ask questions in natural language and receive citation-backed answers that can be verified against the underlying source. Jade by BarNet provides free access to Australian case law with AI-enhanced search. Both Westlaw AU and LexisNexis have introduced AI assistants that can answer research questions in natural language.
The time saving is real. LEAP reports that LawY saves firms 10 to 15 hours per week, and that figure comes from firms already using traditional research tools alongside it. The AI handles the initial search and summary. The lawyer verifies the citations and applies judgment.
The critical distinction: AI research tools that are integrated into legal databases and trained on verified case law are fundamentally different from asking ChatGPT a legal question. The former retrieves from a verified corpus. The latter generates plausible-sounding text that may not correspond to any real authority.
4. Document drafting and assembly
AI can produce first drafts of engagement letters, standard agreements, file notes, and routine correspondence. Josef, an Australian-built platform, allows firms to create automated document workflows without coding. A firm can build a diagnostic tool that walks a client through their situation and generates a customised document from pre-approved clause templates.
Smokeball's AI features include automated document assembly from matter data. Clio's Manage AI generates document drafts using firm-specific context from within the practice management system. LEAP's AI Prompts and Generator creates state-specific legal forms with automated client and case information.
The human-in-the-loop principle applies here without exception. AI drafts. A solicitor reviews, edits, and approves. Nothing generated by AI should reach a client or a court without a practising lawyer having read it.
5. Time recording and billing review
Missed time entries are one of the most common sources of revenue leakage in law firms. Smokeball's AutoTime captures time in the background as lawyers work, recording activity that would otherwise go unbilled. LexUnits converts meeting recordings, transcripts, emails, and documents into structured billing entries in 6-minute increments, formatted for direct export to Actionstep, LEAP, Clio, or Smokeball.
For a five-lawyer firm where each lawyer under-records by 30 minutes per day, the lost revenue at blended rates of $400 per hour is approximately $250,000 per year. Automated time capture recovers a significant portion of this without requiring any change in behaviour from the lawyers themselves.
What AI cannot safely do (yet)
Two categories of legal work remain off-limits for unsupervised AI, and for good reason.
Substantive legal advice
The joint statement from Australian legal profession regulators (the Law Society of NSW, the Victorian Legal Services Board and Commissioner, and the Legal Practice Board of WA) is unambiguous:
"AI chatbots/copilots and other LLM-based tools cannot reason, understand, or advise. Lawyers are responsible for exercising their own forensic judgement when advising clients, and cannot rely on the output of an AI tool as a substitute for their own assessment and advice."
This is not a conservative overreaction. It is an accurate description of how large language models work. An LLM does not understand the law. It generates sequences of tokens that are statistically likely given the input. It cannot distinguish between a binding High Court authority and a superseded lower court decision. It cannot assess whether a novel factual scenario falls within or outside an established legal principle. It cannot weigh competing policy considerations.
The regulators' guidance recommends limiting AI to "tasks which are lower-risk and easier to verify (e.g. drafting a polite email or suggesting how to structure an argument)" and prohibiting its use for "tasks which are higher-risk and which should be independently completed by a qualified person (e.g. providing legal advice on a novel concept or executive decision-making)."
Court submissions without independent verification
This is where the consequences have already materialised. In multiple jurisdictions.
The hallucination problem: from theory to sanctions
In June 2023, Judge P. Kevin Castel of the US Southern District of New York sanctioned attorneys Steven Schwartz, Peter LoDuca, and their firm Levidow, Levidow & Oberman a total of $5,000 for submitting a brief containing six fabricated case citations generated by ChatGPT. The case, Mata v Avianca (No. 22-cv-1461, 678 F. Supp. 3d 443, S.D.N.Y. 2023), became the founding document of legal AI compliance globally.
The citations looked real. They had case names, reporter volumes, jurisdiction designators, and page numbers. Not a single one existed. When Schwartz asked ChatGPT to verify the cases, the model generated fake confirmation. As Judge Castel wrote in his 46-page sanctions opinion:
"Many harms flow from the submission of fake opinions. The opposing party wastes time and money in exposing the fraud."
Australia followed. In January 2025, Judge Skaros of the Federal Circuit and Family Court referred an Australian legal practitioner to the NSW Legal Services Commissioner in Valu v Minister for Immigration and Multicultural Affairs (No 2) [2025] FedCFamC2G 95. The practitioner had used ChatGPT to identify Australian cases and incorporated the results into court submissions without checking them. The submissions contained citations to Federal Court cases that did not exist and quotes from a Tribunal decision that were fabricated.
The practitioner explained that he had accessed ChatGPT, "inserted some words and the site prepared a summary of cases for him. He said the summary read well, so he incorporated the authorities and references into his submissions without checking the details."
The Court referred the matter to the Legal Services Commissioner, noting that "the misuse of artificial intelligence in legal proceedings was a matter of current public interest" and that there was "a strong public interest" in the regulator being made aware of such conduct.
This is not an edge case. Stanford HAI research has documented that even purpose-built legal AI tools hallucinate at measurable rates. According to reporting from AI Outlooks, Lexis+ AI and Ask Practical Law AI produced incorrect information more than 17% of the time, while Westlaw's AI-Assisted Research hallucinated more than 34% of the time. These are the premium, legal-specific tools, not consumer ChatGPT.
The lesson is not "avoid AI." The lesson is that every AI output touching a court submission or client advice must be independently verified by a practising solicitor against a reliable primary source.
Professional conduct implications of AI hallucination
Under ASCR Rule 19, solicitors have a duty not to deceive or mislead the court. Under Rule 4.1.3, solicitors must deliver legal services competently and diligently. Filing AI-generated citations without verification breaches both rules and can constitute unsatisfactory professional conduct or professional misconduct. The defence "I did not know ChatGPT could fabricate cases" has been explicitly rejected by courts in both the US and Australia.
Confidentiality: where your client data goes
Confidentiality is the second major concern, and the regulators have addressed it directly.
The joint statement from Australian legal profession regulators states:
"Lawyers cannot safely enter confidential, sensitive or privileged client information into public AI chatbots/copilots (like ChatGPT), or any AI tool where the inputs may be used to train or improve the model or could be accessed by third parties."
The distinction that matters is between consumer-grade AI tools and enterprise or legal-specific platforms.
Consumer tools (ChatGPT free tier, Google Gemini free, Claude.ai free): Inputs may be used for model training. Data may be processed on servers outside Australia. There is no contractual guarantee of confidentiality. Using these tools with client matter data creates a real risk of breaching ASCR Rule 9 (confidentiality) and potentially waiving legal professional privilege.
Enterprise and legal-specific tools (LEAP AI, Smokeball AI, Clio Manage AI, Harvey): These operate under enterprise agreements with data processing terms. Clio states that client data stays within the same platform the firm already uses for matters, billing, and client management. LEAP processes AI queries through enterprise endpoints. Custom builds can be deployed to AWS Sydney or the firm's own infrastructure with zero-day data retention.
The practical implication: if a solicitor would not email the information to a stranger, they should not enter it into a public AI tool. Legal-specific platforms with proper data processing agreements are the appropriate alternative.
The Law Society of NSW's guide to responsible use of AI (updated January 2026) reinforces this, listing confidentiality under ASCR Rule 9 as the first obligation solicitors must consider when using generative AI, and recommending that firms implement "clear, risk-based policies" specifying what AI tools are permitted, who can use them, for what purposes, and with what information.
The AI tools available to Australian law firms
The landscape breaks into three tiers based on firm size and budget.
Built into your practice management software
These are the lowest-friction options. If you already use LEAP, Smokeball, or Clio, you have AI features available at no additional cost.
| Platform | AI features | Strength |
|---|---|---|
| LEAP | LawY research assistant, Matter AI document analysis, AI Prompts and Generator for drafting, AutoTime recording | Deepest Australian legal document library, citation-backed research |
| Smokeball | Archie AI assistant, automated document assembly, matter workflows, time recording from email | AutoTime captures billable work passively, strong in conveyancing and family law |
| Clio | Manage AI for matter summaries, document drafting, client communication | Best API ecosystem, works with firm's existing data, strong client intake tools |
Smokeball's 2024 State of Law report found that 74% of legal professionals are eager to explore AI benefits. The simplest path is starting with the tools already bundled into the software your firm pays for.
Standalone legal AI tools
| Tool | What it does | Best for | Approximate cost |
|---|---|---|---|
| Josef | No-code document automation and client intake workflows | High-volume standardised documents (leases, NDAs, employment contracts) | Contact for pricing |
| Luminance | AI contract review and due diligence | M&A, large contract review, compliance | Enterprise pricing ($500+/mo) |
| Harvey | Enterprise legal AI for research, drafting, analysis | Large firms with enterprise budgets | Enterprise pricing |
| Habeas | Australian-built legal research trained on local case law and statutes | Firms needing AU-specific research | Contact for pricing |
| LexUnits | Meeting recordings and documents to billing entries | Firms with poor time capture discipline | From A$19/month |
| Jade (BarNet) | Free AU legal research with AI-enhanced search | Any firm needing case law access | Free |
Custom AI builds
For firms with specific workflows that no off-the-shelf product addresses. A custom build connects AI directly to the firm's practice management system, matter data, document templates, and escalation rules. The AI knows the firm's tone, its engagement letter templates, and its conflict checking requirements.
We built this for Equal Legal: a conversational triage layer that classifies matters, generates documents from lawyer-drafted clause templates, and refers complex matters to qualified practitioners. The architecture uses clause-based assembly rather than freeform generation, which eliminates hallucination risk for supported document types.
Custom builds start at around $5,000 for a single workflow and $15,000 to $30,000 for a full firm deployment. The economics work when the firm has a specific bottleneck that no existing product addresses, or when data confidentiality requirements rule out third-party SaaS platforms.
Need AI built for your firm, not the whole legal market?
We build custom AI for Australian law firms. Trained on your matters, integrated with LEAP, Smokeball or Clio, owned by you. Human-in-the-loop on every workflow.
See AI for lawyersThe ROI maths
The value of AI in a law firm is measured in recovered billable time, not in headcount reduction. The Thomson Reuters ROI of Legal Tech and AI Report 2025 surveyed 1,000 legal professionals across Australia, Asia, New Zealand, and the Middle East. The findings:
- Legal technology saves professionals one to three hours daily on routine tasks including contract drafting, legal research, matter management, and discovery
- 36% of law firm users say technology gives them a competitive edge
- 33% report reduced stress, and 32% feel more confident in their work
- Nearly half of decision-makers see "level of innovation" as a key ROI metric, while headcount reduction appears near the bottom for both in-house counsel and law firms
The Thomson Reuters Future of Professionals Report 2025 estimates that professionals using AI will save five hours per week within the next year, worth approximately $19,000 per professional annually.
For an Australian law firm, the calculation looks like this:
| Role | Hourly rate | Admin time saved per day | Annual value recovered |
|---|---|---|---|
| Senior partner | $600 | 2 hours | $300,000 |
| Mid-level associate | $350 | 1.5 hours | $131,250 |
| Junior associate | $250 | 1 hour | $62,500 |
| Paralegal | $120 | 2 hours | $60,000 |
A five-lawyer firm recovering an average of 1.5 hours per person per day at a blended rate of $350/hr generates roughly $656,000 in additional billable capacity per year. Not all of that converts to billed work, but even capturing 30% of it represents nearly $200,000 in additional revenue against an AI investment that ranges from $0 (using existing PMS features) to $30,000 (custom build).
The firms with clear AI strategies are already seeing this. Organisations with a visible AI strategy are 3.5 times more likely to realise benefits from AI than those with no significant plans, according to Thomson Reuters.
What the regulators expect
Australian legal profession regulators have been unusually clear on AI compared with other professional bodies. The key documents:
1. Joint Statement on AI in Australian Legal Practice (November 2024)
Issued by the Law Society of NSW, Victorian Legal Services Board and Commissioner, and Legal Practice Board of WA. Applies to solicitors in NSW and to solicitors and barristers in Victoria and WA. Key principles:
- Maintain client confidentiality (ASCR Rule 9). Do not enter confidential information into public AI tools.
- Provide independent advice (ASCR Rule 4.1.4). AI cannot substitute for the solicitor's own forensic judgment.
- Be honest and deliver services competently (ASCR Rules 4.1.2 and 4.1.3). Verify all AI outputs.
- Charge fair, reasonable, and proportionate costs (LPUL ss 172-173). Do not bill full manual rates for AI-assisted work without disclosure.
2. Law Society of NSW Guide to Responsible Use of AI (Updated January 2026)
The full guide covers accuracy, confidentiality, intellectual property, supervision obligations, and costs. It reminds practitioners that under s 34 of the LPUL, law practice principals are responsible for the legal services provided by the practice and must take reasonable steps to ensure all practitioner associates comply with their regulatory, professional, and ethical obligations.
3. Court practice notes on generative AI
The NSW Supreme Court and District Court have issued practice guidance distinguishing between AI-assisted drafting and the creation of evidence. AI may assist with grammar, language, or formatting, but must not be used to generate affidavits, witness statements, character references, or other evidence purporting to reflect a person's knowledge or belief.
The practical takeaway: AI is permitted and increasingly expected, but with guardrails. Every firm needs a written AI policy. Every AI output must be reviewed before it reaches a client or court. And clients should be informed about how AI is used in their matter.
Building an AI policy for your firm
The Clio 2026 State of Legal Tech report found that only 37% of Australian law firms have strong AI policies and governance. Given that practitioners are already using AI (whether the firm has sanctioned it or not), this gap creates risk.
A practical AI policy for a small to mid-sized firm covers five areas:
Approved tools. List which AI tools are permitted for use with firm data. Distinguish between tools with enterprise data agreements (LEAP AI, Smokeball AI, Clio Manage AI) and public tools (ChatGPT, Gemini) that should never be used with client information.
Permitted use cases. Specify what AI can be used for: administrative tasks, first-draft documents, research summaries, time recording. Specify what it cannot be used for without partner sign-off: client advice, court submissions, regulatory filings.
Verification requirements. Every AI output that will be relied upon in a matter must be independently verified by a practising solicitor. For legal research, this means checking every citation against a primary source. For document drafts, this means reviewing every clause against the firm's precedent library and current legislation.
Confidentiality protocols. No client names, matter details, or privileged information entered into public AI tools. De-identification procedures for any client data used in non-enterprise AI systems. Logging requirements for all AI-assisted work.
Costs and disclosure. How AI-assisted work is billed. Whether and how clients are informed about AI use. The Law Society guidance notes that charging full manual rates for AI-assisted work without disclosure raises questions under LPUL ss 172-173.
Practical first steps
For a managing partner at a small to mid-sized firm who has read this far and wants to act:
Week 1: Audit what you already have. Check whether your practice management software (LEAP, Smokeball, Clio, Actionstep) has AI features you are not using. Most do. These are the safest starting point because client data stays within the system you already trust.
Week 2: Draft a firm AI policy. Even a one-page document that specifies approved tools, prohibited uses, and verification requirements is better than nothing. The Law Society of NSW guide provides a framework. Circulate it to all staff.
Week 3: Start with one workflow. Pick the highest-volume administrative task in your firm. For most practices, this is intake triage, first-draft correspondence, or time recording. Implement one AI tool for that single task. Measure the time saved over 30 days.
Week 4: Review and expand. Assess the time savings and error rate from the first workflow. If the results justify it, add a second workflow. If not, adjust the tool or the process before scaling.
The firms that are getting value from AI are not the ones that bought the most expensive platform. They are the ones that identified a specific bottleneck, chose a tool that addressed it, and trained their people to use it within appropriate boundaries.
Law Firm Marketing in Australia: The Complete Guide
How Australian law firms get clients in 2026. Referrals, SEO, Google Ads, content, LinkedIn, and compliance with ASCR Rule 36 advertising restrictions.
Read moreSources
- Clio 2025 Legal Trends Report - 98% AU AI adoption, firm benchmarks, billing data
- Clio 2025 Legal Trends for Solo and Small Law Firms - Solo/small firm AI adoption rates, digital intake impact
- Clio 2025 Legal Trends for Mid-Sized Law Firms - 93% mid-sized AI adoption, billing strategies
- Clio: AI in Australian Law Firms (2026) - 37% have strong AI policies, governance gaps
- LEAP Legal Software 2026 Survey - 16% daily legal AI use in AU vs 49% global, trust barriers
- LEAP Legal AI Features - LawY research assistant, 10-15 hours/week saved
- Thomson Reuters 2025 Generative AI in Professional Services Report - 26% active adoption, document review as top use case
- Thomson Reuters ROI of Legal Tech and AI Report 2025 - 1-3 hours saved daily, 65% have AI strategy
- Thomson Reuters Future of Professionals Report 2025 - 5 hours/week savings, $19,000 annual value per professional
- Smokeball: 8 Best Legal AI Tools for Australian Lawyers (2026) - Archie AI, Habeas, Harvey, tool comparison
- LexUnits: Best AI Tools for Australian Lawyers (2026) - Josef, Luminance, Spellbook, billing automation
- Law Society of NSW, VLSB+C, LPBWA: Joint Statement on AI in Australian Legal Practice (2024) - Regulator expectations, confidentiality, verification duties
- Law Society of NSW: Solicitor's Guide to Responsible Use of AI (January 2026) - ASCR Rules 4, 9, 17, 19, 37 applied to AI
- Law Society of NSW: AI Guidance Press Release - Joint regulator statement context
- Mata v Avianca, Inc., 678 F. Supp. 3d 443 (S.D.N.Y. 2023) - Sanctions for ChatGPT-fabricated case citations
- Valu v Minister for Immigration and Multicultural Affairs (No 2) [2025] FedCFamC2G 95 - Australian practitioner referred to OLSC for AI-generated fake citations
- Carroll & O'Dea: The AI Lawyer, Opportunity, Hallucination and Professional Risk (2026) - AU court AI guidance, Valu case analysis, NSW practice notes
- Herbert Smith Freehills Kramer Adopts Legora AI Firmwide (2026) - Enterprise legal AI adoption
- Stanford HAI AI Index 2025 - Global AI investment, hallucination research
- AI Outlooks: Australian Lawyer Sanctioned for AI-Generated Citations - Lexis+ AI 17% error rate, Westlaw AI 34% hallucination rate
- Lawyers Weekly: Two-Thirds of Australian Lawyers Using ChatGPT (2024) - LawCPD poll, 66% ChatGPT adoption, 20% have policies
- PracticeProof: The AI Strategy Imperative for Australian Law Firms (2026) - Framework analysis, 98% AU adoption figure context
Frequently Asked Questions
Is AI safe to use in an Australian law firm?
Yes, when used within appropriate boundaries. The Law Society of NSW, the Victorian Legal Services Board, and the Legal Practice Board of WA have jointly issued guidance confirming that AI tools can be used in legal practice provided solicitors maintain client confidentiality, verify all outputs, exercise independent legal judgment, and charge fair costs. The key rule is that a practising solicitor must review every AI output before it reaches a client or a court.
What are the best AI tools for Australian law firms?
For small to mid-sized firms, the most practical options are AI features built into existing practice management software: LEAP's LawY for legal research, Smokeball's Archie AI for document drafting and time recording, and Clio's Manage AI for matter summaries and client communication. Josef is an Australian-built platform for automating document workflows. Larger firms use enterprise tools like Harvey and Luminance for document review and contract analysis.
Can AI replace lawyers in Australia?
No. AI tools handle administrative and research tasks that consume lawyer time but do not require legal judgment. Document review, intake triage, billing reconciliation, and first-draft research are the proven use cases. Substantive legal advice, court submissions, and strategic counsel require a practising solicitor. The Law Society of NSW guidance is explicit: AI chatbots and LLM-based tools cannot reason, understand, or advise.
How much does legal AI cost for a small law firm?
AI features built into practice management platforms like LEAP, Smokeball, and Clio are typically included in existing subscription costs at no additional charge. Standalone tools range from $19 per month for billing automation to $500 to $1,000 per month for enterprise document review platforms like Luminance. Custom AI builds for specific firm workflows start at around $5,000. Most small firms start with the AI features already bundled into their practice management software.
What is the ROI of AI for a law firm?
The Thomson Reuters ROI of Legal Tech and AI Report 2025 found that legal technology helps professionals save one to three hours per day on routine tasks like contract drafting, legal research, and matter management. For a partner billing at $600 per hour who recovers two hours of admin time daily, that represents roughly $300,000 in additional billable capacity per year. The Thomson Reuters Future of Professionals Report 2025 estimates AI will save professionals five hours per week on average, worth approximately $19,000 per professional annually.

