If you use AI tools to help manage legal matters, whether to draft documents, seek information or review advice you have received, two significant court decisions from early 2026 suggest you may be doing so at your peril.
Legal professional privilege is one of the most important protections available to anyone involved in legal proceedings. Put simply, it means that confidential communications between a client and their lawyer are protected from disclosure to the other side. Once privilege is lost, it cannot be recovered.
Both courts and regulators are now making clear that feeding information into publicly available AI tools can destroy that protection entirely.
What happened in the US: United States v Heppner
In February 2026, a New York judge ruled that documents created by a defendant using a publicly available AI tool were not legally privileged and were therefore disclosable to the prosecution.
The defendant had input information about his dealings with his legal advisers into the AI tool, in what appeared to be an attempt to help prepare his defence. The court found that the resulting documents were not protected for three reasons.
- They were not themselves communications between the defendant and a lawyer.
- They could not be treated as confidential, because the AI tool’s terms and conditions expressly reserved the right to disclose user data to third parties, including government authorities.
- The AI tool itself expressly disclaims the ability to give legal advice, meaning the conversations could not be characterised as being for the purpose of obtaining it.
The judge’s conclusion was unambiguous: the use of AI does not place information outside the reach of established legal principles, including those governing privilege.
What happened in the UK: The Upper Tribunal warning
At almost the same time, the Upper Tribunal in England and Wales issued a pointed warning to legal professionals, one that carries equally important implications for their clients.
In a judgment published in February 2026, the Tribunal stated that uploading confidential documents into an open-source AI tool, such as ChatGPT, amounts to placing that information in the public domain. The consequence is a breach of client confidentiality and a waiver of legal privilege, with potential referrals to the Information Commissioner’s Office to follow.
The judgment also addressed the risk of AI-generated misinformation, so-called “hallucinations”, and the supervisory obligations this places on legal professionals. But it was the privilege point that represented genuinely new territory in this jurisdiction.
Why this matters for you
These cases are directed, in the first instance, at legal professionals and how they use AI in their work. But the implications extend to clients too.
If you are involved in a dispute, investigation or legal proceedings of any kind, be cautious about using publicly available AI tools to process or discuss anything connected to that matter. That includes drafting documents, summarising advice you have received, or asking an AI to help you think through your legal position.
The UK judiciary’s own guidance on AI, updated in October 2025, puts it plainly: any information entered into a public AI chatbot should be treated as published to the world.
The practical risk for anyone involved in litigation or legal proceedings is real. If the opposing party can obtain access to documents you thought were protected, because you processed them through a publicly available AI tool, the consequences for your case could be severe.
What you should do
If you are involved in any legal matter, whether a commercial dispute, an employment issue, regulatory proceedings or anything else, keep the following in mind:
- Treat AI tools as public spaces. Anything you share with a publicly available AI tool should be considered potentially disclosable. Do not input privileged communications, legal advice you have received, or confidential documents connected to your case.
- Speak to your legal team before using AI. If you want to use AI tools to assist with your matter in any way, seek advice on whether doing so is safe. More secure, bespoke tools do exist, and your advisers can help you understand the risks.
- Do not use AI as a substitute for legal advice. AI tools cannot advise you on your specific legal position, and courts have confirmed this. If you have questions about a legal matter, those questions are best directed to a qualified solicitor.
Speak to our Dispute Resolution s
olicitors
Daniel Brumpton is a Partner and heads up our expert Dispute Resolution team. He specialises in professional negligence claims, advising on mishandled litigation, business and personal tax advice, pension disputes.
Legal professional privilege is a vital protection. Losing it, through an inadvertent use of AI, could significantly weaken your position in any legal proceedings.
Our dispute resolution solicitors have extensive experience advising clients across commercial litigation, employment disputes and regulatory matters. If you have questions about how to protect your legal interests, we are here to help.
Call us on 0800 024 1976 or complete our online enquiry form. Our offices are in Nottingham, Derby and Leicester, and we advise clients across the East Midlands and nationally.
Contact usIf this article relates to a specific case/cases, please note that the facts of this case/cases are correct at the time of writing.