On February 10, 2026, Judge Jed Rakoff issued a pivotal ruling in U.S. v. Heppner that underscores the legal risks of relying on AI tools for generating documents intended for attorney review.
Key Facts of the Case
- Defendant, a financial services executive accused of fraud, used Anthropic’s Claude AI tool to prepare 31 “Documents” related to his legal situation.
- He later shared these AI-generated materials with his attorneys.
- Government investigators seized Defendant’s computer hardware and uncovered the Documents. The defense argued they were protected under attorney-client privilege and attorney work product protection, but the court disagreed.
Court Findings
- AI tools are not attorneys.
- Claude, like other AI tools, holds no law license and owes no professional duties to users.
- The tool’s terms of service explicitly disclaim any attorney-client relationship and note that user inputs are not confidential.
- Privilege cannot be retroactively applied.
- Documents created independently by a defendant and sent to his/her attorneys after the fact cannot suddenly become privileged.
- Maintaining privilege requires attorney discretion.
- The only potential way to preserve privilege over AI-generated materials is if the content is created by, or at the express direction of, legal counsel.
Implications for Employers
This ruling carries important lessons for organizations that use AI in legal or sensitive business processes.
- Don’t assume AI outputs are confidential. AI-generated content, even if related to legal matters, is not automatically protected. If you want attorney related privilege to apply, seek advice of counsel before you begin the project.
- Establish clear AI usage policies. Employers should define when and how AI tools can be used, especially for drafting legal documents, compliance memos, or risk assessments.
- Use secure, privacy-conscious tools. If attorneys plan to leverage AI for legal analysis, ensure the platform has strong confidentiality safeguards and operates under agreements that maintain privilege. Again, consult counsel in advance of commencing such endeavors.
- Train employees on privilege rules. Educate teams on attorney-client privilege rules and work product protections, including what does and does not quality when using AI.
- Coordinate with legal counsel. Any AI-assisted drafting or other work intended for litigation or regulatory purposes should be done under the guidance of attorneys, not independently.
Conclusion
U.S. v. Heppner confirms that using AI, even for purposes involving legal matters, does not create a legal privilege protection. Employers and executives must treat AI outputs with caution, particularly in legal contexts. Work closely with counsel before you begin any such project to ensure privilege and confidentiality are preserved.
Brody and Associates regularly advises management on compliance with the latest local, state and federal employment laws. If we can be of assistance in this area, please contact us at info@brodyandassociates.com or 203.454.0560.