Your AI Chats Could Be Used Against You
Artificial intelligence (AI) tools like ChatGPT, Copilot, and Claude are increasingly used by individuals and businesses to draft documents, analyze cases, and think through legal issues. A recent federal court ruling, however, showed communications with AI tools are not confidential and may be discoverable in litigation or criminal investigations.
Lawyers are advising clients to exercise caution when using AI for anything related to their case.
This year, a federal judge in New York ruled that a former financial executive could not withhold documents generated through an AI chatbot from prosecutors pursuing fraud charges. The court rejected the argument that the materials were protected by attorney‑client privilege, saying that no attorney‑client relationship exists between a person and an AI platform.
Attorney‑client privilege can be lost if legal advice or confidential information is voluntarily shared with someone other than a lawyer. As AI is becoming more prevalent, most Courts are viewing these tools as falling into a third-party category.
Judges have also noted that AI providers’ terms of use typically state that users should not expect confidentiality and that data may be shared under certain circumstances.
It is important to note that courts are not yet uniform in how they treat AI‑generated materials. In a separate federal case in Michigan, a different judge ruled that a self‑represented plaintiff did not have to turn over her AI chats, treating them as personal work product rather than discoverable communications.
The opposite rulings reveal that the law in this area is evolving, and clear, consistent rules have not yet emerged. Until they do, there is risk involved with using AI.
In response to the ruling, many law firms have issued client warnings and updated engagement agreements to address AI use. Common guidance includes not using AI tools as a substitute for legal advice and to avoid giving confidential facts about an investigation, dispute, or legal strategy to public AI platforms. Do not upload or summarize communications from your lawyer into an AI tool, as this may waive attorney‑client privilege. It’s also important to exercise caution even with “closed” AI systems, as legal protections remain largely untested.
Where AI tools are used in a legal context, lawyers increasingly recommend doing so at the direction and supervision of counsel. Even then, AI should be viewed as a tool, not a person or advisor.
AI can offer efficiency and convenience, but when legal exposure is at stake, it carries significant risk. Courts have made it clear that what you type into an AI chatbot may be accessed, reviewed, and used by others.
Until courts provide clearer advice, the safest approach remains:
If the matter involves potential legal liability, speak only with your lawyer, not AI.
Source: money.usnews.com