← All posts

ai-technology

ChatGPT Sued for Unauthorized Practice of Law: What It Means for Legal Tech

April 7, 2026

On March 4, 2026, Nippon Life Insurance Company of America sued OpenAI in the Northern District of Illinois, alleging that ChatGPT crossed the line from providing information into practicing law without a license.

The case, Nippon Life Insurance Co. of America v. OpenAI Foundation and OpenAI Group PBC, No. 1:26-cv-02448 (N.D. Ill.), is a federal lawsuit accusing a major AI developer of unauthorized practice of law through a consumer chatbot.

The case is in its earliest stages. The allegations are unproven, and OpenAI has stated the complaint "lacks any merit whatsoever." But regardless of outcome, the case forces a question the legal technology industry has been deferring: when does an AI tool cross from information into counsel, and who is liable when it does?

Why This Matters for Legal AI Product Design

The facts are relatively simple. A disability claimant who had settled her case with Nippon Life uploaded her attorney's correspondence to ChatGPT and asked whether she was being gaslighted. ChatGPT concluded she was. She fired her lawyer and, acting pro se with ChatGPT as her drafting tool, filed nearly 50 motions over the next year, including one citing a case that does not exist. Nippon claims it spent roughly $300,000 responding to filings it characterizes as frivolous.

What matters for legal technology is not the individual facts but the product design question they expose. ChatGPT is a general-purpose tool. It has no concept of jurisdiction, no understanding of case status, no awareness that a settlement agreement is binding, and no mechanism to verify that the citations it generates correspond to real cases. It provided legal analysis, drafted court filings, and guided litigation strategy, all activities that move into territory many states may treat as the practice of law.

The complaint brings three claims: abuse of process, tortious interference with contract, and unlicensed practice of law. That third claim, UPL against a software company, is the one the legal technology industry should be watching most closely.

Disclaimers Are Not Product Design

OpenAI updated its terms of use in October 2025 to prohibit users from seeking legal advice through ChatGPT. Nippon points to that update as evidence that OpenAI recognized the risk and chose a contractual fix over a product-design fix.

This distinction, between disclaiming liability and designing against it, is the core architectural question for every legal AI product on the market.

A terms-of-service clause that says "don't use this for legal advice" does not change what the product does. It changes who the company blames when the product does it anyway. The product still generates legal arguments. It still drafts court filings. It still tells users how to proceed in specific disputes. The disclaimer shifts responsibility to the user without changing the system's behavior.

New York's Senate Bill S7263, introduced in April 2025 and advanced again in 2026, shows where this debate is heading. The bill bars AI chatbots from providing substantive responses, information, or advice that would constitute the unauthorized practice of law, creates a private right of action for damages, and makes clear that disclaimers are not enough by themselves.

How We Built FlowCounsel and FlowLawyers Differently

We designed FlowCounsel and FlowLawyers with these boundaries in the product architecture, not in the terms of service.

Attorney supervision, not attorney replacement. FlowCounsel is built for attorneys. Our tools, matter management, pro bono coordination, CLE tracking, and document workflows, make attorneys more efficient. They do not replace attorney judgment. On the consumer side, FlowLawyers routes users toward attorneys and legal aid, not away from them.

Statute-grounded, not opinion-generating. When FlowLawyers provides state-specific legal information, it cites the statute. It does not tell users what to do with that information. When it generates a demand letter, it grounds the content in the user's state's actual statutes, and every letter includes a recommendation to have an attorney review it before sending.

Routing, not advising. FlowLawyers helps users understand their situation and connects them to the right resource: an attorney, a legal aid organization, a pro bono lawyer, or a document tool. The goal is to get people to the right professional faster, not to become that professional.

This is a product design choice. The difference between a tool that provides legal information with attorney routing and a tool that generates legal strategy for active litigation is not a matter of disclaimers. It is a matter of what the system is built to do.

What Legal Tech Builders and Attorneys Should Take From This

For legal tech builders: The era of operating in gray areas with nothing more than a terms-of-service disclaimer is closing. State legislatures are actively writing rules. Courts are hearing cases. The companies that designed their products with UPL boundaries in the architecture, not the footer, will be in the strongest position.

For attorneys: Clients are already using ChatGPT and similar tools for legal guidance, whether lawyers ask about it or not. Ask them. Explain the difference between general legal information and advice tailored to their specific facts. Explain that AI models fabricate citations, ignore jurisdictional differences, and owe no fiduciary duty. The value of a licensed attorney has not changed. What has changed is that clients now have access to tools that sound authoritative, respond confidently, and never send an invoice.

For everyone: Regulation by court doctrine alone is structurally inadequate for this problem. Court opinions arise only when institutional actors have the resources to litigate them, which means the cases that define the boundaries of AI-assisted legal practice will tend to reflect the interests of insurers, large companies, and well-resourced firms. Legislative bodies are better positioned to draw these lines proactively, with public input, rather than reactively through litigation that only the well-resourced can bring.

The question is not just whether ChatGPT practiced law. It is who gets to decide what that means, and whether we will have an answer before the harm compounds or only after.


FlowCounsel is the AI-native operating system for legal teams. FlowLawyers is the consumer-facing legal help platform with attorney discovery, legal aid routing, state-specific legal information, and document tools. Neither provides legal advice. Attorney supervision of all AI output is required.

Sources

FlowCounsel includes pipeline management, directory presence, and AI-managed campaigns.

By invitation only. We're onboarding select firms.