Should You Obtain Legal Advice From ChatGPT?

“It's generally not a good idea to rely solely on ChatGPT for legal advice, especially for important or high-stakes matters.” – ChatGPT
Although the use of AI, including ChatGPT, is becoming more common in our everyday lives, it is unwise to rely on ChatGPT for legal advice for a variety of reasons. AI applications can help with explaining legal concepts in plain language, providing general information about laws and procedures, and offering educational guidance on legal issues, but relying on AI for legal advice or to draft documents or submissions can result in unforeseen problems.
AI applications, including ChatGPT, are not licensed to practice law, obviously. More importantly, there are several things about lawyers that protect clients if a problem arises.
If an AI application makes a mistake that results in you losing your case or costing you money, there is no insurance that can correct the mistake or compensate you. In contrast, licensed lawyers have professional liability insurance that protects clients.
If you ask ChatGPT about your legal issue, there is no client confidentiality and your searches and questions could potentially be disclosed to others, by court order, warrant, or otherwise. You could be forced to reveal what you have searched about to the other side in your dispute. There is no lawyer-client relationship or any privacy at all with ChatGPT. But communications you have with a lawyer are protected by solicitor-client confidentiality and neither you nor your lawyer can be forced to disclose them.
ChatGPT has no loyalty to you or anyone else and can provide advice including conflicting advice to anyone, including the people with whom you have your dispute. Although unlikely, it is possible that ChatGPT could use information you provide to it to modify advice it gives to other people, potentially the same people you have your dispute with. In other words, ChatGPT could use your information to hurt your case directly. Lawyers, on the other hand, owe a duty of loyalty to clients and cannot act against a client’s or former client’s interests.
ChatGPT does not have access to current or updated legal cases or commentary, and can even supply outright wrong information. There are even situations where AI programs have provided people with fake court cases for use in disputes. In one widely-publicized case in British Columbia, a lawyer was reprimanded by the judge in a family law case for mistakenly citing in a draft court document fake caselaw that had been invented by ChatGPT. The court in that case referred to a 2024 study in which it was found that AI programs suffered from “alarmingly prevalent legal hallucinations” and failed to correct users’ incorrect legal assumptions. In other words, ChatGPT is prone to serious error which can be difficult to detect, even for lawyers.
Accordingly, despite the usefulness of AI applications like ChatGPT, it can be dangerous to rely on them for legal advice. When in doubt, consult a lawyer.