Late last year at a small suburban office in Darwin which looks very much like a suburban law firm a Chat bot the Artificially Intelligent Legal Information Research Assistant (Ailira) started helping clients make a will for $150.
A bot is the conversational interface that uses programmed logic and, in some cases, machine learning to determine how to interact given a specified topic or function, leading to a conversational interaction with a machine – you have probably interacted with a bot on multiple occasions without realising.
Will makers enter answers prompted by Ailira’s questions; Will makers can ask Ailira questions, using this process Ailira assists them to create a simple will or refers them to a lawyer if a more complex document is required. Office staff are on hand to pick up on issues relating to lack of capacity or undue influence. Concerns have been raised regarding the assessment of the capacity of the testator; does a staff member understand the capacity of a person to make a will using a chat bot? Additionally questions might arise if Ailira’s programming has errors and important aspects of the will are not included.
In Australia there is a prohibition on engaging in legal practice by a person who is not a legal practitioner. Concerns have been raised that non lawyers have been using software for drafting wills, like Ailira this software prompts the clients to answer pre-generated questions, which automatically generates the will without the Will maker meeting with a solicitor.
Ailira’s creator, a legal practitioner, believes that the Bot is more like a Will kit you can purchase from a newsagent, therefore it is not engaging in legal practice as it is “legal stationery”.
However this raises questions as to the nature of legal information (Legal Stationary) versus legal advice. In deciding these issues the court assesses the facts of the situation on a case by case basis however as digital technology development continues to broaden, regulators will need to view new delivery systems in the interests of consumer protection; what exactly the consumer thinks they’re getting by using such a service would be an important consideration.”
In Attorney General at the Relation of the Law Society of Western Australia v Quill Wills Ltd & Ors – Quill Wills was a company that produced ‘do-it- yourself’ will kits. They claimed that they were not providing legal advice; however they had a representative assisting clients to select clauses held in a computer program that were then drafted into the Will.
The court held that the defendants were drawing and preparing a document
“relating to or in any manner dealing with or affecting real or personal estate or interest therein or any proceedings at law, civil or criminal or in equity.”
Quill Wills had gone beyond “merely giving abstract information as to legal rules and was assisting in the production of a will appropriate to the individual circumstances of the customer”.
There are now websites that provide legal forms and documents in a variety of areas of Law, from simple templates that customers can access and personalise themselves as well as websites that generate a document specifically for the user. It may be as simple as the user inputting details when prompted and a the legal document being generated containing those details, which is then purchased and download.
If this technology is used by non-lawyers to provide the service directly to the public, this may create concerns about consumer protection. Although Quill Wills was decided before the recent advancements in artificial intelligence technology, such technologies may drive higher premiums for the legal profession as a whole, while non-legal professionals may discover that their actions are barred by legislation and are not covered by their own professional indemnity insurance.
Software, for better or worse, is cheaper and faster. Artificial intelligence (“AI”) software is already doing discovery, due diligence, drafting and precedent management, jobs that used to be performed by legal practitioners and law graduates.
As with the online generation of documents, law firms using AI in their legal work are covered by professional indemnity insurance, which works to protect the legal practitioners and the clients. That same level of protection is unlikely to exist should non-lawyers be permitted to provide similar technology directly to the public.
However most chat bots aren’t artificial intelligence-driven, but rather rules-based processing — so in a support setting, they’re fundamentally following the same rules a human agent would have to follow. What this means is that it may make little difference in execution whether a bot or human is conducting the interaction.
Although it is happening more slowly technology is transforming the legal profession, enabling those who adopt it to provide better and more cost-effective legal services and representation for their clients. Importantly due to regulations that ultimately protect clients, and the professional judgment and expertise of lawyers the legal profession will not be made obsolete by technology.