• Black Facebook Icon
  • Twitter
  • Black LinkedIn Icon

Legal Utopia, Legal Utopia - The A.I Way and LegalCrowd are the trademarks and trading names of Legal Utopia Limited, a company registered in England and Wales under company number 10909418 operating from and registered address Level 30, The Leadenhall Building, 122 Leadenhall Street, London, EC3V 1LR.

(C) Legal Utopia Limited 2019-2020. All Rights Reserved. 

App design and functionality may vary to images displayed.

  • Legal Utopia

Legal Services and Digital Technologies

Updated: Nov 11, 2018


Digital technology is rapidly advancing in society, possibly due to the mounting investment and dependency we all place on it. Globalisation has then taken this technology and made it available to everyone, everywhere; making it more accessible and affordable. The combination of advancing, accessible and affordable technology is enabling society to move online in almost all aspects, including the legal industry.


Technology is clearly the future of the legal industry, due to its positive impact on certain legal processes and procedures. The integration of technology was supported by Lord Woolf when it was only beginning to take effect. He said “appropriate technology is fundamental to the future of our civil justice system” in a report concerning access to justice. Now in the 21st century law firms, chambers and other legal services providers can use technologies that can directly be applied to legal work, increasing efficiency and making some of the longest legal processes the shortest. With this change in the industry, the new relationship between technology and legal services also brings with it challenges, particularly concerns regarding ethical responsibilities.


The legal sector in England and Wales did not have an official path for technology within the industry and the legal sector has always been one to take to technology cautiously, due to the array of affects its use can have on the work lawyers do, and the result and impact of that work. This is possible because of predictive coding which is the fundamental mechanism behind most of the current technologies legal services providers have adopted. These systems were first deployed by the US justice system and later in England with Charles Hollander QC predicting its use in England and saying:


“At present, the population of documents identified is normally searched for relevance by lawyers or paralegals. In the United States electronic searching is beginning to be introduced. Tests have shown that it is more reliable than review by humans. No doubt this will be with us soon.”


In the case of Pyrrho Investments Ltd v MWB Property Ltd, a decision was sought by both parties concerning the court’s view on use of predictive coding programmes. The High Court ruled that use of such technology was acceptable and found:

“There is no evidence to show that the use of predictive coding software leads to less accurate disclosure being given than, say, manual review alone or keyword searches and manual review combined...”.


Master Matthews in this case then approved the use of predictive coding and also mentioned that it promoted the objective of Part 1 of the Civil Procedure Rules. This supports the fact that law firms and other legal-based companies are using predictive coding not only for its efficiency, but for cost - saving factors.


The High Court decision has then recognised the use of predictive coding technologies in circumstances where it is more efficient and necessary to use, enabling legal services providers to feel more comfortable investing and adopting these technologies. As a result of this it has prompted academics, legal professionals and practitioners to consider ethical responsibilities arising from the use and application of artificial intelligence systems and assistants. In America, a research facility has been created at Carnegie Mellon University specifically to explore the ethics of artificial intelligence.


Recently appearing at the new Legal Geek Conference in London, Ravn Systems is an example of an artificial intelligence system where ethical questions could be raised. Ravn is an artificial intelligence system called ‘ACE’ (Applied Cognitive Engine) it can automatically analyse documents for due diligence and e-disclosure work, as well as conduct contract risk analysis. Matthew Whalley, Head of Legal Risk Consultancy at BLP commented on the use of ACE:


“Our ‘contract robot’ can now finish in less than two seconds work which would have taken a team of people 100 days to complete.”


This both demonstrates the impact artificial intelligence systems can have on the legal industry in terms of efficiency, but could also have an impact on the ‘human based system’ on which law firms currently rely and operate. We could also see a reduction of lawyer recruitment across the globe as artificial intelligence systems begin to complete work that associates would otherwise complete. The impact of this technology is clearly going to effect significant change in the way the legal sector operates, and because of that, we could see the regulatory bodies of the legal profession begin to regulate the application and use of some technologies. According to Ray Worthy Campbell in “The Digital Future of the Oldest Information Profession” we could see heavy manual work being diverted from young lawyers and ultimately create a lack of work to justify and maintain current intake level of lawyers. This could cause a re-structuring of the human-based law firm and bring with it increasing attention by partners to consider ethical and regulatory issues surrounding the use of advanced and disruptive technologies.


Artificial intelligence could also affect the work that experienced lawyers do on a daily basis; adoption of technologies centred around legal research has been developed and advanced by Ross Intelligence. Legal research is known for being time consuming, even with the assistance of online databases such as Westlaw and LexisNexis. ROSS works through a friendly system requiring no substantial training, it can simply be asked questions concerning any legal issues or points of law and ROSS will provide the answers in seconds. William Caraher, Chief Information Officer at von Briesen & Roper commented on the efficiency of using the artificial assistant:


“With ROSS, the associates at our firm can do on point research much faster and then quickly drill down on the main issues that help support the best possible outcome for our clients.”


Whilst the use of this software is in it adolescence, there are ethical issues that can arise out of its use, one issue being confidentiality of client information. Should an artificial body be able to receive and store privileged information? It is already understood that when lawyers are using online storage facilities and other digital communications, they have an ethical, legal and regulatory responsibility. This is to establish that there is sufficient protection of their client’s information and that the use of the digital facility is necessary. It could well be possible that the regulatory bodies for legal professionals also establish rules with a similar approach surrounding artificial intelligence systems in the future. The Ministry of Justice could set out a new practice direction concerning the use of such systems to ensure proficiency in its use and the protection of client information is sustained. Currently, there is no specific, official or clear guidance in the wake of artificial intelligence systems and other technologies currently being used, indicating why ethical concerns are being raised.

This is why law firms are beginning to take steps to ensure that lawyers using artificial intelligence systems are competent in using such technologies. It has been established that the use of predictive coding based technology is no different in its reliability and accuracy, compared with current or past methods used to conduct due diligence and disclosure or contract liability analysis. In order that this remains the case, lawyers must ensure they are sufficiently proficient in the use of such systems both to enable the technology to be efficient and to protect the best interests of lawyer’s clients.


At present, liability in respect of the use of artificial intelligence systems in the provision of legal services is not clear, or sufficiently provided to protect consumers. Should a lawyer use predictive coding software and be unable to input the initial data correctly, the result would be distorted and ineffective. Although this has not yet happened in England, in Oklahoma, a practitioner in the Bankruptcy Court was suspended “for his continued failure to properly and accurately fill out electronic bankruptcy forms”. This was an online submission procedure he was found not to be reasonably proficient or competent in, leading to suspension from the Bankruptcy Court until a disciplinary hearing. In order to protect litigants and regulate lawyers it could soon be necessary for the Ministry of Justice to set out a guideline for use of technology assisted review. The Technology and Construction Solicitors’ Association’s has already set out a “Guide to eDisclosure” for its members, which sets out basic guidance on the use of technology assisted review or predictive coding software, demonstrating a need and demand for such guidance and regulation by lawyers.


If the use of technologies within law firms was to become more significant in the operation of everyday activities, action is more likely to be taken to regulate the use of artificial technologies. Questions will be raised such as, who is liable for the result or answer of an artificial intelligence systems? The answer will be key to client security. If a client’s case were to suffer as a result of a lack of proficiency or negligence in the use of an artificial system an individual or body needs to be identifiable to be responsible and liable.


As an artificial intelligence system can only do as instructed, it would be logical to presume that the lawyer operating and inputting the data would be responsible for the result the system comes to. A lawyer using such artificial intelligence needs to possess a certain level of competence to ensure that the result an artificial intelligence system achieves is a reasonable and likely one. However, without guidance on such a level of competence, users of such technologies are unable to gauge accurately if a lawyer is sufficiently competent. In order to ascertain if a lawyer was reasonably competent in using artificial intelligence technology, should something go wrong, it is necessary for official guidelines and guidance be provided for lawyers to ensure they can act in the best interests of their clients. Although no such liability has currently been clearly established, it remains a significant ethical concern to professionals within in the legal sector.


The legal services industry has also seen the rise and regulation of Alternative Business Structures (ABSs) which allows businesses to provide unreserved work, such as legal advice under the Legal Services Act. Regulation is also provided (subject to approval) by The Law Society. ABSs are an alternative to the traditional and strictly legal-orientated law firm acting as a legal services provider alongside other useful services and goods. Some ABSs have been seen to take on law graduates to provide such legal services and unreserved work, reducing business costs for staff. This is another contributory factor to the rise of a consumer legal market due to the provisions of the Legal Services Act and the ability to source and use technologies to expedite legal processes and procedures. This increases profitability and accessibility through the volume of work they can complete in a short period.


Entrepreneurs can now create innovative business models surrounding legal services and hire staff at a lower-cost, keeping their overheads lower than a traditional law firm. They are also, usually, funded by capital investment enabling the purchase of digital technologies to assist staff with their work, increase efficiency and reduce costs. This can threaten the existence and need for law firms both high-street and magic circle, which could be a factor in why such law firms are becoming actively involved in the adoption of technology to remain competitive and forward thinking.


Traditional law firms have begun to become more focused on costs and aim to develop efficiency and progression within staff and systems; larger law firms and companies with in-house legal departments are paying more attention to the technology market and what legal tools come to the market that can effectively assist their work force. As mentioned above, this could be strategic in terms of remaining necessary within the industry and competitive amongst the rise of alternative business structures.


As artificial intelligence has significantly developed in such a short period of time, the industry has seen a sudden influx of technology. This technology is now and in the future, becoming more integrated and involved with the provision of legal services that were once conducted only by humans with legal qualifications. Due to this influx of artificial intelligence assistance ethical queries and concerns have been voiced as the current codes of conduct for solicitors and barristers is worded and structured for humans and human thought, not artificial systems.


It can be concluded that the increasing adoption of technologies in the provision of legal services is due to an abundance of factors. The huge advancement and impact technology is having on the legal sector is changing the global market. It is clear that the effects of the introduction and use of such technologies is re-defining the legal industry’s structure with such an impact to possibly cause a complete re-structuring of the traditional law firms.

It is clear however, that those within the legal industry and academics teaching and reading the law, are increasingly considered with ethical issues arising out of the use of technologies and the lack of official guidance and regulation concerning its use is still a question yet to be answered.


Fraser Matcham