As an in-house lawyer, your legals tasks for the day can range from the simple to the very complex. It’s no wonder that in-house teams everywhere are looking at legal technology to unburden lawyers of simple legal tasks. Augmented intelligence (or intelligence augmentation (IA), to avoid confusion with AI) is not a new technology. Rather, it’s a more realistic representation of the current state of legal technology.

Current legal technology is definitely not able to replicate the legal mind, even in respect of simple legal decisions. Instead, it is only able to automate specific legal tasks with human assistance, hence the use of the term ‘augmentation’. This distinction is important in understanding the strengths and limitations of legal technology. Further, the distinction has particular importance for understanding your ethical obligations as a solicitor when using legal technology to automate legal tasks in your organisation.

Hands-Off? Not Quite Yet

Many people hope (and some fear) that artificial intelligence will remove the need for human lawyers, much like the driverless car. Unfortunately (or thankfully, depending on your perspective), lawyers will need to keep one hand on the wheel. Augmented intelligence still requires a human legal mind to refine, train and analyse the software. Two examples of augmented intelligence in the legal industry include:

  1. predictive coding in discovery; and
  2. contract review software.

Predictive Coding

As many lawyers know, discovery can be a mammoth legal project, whereby lawyers need to identify all relevant documents from a much larger set. In predictive coding, software developers have mapped the logic process for this process.

The software examines the content and metadata of the document and identifies characteristics (such as the existence (or lack of) certain words and phrases or relationships between words and dates) that would indicate whether the document is more likely to be relevant or irrelevant.  

However, the software still requires a lawyer to provide it with the documents from which it can examine the content and metadata. More importantly, lawyers must provide examples of coding for relevance that enable the software to identify the characteristics unique to that matter. Hence, the concept of ‘machine learning’ because the software can ‘learn’ and get better at identifying relevant documents thanks to the lawyers’ input.

Contract Review Software

There are various manifestations of contract review software, but, generally, the software can:

  • classify clauses (for example, classifications such as ‘change of control’ and ‘assignment’);
  • pull information from clauses; and
  • identify clauses which are consistent or inconsistent for specific attention by lawyers.  

Lawyers can provide human assistance, as with the machine learning process above.  Alternatively, the software may have pre-loaded rules for certain types of clauses with consistent phrasing.

The above processes are referred to as ‘supervised learning’, whereby lawyers act as the supervisors to the software’s learning. Some newer software is also using ‘unsupervised learning’, meaning it identifies relationships that exist in the documents without guidance. This is useful when dealing with unstructured documents, where you are not sure what relationships exist.

Get Your Hands Dirty

Although you can delegate some legal tasks with augmented intelligence, you cannot delegate your ethical obligations.  

Your primary ethical obligation is to your client (your organisation) to ensure:

  • compliance with the law;
  • good corporate governance; and
  • robust legal compliance processes and procedures.  

The point is that, when adopting legal technology, your responsibility to ensure it is properly implemented and managed. Accordingly, you need to identify the legal risks with augmented intelligence.  

Software never tires, never gets distracted by Facebook and never deviates from its programming. But while software can eliminate human error and inconsistency, it can also compound human error. Improper training or a misunderstanding of the software’s limitations can lead to incorrect outcomes. Further, some of the training will occur by third parties, such as pre-loaded rules in a contract review.

It is, therefore, essential to know what training has occurred and be comfortable that the training has been undertaken properly. In addition, the people designing and writing the software are humans and humans are fallible. Keeping this in mind will mean you can correctly assess the legal risks of augmented technology and take steps to ensure it helps, rather than hinders, you.

Clean Hands Are Safe Hands

However, you don’t have to read every line of code, understand algorithms or review every output of the software. This would be impossible for most lawyers and undermine efficiency gains you’re achieving with the software.  

Rather, you can outsource some legal work to other firms. When you do this, however, you are still responsible for the task within your organisation. Therefore, it’s best practice to put into place processes to ensure you:

  • have a clear selection strategy to identify the right law firm with proven credentials;
  • communicate clear instructions and expectations and brief the firm with the relevant documents;
  • ensure the law firm clearly sets out their assumptions, methodology and reasoning; and
  • review the final work product.

Adopting these processes will allow you to understand and manage any legal risks for using legal technology and augmented intelligence. The same processes can also apply to choosing, and working with, a software provider.

Clear Selection Strategy

Just like when you’re evaluating a law firm, you also need to understand the track record of the software and examples of its use in the market. It’s also best practice to talk to users of the software to get direct feedback.

Clear Instructions, Expectations and Brief

You need to make your needs and expectations clear from the outset. Ask the software provider to step through some relevant real life examples. Even better, brief them beforehand with some examples of tasks you have previously completed, for them to demonstrate through their software. This will enable you to have a reference point when evaluating different software providers.

Assumptions, Methodology and Reasoning

It is important to clearly work through the assumptions, methodology and reasoning behind the software.

Do not be afraid to ask the provider to clarify their answers. The meeting of law and technology has had the frustrating side effect of combining jargon from each field and creating its own set of jargon.  

The purpose of this exercise is not only to understand what the software can do but, more importantly, to understand its limitations. This will enable you to properly manage those limitations if you decide to adopt their software.

Review the Final Work Product

With augmented intelligence, it is often not feasible to review every output of the software, especially in the context of high volume legal tasks.  

Instead, to ensure quality and compliance with your ethical obligations, you can conduct spot checks of both the:

  • input you are using to train the software; and
  • output.  

For predictive coding in particular, these kinds of spot checks have been accepted by some Australian courts as a way to demonstrate the accuracy and reliability of augmented intelligence.  

For example, you regularly spot check the coding by the software during the process of training the software. In doing so, you evaluate the effectiveness of the software until it is within a range of accuracy and reliability you are happy with.

Key Takeaways

Augmented intelligence is an emerging area of legal technology that is providing significant efficiency and productivity gains. However, as augmented intelligence involved the automation of legal decision making, and requires the assistance of a human lawyer, it cannot be treated as simply a technology implementation. Instead, its implementation and use have to be carefully managed.  

If you would like to explore how legal technology can help your in-house team, get in touch on 1300 544 755 or fill out the form on this page.

If you would like further information on any of the topics mentioned in this article, please get in touch using the form on this page.
  • We will be in touch shortly with a quote. By submitting this form, you agree to receive emails from LegalVision and can unsubscribe at any time. See our full Privacy Policy.
  • This field is for validation purposes and should be left unchanged.
If you would like to receive a free fixed-fee quote for a legal matter, please get in touch using the form on this page.
  • We will be in touch shortly with a quote. By submitting this form, you agree to receive emails from LegalVision and can unsubscribe at any time. See our full Privacy Policy.
  • This field is for validation purposes and should be left unchanged.

Privacy Policy Snapshot

We collect and store information about you. Let us explain why we do this.

What information do you collect?

We collect a range of data about you, including your contact details, legal issues and data on how you use our website.

How do you collect information?

We collect information over the phone, by email and through our website.

What do you do with this information?

We store and use your information to deliver you better legal services. This mostly involves communicating with you, marketing to you and occasionally sharing your information with our partners.

How do I contact you?

You can always see what data you’ve stored with us.

Questions, comments or complaints? Reach out on 1300 544 755 or email us at info@legalvision.com.au

View Privacy Policy