Last week, doubters of legal AI tech had a fleeting moment of validation – or at least they thought so – after a major ediscovery project involving United Airlines appeared to go wrong, leading some to suggest that there had been an ‘AI Snafu‘ after only 17% of the millions of docs analysed turned out to be ‘responsive’ i.e. of any potential use to the case.
The problem with pointing the finger at the technology is that experts think (see response below from leading ediscovery consultant, Jonathan Maas) that this had far less to do with the Technology Assisted Review (TAR) itself and more about the way the humans involved ran the matter and used the tech.
It’s also worth noting that TAR is focused on sorting millions of complete documents into two piles (responsive and non-responsive) and is a long way from clause level doc review carried out by companies such as Kira Systems, iManage/RAVN and Seal Software, which more usually is termed the stuff of Legal AI.
So, was this a cock-up of the ‘AI’ and ‘algorithms’ or just humans making a human mess of what is a massive and highly complex project involving millions of docs?
First, the basics of the case taken verbatim from the court documents. If you already know this one, please feel free to skip to the comment piece by Maas directly below: ‘Don’t Shoot the AI Puppy!’
The Case
‘This case involves a multidistrict class action litigation brought by Plaintiffs, who are purchasers of air passenger transportation for domestic travel, against [remaining] Defendants, Delta Air Lines, Inc. (“Delta”) and United Airlines, Inc. (“United”), two of the four largest commercial air passenger carriers in the United States, based on allegations that Defendant airlines willingly conspired to engage in unlawful restraint of trade.
…..Plaintiffs assert that [a] request for an extension of discovery is predicated on a recent “issue with United’s “core” document production,” which constitutes good cause to extend the discovery deadlines.
More specifically, Plaintiffs assert that United produced more than 3.5 million [core] documents to the Plaintiffs, but “due to United’s technology assisted review process (“TAR”), only approximately 17%, or 600,000, of the documents produced are responsive to Plaintiffs’ requests,” and Plaintiffs must sort through them to determine which ones are responsive, which requires additional time.
Plaintiffs contend that they “could not have foreseen United’s voluminous document production made up [of] predominantly non-responsive documents resulting from its deficient TAR process when they jointly proposed an extension of the fact discovery deadline in February 2018.”
Because Plaintiffs are unable to segregate the large number of non-responsive documents from the responsive documents within the time remaining before the discovery deadline, so that they can use the responsive documents to prepare for depositions, motions practice, and trial, they have asked this Court for an extension of the deadline.
Defendants Delta and United oppose Plaintiffs’ request for an extension, but for the reasons set forth herein, this Court shall GRANT Plaintiffs’ Motion for an Extension of Fact Discovery Deadlines, with the proviso that no further extensions of discovery will be considered by this Court.
ORDERED that the Court GRANTS Plaintiffs’ Motion for an Extension of Fact Discovery Deadlines.
The Law Firms involved, (from court data):
For UNITED CONTINENTAL HOLDINGS, INC., a Delaware corporation,
Defendant: Kent Alan Gardiner, LEAD ATTORNEY, Cheryl Anne Falvey, David Martin Schnorrenberg, CROWELL & MORING LLP, Washington, DC; Paul Laurence Yde, LEAD ATTORNEY, FRESHFIELDS BRUCKHAUS DERINGER US LLP, Washington, DC.
And that is the long and the short of it. But what does it mean? Did the tech fail or the humans fail? What went wrong here? Expert on ediscovery Jonathan Maas explains:
‘Don’t Shoot the AI Puppy!’
By Jonathan Maas of the Maas Consulting Group.
All the reports I have read on this ‘glitch’ wrongly indicate that the technology went rogue.
This is no ‘I, Robot’ scenario. We have not slid, unnoticed, into the Matrix. The Paginator [which is a handheld document numbering tool used in the days of hard copy] is not striding down the corridors of justice seeking to drown Sarah Connor in a sea of irrelevant documents.
This is human error, plain and simple. Probably confounded by good old ignorance. The technology works fine if set up and monitored correctly.
Set up: the technology is not intelligent: it is a slave to its human masters. It is repetitively taught what to retrieve by those masters until it satisfies them that it can largely be left alone to do its job correctly.
Like an eager puppy it shoots off to find the red ball, returning to its master’s feet tail a-wagging, blue ball in mouth. So master says “no” and sends it off again, and again, and again until the puppy brings back the red ball more often than not. These are known as training rounds, and they must continue until the master is happy that the hound understands its task, more often than not.
More often than not: it’s only a puppy. There will be errors, there will still be some blue balls retrieved and some red balls missed but, hopefully, small in number.
Monitored: Best that the master checks from time to time to make sure the puppy’s still finding more red balls than blue ones. If it seems to be getting distracted by conkers, or the tone of the red is subtly changing, master will need to retrain the faithful pooch (and give it an encouraging treat) to get it back on track.
Once confident that the training has stuck, the master can go off and leave the diligent puppy to repeat what it knows to be the correct retrieval process until there appear to be no more red balls out there. What a diligent master should then do is check the pile of red balls to make sure that the number of blue balls mistakenly retrieved by the puppy is statistically acceptable given that it was only a puppy making the call.
A diligent master would then also check the yard to make sure that the number of red balls the puppy missed in the sea of blue balls is statistically acceptable given that it was only a puppy making the call.
A sensible master would also have sought agreement from whoever else is impacted by the puppy’s retrieval skills to make sure that the number of blues balls mixed up with the red balls and the number of red balls left in the yard by the puppy are both acceptable error rates given the process adopted and the cost of that process. Indeed, the whole exercise should best be transparent to avoid having someone later complain that master and puppy got it all wrong.
In this instance it would appear that the parties had reached agreement as to how to proceed with their respective discoveries and a Special Master had assisted them. All well and good up to that point. And, it would seem, no further for the defendants.
All that I can conclude from the various reports I have read is that the defendants thought it sufficient to hit the predictive coding “Go” button and, in effect, let the puppy run all over the yard uncontrolled until it collapsed exhausted at their feet. At which point they bagged up the balls and handed the bag unchecked to the plaintiffs. And then shut the yard door without even glancing at the ignored balls.
So, we don’t know how the defendants trained the poor pooch and it looks as if they failed to look in the bag before handing it over and then misguidedly thought their position strong enough to argue the toss with the plaintiffs before the judge, accusing the plaintiffs of needlessly seeking an extension of time to review the excessive rubbish the defendants had served on them (the plaintiffs complained that only 17% of the produced documents were responsive to their requests). Interestingly, and not really surprisingly, there is no mention of the vendors concerned.
As I said, there is nothing wrong with the technology if correctly deployed on the correct type of document populations. This is just human error on the public stage.
Please, don’t shoot the AI puppy!
2 Trackbacks / Pingbacks
Comments are closed.