UK Gov-Backed Report Calls for AI Data Trusts; Praises Legal Sector

An ambitious UK Government-backed report into the nation’s AI industry has called for the creation of Data Trusts, while also praising the progress of AI in the legal sector.

The Government-commisioned report, ‘Growing the Artificial Intelligence Industry in the UK’,  is by Dame Wendy Hall, Professor of Computer Science at the University of Southampton, and Jérôme Pesenti, Chief Executive of AI company, BenevolentTech.

Among the 18 main recommendations by the experts is that the UK Government, AI companies and private businesses should work together to create Data Trusts, i.e. legally binding, long term, framework arrangements that share valuable data with AI companies to help them to create new applications.

‘To use data for AI in a specific area, data holders and users currently come together, on a case by case basis, to agree terms that meet their mutual needs and interests,’ the report authors explain.

‘To enable this to be done more easily and frequently, it is proposed to develop terms and mechanisms for these parties to form, between them, individual ‘data trusts’ to enable AI to be developed… and allow data transactions to proceed with confidence and trust,’ they proposed.

Dame Wendy Hall

‘These trusts are not a legal entity or institution, but rather a set of relationships underpinned by a repeatable framework, compliant with parties’ obligations, to share data in a fair, safe and equitable way,’ they conclude.

As a growing number of AI experts have noted at public hearings on AI, including Mike Lynch, the backer of legal AI company Luminance, an industry that is based primarily on machine learning needs access to large volumes of good quality, relevant data. Without the right levels of data it’s very hard to create sufficiently robust models to return good results.

For example, how can you train a legal AI doc review system to identify anomalies in certain clauses if you don’t have enough real world samples of the range of clauses that exist and may appear in contracts? The short answer is: you can’t, at least not to a high level of accuracy.

The idea of Data Trusts is of course controversial, as it tends to mean State-backed organisations, e.g. the NHS, or perhaps a very large institution, such as a bank, insurance company or transport company, sharing data over the long term with others so that AI applications can be built.

Jérôme Pesenti

New GDPR rules further complicate this, especially if any personal identifiers are to be included. Then one has issues of consent. While stripping out all potential identifiers for anonymised use is not always simple.

It’s unclear yet how AI Data Trusts would work in the legal sector. Legal AI companies have already taught and continue to teach their natural language processing systems with examples of real contracts.

And litigation data presented in open court to some extent has already seen broad use by legal AI companies, especially in the US, to create applications.

But, perhaps this could be taken further with Data Trusts, with groups of companies coming together to provide large volumes of anonymised contractual data to certain AI companies to help them build new applications.

Perhaps also elements of the UK’s justice system could do the same, handing over anonymised case files to any legal AI company that wanted to develop applications for the Ministry of Justice and HM Courts & Tribunals Service.

Either way, the area of Data Trusts could become an interesting practice stream for lawyers to advise on and explore hand in hand with legal AI companies.

And, as well as potentially creating some interesting new legal issues for law firms to grapple with, the report also praised the way that lawyers were embracing AI. In particular they highlighted the work of UK law firm, Pinsent Masons, and its creation of its homegrown legal AI application, TermFrame, as well as the work of global law firm, Allen & Overy with Deloitte to create MarginMatrix, which helps with derivatives compliance and automated drafting.

The Government Position on AI 

Following the publication of the report, members of the UK Government have added their voices to calls to make the country a centre of excellence for the global AI industry.

The Government also repeated claims in the report that AI could ‘boost productivity, advance healthcare, improve services for customers and unlock £630bn for the UK economy‘.

This certainly suggests that lawmakers here see AI as a force for good, rather than as something that will harm the economy due to job replacement.

Culture Secretary Karen Bradley said: ‘I want the UK to lead the way in Artificial Intelligence. It has the potential to improve our everyday lives – from healthcare to robots that perform dangerous tasks.’

‘We already have some of the best minds in the world working on Artificial Intelligence, and the challenge now is to build a strong partnership with industry and academia to cement our position as the best place in the world to start and grow a digital business.’

Meanwhile, Business Secretary Greg Clark said: ‘Artificial intelligence presents us with a unique opportunity to build on our strengths and track record of research excellence by leading the development and deployment of this transformational technology.’

‘This important review exemplifies the world-class expertise the UK already has in AI, demonstrating the huge social and economic benefits its use can bring. We will continue to work with the sector in the coming months to secure a comprehensive Sector Deal that make the UK the go to place for AI and helps us grasp the opportunities that lie ahead,’ he concluded.