Create A National Register of Algorithms – Law Society AI Commission

A special commission created by the President of the Law Society, Christina Blacklaws, to understand the impact of AI and algorithms in the legal world has concluded in its final report with the recommendation that the UK needs to create a ‘national register of algorithms’ used in the criminal justice system that would include a record of the datasets that were used in training.

Another key recommendation in the report – which has a focus on the criminal justice system – is that the UK’s Information Commissioner’s Office – which looks after issues such as GDPR and privacy – should now take on a pro-active role in examining the use of algorithms.

Here is what the report says about the register:

‘Sub-Recommendation 1.7 National Register of Algorithmic Systems – A register of algorithmic systems in criminal justice should be created, including those not using personal data, alongside standardised metadata concerning both their characteristics, such as transparency and discrimination audits and relevant standard operating procedures, and the datasets used to train and test them.

Leadership of this could be taken by the Centre for Data Ethics and Innovation, as the Centre matures, in an open consultation procedure considering the criteria and thresholds for systems included in this register.’

Blacklaws launched the commission at the start of her presidency last year, which held four public hearings, 75 interviews with experts and received 80 written submissions. The move was driven by the fear that the use of AI and algorithmic tools across the justice system is happening in an unregulated environment.

Although Blacklaws stressed that there were considerable benefits to the use of the tools, at present they were being used without any clear underpinning in the law.

The recommendations are not legally binding, and are more a call to Government to respond.

She stressed that algorithmic systems are already widespread in the justice system, from use by police for facial recognition and predictive policing, to crime labs, to courts and the parole system.

The full recommendations can be found here.

Blacklaws concluded: ‘These are ambitious recommendations…but they map out a framework…..to allow the public to reap the benefits but avoid the dangers.’

1 Comment

  1. This is not a good idea. Define “algorithm.” Define “used in the criminal justice system.” (Hint: If I calculate the average age of convicted criminals, I’ve used an algorithm in the criminal justice system. Does it still qualify if I calculate the average age of their lawyers?)

    Now that you’ve defined those amorphous concepts, tell me how fast new algorithms “to be used in the criminal justice system” are created. Or altered slightly to get a better result, or a faster one, or because the programmer felt like it. When do such alterations make a new algorithm?

    There is no way this “register” can possibly accomplish anything meaningful. Unless you think soaking up a lot of time and money for absolutely no good reason meaningful.

2 Trackbacks / Pingbacks

  1. End of The Beginning For Smart Contracts – Accord Forum Write Up – Artificial Lawyer
  2. UK Government Faces Court Over ‘Biased’ Visa Algorithm – Artificial Lawyer

Comments are closed.