In line with Artificial Lawyer’s campaign for the better regulation of facial recognition technology (FRT), here are some of the key developments that are shaping the issue around the world.
USA – California bans law enforcement agencies from using FRT body cameras
On October 8, California Governor Gavin Newsom, endorsed legislation that prohibits local law enforcement from equipping body cameras with facial recognition software and other biometric scanners for three years. The law comes into effect on 1 January, 2020.
Assembly member Phil Ting (Democrat – San Francisco), who proposed the law, said: ‘The public wanted their officers and deputies to use body cameras to provide accountability and transparency for the community. The addition of facial recognition technology essentially turns them into 24-hour surveillance tools, giving law enforcement the ability to track our every move. We cannot become a police state.’
The State-wide rule follows a more locally focused ban in San Francisco. The move by California matters as it is not just America’s most populous State, it’s home to many of the world’s largest tech companies, some of which have been pioneering FRT technology.
Australia: Plans for facial recognition database rejected (for now…)
Meanwhile, in Australia, a Parliamentary security committee has ordered the Government to redraft plans for a national FR database over privacy concerns.
Immigration Minister, David Coleman, when introducing the bill in July 2019, had said that it would enable identity-matching services which would make it easier for documents containing facial images to be safely verified online.
However, the Parliamentary Joint Committee on Intelligence and Security raised concerns and sent it back to the drawing board, saying: ‘The Committee recommends that the Identity-matching Services Bill 2019 be re-drafted taking into account the following principles: the regime should be built around privacy, transparency and subject to robust safeguards, [in addition] the regime should be subject to Parliamentary oversight and reasonable, proportionate and transparent functionality.’
This suggests that the project will return in some form at a later date, and that the FRT regulation debate in Australia is certainly not over yet.
UK: Update on two key court cases
In the United Kingdom, Artificial Lawyer spoke to human rights campaigners on updates on their court cases against British police forces over their use of FRT.
Last month, the High Court ruled that the police force’s use of the technology was justified. A Liberty spokesman said they had applied to the Court of Appeal earlier this month (on behalf of Ed Bridges), in response to the ruling.
Another case of interest is that of London-based Big Brother Watch, which has challenged the Metropolitan Police’s use of FRT. The case has been on hold for some time now, after the Met finished their trials in February this year.
Asked for an update, Big Brother Watch’s Legal & Policy Officer Griff Ferris said that the case is expected to go live if the police force decides to use FR surveillance again.
‘Since February, we have been waiting for the Metropolitan Police to decide whether they intend to use live facial recognition surveillance again. If they do, we’ll take them to court.
‘Live facial recognition doesn’t fit in a democracy and we will fight until it is banned,’ he said.
Expert View: What Are Your Rights With FRT?
Speaking to Artificial Lawyer, Renzo Marchini, a partner at UK law firm, Fieldfisher, focusing on privacy, security and information, explained some of the ethical and legal concerns around FRT.
Generally, FRT is considered to be useful to the Government for law enforcement purposes, however there are widespread concerns that it violates individuals’ human rights and privacy, and there are worries that it tends to be biased against ethnic minorities. There are also issues around data protection.
‘The legal concern is that we have these laws which say that you need to comply with data protection law if you process data about people, and data protection law sets out high level principles about giving people choice, giving people control. But, how do you give people choice and control, if these cameras are out there automatically scanning our faces?’ Marchini said.
‘There are exceptions to the requirements to get consent … those exceptions might in theory be applied to the police, [but] they still need to be applied properly, proportionately, with due regard to the legitimate concerns of individuals,’ he added.
So, what legal action can be taken by people or organisations regarding FRT?
‘People could sue for damages. The regulators [and courts] can say ‘stop, you are behaving unlawfully’. But, proving damages, [and] suing for money is hard because you normally have to prove that this cost you money out of your pocket, and that money is being taken out of your pocket [because] your face is on a database or being checked against a watchlist,’ Marchini explained.
He added: ‘In theory you could sue for something called ‘distress’. [For instance] ‘I was distressed that I was being filmed and checked,’ but that is going to be hard to prove. So it’s going to be actions to tell people to stop, rather than actions to get compensation [that will be brought to court], at least I think in the short term.’
While there are clearly now multiple efforts to curtail the use of FRT in the public realm, the reality is that the genie is already out of the bottle and there is no way to put it back.
The efforts above range from limited bans within the public sector, to reviews of new implementations of the tech, to specific court cases against police use of FRT. In short, it’s a patchwork of efforts, and there are huge gaps between them. Many examples also tend to focus on State-backed projects, rather than in the private sector – which is also experimenting with the tech, often in the public domain.
Meanwhile, the technology and its use is still rapidly spreading around the world, and there remains as yet no fully tested national position on its use in countries such as the US and UK.
By Irene Madongo