The Automatic Facial Recognition (AFR) system rolled out by the Metropolitan Police in London, which is also being trialled in by 3 other police forces in South Wales, Leicestershire and Humberside, has come up against a legal challenge launched by lawyers of the civil liberties group, Big Brother Watch, who argue the use of AFR breaches the rights of individuals under the Human Rights Act.

The technology behind AFR uses surveillance cameras and CCTV to record and then compare facial characteristics with images stored on police databases.

Metropolitan police commissioner, Cressida Dick said, “If there’s a technology that we can use lawfully – which we can, this is one – and is available, which we are trialling with massive safeguards… (and there is) the notion that that technology might be used in limited circumstances to identify against a small list of wanted offenders for serious violence, I think the public would expect us to be thinking about how we can use that technology, seeing if it’s effective or efficient for us. And that’s exactly what we’re doing.”

However, she indicated she isn’t expecting much more from the tech, saying: “It’s a tool, it’s a tactic. I’m not expecting it to result in lots of arrests.”

She also acknowledged that the trials had not met with vast amounts of success to date. The AFR system is reported to have had a 98 per cent false positive rate and has only made two accurate matches. The Met also sat the technology will help keep London safe.

The watchdog described its use in public places as “very intrusive”. Court documents claim the Home Office has failed in its duty to properly regulate the use of AFR.

The system’s manufacturers say they can monitor multiple cameras in real time “matching” thousands of faces a minute with images already held by the police. Big Brother Watch say that research published by the Met in May show that only 2 genuine matches were made out of 104 system alerts during the trials. The group also takes issue with the length of time the images gathered by AFR are held.

In South Wales, it was reported that the police used AFR at least 18 times between May 2017 and March 2018. Cameras in Cardiff city centre and at a demonstration at an “arms fair” were used to gather the images of members of the public. As of April, this year, AFR generated 2,451 alerts with only 234 proving accurate. Police officers stopped 31 people who had been incorrectly identified and asked them to prove their identity.

Big Brother Watch argue the use of AFR breaches the rights of individuals under the Human Rights Act, including the right to privacy and freedom of expression.

Silkie Carlo, director of the civil liberties group, said: “When the police use facial recognition surveillance they subject thousands of people in the area to highly sensitive identity checks without consent.”

“We’re hoping the court will intervene, so the lawless use of facial recognition can be stopped. It is crucial our public freedoms are protected,” she added.

The body that advises London Mayor Sadiq Khan on policing and ethics last week called on the Met to be more open about the use of AFR – and to set out where and when it will be used before undertaking any further pilots.

Dr Suzanne Shale, who chairs the London Policing Ethics Panel said: “We have made a series of key recommendations, which we think should be addressed before any further trials are carried out.

“We believe it is important facial recognition technology remains the subject of ethical scrutiny.”

A similar challenge has been launched against the South Wales Police, by Cardiff man Ed Bridges, backed by Liberty. South Wales police have said this week that they weren’t going to challenge the application for a judicial review, meaning the case moves on to the next stage.