Shropshire Star

Met Police officers to begin using facial recognition cameras

Assistant Commissioner Nick Ephgrave said the technology will help tackle serious and violent crime, and will also be used to trace missing persons.

Published
Last updated
Facial recognition camera

Police in London will soon start using facial recognition cameras for the first time.

The Metropolitan Police said the technology will be used in the fight against serious and violent crime, and also to help find missing children and vulnerable people.

Suspects wanted by police or the courts will be on “watchlists”, and if they are spotted by the cameras they will be approached by officers.

Assistant Commissioner Nick Ephgrave said the technology will be deployed for the first time “within a month”.

Trials of the cameras saw them used in locations including the Westfield shopping centre in Stratford and London’s West End.

Scotland Yard say that the public will be aware of the surveillance, with officers handing out leaflets to the public and the cameras in open locations.

Mr Ephgrave said: “Every day our police officers are briefed about suspects they should look out for; live facial recognition improves the effectiveness of this tactic.

“Similarly if it can help locate missing children or vulnerable adults swiftly, and keep them from harm and exploitation, then we have a duty to deploy the technology to do this.”

Civil rights groups have raised concerns over the technology, and in July last year, the data watchdog warned police forces testing the scanners that privacy and data protection issues must be addressed.

A camera used during trials at Scotland Yard for the new facial recognition system (Stefan Rousseau/PA)
A camera used during trials at Scotland Yard for the new facial recognition system (Stefan Rousseau/PA)

At the time, Information Commissioner Elizabeth Denham said: “We understand the purpose is to catch criminals but these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives.

“And that is a potential threat to privacy that should concern us all.”

She has also called for a legal code of practice to be established before the technology was deployed.

But Mr Ephgrave said the Met is “in the business of policing by consent” and thinks the force is effectively balancing the right to privacy with crime prevention.

He said: “Everything we do in policing is a balance between common law powers to investigate and prevent crime, and Article eight rights to privacy.

“It’s not just in respect of live facial recognition, it’s in respect of covert operations, stop and search, there’s any number of examples where we have to balance individuals right to privacy against our duty to prevent and deter crime.”

Privacy concerns have previously been raised about the technology, and in July last year, the data watchdog warned police forces testing the scanners that there remained significant privacy and data protection issues which must be addressed.

Information commissioner Elizabeth Denham said: “We understand the purpose is to catch criminals.

“But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives.

“And that is a potential threat to privacy that should concern us all.”

However, Mr Ephgrave said the Met is “in the business of policing by consent” and thinks the force is effectively balancing the right to privacy with crime prevention.

He added: “Everything we do in policing is a balance between common law powers to investigate and prevent crime, and article 8 rights to privacy.

“It’s not just in respect of live facial recognition, it’s in respect of covert operations, stop and search, there’s any number of examples where we have to balance individuals’ right to privacy against our duty to prevent and deter crime.”

The force claims that the technology has a very low failure rate, with the system only creating a false alert one in every 1,000 times.

However, using a different metric, last year, research from the University of Essex said the tech only had eight correct matches out of 42, across six trials they evaluated.

The Information Commissioner’s office said the tech has “potentially significant privacy implications” and called on the Government to implement a code of practice for live facial recognition.

It said in a statement: “The code will ensure consistency in how police forces use this technology and to improve clarity and foreseeability in its use for the public and police officers alike.

“We believe it’s important for government to work with regulators, law enforcement, technology providers and communities to support the code.”

Sorry, we are not accepting comments on this article.