UK police face legal challenge over facial recognition system

Published 25.07.2018 00:00
Updated 25.07.2018 20:47
Commuters travel in the London underground in, July 24, 2018. EPA Photo
Commuters travel in the London underground in, July 24, 2018. (EPA Photo)

Human rights campaigners launched a legal challenge to the United Kingdom's use of "China-style" facial recognition technology Wednesday, saying it breaches civil liberties and has not been properly debated by parliament.

Rights group Big Brother Watch and lawmaker Jenny Jones said they had asked the High Court for a judicial review of the use of facial recognition surveillance by the government and London's Metropolitan Police.

Big Brother Watch said the police had recently targeted London's Westfield shopping center with the "China-style surveillance cameras" after previous deployments at events including last year's Notting Hill Carnival.

The group said it had obtained data in May suggesting that 98 percent of facial recognition "matches" from the London force's system had wrongly identified innocent people.

Once people have been matched, even if they are innocent, the police store their biometric images for up to a year without their consent, the group said.

"Facial recognition cameras are not only authoritarian, they're dangerously inaccurate," said Silkie Carlo, the group's director.

Jones, a Green Party member of the Lords, parliament's unelected upper house, said facial recognition surveillance "lacks a legal basis, tramples over civil liberties, and it hasn't been properly debated in parliament."

"The idea that citizens should all become walking ID cards is really the antithesis to democratic freedom," Jones said.

The government made no immediate comment on the lawsuit but Metropolitan Police Detective Superintendent Bernie Galopin, who is overseeing trials of the technology, promised "a full, independent evaluation" at the end of this year.

"The Met is currently developing the use of live facial recognition technology and we have committed to 10 trials during the coming months," Galopin said in a statement.

"At the end of the year, there will be a full, independent evaluation," he said, promising that the use of images gathered would be "intelligence-led and temporary."

"Only images that come up as a match to a targeted individual will be retained for a limited period," Galopin said.

"The use of live facial recognition technology aims to support standard policing activity to ensure everyone's safety."

Rosa Curling of law firm Leigh Day, which is representing Big Brother Watch and Jones, said her clients had provided "compelling evidence to the court" to show that automated facial recognition (AFR) contravened several articles of the European Convention on Human Rights.

"The Home Secretary has failed to show that the use of AFR is either proportionate or necessary in our democratic society," Curling said.

"Our clients hope the issuing of proceedings will result in an immediate halt of its use by the police and reconsideration by both the police and Home Office as to whether it is suitable to use in the future."

Share on Facebook Share on Twitter