Sunday, 26 May, 2019

Rights Group Calls for End to Police Facial Recognition

Stock representation of FRT Jonathan Symcox 8:17am 15th May 2018 Shutterstock Stock representation of FRT
Adrian Cunningham | 17 May, 2018, 03:32

South Wales Police have admitted they keep hold of images of innocent people wrongly identified by their facial recognition cameras for a year, meaning that every innocent person wrongly identified at all these events (over 2,400 people in South Wales Police's case) has their image on a police database - and these people are completely unaware about it.

Denham welcomed plans to establish an oversight panel for facial recognition technology, which she will sit on along with the biometrics commissioner Alastair MacGregor QC and the surveillance camera commissioner Tony Porter, and also the appointment of a National Police Chiefs Council lead for the governance of the use of the tech in public spaces. Despite a court ruling in 2012 that the retention of innocent people's images was "unlawful", the Home Office has refused to delete them, claiming it's "too expensive".

Particular controversy was caused when the Metropolitan Police targeted Notting Hill Carnival with the technology two years in a row, with rights groups expressing concern that comparable facial recognition tools are more likely to misidentify black people.

Civil liberties group Big Brother Watch today (15 May) published a report outlining serious claims about the accuracy of facial recognition tools employed by United Kingdom law enforcement bodies.

The ability, as police see it, to track suspects anywhere there is a camera offers a big leap in crime fighting ability from finding vulnerable people or missing persons, to hunting terrorism suspects or keeping track of one-time suspects for whom there are not the resources to keep under surveillance by officers.

On 31 occasions police followed up the system saying it had spotted people of concern, only to find they had in fact stopped innocent people and the identifications were false. Innocent citizens being constantly tracked, located and identified - or, as is now most likely, misidentified as a criminal - by an artificially intelligent camera system conjures up images of futuristic dystopian societies that even Orwell would be proud of.

Tempers flare between Celtics-Cavaliers after dirty JR Smith foul
Celtics fans starting a potty-mouthed chant toward Smith during the review, and Smart called him a "bully" after the game. It's like a bully; you keep letting a bully pick on you, he's going to keep picking on you until you stand up".

A privacy group has slammed the police's use of facial recognition systems at public events in the United Kingdom as "dangerous and inaccurate".

A member of staff from the human rights organisation Liberty who observed the Met Police's operation at Notting Hill Carnival past year claimed the technology led to at least 35 false positives, five people being unduly stopped and one wrongful arrest.

"This new technology poses an unprecedented threat to citizens' privacy and civil liberties, and could fundamentally undermine the rights we enjoy in public spaces".

Big Brother Watch said it planned to bring its report to Parliament and demand that police stop using automated facial recognition, citing potential violations of the Human Rights Act 1998. "It must be dropped".

"We're seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals". A number of forces are conducting trials and working with legal experts to better understand how it could be deployed.