London Metropolitan Police Are To Use Face Scanning Technology

Does this pose more harm than help?

Does this pose more harm than help?

London Metropolitan Police will start using facial recognition cameras to identify suspects from street crowds raising worries about automated surveillance and erosion of privacy rights. The cameras will be put to work with a month at potential crime hotspots. The technology runs on Japan’s NEC (a multinational information technology and electronics company), it looks for faces in crowds to see if they match any on the watchlists for people wanted for serious and violent offences. Signs will be put up to warn passers-by about the cameras and officers will pass out leaflets with more information. The systems are not to be linked to any other surveillance systems.

The police have not commented on how the locations in which the technology shall be implemented will be chosen or how many places and cameras they are planning to deploy.

Amnesty International researcher Anna Bacciarelli said, “Facial recognition technology poses a huge threat to human rights, including the rights to privacy, non-discrimination, freedom of expression, association and peaceful assembly."

This is not the first use of cameras in Britain, cameras have been used in public spaces by security forces to fight terror threats for decades now. London is the sixth most monitored city in the world according to Comparitech.

London police previously carried out a series of trial deployments that they say identified 7 out of 10 wanted suspects who walked past the camera while only incorrectly flagging up 1 in 1,000 people. 

An independent review last year by University of Essex professors stated that the trials raised concerns about their legal basis and the equipment's accuracy, with only 8 of 42 matches verified as correct.

Pete Fussey, a University of Essex professor who co-authored the report, said NEC has upgraded its algorithm since then, but there is evidence that the technology is not 100% accurate. A recent U.S. government lab's test of nearly 200 algorithms showed that most algorithms have ethnic bias.

By Swarnim Agrahari