We use cookies on our websites. Information about cookies and how you can object to the use of cookies at any time or end their use can be found in our privacy policy.
London police are testing facial recognition tech this week
Insta360 Air AI 3 min read 1 Comment

London police are testing facial recognition tech this week

Facial recognition technology isn't just a handy way to unlock your phone. There are many situations where fast identification is useful, and AI is becoming more efficient at this all the time. Obviously, law enforcement is interested in this, and a new test is about to begin in London. 

London’s Metropolitan Police Service will be testing facial recognition technology in a handful of locations in central London this Monday and Tuesday ,specifically in the vicinity of Soho, Piccadilly Circus, and Leicester Square. 

According to the Met, the tests "will be used overtly with a clear uniformed presence and information leaflets will be disseminated to the public." Does that give the game away? While the idea behind this kind of surveillance is to identify criminals, at the moment it is only the technology that is being refined. The Met have reassured the public that anyone who declines to be scanned "will not be viewed as suspicious by police officers." Well, that's nice of them.

Trials like this have been tested out in London since 2016, with today's marking the 7th such attempt. Three more tests are supposed to be scheduled. It's easy to see why law enforcement is interested in deploying this technology, but the public has good reason to be wary. There are still many kinks that can lead to false identifications.

In China, where state surveillance is more accepted, facial recognition cameras flagged a famous CEO as a jaywalker and publicly shamed her...after it recognized her face on a bus ad. In the UK itself, a facial recognition system used at the June 2017 Champions League soccer final in Cardiff, Wales turned up a 92 percent false positive rate, i.e. 9 times out of 10, it falsely ID'd someone as being suspicious or worthy of arrest.

Is state surveillance going too far?

Technical issues aside, many citizens are uncomfortable with the police leveraging the power of AI over the wider populace. Privacy advocacy group Big Brother watch is protesting the trials, and appear to have spotted plain-clothes police around the trial zones, casting doubt upon the professed openness of the Met Police's statement.

CCTV cameras are a common sight around London and other UK cities, but the onboard AI of facial recognition camera networks takes surveillance to another level - it could lead to faster identification and tracking of suspects. Of course, its deployment could also lead to a nightmare scenario in the case of false positives, could enshrine racial or social biases in an unaccountable automated police system, or could simply make the police too effective - many ordinary people just don't want to be looked at that closely by the authorities.

Could Chinese-style face recognition glasses be used by UK police? / © Reuters

What do you think of the use of facial recognition by police? Effective crime-fighting, or state oppression? Let us know in the comments!

Source: Ars Technica

1 Comment

Write new comment:
All changes will be saved. No drafts are saved when editing