Facts for You

A blog about health, economics & politics

 The Metropolitan Police is set to install the UK’s first permanent network of Live Facial Recognition (LFR) cameras on London Road and North Road, alleged ‘crime hotspots’ in Croydon town centre in south London. In a pilot project in London’s most violent borough, due to go live in June 2025 or thereafter, the cameras will be attached to lampposts and buildings and scan the streets to match the faces of pedestrians and other street users to a database of people on police watchlists who are wanted for serious offences such as theft, violent crime, drugs offences, damage to property offences, terrorist offences, sexual offences, fraud, firearms offences, and immigration offences, among others. The images will be “streamed directly to the Live Facial Recognition system and compared to a watchlist”, which includes images of “Sought Persons” as well as missing persons, vulnerable persons, and even the victims of serious crime. The Metropolitan Police’s recognition algorithm has been independently tested and validated by the National Physical Laboratory in a March 2023 report entitled ‘Facial Recognition Technology in Law Enforcement: Equitability Study.’

These static LFR cameras will not operate continuously, being switched on only as and when deemed necessary by the police. London’s police force has already been using mobile facial recognition cameras in marked MPS-liveried vehicles for the past two years, monitoring crowds at busy central London locations and also during the King’s Coronation. In the words of the UK government, such mobile deployments are “targeted, intelligence-led, time-bound, and geographically limited.”

 Facial recognition software detects faces within digital camera images or video footage, and extracts identifying features from them, such as the shape of the nose and eyes, the distance between the eyes, and the contour of the jawline. A recognition algorithm then compares these features to a predetermined database of known faces of persons of interest. This database excludes people not suspected of criminality or otherwise of police interest, as their details are automatically deleted following capture on camera. A positive match allows rapid real-time facial recognition, even when face is partly obscured, either by heavy makeup, beards, or sunglasses, although face masks and visors can prove problematic. This triggers an alert, allowing the identification, subsequent interrogation, and even possible arrest of suspected criminals. Facial recognition does not lead to automatic arrest, requires corroborating evidence for a successful conviction, and cannot be relied upon as evidence in isolation.

The use of facial recognition software promises a way forward in the ongoing battle against crime. But it has many opponents. Some claim this is an Orwellian attack on privacy and personal liberty, whereby the state seeks to snoop on, and thereby secure greater control over, members of the general public. Driven by such concerns, Rebecca Vincent, interim director of Big Brother Watch, a non-partisan civil liberties and privacy campaigning organisation, has warned of a “steady slide into a dystopian nightmare” with “no oversight or legislative basis.” In addition, there are concerns over the reliability and accuracy of the technology. Ageing, non-binary gender and gender reassignment, race/ethnicity, and altered emotional states may all lead to inaccurate identification, as will poor quality video images. Training datasets may be derived from lighter-skinned subjects and therefore incomplete, while some suspect that ‘structural racism’ within the police force and racial profiling may disadvantage people of colour.  The British Standards Institute has thus set out six principles of ‘trustworthiness’ in a code of practice for ‘Ethical use and deployment in video surveillance-based systems’ (BS 9347: 4024). These principles comprise “governance and accountability, human agency and oversight, privacy and data governance, technical robustness and safety, transparency and explainability, diversity, non-discrimination, and fairness.”

Any measure that can potentially prevent crime and help apprehend alleged criminals and other wanted people is welcome, but hard evidence supporting any likely benefits of LFR technology is somewhat lacking. The results of the Metropolitan Police’s pilot project will therefore be keenly awaited. It is reassuring to know that the process will not be extended elsewhere until it is shown to be beneficial and not unduly intrusive. As it stands, the law does allow the use of facial recognition for policing purposes as felt necessary, and provided it is proportionate and fair.  Unfortunately, considering the times we live in, a libertarian “light-touch” approach cannot be relied upon to guarantee our security in either public spaces or private domains.

Ashis Banerjee

Leave a Reply

Your email address will not be published. Required fields are marked *