Politics

PSNI testing viability of facial recognition

Published

on

The Police Service of Northern Ireland (PSNI) is investigating the potential deployment of Live Facial Recognition Technology (LFRT or LFR) in the Northern Ireland. According to a report in The Irish Times:

The force has set up a Facial Recognition Governance Board which is monitoring programmes elsewhere in the UK and engaging directly with industry providers, though it insists no decision has been taken over whether to deploy the controversial technology.

The PSNI haven’t exactly been transparent about such plans up to this point, with no public references to the Facial Recognition Governance Board available online prior to today’s revelation. LFRT involves the use of cameras combined with automated facial recognition software to scan and identify faces. The system then matches the results against a police watchlist of wanted persons.

The PSNI say they don’t currently use the technology, meaning officers manually operate cameras and examine footage collected. However, they say they are:

…monitoring national LFR programmes, including those implemented by the Metropolitan Police, South Wales Police and, most recently, British Transport Police.

At this stage, we are engaging with these programmes and their industry providers solely in order to assess operational feasibility.

Advertisement

PSNI turn to ‘Israeli’ surveillance tech already in use by British police

A crucial question is whether any of those “industry providers” include Corsight AI, an ‘Israeli’ firm whose LFRT program has been adopted by British police. This is already a breach of the Palestinian-led Boycott, Divestment and Sanctions (BDS) movement’s guidelines. They stipulate no economic dealings with the Zionist entity, or even non-‘Israeli’ companies which support the terrorist land theft project.

Purchasing Zionist tech is one of the worst imaginable cases of this, as it gives a direct boost to the military-surveillance sector of ‘Israel’s’ economy. Further use of Corsight’s product provides funding to, and refinement of, a system used to violate Palestinian rights.

The British government is planning to roll out LFRT systems further, moving from 10 vans to 50 that have the technology installed. Al Jazeera outline how even “Israeli intelligence operatives” have “concerns about its accuracy”. This appears to be another case of much heralded AI ending up like the fictional Robocop prototypes shooting themselves in the head.

Big Brother Watch have flagged the unreliability of the dodgy tech, saying it:

Advertisement

…discriminates against women and people of colour. 80% of people misidentified by facial recognition in London in 2025 were Black.

This sort of bias is a commonly recognised flaw of AI platforms.

Misidentification is a crucial flaw which would result in potentially illegal surveillance. If a system incorrectly identifies someone as a suspect on a watchlist, it could result in their data being stored in the system. This would be a breach of the Protection of Freedoms Act 2012, which outlawed the storage of data like DNA and fingerprints from people not convicted of a crime.

A chilling effect on basic rights

Beyond that, use of LFRT in a public space is inherently indiscriminate and likely breaches other laws, such as those relating to freedom of expression and freedom of assembly. Its use at protests will discourage attendance, especially from minoritised communities. Lancashire police are known to have shared footage of disabled people with the Department of Work and Pensions in an attempt to have their benefits stripped.

The above issues have been cited in a challenge to the Met Police’s use of facial scanning which has just been heard in the High Court. Big Brother Watch argued that its use amounted to “stop and search on steroids“. They cited the case of a man detained for 20 minutes by the cops, despite providing ID to show he’d been falsely identified.

Advertisement

The Met’s justification is that London’s scale makes tracking suspects too hard:

Locating these individuals within a vast, bustling metropolis is akin to looking for stray needles in an enormous, exceptionally dense haystack.

Though it may currently make mistakes, AI is steadily improving. Its increasing capacity to sift through enormous amounts of data and make sense of it is amounts to a power too excessive to grant to increasingly authoritarian states. When Edward Snowden revealed the extent of the US surveillance apparatus in 2013, he didn’t just criticise its immorality. He also lambasted it as ineffective, due to excessive data collection simply adding more hay to the haystack.

In the age of AI, a giant haystack becomes less of an issue. What would previously have required hours of human intervention to interpret can now be churned through and summarised by AI in seconds. Such a power seems too much in the hands of even an accountable state, nevermind an undemocratic and abusive one arresting thousands of innocent people for opposing genocide.

PSNI can’t be trusted with mass surveillance power

The PSNI has played a role in that. In the last week alone it has been shown to have behaved in a discriminatory manner. Internment and collusion are grim historic examples of what happens when police are granted excessive powers.

Advertisement

We could achieve zero crime, but it would require total surveillance and ensure zero freedom. Mass face scanning is a step too far towards the latter, and the PSNI’s secretive Facial Recognition Governance Board should rule out its use.

Featured image via the Canary

Source link

Advertisement

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version