Politics

AI facial recognition wrongful arrest embarrassed Labour

Published

on

Alvi Choudary, a young British-Asian man, was arrested earlier this week for a burglary in Milton Keynes. The only problem? He’s never even entered the city, which lies 100 miles from his home in Southampton. Nevertheless, racist AI facial recognition technology placed him at the scene of the crime.

The news comes after calls from Labour to increase the use of AI in both policing and the court system. On top of that, just two days ago — 24 February — a police chief in charge of AI use admitted that a new £115m national police data centre will produce biased and racist results.

AI facial recognition — Wrongful arrest

Choudary appeared on Good Morning Britain to talk about what happened to him. He explained to presenters Kate Garraway and Richard Madeley that he was held in custody for around 11 hours before police would even speak to him or hear evidence against his arrest.

Adding insult to injury, the arrest turned out to be due to a custody photo taken of Choudary some four years previously. This, too was a wrongful arrest. And, despite the fact that officers assured him at the time that his DNA and information would be removed from the system, his photograph was kept on record.

Advertisement

It was this photo that police AI facial recognition software matched to an image of the suspect. As it turned out, the suspect didn’t resemble Choudary in the slightest. The police officers were laughing at this racist error even as they released Alvi.

Madeley then asked why asked why exactly the AI technology performs so poorly for BAME individuals. Akiko Hart, the director for the human rights organisation Liberty explained that:

It’s because the AI is trained on white faces. And so, essentially, you are more likely to be misidentified if you are young, you are more likely to be misidentified if you’re a woman, and obviously we’ve seen there’s really shocking statistics about how you’re more likely to be misidentified if you’re Asian, if you’re Black, and 250 times more likely to be misidentified if you’re a black woman. And that is because of the way the AI is trained.

AI racism

Choudary‘s wrongful arrest is a case-in-point for the problems being caused by the increasing use of AI in the justice system. However, Labour remain hellbent on pushing AI into policing and the courts, in spite of its well-documented biases.

Earlier this month, on 12 February, the Ministry of Justice announced plans to use predictive policing to overhaul the youth justice system. Part of the proposal was to use “machine learning and advanced analytics” to “support early, appropriate intervention” in youth crime.

Advertisement

At the time, the Canary published an article on how this initiative would automate the discrimination that had already been part of the lives of racialised individuals for decades.

Then, on 24 February, justice secretary David Lammy proudly announced that Labour was also pushing AI use into the courts. He stated:

we are going to invest more in our in-house Justice AI Unit – a specialist team within my department, forward-deployed to the frontline, working with staff to tackle the challenges they face.

Over £12 million in additional funding in the next financial year will expand our AI capabilities, putting this powerful tool, finally, into the hands of staff.

A problem, here and now

On that same day, 24 February, police AI lead Alex Murray acknowledged that a new £115m police data centre would produce discriminatory results. However, he also tried to assure the public that the police would work to reduce that discrimination.

Advertisement

As if, that is, they’ve ever done the work of addressing their non-digital discrimination. Beyond that, the police even have form for neglecting to reduce or act on bias in their AI use already.

On the failure of a previous police venture in facial recognition technology, the Association for Police and Crime Commissioners (APCC) stated that:

System failures have been known for some time, yet these were not shared with those communities affected, nor with leading sector stakeholders.

As Alvi Choudary’s wrongful arrest clearly demonstrated, AI racism in policing is not a problem to deal with in the future. It is already here, and it’s already affecting people’s lives. 

Labour and the police know that this is a problem; they know it’s racist. But never mind — they’re going ahead with it anyway. Oh, and they’ll try to mitigate the risk, honest.

Advertisement

Featured image via 3DIVI

Source link

Advertisement

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version