Politics

Labour is playing AI roulette with our kids’ lives

Published

on

The UK government is rolling out an AI-empowered predictive policing system to predict the “likelihood” of criminal offences among adolescents. Open Rights Group is warning of the risks and limitations of using AI to determine the “most likely offenders.”

In conversation with the Canary, Mariano delli Santi of Open Rights Group explained that these systems are likely to single out children in care and alert authorities to targeted interventions. This means so-called predictive data models will be used to target vulnerable children.

AI is racist because it mimics society

Delli Santi said:

They say they want to help, that they will use this system to target children who are at risk of criminality, with support and therefore to prevent them from becoming criminals. However the way artificial intelligence and predictive policing works, tells us that this may not be everything in this story.

These systems will inevitably reflect society’s prejudices and stereotypes, making them inherently racist and classist, revealing problematic outcomes of predictive policing in practice.

Advertisement

The system, delli Santi explains, risks reproducing:

bias and stereotypes at scale. Black people, migrant people, poor people, people from geographic areas which have been historically over policed are more likely to be identified as at risk of committing a crime.

AI, as we’ve all come to discover, is massively racist. As the Canary has previously reported, 4/5 of people misidentified by facial recognition are Black.

Harvesting NHS data

Beyond bias, the unethical sourcing of data is another concern delli Santi identifies. The government wants to pull data from the NHS, police, Department for Work and Pensions (DWP), and the Department for Education. So anything you tell your doctor could be used to box children into the ‘future criminal’ category. Consequently, AI-driven predictive policing may use NHS records for risk calculation.

Commenting on this Delli Santi tells the Canary:

Advertisement

When you go to your doctor, you expect to be able to tell them what you need to tell them, in order to receive medical treatment and you trust them to not use this data in a way that you will not expect.

If however the government starts to grab data from your general practitioner from your schools…in order to predict whether your child is going to commit a crime or not, this relationship of trust is going to be broken. People will go to their doctor and will have less trust and will think more carefully about what they’re going to disclose and reveal to them or not.

His warning is clear:

Predictive policing is a dangerous thing that has no place in a democratic society.

The government are hellbent on pushing through this policy which will criminalise your kids. However, Open Rights Group is campaigning to introduce a ban on predictive policing. You can join the campaign here.

Featured image via the Canary

Advertisement

Source link

Advertisement

You must be logged in to post a comment Login

Leave a Reply

Cancel reply

Trending

Exit mobile version