Culture

What Is Predictive Policing?

Want more Junkee in your life? Sign up to our newsletter, and follow us on Instagram, Twitter and Facebook so you always know where to find us.

Police around the world are ramping up their use of computer programs that predict where crimes are going to happen and who’s going to commit them.

Here in Australia, Victoria Police have said that they secretly used one of these programs only a couple of years ago.

There are some big concerns that come with this kind of technology and right now, it’s all being used behind closed doors.

So, what do we need to know about it?

What Did Victoria Police Do?

Victoria Police only recently admitted that they were using predictive policing software between 2016 and 2018 in some suburbs of south-east Melbourne.

They were using it to try and pick up on teenagers and kids as young as ten who were supposedly high risk for committing crimes.

Police around the world are increasingly using this kind of software because in theory, it can save on resources.

You don’t need to spend as much time looking for crime if an algorithm can just tell you who’s going to be committing it.

But there are some pretty obvious problems with this system and probably the most important one is that it creates a sort of a feedback loop.

Jake Goldenfein: “If you have a group of children and you say, ‘these kids are more likely to commit crimes moving forward’, you’re effectively painting a big ‘X’ on their back and telling police, ‘Go dig here to find crime’ … So they’re not predicting crime, they’re predicting the likelihood of an arrest.”

That’s Jake Goldfein, he’s an expert in these kinds of programs.

One senior Victoria police officer said that the software they used was 95% accurate. But it’s pretty easy to see how the police could kinda just make the program accurate by hounding the people it picks out.

Technological Bias

This type of predictive technology is generally based on how many times a person has been arrested and charged, not how many times they’ve been convicted of a crime.

That can really exacerbate a lot of biases that already exist.

JG: “There is evidence that crime committed by particular racial and ethnic groups gets reported more … So there’s an unequal reporting of crime amongst different class, economic, and ethnic or racial groups.”

There’s a predictive policing scheme in New South Wales called the Suspect Targeting Management Plan that’s been really heavily criticised for basically just picking on First Nations kids.

In 2015, 44% of the targets on the New South Wales Police list were Aboriginal or Torres Strait Islander.

Victoria Police said that the predictive tool wasn’t widely used in Melbourne because they were worried about these biases.

But even if that particular program has ended, it’s pretty much impossible to tell if law enforcement in Australia is using any of this technology.

That kind of secrecy means that A: We can’t be sure about how law enforcement are operating, and B: There’s no accountability when things go wrong.

Nobody in the community knows how that Suspect Targeting plan in New South Wales works, because the police have said that it would be compromising intelligence to say.

Jake said that there really needs to be more oversight because this is a problem that will only get bigger with developing technologies.

JG: “Why not let people understand technological tools police are using and how they work, so that we can evaluate collectively as a society whether we think this is the kind of thing that we want … We are unable to understand how we’re policed and governed and that’s pretty anti-democratic.”

And even beyond that, maybe we should be allowed to question whether we want these programs used at all.

JG: “Do we really think that we can predict dangerousness and do we get better social outcomes doing that? … These systems are harmful when they don’t work because they’re biased, but they’re probably also harmful when they work perfectly.”

The Takeaway

These predictive policing programs are full of problems that make biases that already exist in the world that much more obvious and damaging.

And the fact that Australians are being totally denied the ability to even understand this technology is even more concerning, particularly because it’s going to become much more available in the future.