The rapid growth in the use of computer programs to predict crime hotspots and people who are likely to reoffend risks locking discrimination into the criminal justice system, a report has warned.
Amid mounting financial pressure, at least a dozen police forces are using or considering the predictive analytics. Leading police officers have said they want to make sure any data they use has “ethics at its heart”.
But a report by the human rights group Liberty raises concern that the programs encourage racial profiling and discrimination, and threaten privacy and freedom of expression.
And just what is this amazing new technology actually doing?
The programs used by police work in two main ways. Firstly, predictive mapping looks at police data about past crimes and identify “hotspots” or areas that are likely to experience more crime on a map. Police officers are then directed to patrol these parts of the country.
Right. Predicting that there are crime hotspots. We really need a machine to do this?
Secondly, “individual risk assessment” tries to predict the likelihood of a person committing, or even be the victim of, certain crimes.
Right. Predicting that certain individuals have overwhelming risk factors. Again, we really need a machine to do this? Albeit it can do it faster.
Methinks the Liberty wonks doth protest too much.