AI Is Racist!

The rapid growth in the use of computer programs to predict crime hotspots and people who are likely to reoffend risks locking discrimination into the criminal justice system, a report has warned.

Amid mounting financial pressure, at least a dozen police forces are using or considering the predictive analytics. Leading police officers have said they want to make sure any data they use has “ethics at its heart”.

But a report by the human rights group Liberty raises concern that the programs encourage racial profiling and discrimination, and threaten privacy and freedom of expression.

And just what is this amazing new technology actually doing?

The programs used by police work in two main ways. Firstly, predictive mapping looks at police data about past crimes and identify “hotspots” or areas that are likely to experience more crime on a map. Police officers are then directed to patrol these parts of the country.

Right. Predicting that there are crime hotspots. We really need a machine to do this?

Secondly, “individual risk assessment” tries to predict the likelihood of a person committing, or even be the victim of, certain crimes.

Right. Predicting that certain individuals have overwhelming risk factors. Again, we really need a machine to do this? Albeit it can do it faster.

Methinks the Liberty wonks doth protest too much.

12 comments for “AI Is Racist!

  1. Lord T
    February 15, 2019 at 12:24 pm

    AI is as racist as we all are because facts are racist as far as the snowflakes are concerned.

    We don’t really use AI properly imo. This sort of data was being generated 100 years ago by sticking pins in maps and looking to see where the most pins are. Any plod in an area would be able to tell you where the hotspots are off the top of their head.

    AI however is already showing its worth by analysing social media and predicting where mobs are going to be congregating with an intent to cause trouble. I would think the French Plod are using this already on those troublesome yellow vests. This could also be done by a person but a lot slower and AI can make the prediction that can be checked and confirmed to help it refine its ‘thinking’.

    The true power of AI is unlikely to be recognised here for some time because unlike that treasonous May most of the government don’t want people locked up because of a minority report type prediction. That is a good thing. AI is totally uninterested in guilt or innocence only probabilities.

    • February 17, 2019 at 7:25 am

      “Any plod in an area would be able to tell you where the hotspots are off the top of their head.”

      When they walked a beat, maybe…

  2. Stonyground
    February 15, 2019 at 7:39 pm

    I think that racist computers have the potential to be very interesting. For instance, I believe that in the US far more black people are imprisoned than are white people. Now this could be because the police and the courts are biased against black people. It could also be because black people are more inclined to be criminals. The truth could be one or the other, or somewhere in between. Isn’t a computer more likely to get to the bottom of the matter by being demonstrably free of prejudice?

    • Errol
      February 15, 2019 at 11:18 pm

      It’s down to demographics, unemployment and lifestyle.

      They’re poor because they’re uneducated. They’re uneducated because their parents can’t instill in them the value. Thus you get a demographic of poor, uneducated criminals that perpetuate through generations.

      • February 16, 2019 at 2:00 am

        That ‘value’ word is spelled incorrectly, sir. It should read, VIRTUE.

      • February 17, 2019 at 7:26 am


  3. Ted Treen
    February 15, 2019 at 10:28 pm

    “…The rapid growth in the use of computer programs to predict crime hotspots and people who are likely to re-offend risks locking discrimination into the criminal justice system…”

    Of course it’s discriminatory:- discriminatory against criminals & recidivists. Isn’t that the bloody point? God give me strength.

  4. Errol
    February 15, 2019 at 11:15 pm

    Encourage discrimination and racial profiling – so if 30 black kids are knifing one another, we should look at the white one.

    For goodness sake. The reason these kids are killing one another is because we, the humans, are pretending that there isn’t a problem. Because we are NOT profiling the black youths.

    But hey. Who cares about the body count. Let’s be PC and ignore the blacks killing one another.

    This is why humans shouldn’t be allowed to run things. They’re dumb and won’t do what needs to be done.

    • February 17, 2019 at 7:27 am

      The latest cause celebre is banning of drill rap videos being the same thing as adults railing against Elvis Presley’s hip movements… *rolls eyes*

  5. Valentine Gray
    February 16, 2019 at 8:56 am

    Yes the Police use A I because they are so dumb to work it out, shackled with “Political Correctness” and “Common Purpose” these two moral illnesses are the real threat, they feed off low IQ crime like your stabbers and muggers. Try dealing with criminals from a different social strata, giving millions to Phantom Ferry companies, Pharmaceutical Bandits ruining the NHS and Phoney referendums. Want to see real black crime go to Africa.

  6. Mark Matis
    February 16, 2019 at 1:36 pm

    AI is not “racist”. AI is merely HONEST. And that is not acceptable these days.

    • February 17, 2019 at 7:27 am

      Good point!

Comments are closed.