The British newspaper “Daily Mail” says: “The model is controversial, as it does not take into account systemic biases in police enforcement and its complex relationship with crime and society, and we have shown that similar systems perpetuate racial bias in police work, which can be replicated through this model. in practice.
But the data scientists at the University of Chicago who created this computer model, using public data from eight major US cities, claim that their model can be used to expose bias, and should only be used to guide current police strategies. disproportionately lower than in wealthier neighbourhoods.
The Chicago Police Department experimented with an algorithm that created a list of people most at risk of being involved in a shooting, either as a victim or as a perpetrator, in 2016.
Lawrence Sherman, of the Cambridge Center for Evidence-Based Policing, told New Scientist that he was concerned that the model looked at data vulnerable to bias.
The computer model was also trained using historical data on criminal incidents from the city of Chicago from 2014 to the end of 2016, and then predicted crime levels in the weeks following that training period.