Leak Shows Crime Prediction Software Targets Black and Latino Neighborhoods

police car

Here’s some news from the beginning of the month that I missed. Gizmodo and The Markup analyzed PredPol, a crime prediction software used in the U.S.

Residents of neighborhoods where PredPol suggested few patrols tended to be Whiter and more middle- to upper-income. Many of these areas went years without a single crime prediction.

By contrast, neighborhoods the software targeted for increased patrols were more likely to be home to Blacks, Latinos, and families that would qualify for the federal free and reduced lunch program.

Check It Out: Leak Shows Crime Prediction Software Targets Black and Latino Neighborhoods

2 thoughts on “Leak Shows Crime Prediction Software Targets Black and Latino Neighborhoods

  • Andrew:

    This appears to be a study in bias. It is extremely difficult for any individual, or collection of individuals who share common reference frames, to recognise their own assumptions and therefore biases. As political scientists have pointed out, this is what condemns most would-be revolutionaries to replicate the very systems they overthrew with the same results, just new players.

    It appears that the creators of this platform may well have input the variables and parameters associated with heavily patrolled areas where crimes are historically identified into their algorithms in order to identify areas where future crimes would be most likely committed, which in turn would predict that crimes are imminent in those areas that are traditionally highly patrolled. Anyone who collects socioeconomic status (SES) data knows that you can do this without collecting data specifically on race or ethnicity and yet identify, based on those analyses, areas where particular ethnicities reside with high precision. It is recursive.

    In any case, this highlights the importance of a diverse set of designers and reviewers who are more likely than a less diverse set to identify potential bias blind spots.

  • Is the software racist, or does it reflect a racist society?

    Minorities are forced to live in lower-income neighborhoods, and thus are forced to live with more crime and violence than whites on average.

    How is the software to know how to compensate for this? If it was to make it look like there’s no difference between white and minority neighborhoods, that’s when I would assume the software has been modified illegitimately. Not because minorities are likely to commit more crimes based on who they are, but based on the fact that they are victims of racism to this day.

    Correlation is not causation.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.