Predictive Policing in Phoenix Arizona
Most people aren’t aware that method or approach to policing that comes straight from science fiction is alive and an enforcement in Phoenix. Predictive policing, similar to what was depicted in Minority Report, and through various other sci-fi films and books, seems to put more people at risk than it protects. So what happens when someone is the subject, or possibly victim, of predictive policing?
Back in 2015 Senator Steve Smith signed in a predictive policing bill which allotted over $1,000,000 to some Arizona police departments. The pilot program took place in Phoenix, Mesa, Sierra Vista, and Maricopa. The question is, how has this predictive policing affected these communities in the last 5 years?
Is AI Contributing to Racial Profiling?
The two sides of the argument phone into the same problem. One side claims that AI is contributing to racial profiling because the historical data inferred trends that represented greater crime in specific neighborhoods, which may have changed in racial profile over the years.
The other side of the argument is that the AI is perpetuating racial profiling by only targeting crimes or attempting to predict crimes in specific neighborhoods. The root of the matter is certainly concerning, but the presence of an increase in racial profiling through higher crime reports of predictive police reporting focused on very specific neighborhoods shows that there’s a problem.
Predictive Policing Methods
How does predictive policing work? There are a very small handful of software options that police departments used to input their recent data. They use that historical data to discover hot spots of crime within the area. Then, police become more present in those hot spots to increase surveillance and analyze the social networks of potential criminals.
The trouble here is that everything is with the idea of potential. Many bills for predictive policing, including Arizona’s, sites that it should serve to protect potential victims. But how can you protect a potential victim if there is no crime?
These software options generate clusters where it predicts a high volume of criminal activity. Essentially, this software is doing what police have done intuitively for decades. They’re putting focus on areas that are known for high volumes of crime.
Can You Defend Yourself Against AI and Predictive Policing?
The majority of predicting policing software and programs rely on outdated information. A number of civil liberties groups have dug deep into the analytics of AI and predictive policing software. What they’ve determined is that a lot of the clusters found are either wrong or biased. How many people have had negative interactions with the police, faced charges, or experienced incarceration because of inaccurate or outdated predictive policing?
If you were arrested on charges because of predictive policing, it is more likely that racial profiling had a substantial hand in that decision. Predictive policing technology has resulted in lazy police work, clear violations of privacy, misunderstandings between police and the community and an increase in racial profiling. If predictive policing technology had any hurt to do with your charges, then you need an Arizona criminal trial attorney.