Being mistakenly branded as a criminal can ruin many lives of youths and adults. According to Michelle Alexander, the author of “The New Jim Crow,” employment discrimination severely diminishes the job prospects of convicts. Whether it’d be three weeks, 10 months, or 30 years spent in prison, they would still be required to disclose their prior felon status to their employer.
According to the National Institute of Justice, “predictive policing tries to harness the power of information, geospatial technologies and evidence-based intervention models to reduce crime and improve public safety.” The biggest threat of predictive policing is the fact that the information it depends on is racially prejudiced and half-finished. Since the method works in certain cases, however, I do not encourage a complete ban on predictive policing. Instead, I would rather apply caution when using such system.
Due to the decline of resources, predictive policing was introduced to help departments focus on areas where there are more criminals. An error to this proposal is the fact that analytics can be racially skewed or archaic. Faiza Patel, the co-director of the Liberty and National Security Program at the Brennan Center for Justice at New York University Law School, stated that the predictive method is an infraction to the constitutional requirement that police can target people when they are suspicious of their actions. The requirement does not include reliance on AI probability.
Utilizing these trends will promote racial bias. Instead of discovering ways to reduce skewed crime statistics, predictive policing will increase enforcement in targeted areas. As a result, the number of cases in those areas would be added to the previous distorted analytics. It is a never-ending cycle that can worsen the relationship between the government and minorities.
Predictive policing is evidently in its infancy. Thus, analysts must thoroughly investigate the roots of crimes and should not only rely on the data given. People must not be hasty in utilizing cause-and-effect mechanisms. Even if the AI reports that there are many crimes between certain hours, it can also mean that certain events are occurring within the same setting. Police departments should follow the St. Louis County Police Department’s approach in discovering the main causes of crimes. According to the Marshall Project, the police department meets on a weekly basis to compare various regions’ susceptibility to distinct misdemeanors.
In Los Angeles, the police department is using Palantir and predictive algorithms to expedite their search for crimes. Deputy Chief Sean Malinowski defended the usage of data analytics, claiming that the department’s resources are scarce. However, the method is essentially flawed because the data gathered are from law enforcement—resulting to implicit bias.
The main issue with predictive policing is the accentuation of structural violence. Crimes typically occur in neighborhoods with mostly minorities, but that does not mean that all of them are guilty. Typically, they get arrested multiple times for the same crimes. If algorithms were used to gauge the amount of arrests in a community, cops would develop an inclination to mistakenly judge the minorities for potential crimes, even though the minorities’ actions were not seen as suspicious. The cycle of injustice will prevail, even with the use of advanced technology. Predictive policing is filled with imperfections that can still be fixed. Until the system can be judiciously framed, moral exclusion will evidently remain as the evil root of repression.