Professionalism/Predictive Policing
This page or section is an undeveloped draft or outline. You can help to develop the work, or you can ask for assistance in the project room. |
Background
[edit | edit source]Predictive policing is a law enforcement practice that uses mathematical, predictive, and analytical methods to anticipate and prevent potential crimes.
Minority Report
[edit | edit source]One of the chief popularizers of predictive policing was the 1956 short story The Minority Report by Philip K. Dick and the 2002 film of the same name. The short story takes place in a future society in which nearly all crime has been eliminated. This feat was accomplished by the Precrime Division, created and headed by the protagonist, John Anderton. The Precrime Division makes use of three mutants, called "precogs," to arrest would-be criminals before they can commit their crime.
“You’ve probably already grasped the basic legalistic drawback to precrime methodology. We’re taking in individuals who have broken no law.”
“But surely, they will,” Witwer affirmed with conviction.
“Happily, they don’t—because we get to them first, before they can commit an act of violence. So the commission of the crime itself is absolute metaphysics. We can claim they are culpable. They, on the other hand, can eternally claim they’re innocent. And, in a sense, they are innocent.”
— Philip K. Dick, The Minority Report
This story accurately predicts many issues facing predictive policing today. Rather than only forecasting crimes, the three "precog" mutants mindlessly babble about everything they can see in the future. This creates an intractable volume of data, envisioning one of the problems of big data decades in advance. Dick correctly predicts that this volume of data will be processed by machines. Moreover, he predicts that big data analysis will have an inherent uncertainty due to the statistics involved: the mutants' predictions often conflict, and a "majority report" is all that's needed to make an arrest.
Despite the uncertainty of the Precrime system, Anderton believes completely in its authority—until it accuses him of murdering a man he has never met, Leopold Kaplan. This demonstrates an important bias: people tend to support something that hurts others until it hurts them. In modern predictive policing, systems accused of bias toward certain populations may be accepted by officers and the officials above them who are outside the affected groups.
Another lesson comes from the story's anticlimactic ending. When Anderton himself looks at the data which led to his accusation, it becomes clear that the system has made a mistake. This points to the dangers of over reliance on predictive models. According to researcher Sonja B. Starr, literature suggests "that providing judges with risk predictions that are framed as scientific and data driven will likely increase the weight placed on them." [1] It is essential, therefore, to ensure that the limitations of such predictions are well understood.
PredPol
[edit | edit source]PredPol is one example of a company that offers this type of software for police departments. PredPol claims that crime types tend to cluster, and its algorithm attempts to replicate this perceived behavior using a self-exciting point process model[2]. The algorithm takes as input a particular police department’s record management system (RMS) data and produces an output of 500 feet by 500 feet regions on a map in which it predicts crime is most likely to happen. The algorithm does this by collecting three basic data points from a department’s RMS: crime type, crime location, and crime date/time. Police departments typically use PredPol’s output to place police officers in these high-risk areas for a small fraction of their shifts depending on the officers’ availability. Crime is said to be deterred in these areas by virtue of the police officers being present.
Proponents
[edit | edit source]Police departments generally experience good results after implementing predictive policing software. In fact, Santa Cruz saw burglaries decrease by 11% and robberies drop by 27% in the first year, though some suggest these numbers may be resultant of normal crime rate fluctuations[3]. In 2012, 70% of 200 police agencies surveyed about their use of predictive policing technologies said they plan to increase usage within the next two to five years[4]. With PredPol’s user base expanding to contain over 60 police departments including their largest clients, Los Angeles and Atlanta, it seems that predictive policing may become a prevalent trend in modern law enforcement.
Opponents
[edit | edit source]Many are not convinced that predictive policing software like PredPol will work as expected. Although PredPol supporters reiterate its use only as a helpful crime deterrent, some claim that the labeling of areas as high-risk may lead to an exaggeration of danger by the officers patrolling that area [3]. Others suggest that because predictive policing does not actually stop crime, it doesn’t help fix the area’s underlying problems. In fact, statistics showing declining crime rates may be deceiving in that criminals may have simply moved to another area.
One large concern is that predictive policing software may reinforce bad habits and amplify biases [5]. This results from the software’s potential to cause a slippery slope of racial discrimination. In particular, if the software goes far enough back into a police department’s RMS data, it is likely to find enough arrest records fueled by personal biases for it to make a decision that could be racially discriminatory. If police officers, in turn, overly rely on the system, these biases could result in more prejudiced arrests which could further muddy the data used as the input to the software.
In addition, while PredPol has a broad U.S. user base, some police departments have been reluctant to use algorithmic policing. In 2014, the new head of research and planning at the Oakland Police Department was lobbying for the use of PredPol. He was unfamiliar with the community but felt predictive policing could augment the police force. However, the assistant chief, Paul Figueroa, argued again Birch's proposal to use PredPol. In the following months, while Birch was awaiting $100,000 to implement PredPol, he began to learn about the community. Once the funds were approved, he realized the flaws Figueroa had earlier identified--mainly that the technology would increase distrust of police officers. Based on this and news of other Bay Area police departments ditching the technology, Birch rescinded his request for funding. [6]
Similar Technologies
[edit | edit source]Northpointe Criminal Risk Assessment
[edit | edit source]The Northpointe Criminal Risk Assessment program uses statistical data to find the chance of recidivism of individual convicts. Unlike PredPol, the Nortpointe score is used at the individual level and helps in decisions such as parole approval. Since this makes it possible to single out certain individuals, ProPublica began a study to investigate the possibility of bias. [7] ProPublica released a long report that outlined the statistics behind what they claim to be proof of bias. While the statistics can have multiple interpretations, the study highlighted a few individual cases where the software clearly failed. In one case, a black woman convicted of a minor offense as a teenager was given a much higher score than a white man convicted of armed robbery. The woman has never been arrested again, while the man has returned to prison on similar charges. While it is difficult to determine if this was due to a glitch or deeper bias in the software--many predictive software programs are "black boxes" with the algorithms hidden from users--ProPublica raised valuable concerns about its use in deciding people's futures.
Ethical Issues
[edit | edit source]Issues Affecting Civilians
[edit | edit source]As described with PredPol and Northpointe, many potential ethics issues relate to predictive policing. One of the biggest criticisms is that regardless of bias of the algorithm, the data used by the algorithm is inherently skewed. This is due to historical trends of certain groups of people being unfairly targeted by law enforcement. If this data is used in predictive policing, then it will cause the police to more intensively target those areas, creating a feedback loop of bias. Thus, it is very difficult to protect against biased historical data.
Furthermore, to protect their intellectual property, the companies that develop these algorithms do not release the technical details. While this is an understandable business practice, it causes distrust among many people. This makes it impossible to judge the ethics of the developers implementing their algorithm. Since the programs are in their infancy, it could be years before there are any whistle-blowers revealing the biased nature of the software, or more conclusive evidence that the software does help reduce crime.
Issues Affecting Officers
[edit | edit source]While it is tempting to only consider predictive policing's impact on civilians, it is important to consider the ramifications for police officers as well. In the city of Burbank, California, a department-wide survey found that 75% of officers had "low or extremely low" morale, due in part to the department's use of PredPol. Sergeant Claudio Losacco explained that, "It's like telling a fisherman of 20 years that we're going to tell you how to fish."[8] As a result, officers in the city are no longer required to spend time in the highlighted boxes[9].
Another matter of concern for officers is the question of automation. On PredPol's homepage, Los Angeles Police Chief Charlie Beck is quoted as saying, "I’m not going to get more money. I’m not going to get more cops. I have to be better at using what I have, and that’s what predictive policing is about."[10] While this certainly sounds appealing from an administrator's perspective, it should raise concerns for patrol officers. Due to the compensation effect, Beck's statement has a corollary: if the promises of predictive policing come to fruition, there may be a decreased demand for officers in their current roles. Police officers are then placed in a difficult position—if predictive policing allows more crime to be stopped, then they should support it, even at the cost of their own livelihood. This issue of automation leading to greater productivity and fewer needed workers extends to virtually all professions.
References
[edit | edit source]- ↑ Starr, S. (2014). Evidence-Based Sentencing and the Scientific Rationalization of Discrimination. Stanford Law Review, 66(4), 803 - 872.
- ↑ PredPol. (2015). How PredPol Works. Retrieved May 08, 2017, from http://www.predpol.com/how-predpol-works/
- ↑ a b Huet, E. (2015). Server and protect: Predictive policing firm predPol promises to map crime before It happens. Forbes. Retrieved May 08, 2017, from https://www.forbes.com/sites/ellenhuet/2015/02/11/predpol-predictive-policing/#5c37d1b64f9b.
- ↑ Police Executive Research Forum. 2014. Future Trends in Policing. Washington, D.C.: Office of Community Oriented Policing Services.
- ↑ Smith IV, J. (2016). Crime-prediction tool PredPol amplifies racially biased policing, study shows. Retrieved May 08, 2017, from https://mic.com/articles/156286/crime-prediction-tool-pred-pol-only-amplifies-racially-biased-policing-study-shows#.9Voi4FTYF.
- ↑ Thomas, E. (2016, December 28). Why Oakland Police Turned Down Predictive Policing. Retrieved May 08, 2017, from https://motherboard.vice.com/en_us/article/minority-retort-why-oakland-police-turned-down-predictive-policing
- ↑ Julia Angwin, Surya Mattu, Jeff Larson, Lauren Kirchner. (2016, May 31). Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And it’s Biased Against Blacks. Retrieved May 08, 2017, from https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
- ↑ Jack Smith. (2016, Oct. 10). Crime-prediction tool may be reinforcing discriminatory policing. Retrieved from http://www.businessinsider.com/predictive-policing-discriminatory-police-crime-2016-10
- ↑ Alene Tchekmedyian. (2016, Oct. 4). Police push back against using crime-prediction technology to deploy officers. Retrieved from http://www.latimes.com/local/lanow/la-me-police-predict-crime-20161002-snap-story.html
- ↑ http://www.predpol.com/