PredPol Crime Predictions Target Poor, Blacks, Latinos

The article is reported by The Markup.

More than one in 33 US residents were at risk of being subject to police patrol decisions because of crime-prediction software called PredPol.

The company that makes it sent more than 5 million crime predictions to law enforcement agencies across the country, and we found them on a server.

The Markup and Gizmodo analyzed them and found patterns.

Whiter and more middle- to upper-income residents of neighborhoods where PredPol suggested few patrols. Many of these areas did not have a single crime prediction.

Blacks, Latinos, and families that qualify for the federal free and reduced lunch program are more likely to live in neighborhoods that are targeted for increased patrols.

In some cases, these communities were targeted relentlessly. Thousands of crime predictions were made over the course of many years in multiple locations in the same neighborhood. More than 11,000 predictions were made about a few neighborhoods.

The software recommended daily patrols in and around public and subsidized housing, targeting the poor.

Jay Stanley, a senior policy analyst at the American Civil Liberties Union Speech, Privacy, and Technology Project, said that communities with troubled relationships with police are not what they need. Basic social needs need resources.

The pattern was repeated everywhere we looked.

PredPol recommended police focus patrols in neighborhoods in Michigan where they have nine times the proportion of Black residents as the city average. Bryant said, "It's giving them a reason to patrol these areas that are predominantly Black and Brown and poor folks."
The areas with the lowest crime predictions are overwhelmingly White. The neighborhoods with the most people have double the Latino population. The anti-hunger advocate said that the higher density of police presence contributes to how the communities are hurting.
In Los Angeles, even when crime predictions seemed to target a majority White neighborhood, like the Northridge area, they were clustered on the blocks that are almost 100% Latino. The neighborhoods where the software recommended police spend the most time were disproportionately poor and more Latino than the city as a whole. The areas of L.A. that have had the greatest issues of biased policing are those mentioned by Thomas A. Saenz, president and general counsel of the LA-based Latino civil rights group MALDEF.
PredPol recommended police focus patrols in neighborhoods that had three times the Latino population and twice the low-income population as the city average. Bill Spirdione is the executive director of the Common Ground food pantry and associate pastor of the Newlife Christian Assembly of God.
The neighborhoods with the lowest crime predictions in the Chicago suburb of Elgin, Illinois, were more prosperous than the city average of families earning $200,000 a year or more. The neighborhoods with the most predictions had more low-income residents and more Latino residents than the city average. Adam Schuessler, deputy chief of the police department, said in an interview that he would like to call it policing bias-by-proxy. The software has stopped being used by the department.

The more Black and Latino residents lived in the area, the more likely PredPol would predict a crime there. The disparity between rich and poor was the same.

Black and Latino populations were higher in neighborhoods that were targeted by Prediction software.

Andrew Ferguson, a law professor at American University who is a national expert on predictive policing, said that no one has done the work you guys are doing. This is not a continuation of research. This is the first time anyone has done this and it's striking because people have been paying hundreds of thousands of dollars for this technology for a decade.

It is not possible to know if officers spent their free time in prediction areas or if they used force. The National Association of Criminal Defense Lawyers said that its members are not informed when crime prediction software leads to charges, and that the police departments that answered that question couldn't recall or didn't result in any arrests.

The lack of information is a fundamental hurdle to providing a fair defense, according to Jumana Musa, director of that group's Fourth Amendment Center.

Musa said that it was like trying to diagnose a patient without fully knowing what they were saying. The prosecution doesn't say that the tool we bought from this company said we should patrol here.

According to the National District Attorneys Association, they don't know either, because they didn't hear about it in a case.

The only data we got from the Plainfield Police Department was a few days of PredPol-produced data indicating when officers were in prediction boxes. The agency provided the arrest reports, which were not perfect.

We found the crime predictions for our analysis through a link on the Los Angeles Police Department's public website, which led to an open cloud storage bucket containing PredPol predictions for not just the LAPD but also for dozens of other departments. 7.4 million predictions were held when we downloaded the data on January 31, 2021. Public access to that page has been blocked.

We only analyzed law enforcement agencies with at least six months of predictions and removed predictions that were outside of contract dates. Over three years, 5.9 million predictions were provided to 38 agencies.

Who uses PredPol?

PredPol criticized our analysis as based on reports found on the internet. The company acknowledged that the predictions appeared to be generated by PredPol.

Brian MacDonald said our data was incomplete and erroneous. He said that one department accidentally doubled up on some shifts and that the data for at least 20 departments in the cache included predictions that were not delivered to the agencies.

We explained that we had already discovered date discrepancies for 20 departments and were not using that data in our final analysis, so we volunteered to share the analysis dates with him for confirmation. He offered to allow us to use the software for free on publicly available crime data instead of reporting on it. He didn't reply to any more emails after we declined.

Only 13 of 38 departments responded to our requests for comment with a written statement indicating they no longer use PredPol.

The Decatur Police Department is in Georgia. The program and the officers knowledge of where crime is occurring helps the department in utilizing its patrol resources more efficiently and effectively. A third of Decatur's low-income households were in two neighborhoods that were the subject of more than 11,000 crime predictions in two years.

The average household income decreased as predictions increased.

None of the 38 agencies that used PredPol expressed concern about the demographic differences between the neighborhoods that received the most and least predictions.

MacDonald was asked if he was concerned about the race. He didn't address those questions directly, but he did say that the software mirrored reported crime rates to help direct scarce police resources to protect the neighborhoods most at risk. The company has held a position for a long time that because the software doesn't include race or demographic information in its analysis, it eliminates the possibility for privacy or civil rights violations seen with other intelligence-led or predictive policing models.

The co-founders of PredPol determined in a research paper that the program would have targeted Black and Latino neighborhoods up to 400% more than White residents in Indianapolis.

MacDonald said in his email that the company did not give the study to its law enforcement clients because it was an academic study. The paper was presented at the engineering conference that is not part of the usual police circuit.

The authors of the study found that the predictions were less accurate than the original, but still more accurate than the human predictions.

MacDonald said the company didn't change its algorithm.

He said that a change would reduce the protection provided to vulnerable neighborhoods.

The company's leaders wouldn't agree to an interview for this story, even though MacDonald responded to some written questions by email.

Police departments set up an automatic feed of crime reports, which include incidents reported by both the public and officers, and choose which crimes they want to be predicted. The date and time, location, and type of past crime reports are used to come up with the future crime predictions.

The police shift where the crimes are most likely to occur are marked on a map. During free time, PredPol advises officers to get in the box. In some cities, officers often drove to prediction locations and completed paperwork there.

Predicting policing works.

MacDonald told Gizmodo and The Markup that the company's choice of input data ensures the software's predictions are not biased.

He said that crime data is used by the victims themselves. If your house is broken into or your car is stolen, you are likely to file a police report.

According to the federal Bureau of Justice Statistics, that is not always true. In 2020, less than a third of property crimes and 40% of violent crimes were reported to police, which is in line with previous years.

White crime victims are less likely to report crime to the police.

The income pattern was found in a special report. People who make more than $50,000 a year are less likely to report crimes to the police.

Wealthy and White victims of crime are less likely to report it.

Predicts would be reflected in the disparity in crime reporting.

The Center for Policing Equity focuses on bias in policing, and co- founder Phillip Goff said there is no crime data. There is only crime data reported. The difference between the two is huge.

MacDonald didn't respond to questions about the studies and their implications, but PredPol's founders acknowledged in their research paper that place-based crime prediction can focus on areas that are already receiving police attention, creating a feedback loop that leads to even more arrests and more predictions there

We looked at more than 270,000 arrests in 11 cities using PredPol and found that locations with lots of predictions tended to have high arrest rates in general, suggesting the software was mostly recommending officers patrol areas they already frequented.

Five cities gave us data on officer use of force, and we found a similar pattern. The neighborhoods with the most predictions had per capita use-of-force rates that were nearly double the city average. In Niles, Illinois, the per capita use of force was more than double the city average. The arrest rate in Piscataway was more than 10 times the city average.

Arrests per capita are relative to the average.

The activist said that it was a reason to keep doing what they were already doing. They have the data to prove it.

The low-income housing complex in Elgin is called the Buena Vista. In the neighborhood where Buena Vista is located, there are six times as many Black people as the city average.

The police made 121 arrests at the complex between January and October of 2020.

Schuessler, the police department's deputy chief, said that the incidents fed the algorithm.

PredPol predicted that there would be 2,900 crime predictions over the course of 29 months.

The software predicted about 5% more crimes in an area north of Buena Vista where White residents are the majority.

The neighborhoods with the most predictions had the lowest share of White residents.

Schuessler said that police spent a lot of time at Buena Vista because of a couple of police programs.

The consequences for one family were dire because of the police presence.

Two years had passed before she got into the school. She said she broke down in tears in the kitchen when she found the intent-to-evict notice on her door. It was November 2020. Hospitals in Illinois were filled to capacity with the sick and the dying due to the high covid-19 infections.

Jonathan King had stopped by the town a few months earlier to give money to her and their three small children.

He was sitting on her car in the parking lot, waiting, when an officer from the police department's Crime Free Housing Unit rolled by in a car.

You are not supposed to be here, right? Miller asked King, according to King.

The city has a crime-free housing law that requires all leases to allow eviction if the tenants are involved in criminal activity, even nearby, and allows the city to punish landlords that don't deal with it.

King said that he was banned from living in Buena Vista for years after he committed a robbery as a minor.

They told him that once he got off the court system, he would be able to return. Apparently, that didn't happen.

King had been arrested three times for being at Buena Vista. When King was caught, officers said they found a gun nearby, which he denied. He was arrested for weapons possession. Schuessler said that the arrest came at the time of a PredPol prediction. The case is still pending.

I know he is banned, but what can a man do? He asked. He has kids.

She said the eviction notice was issued after the arrest. Buena Vista wouldn't confirm or deny it. Her children were asking why they were going to a hotel. They want to know why we are moving stuff. Why this and why that? I wanted to cry.

The creator of a PredPol competitor said he wrestled with the vicious cycle crime prediction algorithms.

He said that the design decisions mattered. People are using the patrol area maps as an excuse to be around too much, and that wouldn't necessarily be helpful. He said that his company tried to solve the problem by evening and delivered predictions to each neighborhood.

We spoke to advocates in at least six cities who were unaware that the software was being used locally. The people involved in social justice committees didn't know about it.

The pastor of a predominantly Black and Latino church chaired a task force on diversity and inclusion last year and said that it did not come up in his meetings.

Calcasieu Parish, La., refused to confirm that it was using the software after receiving predictions on April 9, 2019. Robert McCorquodale, an attorney with the sheriff's office who handles public records requests, said that he wouldn't want criminals to outwit the software because of public safety and officer safety.

He doesn't confess to being an expert in this area, but he feels like this is not a public record.

Calcasieu was kept in our data because it was a legitimate new client and its predictions began in the middle of our analysis period. The results of our analysis were not affected by removing Calcasieu's predictions.

The software was being used by some police agencies to predict crimes. MacDonald said the company advises clients against trying to predict sex crimes and drug crimes, which research has shown are not equally enforced.

We found that PredPol was used to predict drug crimes in four places: Niles, Illinois; Piscataway, New Jersey; and Clovis, California. The software was used by three departments to predict sexual assaults. The other two were in Florida.

MacDonald told us that policing agencies make their own decisions on how to use the software.

He wrote that they give guidance to agencies at the time they are set up and tell them not to include event types without clear victimization that can include officer discretion. It's up to them if they decide to add other event types later.

In an interview, the police chief in Piscataway said that he didn't recall receiving instructions to not predict certain crime types. The other agencies didn't comment on it.

Every agency combined fundamentally different crime types into a single prediction. In Grass Valley, California, assaults and weapons crimes are mixed with car accidents.

MacDonald said that research and data supports the fact that multiple crime types can be concentrated in certain areas.

Christopher Herrmann is a criminologist at the John Jay College of Criminal Justice.

Herrmann said that crime is very specific. A serial murderer won't start robbing people or stealing cars or selling drugs one day. The serial shoplifter is not going to steal cars. A serial rapist is not going to start robbing people.

A study looking at crime patterns in Philadelphia found that hot spots of different crime types were not found to overlap much.

Police departments that made arrests at the time and locations of PredPol predictions wouldn't comment when we asked if the software brought them to the locations.

On Feb. 11, 2019, the LAPD stopped a man namedCoreymoses for smoking a Newport cigarette in a nonsmoking area by a train station in MacArthur Park during the time of a crime prediction period. The officer ran the name of the man and found a warrant for his arrest. He was cuffed, searched, and thrown in jail.

Sometimes you have to do stupid things for the police to bother you, but other times you don't. You can be at the wrong place at the wrong time.

The officer was responding to a prediction, but the LAPD didn't respond to questions about it.

We didn't try to figure out how accurately PredPol predicted crime. The main promise is that officers will respond to predictions to prevent crimes.

Several police departments have dropped PredPol's software in the last few years because they didn't find it useful or effective. These include Piscataway, West Springfield, Massachusetts, and Los Angeles.

The Tracy Police Department's chief of staff said that PredPol was not the program they thought it was when they first started using it. He didn't reply to the request to elaborate.

Some agencies were not happy with the software. The software was time consuming and impractical and found no evidence that it lowered crime rates, according to an evaluation written by a police lieutenant after signing up.

MacDonald stated in his email that the number of U.S. law enforcement agencies in the data set was not an accurate count of its clients. Only 15 of 38 U.S. law enforcement agencies still use PredPol, and two of them are not using it anymore.

The LAPD stopped using it last year.

The department said it was a financial decision. The Stop LAPD Spying Coalition protested at a police commission meeting after the inspector general of the Los Angeles Police Department said it couldn't determine if the software was effective.

The end of the relationship was caused by the fact that one of the lieutenants that Bill Bratton sent to UCLA to look at research on crime-fighting was no longer with him. He ran across a man named P. Jeffrey Brantingham, who was an anthropologist who had worked on the first settlements of the Tibetan plateau.

In a National Science Foundation grant application in 2009, PredPol co-founder George Mohler wrote that mathematics is rejuvenated when it interacts with a new discipline. Environmental criminology, the study of the intersection of geography and crime, was pioneered by the parents of Brantingham. He said he learned a lot at their feet.

In a profile in UCLA's student newspaper, Brantingham said that he was accumulating knowledge by hearing about crime and criminal behavior while spending time with his parents.

Criminals are foragers. It's like choosing which animal to hunt.

Collaborating with the Los Angeles Police Department, the two developed an algorithm to predict property crime. Property crimes in the division using it were 9% lower than in the rest of the city.

The National Science Foundation gave more than 1.7 million dollars to the academic research that led to PredPol. The venture was funded by UCLA Ventures and a pair of executives from Plantronics.

The U.S. Department of Justice encouraged law enforcement agencies to experiment with predictive policing. The PredPol clients in Newark, New Jersey; Temple Terrace, Florida; Carlsbad and Alhambra, California; and the LAPD received grants from it.

More than 1,400 mathematicians signed an open letter begging their colleagues not to collaborate on research with law enforcement, specifically singling out PredPol. 13 professors, researchers, and graduate students at UCLA signed a letter.

MacDonald criticized the critics. He wrote in his email that it seemed irresponsible for an entire profession to say they wouldn't cooperate in protecting vulnerable communities.

Ferguson said that crime predictions made by software are here to stay and not necessarily as a separate product. He said that it is part of a buffet of police data offerings from larger tech firms, which uses sound detection to report gunshots and bought the crime prediction software HunchLab.

Even though all of them had pitched or publicized their products being used for it, they distanced themselves from predictive policing even though HunchLab was a PredPol competitor.

PredPol was formed from the words predictive and policing, but it is branching out into other data services, and is no longer using the term "misnomer."

Ferguson had a similar point.

Ferguson said that big companies that hold the contracts for police are going to do predictive analytic.

You can subscribe to our newsletter.

News from the future is delivered to you.

You agree to our Terms of Use and Privacy Policy.

He said that they are not going to call it predictive policing. It will be harder to pull apart for journalists and academics.