1. Inhalt
  2. Navigation
  3. Weitere Inhalte
  4. Metanavigation
  5. Suche
  6. Choose from 30 Languages

Science

Algorithms prevent crime before it happens

Police in the US, and increasingly in Europe, are using statistical models to predict where crime will happen next. This computer-driven law enforcement has reduced crime, but has also raised civil liberties concerns.

The London Metropolitan Police Service has become the latest in a string of law enforcement agencies to adopt statistical software that aims to prevent crime by predicting where it will happen next. Called PredPol, the computer program has been used for years in the United States, where it has apparently helped reduce the rate of assault, burglaries and robberies in major metro areas.

Developed at UCLA in California by mathmetician George Mohler and anthropologist Jeff Brantingham, PredPol runs historical crime data through an algorithm that then predicts which locations in a city are at a greater risk for repeat offenses.

PredPol is similar to statistical programs long used by major private sector companies. The online retailer Amazon, for example, collects data on customers' buying habits to predict what they may want to purchase in the future. It then makes recommendations based on that data.

"A lot of police departments, and indeed other agencies of the state, have been looking quite enviously over at what large commercial companies are able to do," Jamie Bartlett, with the British think tank Demos, told DW.

"You've seen all these incredible algorithms that can often predict with a remarkable degree of accuracy where someone is going to be, what sort behavior they're likely to undertake on the basis of their expressed opinions or attitudes or behaviors," Bartlett said.

Predictive policing

Based on its algorithm, PredPol generates maps that display 500-by-500 square foot hot spots where crimes are likely to occur again. Law enforcement can then deploy to those locations at specific times of the day to deter crime before it happens. The strategy is called "predictive policing." Other companies such as IBM have developed similar software.

FILE - In this Sept. 6, 2012 file photo, Amazon founder and CEO Jeff Bezos speaks in Santa Monica, Calif. Bezos plans to buy The Washington Post for $250 million. (AP Photo/Reed Saxon, File) Eingestellt von: Martin Koch (mak)

Law enforcement is taking a page out of Amazon's playbook

"There's the idea that somehow you're making a forecast of the future, and it's really just making a prediction based on historical data," John Hollywood, with the Rand Corporation, told DW. "It's just now we're using more input variables."

The input variables can include locations, types of crimes, times, dates and array of other data. Police in Charlotte-Mecklenburg County, North Carolina even used foreclosure data to map out areas that were at higher risk for crime.

Savings through prevention

In Santa Cruz, California, Predpol helped reduce assaults by 9 percent, burglaries by 11 percent and robberies by 27 percent, according to police.

The Foothill Division of the Los Angeles Police Department claimed similar success, with a 25 percent drop in burglary. PredPol is now being used by several other major American cities, including Seattle and Atlanta.

In the United Kingdom, pilot programs have been implemented in Kent, Great Manchester, West Yorkshire and the West Midlands. In their national police vision, the 43 police departments of England and Wales have set a goal of adopting predictive analysis programs like PredPol by 2016.

According to Bartlett, law enforcement has joined a growing public sector trend of trying to save scarce money by focusing on prevention.

"Across a number of departments there have been lots of studies done that consistently demonstrate the value to the public purse to prevent bad things from happening than from cleaning up after them," Bartlett said.

Civil liberties concerns

But critics warn that predictive policing could simply entrench racial profiling. Legal expert David Harris says that if a police officer already operates with a racial bias, predictive information that marks a particular location as higher risk could encourage the officer to detain someone with little real cause for suspicion.

"What's happening in the police officer's mind is that the racial characteristics or ethnic characteristics are proxies or substitutes for actual suspicious behavior," Harris, a law professor at the University of Pittsburgh, told DW.

"When you add to that the supposedly iron-clad data based predictions that crimes are going to be going on in this place, the potential for stops, frisks, detentions based on very little real evidence just grows," he said.

But Hollywood claimed that during study he conducted for the Rand Corporation, privacy and civil liberties advocates saw predictive policing as a potential improvement over past practices, so long as the right protections were in place.

"[That's] because the focus was actually on places and people that there was genuine data to suggest a threat as opposed to just declaring an entire neighborhood or an entire class of people as a likely threat and going after all them," Hollywood said.

ILLUSTRATION - Ein Computer Code am Donnerstag (20.11.2011) auf einem Bildschirm in Köln. Die deutsche Regierung will selbst Software für Computer-Trojaner entwickeln lassen. Bundesinnenminister Friedrich kündigte am Donnerstag an, dass der Bund ein eigenes Kompetenzzentrum dafür einrichten wird. Die Experten des Chaos Computer-Clubs hatten vor kurzem berichtet, dass der vom Staat eingesetzte Trojaner zur Überwachung der Internet-Verbindung Verdächtiger eine viel umfassendere Überwachung ermöglichte, als gesetzlich erlaubt ist. Foto: Oliver Berg dpa/lnw (zu dpa 0847 vom 20.11.2011)

The trend to policing by algorithms raises legal issues

If the courts accept predictive crime modeling as a legitimate factor in concluding probable cause for suspicion, a host of new legal questions will have to be answered. That's according to legal expert Andrew Ferguson with the David A. Clarke School of Law at the University of the District of Columbia in Washington D.C.

"Other questions get opened because then we need to know how accurate is the data, how transparent is it, how are we supposed to evaluate whether this algorithm is actually crunching the right numbers – what are the right numbers?"

DW recommends