Artificial intelligence turned out to be a so-so cop

CarderPlanet

Professional
Messages
2,552
Reaction score
724
Points
113
The AI predictor of crimes promised brilliant results. How are things now?

An AI algorithm that predicts possible crimes a week in advance has been tested by many US police departments over the past few years. Some departments have even implemented it in their work. However, a new study by The Markup and Wired has shown that the technology is nowhere near as effective as initially expected.

For example, for the city of Plainfield, New Jersey, an AI predictor turned out to be too expensive and ineffective. Experts analyzed 23,631 forecast reports from the Geolitica program, formerly known as PredPol. Analysis of information for the period from February 25 to December 18, 2018 showed that the accuracy of calculations was less than half a percent.

Initially, the Geolitica software was supposed to identify "hot spots" - areas with an increased risk of criminal activity, based on in-depth data analysis.

Captain David Guarino of the Plainfield Police Department was outspoken about the inefficiency of the technology: "Why did we acquire PredPol? We wanted to make the fight against crime more effective. If we knew where to expect trouble, the process would really become easier. But I very much doubt that it helped... I don't think we've ever used it much, if at all. As a result, it was decided to abandon it."

Predpol's operating principles have long raised questions among experts. Especially after another, earlier study showed that AI mainly focuses on low-income neighborhoods and ethnic minorities.

Geolitica has not yet commented on the results of the study. However, according to Wired, it plans to stop operating at the end of this year, and part of its team has already joined another law enforcement company.
 
Top