LAPD to stop using biased data programs after criticism
| AFP Photo


Los Angeles Police Chief Michel Moore plans to end a program that uses data to identify individuals who are most likely to commit violent crimes, bowing to criticism included in an audit and by privacy groups.

In a memo sent on Friday to the Police Commission, the civilian panel that oversees the LAPD, Moore detailed changes in response to an audit by Inspector General Mark Smith.

Smith found that the department's data analysis programs lacked oversight and that officers used inconsistent criteria to label people "chronic offenders." Smith also couldn't determine the overall effectiveness of a geographic component that tried to pinpoint the location of some property crimes.

Moore told commissioners that the department will not use programs that do not produce results and will strive to "identify new or emerging ideas that hold promise."

"Crime reduction strategies are never static," Moore wrote. "We will continue to learn and evolve in our work."

For years, critics have lambasted the data-driven programs - which use search tools and point scores - saying statistics tilt toward racial bias and result in heavier policing of black and Latino communities. After the "chronic offender" lists created an uproar among civil liberties and privacy groups, the LAPD suspended that tool in August.

Andrew Ferguson, a law professor at the University of the District of Columbia who studies policing data and wrote a book on the topic, credited Smith's audit for exposing problems. Police leaders, he said, need to be transparent if they want to build community trust and accountability around data programs.

"You have to have the courage to go to the community to tell them how you're using it and if it is broken," said Ferguson, who called for annual audits of the programs. "If they do that, they're going to be leaders" in the country.

Smith's audit focused on several tools.

For violent crime, the department draws "LASER" zones devised by a human crime analyst, not a computer - to identify crime hot spots and where to focus more officers.

Many of the department's divisions also used data to compile lists or "bulletins" of people calculated to be among the top 12 "chronic offenders." The program assigned points to people based on criminal histories, such as arrest records, gang affiliation, probation and parole status and recent police contacts.

Smith found that 44 percent of the so-called chronic offenders had either no arrests or one for violent crimes. About half had no arrests for gun-related crimes. Others were in custody or had been arrested for only nonviolent crimes.

Officers will now rely more on old-school tactics, using physical descriptions of those suspected in reported crimes. They will also focus more closely on perpetrators recently released from custody and those who have committed similar crimes in the past, the memo said.

Moore also plans to increase oversight by developing "precision policing" manuals tailored to each of the agency's four geographic commands. The manuals, which are expected to be completed this summer, will incorporate the inspector general's recommendations. The four area commands will provide a "centralized model of oversight from the Office of Operations," Moore wrote.

A recent study from New York University School of Law and NYU'sAI Now Institute examined data programs in troubled police departments in Chicago, New Orleans and Maricopa County, Arizona.

The study concluded that "dirty data" lead to biased policing and unlawful predictions. Those policies repeatedly send officers to the same neighborhoods, regardless of the crimes committed, the study found.

Rashida Richardson, director of policy research at the AI Now Institute and a co-author of the report, said Smith's findings mirrored suspicions that police target specific communities.

"This shows a larger policing problem," she said. "None of this is standardized. A lot of this system is one-sided."