The Justice Department has failed to convince a group of U.S. lawmakers that state and local police agencies will not receive federal funding to purchase artificial intelligence-based “policing” tools that are known to be, if not exacerbate, long-standing biases. inaccurate. American police force.
Seven members of Congress wrote in the letter A letter to the Ministry of JusticeTheir leaks from the agency, first obtained by WIRED, only heightened their concerns about the Justice Department’s police funding program. Lawmakers say nothing in their response so far suggests the administration has bothered to investigate whether departments awarded police funding to buy discriminatory policing software.
“We urge you to halt all Department of Justice grants for predictive policing systems until the Department of Justice can ensure that grant recipients will not use such systems in a manner that has discriminatory effects,” the letter reads. admit It did not track whether police departments used funds provided by the Edward Byrne Memorial Justice Assistance Grant Program to purchase so-called predictive policing tools.
Lawmakers, led by Sen. Ron Wyden, D-Ore., said the law requires the Justice Department to “periodically review” grant recipients’ compliance with Title VI of the state’s Civil Rights Act. They explained that the Department of Justice is clearly prohibited from funding programs that discriminate on the basis of race, ethnicity, or national origin, whether the results are intentional or unintentional.
An independent investigation by the media found that popular “predictive” policing tools trained on historical crime data often replicate long-standing biases, at best providing a veneer of scientific legitimacy to law enforcement while over-representing police forces in predominantly Black and Latino communities. Policing is here to stay. .The Markup’s headline in October put it bluntly: “Predictive policing software is poor at predicting crimeThe story tells how the publication’s researchers recently examined 23,631 police crime predictions and found that they were accurate about 1 percent of the time.
“Predictive policing systems rely on historical data distorted by falsified crime reports and disproportionate arrests of people of color,” Wyden and other lawmakers wrote, predicting that —As many researchers say– This technology only creates “dangerous” feedback loops. The statement noted that “biased projections are used to justify disproportionate stops and arrests in minority communities,” further skewing statistics about where crimes occur.
Senators Jeffrey Merkley, Ed Markey, Alex Padilla, Peter Welch and John Fetterman and Representative Yvette Clark also signed on to the letter. Department.
Lawmakers are asking an upcoming presidential report on policing and artificial intelligence to examine the use of predictive policing tools in the United States. “The report should assess the accuracy and precision of predictive policing models for protected classes, their interpretability and effectiveness.” They added that this included “the lack of transparency from the companies developing these products to assess their risks.” any restrictions.”
Lawmakers said that if the Justice Department wants to continue funding the technology after the review, it should at least establish “evidence standards” to determine which prediction models are discriminatory and then deny funding to any models that fail to meet those standards.