id author title date pages extension mime words sentences flesch summary cache txt www-technologyreview-com-4006 Predictive policing algorithms are racist. They need to be dismantled. | MIT Technology Review .html text/html 4583 271 70 "Cities have been going broke for years, and they've been replacing cops with algorithms." Exact figures are hard to come by, but predictive tools are thought to be used by police forces or courts in most US states. "It's really just in the past few years that people's views of these tools have shifted from being something that might alleviate bias to something that might entrench it," says Alice Xiang, a lawyer and data scientist who leads research into fairness, transparency and accountability at the Partnership on AI. Police like the idea of tools that give them a heads-up and allow them to intervene early because they think it keeps crime rates down, says Rashida Richardson, director of policy research at the AI Now Institute. It depends what you mean by "work." In general it is practically impossible to disentangle the use of predictive policing tools from other factors that affect crime or incarceration rates. ./cache/www-technologyreview-com-4006.html ./txt/www-technologyreview-com-4006.txt