Predictive policing AI systems analyze crime data to forecast criminal activity, but research shows they can perpetuate racial bias and lack transparency. Major cities including Chicago, Los Angeles, and Pasco County Florida have discontinued programs due to low accuracy and civil rights concerns, highlighting the need for democratic values of due process, transparency, and public accountability in AI-driven law enforcement tools.
Original source: The Conversation
Retrieved: 2025-10-27 | Language: EN | Reading time: 5 min read