AI Policing is Dangerous

Discriminatory AI tools like facial recognition, automated license plate readers (ALPRs), gunshot detection technology, and predictive policing programs are being used throughout the criminal legal system, supercharging biased policing practices.

Until May 28, 2024, the Department of Justice requested comments about the use of AI in policing, specifically focused on privacy, civil rights, and civil liberties concerns.

We sent over 3000 of your comments to the DOJ making it clear to the Department of Justice that biased, rights-infringing AI should not be used. 

Thanks for signing the petition!

Please consider sharing this page with your friends and family.

Background:

The Biden Administration’s Executive Order on AI requires the Attorney General, the Secretary of Homeland Security and the Director of the Office of Science and Technology Policy to produce a report with recommendations for ways to use AI in law enforcement, specifically in ways that are consistent with privacy protections, civil rights, and civil liberties.

The reality is that AI tools like facial recognition, automated license plate readers (ALPRs), gunshot detection technology, and predictive policing inherently threaten our privacy, rights and freedoms. There is no way to make the use of these technologies safe.

The report will address and make recommendations for a number of AI use cases, including for tools in police surveillance and predictive policing. We know that the AI tools used by law enforcement copy the biases in historical crime data and are responsible for expanding discriminatory policing practices. The U.S. police and law enforcement systems have a long history of weaponizing data against Black communities. Numerous studies show that the algorithms that power AI technologies exacerbate racism and discrimination rather than solve the problem of discriminatory decision-making. Institutionalizing the use of this technology will only increase surveillance, wrongful arrest and imprisonment of Black, Brown, immigrant, and other targeted communities. 

There is no amount of regulation, transparency, or oversight that will fix the dangers inherent in widespread use of these tools in policing.

Fight for the Future has campaigned and advocated against the use of specific tools in policing, including facial recognition, ALPRs and ShotSpotter. This is an opportunity to highlight issues with these technologies, call out how any AI used by police must consider the harms we’ve already seen, and stop the use of any tech that continues to allow those harms.