AI Experts Want to End 'Black Box' Algorithms in Government
The right to due process was inscribed into the US constitution with a pen. Public agencies responsible for areas such as criminal justice, health, and welfare increasingly use scoring systems and software to steer or make decisions on life-changing events like granting bail, sentencing, enforcement, and prioritizing services. The report from AI Now, a research institute at NYU that studies the social implications of artificial intelligence, says too many of those systems are opaque to the citizens they hold power over. A project by legal scholars that used open-records laws to seek information about algorithms and scoring systems used in criminal justice and welfare in 23 states came back largely empty handed. AI Now’s call for a rethink of government use of algorithms is one of 10 recommendations in the 37-page report, which surveys recent research on the social consequences of advanced-data analytics in areas such as the labor market, socioeconomic inequality, and privacy.
Discover Related

AI tools are becoming more racist as the technology improves, new study suggests

How to wrench open the black box of algorithms that decide our fate

White House unveils artificial intelligence ‘Bill of Rights’

U.S. Civil Rights Enforcers Warn Employers About Using Biased AI

A Move for 'Algorithmic Reparation' Calls for Racial Justice in AI

UN calls for moratorium on Artificial Intelligence tech that threatens human rights

U.N. decries police use of racial profiling derived from Big Data

If Done Right, AI Could Make Policing Fairer
