Researchers say Amazon face-detection technology shows bias
Associated PressNEW YORK — Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto. Matt Wood, general manager of artificial intelligence with Amazon’s cloud-computing unit, said the study uses a “facial analysis” and not “facial recognition” technology. Wood said facial analysis “can spot faces in videos or images and assign generic attributes such as wearing glasses; recognition is a different technique by which an individual face is matched to faces in videos and images.” In a Friday post on the Medium website, MIT Media Lab researcher Joy Buolamwini responded that companies should check all systems that analyze human faces for bias. Amazon’s reaction shows that it isn’t taking the “really grave concerns revealed by this study seriously,” said Jacob Snow, an attorney with the American Civil Liberties Union. Wood said Amazon has updated its technology since the study and done its own analysis with “zero false positive matches.” Amazon’s website credits Rekognition for helping the Washington County Sheriff Office in Oregon speed up how long it took to identify suspects from hundreds of thousands of photo records.