Apple sued for dropping CSAM detection features from services, devices
FirstpostAnnounced in 2021, the plan was for Apple to scan images on iCloud for child abuse material using on-device technology. While Apple retained a nudity-detection tool for its Messages app, the CSAM detection feature was dropped in 2022 amid privacy and security concerns The latest lawsuit argues that Apple’s decision has perpetuated the sharing of illegal material online and broken the company’s promise to protect abuse victims. While Apple retained a nudity-detection tool for its Messages app, the CSAM detection feature was dropped in 2022 amid privacy and security concerns raised by experts and advocacy groups. The lawsuit argues that Apple’s decision has perpetuated the sharing of illegal material online and broken the company’s promise to protect abuse victims like her. The lawsuit highlights that other tech companies, like Google and Meta, employ CSAM-scanning tools that detect significantly more illegal material than Apple’s nudity-detection features.