Announced in 2021, the plan was for Apple to scan images on iCloud for child abuse material using on-device technology. While Apple retained a nudity-detection tool for its Messages app, the CSAM detection feature was dropped in 2022 amid privacy and security concerns The latest lawsuit argues that Apple’s decision has perpetuated the sharing of illegal material online and broken …