Apple sued for dropping CSAM detection features from services, devices
1 month ago

Apple sued for dropping CSAM detection features from services, devices

Firstpost  

Announced in 2021, the plan was for Apple to scan images on iCloud for child abuse material using on-device technology. While Apple retained a nudity-detection tool for its Messages app, the CSAM detection feature was dropped in 2022 amid privacy and security concerns The latest lawsuit argues that Apple’s decision has perpetuated the sharing of illegal material online and broken the company’s promise to protect abuse victims. While Apple retained a nudity-detection tool for its Messages app, the CSAM detection feature was dropped in 2022 amid privacy and security concerns raised by experts and advocacy groups. The lawsuit argues that Apple’s decision has perpetuated the sharing of illegal material online and broken the company’s promise to protect abuse victims like her. The lawsuit highlights that other tech companies, like Google and Meta, employ CSAM-scanning tools that detect significantly more illegal material than Apple’s nudity-detection features.

History of this topic

Apple faces $1.2 billion lawsuit for failing to prevent distribution of child sexual abuse material via iCloud
Trending News
1 month ago
Australia takes aim at Apple, Microsoft over child protection online
2 years ago
UK cybersecurity chiefs back Apple’s controversial photo-scanning feature
2 years, 5 months ago
Apple’s child sex-abuse scanning tool is too clever for its own good: Tae Kim
3 years, 4 months ago
Apple appeals against security research firm while touting researchers
3 years, 4 months ago
After criticism, Apple to only seek abuse images flagged in multiple nations
3 years, 4 months ago
Apple's own employees are not happy with its move to scan users phones for child sexual abuse images
3 years, 4 months ago
Apple defends scanning iPhones for child abuse images, saying algorithm only identifies flagged pics
3 years, 5 months ago
Apple to scan U.S. iPhones for images of child sexual abuse
3 years, 5 months ago
Apple will scan iPhone for nude photos, will report user to police if it finds child porn
3 years, 5 months ago
Apple accused of creating 'backdoor' into iPhones with child abuse image scanning
3 years, 5 months ago
iPhone to detect and report child sexual abuse images: Why security experts are calling it a bad idea
3 years, 5 months ago
Apple to scan U.S. iPhones for images of child abuse
3 years, 5 months ago
Fury at Apple's plan to scan iPhones for child abuse images
3 years, 5 months ago
Apple may soon scan your iPhone for child abuse photos, here is how this will work
3 years, 5 months ago
Apple reveals it scans iCloud photos to check for child sexual abuse
5 years ago

Discover Related