Apple to scan U.S. iPhones for images of child abuse
The HinduApple unveiled plans to scan U.S. iPhones for images of child abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens. The tool Apple calls “neuralMatch” will detect known images of child sexual abuse without decrypting people’s messages. “Researchers have been able to do this pretty easily.” Tech companies including Microsoft, Google, Facebook and others have for years been sharing “hash lists” of known images of child sexual abuse. “With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.” Julia Cordua, the CEO of Thorn, said that Apple’s technology balances “the need for privacy with digital safety for children.” Thorn, a non-profit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.