AI means anyone can be a victim of deepfake porn. Here’s how to protect yourself
CNNNew York CNN — “All we have to have is just a human form to be a victim.” That’s how lawyer Carrie Goldberg describes the risk of deepfake porn in the age of artificial intelligence. While revenge porn — or the nonconsensual sharing of sexual images — has been around for nearly as long as the internet, the proliferation of AI tools means that anyone can be targeted by this form of harassment, even if they’ve never taken or sent a nude photo. A group of teens and parents who had been affected by AI-generated porn testified at a hearing on Capitol Hill, where Republican Sen. Ted Cruz introduced a bill — supported by Democratic Sen. Amy Klobuchar and others — that would make it a crime to publish such images and require social media platforms to remove them upon notice from victims. “My proactive advice is really to the would-be offenders which is just, like, don’t be a total scum of the earth and try to steal a person’s image and use it for humiliation,” Goldberg said.