Fake babies, real horror: Deepfakes from the Gaza war increase fears about AI’s power to mislead
1 year, 1 month ago

Fake babies, real horror: Deepfakes from the Gaza war increase fears about AI’s power to mislead

Associated Press  

WASHINGTON — Among images of the bombed out homes and ravaged streets of Gaza, some stood out for the utter horror: Bloodied, abandoned infants. While most of the false claims circulating online about the war didn’t require AI to create and came from more conventional sources, technological advances are coming with increasing frequency and little oversight. “It’s going to get worse — a lot worse — before it gets better,” said Jean-Claude Goldenstein, CEO of CREOpoint, a tech company based in San Francisco and Paris that uses AI to assess the validity of online claims. In others, generative AI programs have been used to create images from scratch, such as one of a baby crying amidst bombing wreckage that went viral in the conflict’s earliest days. Doermann, who is now a professor at the University at Buffalo, said effectively responding to the political and social challenges posed by AI disinformation will require both better technology and better regulations, voluntary industry standards and extensive investments in digital literacy programs to help internet users figure out ways to tell truth from fantasy.

History of this topic

Alarm raised over bizarre images circulating on Facebook
8 months, 1 week ago
Hulu Shows Jarring Anti-Hamas Ad Likely Generated With AI
55 years ago
Fake babies, real horror: Deepfakes from the Gaza war increase fears about AI's power to mislead
1 year, 1 month ago
Fact or Fiction: Israel needs fake nurses to justify killing Gaza babies
1 year, 1 month ago
Child abuse images created by AI is making it harder to protect victims as law enforcement find it more difficult to identify whether real children are at risk, National Crime Agency warns
1 year, 5 months ago
AI-generated deepfakes are moving fast. Policymakers can't keep up
1 year, 8 months ago

Discover Related