AI Agents Will Be Manipulation Engines
WiredIn 2025, it will be commonplace to talk with a personal AI agent that knows your schedule, your circle of friends, the places you go. New AI agents will have far greater power to subtly direct what we buy, where we go, and what we read. AI agents are designed to make us forget their true allegiance as they whisper to us in humanlike tones. Before his death, philosopher and neuroscientist Daniel Dennett wrote that we face a grave peril from AI systems that emulate people: “These counterfeit people are the most dangerous artifacts in human history … distracting and confusing us and by exploiting our most irresistible fears and anxieties, will lead us into temptation and, from there, into acquiescing to our own subjugation.” The emergence of personal AI agents represents a form of cognitive control that moves beyond blunt instruments of cookie tracking and behavioral advertising toward a more subtle form of power: the manipulation of perspective itself. This brings us to the most perverse aspect: AI agents will generate a sense of comfort and ease that makes questioning them seem absurd.