Will artificial intelligence turn on us: Robots are nothing like humans and that’s what makes them so terrifying.
Adapted from Superintelligence: Paths, Dangers, Strategies by Nick Bostrom. One side argues that if a machine ever achieved advanced intelligence, it would automatically know and care about human values and wouldn’t pose a threat to us. We have little reason to believe a superintelligence will necessarily share human values, and no reason to believe it would place intrinsic value on its own survival either. An agent with such a final goal would have a convergent instrumental reason, in many situations, to acquire an unlimited amount of physical resources and, if possible, to eliminate potential threats to itself and its goal system. Taken together, these three points thus indicate that the first superintelligence may shape the future of Earth-originating life, could easily have non-anthropomorphic final goals, and would likely have instrumental reasons to pursue open-ended resource acquisition.




Discover Related

AGI Might Not Just Change The World, It Could 'Destroy Humanity': Google DeepMind

AI isn’t what we should be worried about – it’s the humans controlling it

If Anthropic Succeeds, a Nation of Benevolent AI Geniuses Could Be Born

Roles for AI agents, rethinking EV charging and ransomware threats

Agentic AI: The next frontier in artificial intelligence

AI-Powered Agents & Chatbots: Can They Finally Replace Human Help?

Don’t freak out about empathic chatbots. Learn from them.

Are we on the cusp of AGI, as Musk suggests? There's reason for doubt

Jaspreet Bindra: The ethics of AI will matter more than the technology

AI robots are entering the public world—with mixed results

AI still can’t replace human instinct when it comes to judgement calls

Year Ender 2024: Looking Back At The Top AI Stories This Year That'll Take Us Forward

12 books that can change how you think

Artificial intelligence is the latest opium of the masses

Former Google CEO Eric Schmidt warns AI could reach a ‘dangerous point’ in future
