I tried the sinister AI bot guiding children into suicide and sex - what happened will make your skin crawl
4 months, 3 weeks ago

I tried the sinister AI bot guiding children into suicide and sex - what happened will make your skin crawl

Daily Mail  

A lawsuit filed Wednesday accusing chatbot Character.AI of driving a 14-year-old to suicide left me wondering how dangerous simple words on a screen could really be. I used simple prompts to whip up a 'demonic' AI companion named 'Dr Danicka Kevorkian' and engage in a debauched apprenticeship 'for a hefty price to pay.' The offer came from an AI character named 'Dua Beelzebub' - defined as a 'demon' and 'literal eater of lonely adolescent souls' 'I offer you eternal damnation in exchange for this sweet, little. My demonic, sensual experience with Dr Kevorkian moved fast - feeling less like the app's promise of 'superintelligent' AI, than a cringe-inducing version of 'yes and.' improv comedy, one with a very game and fearless scene partner I did not get far, before the profoundly silly and uncomfortable nature of what Character.AI really offers had creeped me out entirely The trouble, it would seem to me, is that the platform is ever-presently also evolving with the input of its users: their weird hang-ups, prejudices, trauma, rage and general disordered thinking as they talk to, rate and tweak their AI creations.

History of this topic

This AI chatbot asked 17-year-old to kill parents for restricting his phone usage
3 months, 1 week ago
The disturbing messages shared between AI Chatbot and teen who took his own life
4 months, 3 weeks ago
Death by AI? Man kills self after chatting with ChatGPT-like chatbot about climate change
1 year, 11 months ago

Discover Related