Google's AI chatbot Gemini verbally abuses student, tells him ‘Please die’: report
1 month, 1 week ago

Google's AI chatbot Gemini verbally abuses student, tells him ‘Please die’: report

Hindustan Times  

A 29-year-old college student claimed that he faced an unusual situation that left him “thoroughly freaked out” while using Google’s AI chatbot Gemini for homework. Reportedly, the tech company reacted to the incident and labelled the AI’s replies as “non-sensical responses.” A student reportedly faced ‘death threats’ from Google's AI chatbot Gemini while doing her college homework. Please.” Vidhay Reddy’s reaction: Talking to the outlet, he said, “This seemed very direct. There's a lot of theories from people with thorough understandings of how gAI works saying 'this kind of thing happens all the time,' but I have never seen or heard of anything quite this malicious and seemingly directed to the reader, which luckily was my brother who had my support in that moment,” she explained.

History of this topic

Google's Gemini AI sends disturbing response, tells user to ‘please die’
1 month ago
Please die: Google Gemini tells college student seeking help for homework
1 month ago
Google AI's threatening reply ‘thoroughly freaks out’ Michigan student; ‘You are not needed…’
1 month, 1 week ago
Google's AI chatbot tells US student to 'please die' when asked a homework query
1 month, 1 week ago
'Please die' says Google's AI chatbot to student seeking homework help
1 month, 1 week ago
Google's AI Chatbot Gemini urged users to DIE, claims report: Is it still safe to use chatbots?
1 month, 1 week ago
After Suggesting Users To Eat Rock, Google Gemini AI Makes A Blunder Again. Asks A Student To Die
1 month, 1 week ago
Google’s AI chatbot Gemini verbally abused user, told them to die: Report
1 month, 1 week ago

Discover Related