Deep search
Search
Copilot
Images
Videos
Maps
News
Shopping
More
Flights
Travel
Hotels
Real Estate
Notebook
Top stories
Sports
U.S.
2024 Election
Local
World
Science
Technology
Entertainment
Business
More
Politics
Any time
Past hour
Past 24 hours
Past 7 days
Past 30 days
Best match
Most recent
Google AI chatbot responds with a threatening message
Google's AI Chatbot Tells Student Seeking Help with Homework 'Please Die'
When a graduate student asked Google 's artificial intelligence (AI) chatbot, Gemini, a homework-related question about aging adults on Tuesday, it sent him a dark, threatening response that concluded with the phrase, "Please die. Please."
Google AI chatbot tells user to 'please die'
Google chatbot Gemini told a user "please die" during a conversation about challenges aging adults face, violating the company's policies on harmful messages.
Google AI chatbot responds with a threatening message: "Human … Please die."
In an online conversation about aging adults, Google's Gemini AI chatbot responded with a threatening message, telling the user to "please die."
PCMag on MSN
18h
Asked for Homework Help, Gemini AI Has a Disturbing Suggestion: 'Please Die'
A student received an out-of-the-blue death threat from Google's Gemini AI chatbot while using the tool for essay-writing ...
23h
Why it Matters That Google’s AI Gemini Chatbot Made Death Threats to a Grad Student
AI chatbots put millions of words together for users, but their offerings are usually useful, amusing, or harmless. This week ...
Opinion
The Register on MSN
15h
Opinion
Google Gemini tells grad student to 'please die' while helping with his homework
When you're trying to get homework help from an AI model like Google Gemini, the last thing you'd expect is for it to call ...
Digital information world
4h
Google’s Gemini AI Chatbot Under Fire For Releasing ‘Out of the Blue’ Death Threat To Student
Disturbing chatbot response prompts Google to pledge strict actions, highlighting ongoing AI safety challenges.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results
Feedback