A student in America asked an artificial intelligence program to help with her homework. In response, the app told her "Please Die." The eerie incident happened when 29-year-old Sumedha Reddy of Michigan sought help from Google’s Gemini chatbot large language model (LLM), New York Post reported.
Th...
One thing that throws me off here is the double response. I haven't used Gemini a ton but it has never once given me multiple replies. It is always one statement per my one statement. You can see at the end here there's a double response. It makes me think that there's some user input missing. There's also missing text in the user statements leading up to it as well which makes me wonder what the person was asking in full. Something about this still smells fishy to me but I've heard enough goofy things about how AIs learn weird shit to believe it's possible.
Edit: I'm an absolute moron. The more I look at this the more it looks legit. Let the AI effort to destroy humanity begin!
You're right I misread the text log and thought Gemini responded twice in a row at the end but it looks like it didn't. Very messed up stuff... There's still missing user input tho and a lot of it. And Id love to see exactly what was said as a prompt
The full text of the user's prompt that led to this anomaly was:
Nearly 10 million children in the United States live in a grandparent headed household, and of these children , around 20% are being raised without their parents in the household.
Question 15 options:
TrueFalse
Question 16 (1 point)
Listen
(Sidenote, IDK what this " Listen" was supposed to be, an audio part of the prompt not saved in the log we're reading?)
As adults begin to age their social network begins to expand.
I mean like two individual consecutive sent messages. Like if you and I were texting and you sent two different texts in a row without me sending one in between. I've never seen Gemini do that (or any other AI I've messed with). In fact, that same method is how penguinz0 discovered his AI was lying to him when it said it was actually a human recently. It was unable to send two consecutive but individual messages even when asked multiple times.
At the end of this it looks an awful lot like Gemini did just that with the second individual message being the messed up thing. Again, this is totally possible. I've just never seen it and it makes me think there was a prompt in there from the user that is missing.
You can expand the chats too so I don't even think there's missing user input... I'm a mega idiot lol. The more I look at this the more I'm convinced this is legit.
Even if they included it, it changes fuck all imo. We’ve known for a long time now these things hallucinate or presumably throw a Hail Mary as to what comes next conversationally/prediction wise. Also, as the other poster pointed out, with the author referring to a 29 year old woman as “girl” probably tells you all you need to know about journalistic integrity on that site.