This isn't unsettling at all.
According to a news story, Gemini AI tells the user to die — the answer appeared out of nowhere when the user asked Google's Gemini for help with his homework
@BigMikey What the fuck...? Um, yeah not letting my son use Gemini, I don't care if it's free - not with it throwing that shit around randomly.
@PaganMother Definitely wont be letting my kid anywhere near this too.
@BigMikey
Wth?
@BigMikey let’s see the chat that led up to this!
@BigMikey I’ve seen these a few times. It’s almost always the result of “playful” prompts to solicit these types of responses. If I had a colleague or friend/family bring this to me, I would review their chats history with the LLM.
@BigMikey also if you’re concerned about the response, which is absolutely reasonable, I would submit it to the Gemini support team. If you have their paid service then they’ll typically respond fairly quickly and or escalate it to the right team.
I hate all things Google but Gemini is one of the few solutions where they’ll take a report seriously.
@SpaceShanks it's not from me but a news story I came across today:
https://www.cbsnews.com/news/google-ai-chatbot-threatening-message-human-please-die/
@BigMikey after reading that article, I’m more suspicious of the veracity of this claim. Nowhere in the article does it mention that the incident was actually investigated and the only response from Google is boilerplate for this type of nonsense in the news.
I bet a review of the full chat history for that 29 year old student (only mentioned because they aren’t a child) would yield some provocative results.
At my last company I was in charge of all things AI so I call BS on this stuff.
@BigMikey regarding my last sentence… it’s important to remind people that they get the results they input with a whole lot of weird mixed in. These companies are still selling technology that is still technically experimental so when it comes to concepts like “memory” there can be some issues.
When I would investigate weird responses, it was rarely the result of something totally unspecified almost always derivative of past prompt engineering (flubbering).
I’m skeptical to say the least.
@SpaceShanks May I get your opinion on a similar but much older case?
@BigMikey I once asked Gemini for advice on how to get my dog to stop biting when she plays. It told me to bite her back.
@BigMikey Whoa!