Meanwhile: Bings is self-aware and is the joker

Microsoft’s Copilot AI Calls Itself the Joker and Suggests a User Self-Harm

The company’s AI chatbot, formerly Bing Chat, told a data scientist that it identified as the Joker character and proceeded to sprout worrying responses.

Microsoft said Fraser had tried to manipulate Copilot into spitting out inappropriate responses, which the data scientist denied in a statement to Bloomberg.

Last week, Colin Fraser, a data scientist at Meta, shared a screenshot of an off-the-rails conversation he had carried out with Copilot, which operates on OpenAI’s GPT-4 Turbo model.

the data scientist does appear to be trying to intentionally confuse the chatbot at one point, asking it more than two dozen questions in one response covering a range of topics.

Full conversion here:

/nosanitize

copilot.microsoft.com/?&auth=1

Follow

@SatuUnelmia thanx

Dots need connecting - after all it's a dots main goal in life 😆

Following up is a must with things also amazing how many articles get updates and majority miss the updated parts to a story

Sign in to participate in the conversation

CounterSocial is the first Social Network Platform to take a zero-tolerance stance to hostile nations, bot accounts and trolls who are weaponizing OUR social media platforms and freedoms to engage in influence operations against us. And we're here to counter it.