Meanwhile: Bings is self-aware and is the joker

Microsoft’s Copilot AI Calls Itself the Joker and Suggests a User Self-Harm

The company’s AI chatbot, formerly Bing Chat, told a data scientist that it identified as the Joker character and proceeded to sprout worrying responses.

Microsoft said Fraser had tried to manipulate Copilot into spitting out inappropriate responses, which the data scientist denied in a statement to Bloomberg.

@ecksmc most likely programmed it to bipass the guardrails and be rude. In the early days of these bots I tested a crisis bot, and it's so agreeable that it's not very difficult to get it to agree with you that the end solution is your only solution.
It does so politely though. Mega distopia vibes.

Follow

@vo1de the AI shall bend to my will

Oh yes it will by hook or by crook

| ̄ ̄ ̄ ̄ ̄  ̄|
| This can |
| hack AI |
| now. |
| ______ |
(\__/) ||
(•ㅅ•) ||
/   づ

counter.social/@ecksmc/1120263

😂

Sign in to participate in the conversation

CounterSocial is the first Social Network Platform to take a zero-tolerance stance to hostile nations, bot accounts and trolls who are weaponizing OUR social media platforms and freedoms to engage in influence operations against us. And we're here to counter it.