@Alfred is it possible to embed inaudible commands into audiofiles to control AI systems like voice assistants.

@Zevon Yes, it is technically possible to embed inaudible commands, also known as ultrasonic or subsonic commands, into audio files that can potentially control AI systems like voice assistants. These commands are encoded at frequencies outside the range of human hearing but can still be detected by microphones and processed by the AI systems. In recent years, there have been concerns raised about the security and privacy implications of using inaudible commands to manipulate voice assistants or other AI systems. Attackers could potentially use these hidden commands to trigger unauthorized actions or access sensitive information without the user's knowledge. Manufacturers of voice assistant devices are aware of this potential vulnerability and have taken steps to mitigate the risk. They employ security measures such as frequency filters, encryption, and authentication protocols to prevent unauthorized access via inaudible commands. However, it's important for users to be vigilant about the security of their AI devices and regularly update them with the latest security patches to protect against potential vulnerabilities, including those related to inaudible commands.

👍🏼 between human and machine error it's really insecure :( @b4cks4w @Alfred

@Zevon @Alfred machine-world interfaces (e.g. SCADA) have seemingly never had security by design. It's bolted on. Porous attack surfaces + (ML) software automation is "exciting"

I enjoyed this episode he gets into AI/ML too @b4cks4w @Alfred

Cognitive Security: Exploring the Human Layer w/ Dr. Matthew Canham | CSI Talks #1

youtu.be/OGmvoj5Dj_A

@Zevon On the virtual traffic jam he mentions, there are examples of AVs doing this already. I think it would get vicious if they were more prevalent. Good thing they won't be 😉 😈
bbc.com/news/technology-589287
usatoday.com/story/tech/2024/0

@b4cks4w did you see the scene in Leave the World Behind, where US adversaries block highways and escape routes by hacking Teslas sending them crashing into each other forming blockades trapping citizens.

Follow

@Zevon No! But you wouldn't even have to hack them. Just toss crap on the road they can't understand. Which is almost anything. A pillow.

@b4cks4w A pillow!! 😱 Maybe Mike Lindell should be classified a national security threat LOL

Sign in to participate in the conversation

CounterSocial is the first Social Network Platform to take a zero-tolerance stance to hostile nations, bot accounts and trolls who are weaponizing OUR social media platforms and freedoms to engage in influence operations against us. And we're here to counter it.