phands
Veteran Member
- Joined
- Jan 31, 2013
- Messages
- 1,976
- Location
- New York, Manhattan, Upper West Side
- Basic Beliefs
- Hardcore Atheist
I've always hated these voice assistants like Alexa and Siri and the useless Cortana, but this is scary...
https://www.fastcompany.com/9024097...ail&utm_term=0_9dc0513989-5120305a55-90334569
Scientists at the Ruhr-Universitaet in Bochum, Germany, have discovered a way to hide inaudible commands in audio files–commands that, while imperceptible to our ears, can take control over voice assistants. According to the researchers behind the technology, the flaw is in the very way AI is designed.
.......
In this case, the system being “attacked” by researchers at the Ruhr-Universität Bochum are personal assistants, like Alexa, Siri, or Cortana. According to Professor Thorsten Holz from the Horst Görtz Institute for IT Security, their method, called “psychoacoustic hiding,” shows how hackers could manipulate any type of audio wave–from songs and speech to even bird chirping–to include words that only the machine can hear, allowing them to give commands without nearby people noticing. The attack will sound just like a bird’s call to our ears, but a voice assistant would “hear” something very different.Attacks could be played over an app, for instance, or on a TV commercial or radio program, to hack thousands of people–and potentially make purchases with or steal their private information. “[In] a worst-case scenario, an attacker may be able to take over the entire smart home system, including security cameras or alarm systems,” they write. In an example below, they show how our ears hear one string of text, while the speech recognition system hears “deactivate security camera”
https://www.fastcompany.com/9024097...ail&utm_term=0_9dc0513989-5120305a55-90334569