Alexa, Siri and Google can be tricked by commands you can’t hear

Posted on May 18, 2018

Alexa, Siri and Google can be tricked by commands you can’t hear

As tens of millions of happy delighted owners know, Siri, Alexa, Cortana and Google, will do lots of useful things in response to voice commands. But what if an attacker could find a way to tell them to do something their owners would rather they didn’t? Researchers have been probing this possibility for a few years and now, according to a New York Times article, researchers at the University of California, Berkeley have shown how it could happen.

Their discovery is that it is possible to hide commands inside audio such as voice statements or music streams in a way that is inaudible to humans. A human being would hear something innocuous which the virtual assistants interpret as specific commands. The researchers have previously demonstrated how this principle could be used to fool the Mozilla DeepSpeech speech-to-text engine.

Source: sophos.com