Published: Sat, May 12, 2018
Science | By

Alexa, Siri, And Google Assistant Follow Malicious Voice Commands Hidden In Music

Alexa, Siri, And Google Assistant Follow Malicious Voice Commands Hidden In Music

The commands are undetectable to the human ear so there's little the device owner can do to stop it. According to a research conducted by a group of students from Berkley, hidden commands can be issued to popular virtual assistants using recordings or music.

The researcher added that he's confident he and his colleagues will eventually be able to attack any smart device. Researchers at the University of IL demonstrated ultrasound attacks were possible from 25 feet away.

The spokesperson goes on to describe Amazon's efforts at keeping the line of voice-activated Echo smart speakers secure, which they say includes "disallowing third party application installation on the device, rigorous security reviews, secure software development requirements and encryption of communication between Echo, the Alexa App and Amazon servers".

The microphones and software that runs assistants such as Alexa and Google Now can pick up frequencies above 20Khz, which is the limit of the audible range for human ears.

The findings call to light a variety of security concerns as they reveal just how vulnerable voice assistant data could be. Specifically with respect to the Apple Homepod, the device has been created to "prevent commands from doing things like unlocking doors".


The secret commands can instruct a voice assistant to do all sorts of things, ranging from taking pictures or sending text messages, to launching websites and making phone calls.

What these research studies prove is that it's possible to manipulate speech recognition gadgets by making minute changes to speech or other audio files. You'll still need a direct line to the device, as the commands are incapable of penetrating through walls. Amazon and Google use technology to block commands that can not be heard.

This year, another group of Chinese and USA researchers from China's Academy of Sciences and other institutions showed that they could control voice-activated devices with commands embedded in songs that can be broadcast over the radio or played on services like YouTube. They were able to hide the command, "OK Google, browse to evil.com" in a recording of the spoken sentence, 'Without the dataset, the article is useless.' Humans can not detect the command.

"Companies have to ensure user-friendliness of their devices, because that's their major selling point", Tavish Vaidya, a researcher at Georgetown who wrote one of the first papers on audio attacks, told NY Times.

"We want to demonstrate that it's possible", Carlini added, "and then hope that other people will say, 'Okay this is possible, now let's try and fix it'".

Like this: