+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Attackers can use audio frequencies beyond human hearing to exploit smart speakers

May 11, 2018, 19:02 IST

Advertisement
  • Smart speakers can hear commands inaudible to humans.
  • Secret messages can be used to make calls and unlock doors.
  • Researchers in China call the technique, 'Dolphin Attack'.
Alexa, Siri and the Google Assistant can hear audio frequencies humans can’t. And they can also hear commands in these frequencies that we can’t. Researchers have stumbled upon the fact that digital assistants can be manipulated using white noise and commands that the human ear doesn’t register.

Research teams in China as well as the US have been able to secretly activate the AI systems on smartphones and smart speakers. Once activated, the university labs have dialed phone numbers and opened websites. Nonetheless, the technology could be used for darker ambitions like unlocking doors, wire money or shop online.

Researchers Nicholas Carlini and David Wagner spoke to The New York Times about how they embedded the secret message, “OK Google, browse to evil.com,” in seemingly harmless sentence as well as in a short video of Verdi’s ‘Requiem’. Both times, Mozilla’s open-source software was fooled.
Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

Carlini stated, “We want to demonstrate that it’s possible, and then hope that other people will say, ‘’Okay this is possible, now let’s try and fix it.’”

Researches in China call the technique ‘DolphinAttack’. They’ve used it to instruct smart devices to visit malicious sites, initiate calls, click pictures and send messages. Currently, the scope of the DolphinAttack is limited but the University of Illinois has already demonstrated what ultrasound attacks from 25 feet away are capable of. During the Urabana-Champaign, they showed that though commands couldn’t yet penetrate walls, they still had the potential to control smart devices through open windows in buildings.
Advertisement


In response to the article by the American publication, all three companies have lent assurances that their smart speakers are secure. Specifically with respect to the Apple Homepod, the device has been designed to “prevent commands from doing things like unlocking doors.”

If anyone remembers the Superbowl Amazon ad for earlier this year, it illustrates how Amazon is already aware of such a function. They designed their ad in such a way that Alexa speakers at home with viewers wouldn't respond even though Alexa was uttered nearly 10 different time during the ad spot. In fact, Amazon filed a patent back in 2014 called, "Audible Command Filtering," that describes different techniques to prevent Alexa from waking up.

Security risks have been a continuous concern when it comes to the Internet of Technology (IoT). Nonetheless, this highlights an issue that could pose of an obstacle in the future. It’s a fair warning to companies designing digital assistant to get out in front of the problem rather than be reactionary.



Next Article