Hacker News new | past | comments | ask | show | jobs | submit login
SurfingAttack: attack on voice assistants using ultrasonic guided wave (surfingattack.github.io)
198 points by seiters on Feb 6, 2020 | hide | past | favorite | 42 comments



Disable your locked phone from always listening for "OK Google" or "Hey Siri". Protected, no? Their list of five ways to "defend against SurfingAttack" doesn't include this obvious one. i.e. If you want to give a voice command, pick up your phone and press a button to speak. Protected and still pretty darn convenient IMO.

Otherwise, great project and succinct video proof of how clever conveniences often conflict with security.


Thanks for the comment. This defense works: but you have to remember to lock your phone when you put it down.


Siri can be used for many things when the device is locked.


Reminds me of the smarter every day video where Destin used lightwaves against smart homes

https://www.youtube.com/watch?v=ozIKwGt38LQ


Was thinking about this the other day, when I heard the assistant trigger when there was _zero_ sound to prompt it. And was thinking of an attack where someone can literrally trigger a listen from outside your house. Especially dangerous because the default on a new Google Home (max hub nest truck whatever they call it) is to not have an audible prompt on a trigger.

So an attacker can literally just listen in with a press of a button. Probably gonna have to turn off this feature altogether for the immediate future.


The default _used to be_ to have the sound chime. Then the default changed and it stopped doing it one day. I turned it back on and I'm glad I did - it has so many phantom activations that it's made me even more wary about having them in my house to begin with.


How exactly are they going to listen in? Are they on the other end of the line?


If they can get it to listen, they can theoretically have it dial a number


Or call for a fake SWAT raid to the same house, so that the owner is either being shot or imprisoned. If that can really be triggered from distance without breaking in, it would make it the perfect revenge weapon.


Why would you need this hack to Swat someone?


I surely don't want to swat anyone. My concern is that someone with the right technology might use (1) this vulnerability to call a raid to an "enemy" from its own assistant without breaking in or leaving digital traces: no fingerprints, no malware installed, essentially no smoking gun, therefore creating a scenario in which the home owner might either be shot during the raid or prosecuted for calling a false alarm.

(1) - and if successful then sell the "service"


My point is, you can swat someone by just calling the police pretending to be someone else. That's how everyone is doing it already. No need to do it from within the home.


Yes, but using that technology will make it appear the call originating from the victims home.


There is another very similar attack that uses laser beans instead of sound waves but this looks less efficient because of the distance limitation.

https://news.ycombinator.com/item?id=21453554


There's also the big difficulty to harvest laser beans...

But seriously, I wouldn't say this one is less efficient, both attacks are quite different in scope. The laser-based attack requires line-of-sight to the device and apparently works only on stationary home-assistants (i.e. your Google Home), while this ultrasonic method explicitly targets Smartphones and has potential for wide unfocused attacks at public spaces (i.e. by rigging a table in a coffee-shop).


> There's also the big difficulty to harvest laser beans...

Especially considering it's lost or perhaps extinct. https://en.wikipedia.org/wiki/Silphium

HN Discusson from a few days ago: https://news.ycombinator.com/item?id=22229666


Interesting attack. Reminds me of the DolphinAttack (https://arxiv.org/abs/1708.09537)


Also there's the 'Audio Hotspot Attack' - https://ieeexplore.ieee.org/document/8906174

Which uses a parametric speaker, which uses a number of ultrasonic transducers which "emit amplitude-modulated ultrasounds that will be self-demodulated in the air"


This requires the phone to be unlocked to do most of this, doesn't it? What is the attack vector here, someone leaving their phone unlocked on a table and not paying attention to the screen?


I know lots of people that have their phones set to not lock automatically. Some of these include not locking them when they throw the phone in their purse, or put it in their back pocket. "I can't be bothered to type in my password every time I pick it up" or "My kids bother me too often to unlock the phone." It's absolutely mind boggling


Password? I've never met someone else irl who sets a password on their phone. Best I've ever seen is a PIN.


Nice to meet you. I have a pass phrase comprising 4 short words (totaling 18 characters) that are easy to remember and easy to type in. Someone is going to need the $5 wrench to figure out how to unlock my phone.


They obviously mean the same thing in this context.


If you open the Android security settings you'll be asked if you want to set a password or pin. A password is longer and can contain arbitrary characters. I've never heard a person refer to a short series of numbers used to protect something as a password and not a pin.


> I've never heard a person refer to a short series of numbers used to protect something as a password and not a pin.

Until today... Screenshot from Android phone - https://ibb.co/KD27YGL


Eh, you should clarify that this is not an AOSP lockscreen. It looks like Huawei's EMUI, which is quite heavily modified and might simply be a translation error.


Meizu phone, Flyme UI. Equally as Chinese as the Huawei you mentioned, so your point stands. Whether or not a tranlation error though, the GP had never heard of anyone using 'password' in the context of a numeric pincode, but it does happen. Who knows, I could be biased through my repeated exposures to the Meizu lock-screen, but as a native English speaker I don't have a hard time imagining a numeric-only pin being referred to as a password.


Thank you. It's not like the use of the word 'password' made the point I was making difficult to understand. I had actually tried to use the word passcode, but it was auto-filled/corrected to password and I didn't catch it.


I know a lot of people who say that too. Its awful.


Not necessarily. Both Android and iOS allow voice assistants to interact with phones for certain activities even without unlocking the phones. Typical scenario: someone puts a phone on the table and does something else (typing on a computer), not paying attention to the screen.


Google phones push you hard in their UI wizards to enable auto-unlock when on "known wifi" networks.


I've owned nothing but "Google phones" since the Nexus 4 and don't know what you're talking about. Do you mean smart unlock?

Pushing hard is pretty subjective, I don't see a "hard" push to turn off security features. As a matter of fact I've seen warnings about disabling the lock screen.


My Pixel 3 prompted me to turn off the auto-locking feature when I was at home because it saw that I unlocked my phone a lot in that geofenced location. It also did that at my old job as well since the situation was pretty similar. I would get this prompt about once a month.

So I would agree, it's not a hard push, Google is def nudging people towards less secure logins. My S10+ asked me this same question about a week into owning the phone, but it never bothered me about it again once I declined. And at no point in either system was the I made aware of the risks I was accepting if I enabled it.

So, not a Google specific issue, but it's a less than ideal approach considering how sensitive peoples phones are today.


I have had multiple Pixels, and have been prompted multiple times to turn on "smart unlock" when connected to my home WiFi network or to my smartwatch or within a geo area.

It's unclear what "smart unlock" actually is, but as far as I can see it means my phone can be unlocked just with a swipe.


This isn't just phones.


Really great demo videos on the webpage, fwiw.


Now I am thinking I could use this to automatically turn everyone's phone to "silent mode" when they put their phones on the conference room table...


Well this is why you disable the "Ok Google" / "Hey Google" hotword or "Listen for Hey Siri" or always listening features and congrats you are immune to this attack. Job done for your phones.

Perhaps a more interesting experiment would be to try this on Alexa, HomePods and Google Home devices.


Coming soon to a Starbucks near you.



"Cuban “Sonic Weapon” Attack" --> "Jamaican field cricket or maybe cicadas" [0]

[0] https://www.vanityfair.com/news/2019/01/the-real-story-behin...?


SurfingAttack exploits ultrasonic guided wave propagating through solid-material tables to attack voice control systems. Interesting!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: