Hello all, I'm Sriram, the first author of this paper. We were inspired by the idea of laser microphones as an audio eavesdropping vector, and tried to find a way to use LIDARs similarly, even though they're not designed for this purpose at all.
In the near future, what I think is scarier is the possibility of executing the same attack through self-driving cars LIDARs. Perhaps this would allow attackers to spy on conversations in cars that are driving beside you or stationary next to you at traffic lights.
What materials did you find were most difficult to perform this attack on? I am guessing material like wood or rubber does not vibrate enough to observe a pattern? Also wouldn’t you be able to prevent this attack by having audio generators that generate near random audio signals that mask the data the attacker is seeking?
Right, good question. Anything that is very rigid and heavy doesn't vibrate enough when we play sound near it. I would say it's more about the thickness and weight of the target object than the material itself. For example, a very thin piece of wood would work much better than a wooden table leg. In the paper we actually test against ten different objects that are likely to be within reach of the robot vacuum cleaner.
The difficulty with masking this attack is that you need audio playing at a comparable volume near to the legitimate speech sources, which might be pretty disruptive. In the case of background white noise generators, for legitimate audio playing at around 70 dB SPL, we don't lose much accuracy until the background white noise exceeds 75 dB SPL.
Very interesting. All the new phones have lidar built in that works up to 30 feet away. In theory with the technology you applied here, it would be possible for them to eavesdrop without a microphone.This is almost as unsettling as the cameras that can visualize WiFi reflecting off human skin through walls. Its a matter of time before a commercial model hits the market. https://youtu.be/fGZzNZnYIHo what a brave new world we are about to live in.
Thank you! For the Xiaomi Roborock S5, the plastic housing around the LIDAR makes it hard to see if it's rotating when you're standing above it. If you bend down and look at it side-on, you can tell whether it's rotating or not.
Some feasible ways to stealthily perform the attack when the LIDAR is not rotating could be: a) attack when docked at the charging station, or b) hiding under furniture.
If only my Roomba was that smart, I wouldn't probably worry about eavesdropping: right now it can barely clean my floor and lock himself in the bathroom forever.
Jokes aside, which Robot Vacuum Cleaner is equiped with a LIDAR? So far the only ones that I've seen barely have a proximity sensor, fall sensor and IR sensors . It could be that I've only bought and seen the cheapest versions though.
Roborock are really nice and not that expensive. I've actually been really impressed with just how well it maps the floors. The other day I had to clean some cat fur out of it mid cycle, placed it in a totally different part of the room that wasn't in sight of the dock and it was able to fairly quickly figure out where it was.
Have one of those and it is really impressive. Automatically detected all rooms, so that i can just tell it to clean a specific one. No matter where i put it, it knows where it is even if i have rearranged some chairs etc.
I bought an S50 from China and it constantly errors out on carpet. It seems to be a common thing, I guess China doesn't do full carpet like we have in the US so it's something they didn't test for on my version.
The lidar is impressive though. Cleans way faster since it's taking efficient paths.
The roborock S5 and older models can easily be rooted, and you can install alternative open source firmware [0], newer models still require hardware disassembly in order to flash the firmware, so it's not quite user friendly yet [1]
For newer versions you can however extract the API token from the app, and use it to control the vacuum without internet access and block its external internet access, since the vacuum uses a known protocol. [2]
I did this with my S5. It's not the easiest operation, but the nice thing about the device is that a factory reset is an actual factory reset, so you can always start over with factory firmware if you mess up.
I use a Wireguard VPN to access the Vacuum's web UI to control it; it has no internet access at all.
Some like the deebot even have common household object detection in addition to the lidar and can move around them. Not sure how well it works in practice.
my bad. they were acquired by a german company not irobot. at the same time irobot bought another company that used indoor GPS, Mint. And neato launched a bunch of meaningless model numbers to jack up the price, just like irobot. Mixed it up because of all that (not sure it makes it any better :)
The "Evil Maid" class of attacks have a new vector: "Evil Digital Maid/Butler" (assume pervasive, fully compromised electronic assistants).
iPhone "Evil Maid" => GPS, Mic, Camera, Digital User Impersonation [post social network messages, iMessage, etc.]
HomePod "Evil Butler" => Control HomeKit, Mic, Playback Arbitrary Recordings [freeze, this is the police, etc., impersonate a significant other]
Roomba "Evil Maid" => Lidar (mm-resolution depth-camera?!?), Virtual Mic, Push/Close Doors, Push/Move Objects [tip over a table w/ candle]
WiFi Cams "Evil Maid" => Camera, sometimes speakers, sometimes motion control
...if this is how the robot uprising begins, we're a long way from Terminators / SkyNet, but easy to see entire classes of vulnerabilities which are pretty obvious in retrospect.
If you haven't seen "Enemy of the State" or "Conspiracy Theory", they're great movies with a similar premise: "What if 'the system' turned against you?"
I would also recommend “The Conversation” (1974). Not because the vision of surveillance is up to date, but because it’s a much better movie and (sort of) prequel to “Enemy of the State”.
It's interesting work. It's a kinda like finding a really weak seemingly impossible to use buffer overflow and now someone has to weaponize it and put it into easy to use metasploit to become just one of 1000s of things to have available.
Personally I'm surprised all these robots don't have microphones yet. Not being able to talk to robots makes them pretty lame.
Hi, first author of the paper here. We also consider this as part of the increasing arsenal of smart-home attacks, which can be opportunistic and long-term. Also given that it's an offline attack, as signal processing / machine learning methods improve, perhaps the lidars signals an attacker collects could eventually become intelligible audio.
I was also surprised that they don't have microphones. I guess the developers would prefer to have that on the companion app instead.
In reality though I never have my lidar robotvac running when I am at home. Even less having a conversation as all robotvac are loud.
I personally would be still more concern about all voice activate device (alexa etc).
I believe there is previous work that re-wires / re-purposes speakers to be used as microphones. However, my understanding is that this requires the hardware itself to be modified.
This is stupid; if I'm going to be able to sneak an entire robot vacuum cleaner into the victim's environment, I'm putting an actual microphone and even camera in there, and not messing around with LIDAR bouncing off vibrating paper cups.
Who says you get to put a microphone and a camera in there?
You work with what you have. Ideally you'd have a microphone, but maybe that robot vacuum cleaner of your target doesn't have one. And maybe you also don't have access to other devices which have one.
But, TBH, I wouldn't be surprised if today's vacuum cleaners have a microphone in them. "For voice commands", you know?
Recently I bought another TP-Link HS110 Wifi Plug, and while working on reading it automatically every couple of seconds with a Python script, I noticed that a response contained a field labeled "mic_type":"IOT.SMARTPLUGSWITCH". "mic_type"?
Some time ago the German router producer "AVM" had to explain why their DECT smart-plugs had a microphone in them.
The attack presented in the paper replace only the software without hardware intervention. It requires someone to MITM the robot update service, and that's not impossibile considering that someone still delivery software updates via HTTP.
The scenario does not involve "sneaking" in a robot vacuum. It's just another attack vector to pursue when looking to bug a target. Maybe you can't get a 0-day on their Alexas or their Nests, but you do have one for their vacuum. You remotely update the firmware on the vacuum to exfiltrate the sound that way.
Most people didn't realize that the :visited selector represented a danger until someone figured out how to get your browsing history by abusing it.
In the near future, what I think is scarier is the possibility of executing the same attack through self-driving cars LIDARs. Perhaps this would allow attackers to spy on conversations in cars that are driving beside you or stationary next to you at traffic lights.