Hacker News new | past | comments | ask | show | jobs | submit login
Wi-LE: Can WiFi replace Bluetooth? (2019) [pdf] (ucla.edu)
107 points by docuru on Feb 27, 2021 | hide | past | favorite | 80 comments



I think there is a wirelrss revolution around the corner being held up by patents, closed chipsets, and closed firmwares.

Look at what was necessary in this papet, thr IoT device pretends to be an access point to bypass all the layers in the stack that would filter out thr packet.

In this case they found a hack to workaround chipset/firmware limitations, but what other innovations havent happened due to inaccesible wireless physical layers?


Check our SDR, you can have access to anything in the physical layer with ease these days. Like here's a fancy SDR system running as a wifi AP: https://www.rtl-sdr.com/bladerf-wiphy-open-source-wifi-acces... You'll need a proper FCC license to do much transmitting of anything on non-standard frequencies though.


I can't tell if "thr" is a typo of "the" or an abbreviation of "their".


lol typo of "the." I hate typing on mobile.


This seems like the perfect type of error which is caught by autocorrect. Do you have it enabled?


I thought you were mocking wireless packet loss or something.


Yes, or, at least it has been used to do so. Microsoft's xbox 360/xbox one USB controller adapter is actually a standard wifi adapter with a custom firmware.

Source: https://github.com/medusalix/xow#how-it-works


That's only for the xbox one; the 360 was very nearly the only implementation of the wireless USB standard.


I've owned both adapters, the Xbox 360 adapter was significantly more reliable, and the Xbox One adapter has weird latency and handshake issues.


That's due to range and especially latency issues with Bluetooth. It's one reason competitive shooters were more popular on the 360 than the Ps3 with its Bluetooth gamepad.


There's been extensive study into the latency difference between 360 controllers and ps3 controllers. There is virtually none and it's definitely not noticeable with all the other latency sources in the signal chain.

Edit: one public, easy to digest source

https://www.reddit.com/r/RocketLeague/comments/8wvf9u/contro...


That's on a PC. I'm not saying console results aren't similar but they aren't necessarily the same.

Regardless, it was the common perception at the time and thus affected the behavior of players.


You might be surprised to discover that the radio waves don't care if they're going to a desktop PC in a traditional case, or a slightly different desktop PC in a Microsoft/Sony case!

It was not "the common perception". At best it was common among your peer group?


> . It's one reason competitive shooters were more popular on the 360 than the Ps3 with its Bluetooth gamepad.

I played a LOT of Call of Duty and PS3. I never noticed input latency issues.

I think shooters where more popular on XBox because Halo took over a ton of the initial market share.


Slightly off-topic:

Caring about slight latency differences but then still using joysticks for a shooter seems rather weird.


And yet when the military wants to shoot for real, it uses joysticks and not keyboards.


That implies nothing about their suitability as an interface for shooters. The military has other concerns that don’t apply here, like field reliability, and decades of established practices and equipment.


Huh? Who suggested that keyboards are better than joysticks for aiming?

For aiming in a shooter you have: keyboard < joystick < mouse or gyro.

For the military applications, the inertia of the thing you are aiming is important. In the most concrete example, think of swinging a rifle around to aim it. For a slightly more abstract think of aiming artillery.


Of course because. A mouse cannot keep up with the real world. In a sense a mouse is too good. With a joystick it's fine if your actuation is slow because the joystick allready soaks up a lot of that shitty interface.


Does that mean something like an esp32 could interact with it, or does the xbox one have DRM like the PlayStation controllers do?


Is it still the case for the new Xbox?


Interesting the newer Xbox controllers can connect over Bluetooth with windows 10, but I found it much flakier than using their dedicated wireless adapter (which I guess works much more like the hardware in an actually Xbox)


Yes, the Xbox Series controllers are fully interoperable with the Xbox One, which they wouldn't be if they switched to some other technique.


Actually I think they switched to just bluetooth now, or at least the new ones can connect to my computer/phone just through bluetooth without an adapter (the old ones did require one though)


The third revision of the Xbox One (1708, doesn’t have the glossy plastic piece at the top where the Xbox button sits) added Bluetooth support. The consoles and the Xbox Wireless Adapter still use the “Xbox Wireless” (basically WiFi Direct), however.


I hope so. I have issues with BT about once every time I use it.


I’ve heard multiple times that apparently a lot of proprietary Bluetooth stacks are filled to the brim with bugs. I’m not sure how complex the Bluetooth spec is but I wouldn’t be surprised if it were true.

It’s also part of why I can’t stand the trend of remove the headphone jack from devices.


I've read chunks of the bluetooth spec and while I didn't wind up spending very long actually using it, the vibe I got was that of a committee composed of architecture astronauts who pumped out pages by the truckload and applications engineers who could never agree on anything. The standard is very heavy on abstract cruft and very light on constraining decisions, which is probably why 30 years later we still have pairing issues with headphones.


Oh but the next version has fixed that, uses less energy and is going to be great, as per every Bluetooth press release every single year.


BLE actually fixed that and it uses less energy. I have used it both on desktops and on embedded devices, in C and in scripted languages, and it has been a pleasure.


The original design was utterly terrible. A low data rate frequency hopper with long packets. Which meant the RF performance was terrible and non state of the art. By design. I think they thought spec'ing low data rate frequency hopper meant automatically low cost, low power.

They messed up the low power part by spec'ing a complicated baseband that had everything but the kitchen sink. Basically duplicated the USB spec but then added authentication and encryption on top of it. Which is even more complicated because everything runs over a unreliable frequency hopping rf channel. Frequency hopping messes with a lot of things.


> I’m not sure how complex the Bluetooth spec is but I wouldn’t be surprised if it were true

Pretty complex.


AFAIK the spec is a free download, I have the "Bluetooth Core 5.0" one and it's 2822 pages long.

I also have the one for WiFi (802.11-2012) and that is 2793 pages, so I'd say they are of roughly equal complexity.


Yeah, I have tried to read that WiFi spec and I was kinda overwhelmed.


I tossed bluetooth dongles aside and got an intel AX210 and it fixed connectivity issues for me. Turns out bluetooth actually needs antennas and reliable firmware.


I just got a new router today to try the VR streaming from Desktop PC to my Quest 2 and it works flawlessly. Pretty exciting if it can do that it can probably do anything else. The part I don't know is what the energy use tradeoffs are. I know wifi6 is supposed to be lower energy cost?


Might be wrong, but I'm under the impression that the video encoding has a significantly higher energy cost on the Quest 2. Virtual Desktop even allows an OC setting if I remember correctly.

It helps, of course, that the bandwidth required for VR streaming is fairly low using the Oculus link protocol.


I can see a reason to abandon BLE. Bluetooth is generally a bit problematic maybe due to is complexity. but “Bluetooth low energy” is perfect for what it is (a super slow short range wireless communication implementation).


Programming BLE is basically bit banging. Regular BT has higher level protocols (which have their problems too -- I hate BT -- but are easier to use).

There's a user experience benefit of the semantic separation between BT and Wifi; you move around and change SSID all the time with a portable device like a phone, which is for connecting to "other" things. Your BT connections "travel with you" as they are part of a personal area network.

There are weird boundary conditions (connecting to your friend's BT speaker, using wifi to cast to a chromecast or AppleTV) for which neither is perfect; frankly I think those wifi cases are more like the BT use case.

Just because the two use the same BT spectrum doesn't mean they should be unified.


The problem with BLE is they make huge Bluetooth standards but forget the things people will actually want.

Their first revision was 20-23 byte packets. The first thing anyone looks for when doing BLE is “how to send larger data”. BLE 4.2 and packet length extension helped this but it’s still pretty rough because you aren’t promised any specific MTU size by spec, just hope you can the packet size you need on connection.

I have a BLE product in the wild, but the amount of work I did to get around packet limitations was probably not worth it all said and done.

BLE is good for event based data transfer of flags and small data, but you outgrow that pretty damn quickly.


Any advice on how to start messing around with BLE? Is there a good adapter to buy for packet sniffing and whatnot?

I have a new interest in BLE on account of myself recently becoming bluetooth enabled.


Bluetooth even with proprietary codes like aptix is lossily compressing already compressed audio which is a shame. I really hope WIFI (or a future Bluetooth standard version) will alleviate this big issue in the future


if only there were some good flows for attaching sound input / outputs & input devices over wifi

there are some technical means. one could use usb-ip[1] to bridge peripherals, for example. but few rise to the status of "flows", good well paved paths.

[1] http://usbip.sourceforge.net/


Netcat and sox works spendidly:

    rec -p | nc -l 7777

    nc 192.168.1.11 7777 | play -v 0.1 -


That can work, but probably not very well. Without real-time resampling some buffer in the path will underrun or overrun depending on whether the recording soundcard is sampling slower or faster than the playback soundcard. The clocks are not synchronized.


I wörks!. I have microphone outside soundproof windows and it is very synchronized, within 200 milliseconds. Good enuff for movies methinks.


Correccion. It is actually 3 seconds via Wifi. Previous test was direct loopback in the same machine. Good for music and other non-synchronized activities anyways.


Correccíon #2: If you manage to start both the server and client at the same moment, it is indeed 200 ms.


Is usbip finally usable?


In my experience, no. I would really love for it to be a thing as it's perfect for sharing devices on your computer to containers or VMs (i.e. program and debug your ARM MCU with a complex embedded toolchain and programmer running in a simple docker container). But in my experience USB over IP is still kind of a "here's a bunch of kernel modules and not a lot of documentation; good luck on windows" stage.


Sad, that was my experience too a few years back.


What I’d love to see is my Mac be able to connect my Zoom or Teams meetings to my AirPods via my iPhone (on wifi) so that I can freely run to the kitchen without losing Bluetooth connection.


Tangential, but I am amazed by the Bluetooth range I get out of my AirPod pro and iPhone 12. What the hell are they doing? My laptop doesn’t even get to 1/4 of the range. I’ve walked into a vault (like a bank vault) with my phone outside of it, and the music never stopped. I’ve walked a good 20ft and everything was still smooth.


What other quality Bluetooth devices do you have or have you used? While Airpods and iPhones work excellently (I have both the gen2 Airpods and iPhone 12 mini), my Sony XM4s also have great range with my Windows desktop.


I’m comparing the same AirPods between iPhone and my Thinkpad and I can’t get the same range. Unfortunately the only other Bluetooth device I have is my aftermarket ECU in my Miata and there is no iPhone app to connect to that one.


Your music had probably been buffered.


And I just said a silly thing. It's Bluetooth, not Wifi we are talking about.


I just always use my phone for zoom instead of my MacBook. About the only feature I like of MS Teams, is the ability to switch devices easily. It only works for scheduled meetings, not calls though.


The Teams iOS app does audio hand-off really well in my experience. Have you tried that?


I haven't - but I want to use my desktop for the camera and viewing shared screen as much as possible, I just don't like missing something if I have to step away.

As it is my Airpods ALMOST reach to the kitchen, just as I open the fridge they disconnect.

And they don't reconnect until I'm very close (though I suspect this is because they decide to connect to my phone).

Now that I think about it I should just join the meetings twice ...


Teams does this. Join call from phone. Then join from computer. Computer will ask if transfer or join w/o audio. Choose join w/o audio.

Best way to be on calls. Need to walk around no problem. But can still see full screen when desired. Also can work if your home internet is spotty and you transfer to phones internet service.


Get a mini fridge for your office


A recent development not mentioned in this article is Wi-Fi Halow/802.11ah, this is a new standard using the 900MHz ISM band. Advantages of this are better signal penetration/range for the same output power (or lower output power for the same range).

Compared to 2.4/5G wifi Halow uses narrower bandwidths so has lower data rates but is still much higher than BLE. BLE can do up to ~2MBit with a 2MHz BW, 802.11ah can do up to ~8MBit with a 2MHz BW but can go up to 16MHz BW. Additionally 802.11ah supports MIMO, so it can also scale the bandwidth up with more antennas.

There are some other new fun features in 802.11ah namely TWT (target wake time) which is a powersaving method that solves a lot of the problems the article mentioned with powersaving in wifi.


I can’t wait to get rid of Bluetooth for its issues with range and badly implemented devices and software where pairing doesn’t “stick” for long. The only good thing is the energy consumption with BLE. Like USB 3.whatever and USB-C, Bluetooth is generally a pile of garbage standards in a pile of garbage standards with poor implementations. Some Apple devices seem to be doing better with Bluetooth. Elsewhere it’s a bag of hurt. But that’s purely anecdotal from my end.


I hope not. For one thing, Bluetooth consumes less battery.

Also, when the topic of Bluetooth comes up, many people complain that they have Bluetooth issues. Sadly, some manufacturers do not care about interoperability. Maybe I'm just lucky but I don't even recall when I would've had issues with Bluetooth. I use several devices practically every day (car, headphones, speakers).


As the paper explains. At the actual energy consumption level, the modulation techniques used by WiFi are about 3 times more efficient than Bluetooth.

The whole premise of the article is that this allows for efficient use of WiFi hardware to emulate BLE like behaviour.


At least read the abstract before commenting, man...


I did, man.

Table 1.

  Wi-Le Energy/packet: 84 uJ
  Wi-Le Idle current: 2.5 uA

  BLE Energy/packet: 71 uJ
  BLE Idle current: 1.1 uA
You can see this from Figure 4 as well.

So, like I wrote, Bluetooth consumes less battery.

In addition there are unsolved issues, which are likely to add to the overall consumption.

Edit: formatting


> We believe that an application-specific integrated circuit (ASIC) implementation will have much lower power consumption.

Line from right above the table so it could possibly go the other direction and use less.


Where I live the Wi-Fi spectrum is already busted. There are several dozen visible access points right now, and if I go outside where no metal shielding is happening I'd probably see 100 or more of them.

And that's just the well-behaving honest Wi-Fi access points, not all the other crap on these frequencies that periodically destroys signal with interference.


Bluetooth uses exactly the same spectrum as one of the WiFi frequencies (2.4GHz).


Right, though it uses narrower frequency bands, so you have about 10 times as many channels in the same ammount of spectrum.

And Bluetooth hops around these channels, dynamically selecting the one with less interference.


And when I have wifi turned on on my laptop, I get a .5 second gap in bluetooth audio approximately once every thirty seconds.


Turning on Wifi makes bluetooth audio unbearable on my Lenovo Chromebook.


Yeah, that's probably a contributing factor to the problem.


Wi-Le is interesting. Does it mean you don’t have to connect to a wifi hotspot to do configuration?

The product I made uses BLE to connect between device and mobile for things such as connecting the device to a wifi router. Otherwise the device connects to wifi normally.


I thought Wifi Direct already did this?

https://en.wikipedia.org/wiki/Wi-Fi_Direct


So only one way (upstream) communication via "beacons"? What about downstream - thats where the complexity is.


And BT uses adaptive frequency hopping which is huge for avoiding interference.

Ultimately, your BT experience will vary depending on whether your device vendor has implemented advanced features, and done interoperability tests.

Lastly, BT could relatively easily move to the shiny new 6 GHz spectrum and make use of Very Low Power (VLP) assuming there is some spectrum harmonization done all over the world. https://docs.fcc.gov/public/attachments/DOC-363490A1.pdf


Beyond battery efficiency, another pro that WiFi over Bluetooth is the security.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: