Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why is Bluetooth so unreliable?
351 points by whitepoplar on July 12, 2017 | hide | past | favorite | 261 comments



This isn't the first time I've talked on this. I've had some experience with bluetooth on Linux, and as a radio guy. The answer is there are problems from Layer 1 to Layer 7, needless complexity, and design by committee.

Bluetooth is an EXTREMELY complex radio protocol on Layer 1. It's like a mating dance between scorpions in the middle of a freeway. High chance something gets messed up.

Layer 1 keeps drastically changing too. Bluetooth 1 and 2 use completely different modulations, and are not backwards compatible. Bluetooth 3 simply was an extension to 2. "Let's agree over Bluetooth 2.0 to use WiFi instead." Bluetooth 4, while much simpler, uses an entirely different scheme.

Instead of a "general purpose" wireless network like WiFi, Bluetooth tried to be application specific. Except the only profiles everyone wants are mice, wireless audio, and fitness trackers. If you look at the application layer spec, it reeks of design by committee. Everyone haphazardly jammed their pet projects together, and there are redundant and vestigial parts everywhere.

The Linux side of BlueZ is abysmal. Honestly, I don't even know how anyone does anything with Bluetooth on Linux besides a mouse and keyboard. And barely even that.

As much as I hate on the protocol, the Layer 1 spec is truly ahead of it's time, in some areas. Watching two radios frequency hop, and negotiate to avoid a congested wifi channel was unreal.


> This isn't the first time I've talked on this. I've had some experience with bluetooth on Linux, and as a radio guy. The answer is there are problems from Layer 1 to Layer 7, needless complexity, and design by committee.

I'll add on to this a bit.

The Bluetooth stacks out there are, in general, tested only for a few of the many profiles BT supports. The firmware running on the BT chips is always of questionable quality, it is a black box, that in the case of one major chip vendor, is not coded using source control(!), figured out empirically as we'd get firmware drops that had fixed out latest reported issue, but bugs from previous FW drops would be present again. (Maybe they did use source control, and just had the world's worst regression test suite, e.g. none?)

That is device side. On the PC/Phone side, things are exactly the same. EVERY phone has problems with their Bluetooth stack. The Android Bluetooth stack introduces new bugs in between major revisions So does the iPhone BT stack. The iPhone BT stack had a problem for, I believe it was 3 major OS revisions, where it would just drop connections to BTLE devices, and users had to turn BT on and off on the phone to reconnect. It was reported to Apple by multiple vendors, it took multiple years for a fix to come out, and it made BTLE on iPhone a horrible experience. But similarly severe bugs exist everywhere.

HID is the one profile that will work. Serial also probably works, more or less, but serial you have to build a lot of other infrastructure on top of it. BT pairing on phones works nowadays, thanks in part to the popularity of wearable devices, but it barely functioned 5 years ago.

Other specs, including simple ones like media control (AVRCP, which is an awesome spec by the way, give it a read), will have differences in implementation between OSs (desktop and mobile), typically some basic features are not implemented for who knows what reason.

All that said, BTLE is incredibly simple and easy to understand. :)


The biggest problem we have on the Android side of things is the steaming pile that is Broadcom's bluedroid. The codebase does its own allocation tracking, threading, scheduling, and has state machines with call ins, outs, and backs (none of them used the same way). In my time on Glass, bluedroid was one of the biggest headaches we had with Bluetooth, not to mention the Bluetooth chip firmware. We tried to clean up as much as we could (and did a fair amount) bit the stack truly needs a rewrite from scratch. Some things you just can't incrementally fix -- especially when new features are continually dropped into the codebase from afar, without regard to the damage they do to the codebase.


Some things you just can't incrementally fix -- especially when new features are continually dropped in from afar,

It is the new features here that are the problem. I think you actually can incrementally fix anything, but such a project can be as expensive as a rewrite -- and neither will succeed if new stuff is constantly being thrown in.


Some very insightful discussion! I work with BLE at the software level on iOS and ever since iOS 8 it has been pretty stable, iOS 6 was terrible. However, it does seem to be quite flaky at times (interference, cheap BT chip, etc).

I've been doing iOS development for wearables for a few years and have found that you can generally make BLE for an iOS app very reliable. But it requires some work in the software to cover up some of the underlying flakiness. Reminds me a lot of building resilient networking code for apps back when cell service was slow and spotty. But if you do a little work to build in command queues, auto-reconnect, etc it can be very solid for the user.

Oh, and it helps a lot to pick a good chip. From my experience, the Nordic NRF5x series is the best. But as has been mentioned elsewhere in this post you still need to have a quality antenna design. I've noticed a huge difference between using a reference design and one that spends some time in a chamber with a good RF engineer.


> BT pairing on phones works nowadays, thanks in part to the popularity of wearable devices, but it barely functioned 5 years ago

Huh? Pairing worked like 10 years ago. And sending files between phones. Everyone in school was sending J2ME games/apps around :D Audio with media control also worked. And modem (using the phone's GPRS/EDGE connection on a PC).


I don't know, just last week I was having issues pairing my iPhone 7 to my friend's new Jeep and ended up giving up. I usually just keep bluetooth turned off on my phone and don't buy bluetooth devices since I find it's often more frustrating than not.


Same experiences. I don't even bother trying blutooth anything anymore. It's more frustration than it's worth. It's disabled on all my devices.


>just last week I was having issues pairing my iPhone 7 to my friend's new Jeep and ended up giving up

for audio, right?


TL;DR: Remove old paired devices! I just went through this this week with a 2013 Wrangler. Here's what fixed it for me: deleting the other bluetooth devices from the radio. It had my old iPhone and my brother's iPhone. Neither was anywhere nearby, but when I tried to pair with my new iPhone, it wouldn't work. I tried multiple times over multiple days. Finally, as a test I just removed the other 2 phones from the profile. It paired perfectly after that and has worked since.


Any suggestions for pairing a OnePlus 3 with a (2016) BMW for audio? Mine works like a champ for speaker phone but in order to play any kind of media, the OnePlus refuses to do it. (Though my son's Samsung Galaxy S5 works.)


My experience was that Bluetooth was solid at the end of the dumbphone era (although the UI was often crap). Like you we were bluetoothing files back and forth, I was syncing phone contacts with my Mac over iSync Bluetooth, sending SMS/getting call notifications on my Mac (can't remember the name of that app), etc.

But when iOS/Android smartphones came on the scene it took a huge step back in reliability. A2DP also took a huge step back in audio quality until the new platforms tweaked their SBC parameters and added in MP3/AAC support.


Indeed.

The craziest setup i ran with back then was a Nokia N800 paired up with a Sony Ericsson C702 and a no-name folding keyboard. The C702 was also paired up with a pair of Jabra headphones/headset.

At certain times all of those could be in use. The C702 acting as the modem/router for the N800, whole also playing music through the Jabras, and me typing comments on the Maemo forum etc.

The only times there were a hitch were when i used the modem profile between the N800 and the C702, rather than the PAN. Something about the intensity of chatter or something between the two made the music and keyboard stutter.

Do note though that headphones can act up if you are out and about with little for the signal to bounce off. Best then to have all devices on the same side of the body for less obstructions.


I used to use Bluetooth PAN with my M600i (UIQ3): no wifi? No problem! I connected to my dodgy windows 2000 laptop with some BT software and drivers (and a dongle), and shared its connection to my phone :)


I'm glad your experience was very different from mine.


All due respect, it does work much better today than it did 5 years ago. Many things now "just work" and it used to be more fiddly.


> that in the case of one major chip vendor, is not coded using source control(!),

In this day and age it's absolutely insane to not have revision control.


I was using SVN for my personal projects 10+ years ago. It's crazy to think that some companies may not be using source control even today...


I used to work with SVN in a group of 3 developers and several branches (test, staging, master) and it was pretty messy.

When I read the original comment about bug fixes disappearing, it reminded me of merging test onto stable by hand, not knowing what was "the good code" and having to ask, research, etc. It was messy and some times we got regressions.

We called this thing, a bug fix got "merged away".

Nowadays with Git, feature branches, better test coverage, continuous integration; things have got much better.


You guys must not work in electronics.


I worked in electronics 30 years ago and we had revision control.


They're definitely out there.

Even in places that think they do, it seems pretty common to have an additional codebase in the database without any real version control.

I'd like to know how problems like this correlate to data breaches.


I worked for a client years ago, Dropbox was source control. It wasn't as bad as none, but you lost changes at the 30 day mark.


Were the (technical) employees happy about it?

I.e. was it just a crazy requirement handed down, or did they not know better?

I just can't imagine it even for solo development, nevermind collaborative work! "Agh. We must have had a merge conflict, damn, the core part of my work is gone... Let me see if I have a recent copy... Oop, no, I don't, because I've run out of disk space from all these clone backups."


FWIW, I use git inside Dropbox for my personal solo projects. I don't want to have to use commits and git's remote syncing mechanisms to share in-progress code between my desktop at home and my laptop. I want to be able to pick up exactly where I was on the other machine, including my git staging area and local unstaged changes. It works perfectly for this.

I do occasionally get Dropbox-level merge conflicts for tmp editor/IDE files, but I can't recall getting any for source files.


So, neat trick: "git clone" your local Dropbox copy somewhere, and you get the benefits of "commit" and "push", but it's local; and then your remote is synced via Dropbox :)


> neat trick: ... and you get the benefits of "commit" and "push"

GP seems to be deliberately avoiding commands relating to remote management, such as push.


I use a similar setup. I use spideroak hive since its 'zero knowledge', I don't however sync any binary or IDE directories. This allows me to have the source files and git repos synced, but still keep somewhat independent set ups.


just so you know, git on dropbox is not a great idea in a team. solo it should be fine but you are probably better off using bitbucket for free.


All projects were single developer. It was a burn and churn environment. Contractors from all over the world, share a folder from the root, they work in it, when they completed it, the folder was unshared and they were paid.

It worked OK for that purpose. And I still have dropbox mainly for that reason. My source code is always local. Everything goes into TFS, but the local is backed up to dropbox.


Is there any reason to hope things might be finally resolved in the somewhat near future?


I'm not sure, I'm no longer working on products that use BlueTooth. I know that BT pairing has gotten easier and easier over time (improvements to the spec), but the real important parts are the BT stacks and BT chip firmware. From what I understand working with BT Engineers, the best BT stack was from Stonestreet One, who was recently bought up by Qualcomm. If someone had bought them up and open sourced their code, the world of BT would be a much better place right now.


Based on 16 years of the technology never failing to disappoint, none whatsoever.


Well, after 20 years, USB (another specification which aims at making a hundred kinds of different coffees, like about every specification designed in the last 20 years) kind of works nowadays, so there might be a bit of hope. Hmmm wait, except twice a week when I plug USB devices in the 'wrong' order on my computer. Oh well...


On the bleeding edge, the future with reversible connectors is here, with USB-C style connectors (on both ends!) for everything.

'course, now there's a hundred dollars in various adaptors to rebuy...


Serial has its issues too. We are using a serial port over BTLE and when you need to do anything with debuffering/buffering and online data... you no longer have a UI because the transmission is so slow.


> As much as I hate on the protocol, the Layer 1 spec is truly ahead of it's time, in some areas. Watching two radios frequency hop, and negotiate to avoid a congested wifi channel was unreal.

What are you using to watch Bluetooth at a low level?

My Magic Mouse at work would get very jerky with my Mac Pro. Resetting it would sometimes clear it up for a while. I swapped it with my Magic Trackpad from home, and the mouse is fine at home, and the trackpad now has the jerkiness although not as much as the mouse did.

I tried moving the Mac up on to the top of the desk so the mouse/trackpad is only a few feet away, and has line of site instead of having a metal desk between it an the computer, but that made no difference.

I'm speculating that it is some kind of interference from the office downstairs. (It's an office of Azima DLI, which develops all kinds of stuff that might be noisy in RF).

I'd like to be able to take a look at what is actually going on in my office (1) with RF in general, and (2) specifically between my Mac and my trackpad or mouse.

I thought this might be an excuse to step up from one of those $15 SDR dongles I've played around with to a HackRF, but from what I've read the HackRF cannot fully deal with the Bluetooth frequency hopping. I think I read that using multiple HackRFs one can do it, but that's beyond what I want to spend.


The best tool for low-level Bluetooth/BLE sniffing and injection that isn't obscenely expensive is probably the Ubertooth One.[0]

[0] https://greatscottgadgets.com/ubertoothone/


USB 3.0 devices can interfere with Bluetooth. Do you have any external hard drives in plastic enclosures? Non-shielded cables? USB 2.0 doesn't present this kind of problem.


> Watching two radios frequency hop, and negotiate to avoid a > congested wifi channel was unreal.

Exactly. As I said in another comment, we did a test in a very busy office space and didn't loose a single bit of data @ several 100kbit/s over a week transmitting continously. We could see it hopping on the sniffer.


True, as long as both transceivers are 100% to spec. The problem is, it's a VERY large and complicated spec, and it's easy to make one little slip and have to completely renegotiate the connection.


You're a radio guy, I have two questions:

- Are there similar close range wireless protocols that are stupid simple (say like IR communications but on Radio spectrum)

- Could someone or some team something stupid that could be handled by todays micro controlers (esp32 is 7€ for a dual core 32bit processor, I suppose it can be used as a dedicated comm. proc)

In a way, wifi is perfect for 0-50m~, we just need a similar idea for 0-10 and low bandwidth.


You want Nordic nRF24L01+ radios. That can be trivially handled by an 8bit micro using SPI, modules with these cost peanuts. However, if you want a PC/phone connectivity you would have to build your own dongle - or reprogram one from a wireless mouse/keyboard. Many use these chips.

Another option is ANT+ - that is very low power but also very low bandwidth (8 byte long messages 4x a second ...) protocol used mostly by fitness gear like training bikes, treadmills and such. Many smartphones have radios that are able to use it. Again something that is easy to make work even with an 8bit micro.

For more advanced stuff there is Zigbee, but that is quite a bit more complex (but still manageable even by an 8bit MCU) and there are some licensing issues which kept it rather obscure.


Another radio question: what about having long distance radio communication between 2 devices or more? Any project around it?

Local radio text chat could also be fun... Think of an IRC channel, only for the people who can grab the signal...


It sounds like you'd be interested to get into ham radio. There are plenty of digital modes, and you can communicate all the way around the Earth in the right atmospheric conditions using a big enough antenna, or even bounce signals off the Moon.


There are videos of HAM guys scanning randomly through the spectrum to find people. I remember one video, 80% of it was static, and then a blip.. the guy (somewhere in the US) started to poke around that freq. another blip.. then a voice. It was a Russia dude answering, it was surreal.


APRS is something like that, popular with hams.


for just 'text' maybe something like LoRaWAN[1]

[1] https://en.wikipedia.org/wiki/LPWAN#LoRa


Oh yeah that nordic chip sounds familiar.

ANT+ could be enough for a control path or maybe keyboard.

I always wanted to try zigbee but it felt too much like a closed world, although people seemed very satisfied with it it seems.


Isn't zwave an option for more complex networks?


> As much as I hate on the protocol, the Layer 1 spec is truly ahead of it's time, in some areas. Watching two radios frequency hop, and negotiate to avoid a congested wifi channel was unreal.

Is it really ahead of its time?[1][2] I recall reading somewhere that Bluetooth traces much of its roots to military radio applications.

[1] https://en.m.wikipedia.org/wiki/HAVE_QUICK

[2] https://en.m.wikipedia.org/wiki/SINCGARS


Funny thing is that the profiles are what i actually like about Bluetooth. Particularly the ones imported from OBEX.

While wifi is perhaps more interesting, at least with OBEX push and FTP i know (and it have yet to have it fail, knock on wood) that i can transfer files, albeit slowly, between two devices.

Get two devices on the same wifi network and i still need to figure out a protocol both of them can talk so that they can exchange data (never mind that they had to introduce wifi direct because certain devices refused to connect to ad-hoc wifi hotspots).


FHSS is an amazing technology, from the standpoints of both efficiency of spectrum usage and being a clever loophole for regulators.


Everyone says this, yet I use bluetooth products (all Apple hardware) literally every day. Apple has really polished bluetooth to the point where I don't even remember I'm using it, and never have to deal with pair requests, signal loss, etc.

In most everything I own, bluetooth is frustrating crap. Apple's somehow gotten it (mostly) right, so why can't anyone else?


To provide an alternative data point, everything works flawlessly with my Bluetooth wireless speaker except my MacBook. It constantly skips.


This and the random disconnections of my BT trackpad and keyboard are really annoying.

I believe more people here will have encountered this behaviour when all of your paired devices suddenly disconnect from the MacBook, what have helped me is to turn my Wi-Fi radio off, wait for the devices to reconnect and then turn it on again. This has happened to me for at least the past 3 or 4 major versions of OS X/macOS.


That's interesting, I've never had any issues at all with my Bluetooth. (MBP late 2014, mid-2012 prior to that one). I use the keyboard, trackpad and mouse and never have used anything else.


This, I have the same experience with my BT bose speaker, forever having issues. Sometimes it'll need to be re-joined, other times it works, but then starts skipping. Granted I've had pretty good experience with Apple to Apple BT. Like Apple BT mouse or trackpad, where as the Logitech BT mouse is jumpy/sluggish.


I've been having a shitty experience with 2.4GHz WiFi combined with Bluetooth on my late 2013 MacBook Pro Retina for ages. The Wifi performance goes to shit with multiple devices connected and the bluetooth devices stutter and disconnect. It was completely unusable in Yosemite but slightly better in some version after that. It's currently "usable" unless I do heavy downloads over 2.4GHz with a bunch of bluetooth devices connected. With 5GHz Wifi everythings fine and dandy.

It's not an issue when running Windows on the machine and it's not due to congestion regarding 2.4GHz wifi networks. And it's not AP related. It's macOS related. I am also not alone in experiencing it. Not the worlds biggest issue but frustrating as hell.


Personally I've always had skipping problems with -ANY- Bluetooth speaker, be it with my Macbook, iPhone, Android handsets, etc. And I've got everything from a bluetooth enabled car stereo to a Bose Soundlink II, just doesn't seem to ever get better.


I can't understand why AirDrop is as unreliable as it is. It's abysmal and only Apple is to blame.


Yep. I took to emailing pictures to myself to transfer them because it was so flakey. The tricky part was that because it was supposed to Just Work there's not much configuration or anything you can twiddle to refresh or anything, so when it Just Doesn't you're completely out of luck.


Now that is 100% true. Is AirDrop bluetooth or Wifi though?


Seems there is a bit of both, depending on what generation of hardware and software involved.


That's probably why it's on the flaky side.


I can't even get the Apple Wireless keyboard working on my Mac :( Keeps disconnecting every odd minute


Close to a hotspot or a noisy microwave nearby?


It's pretty much transparent to me using Android and my Windows PC too. I didn't realize it was supposed to be "unreliable."


Oh I've definitely found it unreliable elsewhere, especially in the cheaper Android handsets I used to get.


I guess it's a matter of getting what you pay for. It works quite well with my Nexus 6P.


You can't have a bluetooth serial adapter on the iphone, because it only supports a few bluetooth profiles. This cuts a lot of complexity.


But bluetooth keeps getting rammed down our throats with bluetooth hardware being installed on motherboards (sometimes not even as removable modules) and bluetooth software in the O.S. turned on by default (lolsecurity), sometimes even at the BIOS/EFI layer.

Add in all the complexity of different bluetooth versions and you have something that "seemingly everyone" has but few people reliably use.


I've spent the better half of 3 years building products on the 2.4ghz spectrum (WiFi and BLE).

Most of the issues in this thread are related to poor hardware design more than a crowded spectrum. While the spectrum is in fact crowded in metropolitan areas, most Bluetooth communication doesn't require much bandwidth and can handle error prone areas with ease.

While the frequency hopping helps a ton on BL (and WiFi for that matter), the issues people outlined are due to:

1) Shitty firmware 2) Shitty hardware

Antenna design is black magic and only a few firms in the US do it well. It took us almost 10 months to fully design and test our antenna assembly(s) with a very capable third party firm.

It took dozens of trips to a test chamber, a dozen computer simulations that take a day to run, and PCB samples that take days to verify. They have to be tuned every time copper or mechanical parts move as well.

It's a real pain and most Bluetooth products use garbage chip antennas and baluns or reference designs for antennas. This increases the sensitivity to failure and provides a generally shitty experience.

Most of your product interactions around bluetooth are budget products connected on one side of the equation (e.g. a $50 bluetooth headset). So despite how capable your Mac or iPhone is, if you have a garbage headset on the other side with poor antenna design, it'll be a disaster of an experience.


> So despite how capable your Mac or iPhone is

this made me literally lol. my macbook pro is the shittiest bluetooth device i have ever used. just search google for macbook pro bluetooth audio issues, and you will find a slew of forum posts going back years. there's one terminal hack after another suggested just to try and get macbook pros to properly play audio over bluetooth. i don't even bother using my macbook pro these days to play audio through a bluetooth speaker. i just use my iPad or android phone.


> just search google for macbook pro bluetooth audio issues, and you will find a slew of forum posts going back years

Apple has also shipped hundreds of millions of devices over that same time period, so the number of people with failed hardware is non-trivial. More importantly, it is extremely rare for people to perform root cause analysis on the actual failures which combined with the number of vendors and protocol complexity in the Bluetooth world means that a significant number of times the blame is placed on the wrong component.

The main thing I take from this is that vendors need to think about how they can provide a visible way to learn about peer device errors so e.g. your Mac could tell you that sound quality is horrible because the Bluetooth device doesn't support a high-quality codec or sufficient bitrate, is seeing high levels of retransmits or dropped packets, etc.


I think you're assuming that there's an issue with the macbook's implementation instead of the device you're pairing it with.

Like I had mentioned previously, if one side of the paired devices has a garbage antenna setup, it will have poor performance. Things like your soundbar, or a generic bluetooth speaker probably use reference designs for their BT setups and have not tuned them for the enclosure or use case.


not an assumption. given that everything else works together, it's the macbook pro that's the odd man out. my other devices include multiple bluetooth speakers, mice, phones (android and even windows phone), surface pro, iPad, and others. they all work seemlessly with bluetooth, but the macbook pro struggles with most, especially when trying to play audio over bluetooth. like i mentioned, this is a well-documented problem if you do even a basic search.


I haven't had any issues with Bluetooth on a Mac so far, ever since the BT 1.2 days. I've used Mac OS X, macOS and Linux on PowerPC and Intel Macs with both built-in Bluetooth hardware and USB bluetooth adapters, and all of them were fine. Doesn't seem to be an issue in either the Apple-implemented bluetooth hardware or software to me...


My MacBook Pro Retina 2012 had really flakey Bluetooth. Speakers, Apple Keyboards/Trackpads you name, they'd randomly drop out and need to be power cycled. My 2017 seems a lot more reliable though (knock on wood)


Apple to this day doesn't support aptX. It's plain embarassing.


Why? It's not like they say they do, but then don't. I've seen far more implementations where the product is listed as supporting a whole bunch of profiles and technologies, but then either doesn't implement it in a useful way or simply never works no matter the device. Most often this is with devices that originally used audio, but then added support for calling, address book, media controls etc. but implement them poorly or downright broken in a fashion that makes the base functionality even less usable than it was before.


Not supported on iOS true but is definitely supported on macOS.


I'm not sure how capable my Mac or iPhone is, because I paired them and sometimes I want to tether internet from iPhone and it's faster to use WiFi hotspot, than to connect them again.


you can just select your phone from your laptop these days and it auto-pairs and turns on tethering. pretty slick.


> 1) Shitty firmware 2) Shitty hardware

What quality is the hardware in something like this:

https://tehnoetic.com/tet-bt4

?


>2) Shitty hardware Antenna design is black magic and only a few firms in the US do it well. It took us almost 10 months to fully design and test our antenna assembly(s) with a very capable third party firm.

I think that you are remiss if you don't actually name the companies/produucts/devices which are designed well then....

There was a company in Aptos/Santa Cruz who was designing the antennas for a lot of Lockheed stuff back in about 2006/2007 or so who we went to for antenna designs for our active RFID devices - but I cant recall for the life of me the name of the company now...


As a rule of thumb, the notable product manufacturers like Apple, Samsung, Nokia, Microsoft, Amazon, etc, have extremely capable RF design teams.

Your run of the mill bluetooth headset, mouse, or keyboard likely does not spend the time on the RF design beyond "does it work".


Anecdotally, I have a Microsoft Surface Pro 4 and a Microsoft Surface Keyboard (Bluetooth), and it is incredibly unreliable. Pressing a key after a period of inactivity will select an action from:

1. Do absolutely nothing.

2. Completely lock up the computer so it needs a hard reset.

3. Completely lock up the computer for a few seconds, then work.

4. Start working, but ignore ~40% of the key presses.

5. Swallow the first keypress or two, then work.

6. Actually function like a keyboard.

If Microsoft has a good RF team, I'd hate to see what worse teams do.

I wish there was a USB version of this keyboard, or at least a wireless dongle version (like Microsoft's cheaper keyboards).


Those sound like software/firmware problems, not RF problems. I think it's pretty well-known that Microsoft has an absolutely dreadful software team.


"Run of the mill bluetooth headset, mouse, or keyboard" manufactures don't make Bluetooth chips, they buy them. What part of the bt stack do you work on? You're saying nonsensical things, wifi does not do AFH. I didn't know apple made bluetooth chips? You don't even mention Nordic or TI, also not all big bt companies are in America.


I never said the chip manufacturers were device manufacturers, perhaps you read that incorrectly?

The companies that deploy a TI, Nordic, Broadcom chipset are subject to the scrutiny I just mentioned. The chipset manufacturers offer reference designs but generally you need a custom matching circuit and a complex RF design to do it correctly. That's where most manufacturers skimp out because its generally "good enough"

As for WiFi, it does not frequency hop but is capable of "auto" selecting a channel on boot (generally).


Yes, I clearly misread that.


I didn't know apple made bluetooth chips?

W1

https://9to5mac.com/guides/w1/


> You're saying nonsensical things, wifi does not do AFH.

Many, many Wi-Fi implementations are capable of frequency hop, including my own at home.


That is not possible, it's not part of the spec. Maybe what you're thinking of is the wifi hotspot selecting the channel automatically when setting up a network based on the neighbours it can see. Nothing to do with frequency hopping. AFH happens between packets and maps out bad channels. Wifi does not do that, never has and never will because it's governed by different rules.


Just because a spec doesn't have it doesn't mean it can't be done. At the same time, implementing something that makes you out-of-spec (like Zero Handoff with a virtual floating BSSID between multiple orchestrated AP's -- totally unsupported and not in the spec) and designing it the right way can sometimes mean that a function you thought could not be there is there anyway.

What he is probably referring to isn't frequency hopping, because what it sounds like and that it actually refers to means different things to different people. For someone who is not a RF engineer, hopping might simply mean "an AP that has multiple radios that uses one to scan the 802.11 2.4Ghz space to see what channels have the least utilization, and then restarts the other radios on a that channel, possibly forcing all clients to lose their connection as if out-of-range and reconnecting again on the new channel as they find it".

This, of course, has nothing to do with RF frequency hopping, but if you don't actually know anything about RF, you won't know what hopping means either, and just imagine that what your router with built in AP is doing must therefore be that 'hopping'.


> For someone who is not a RF engineer, hopping might simply mean "an AP that has multiple radios that uses one to scan the 802.11 2.4Ghz space to see what channels have the least utilization, and then restarts the other radios on a that channel, possibly forcing all clients to lose their connection as if out-of-range and reconnecting again on the new channel as they find it".

Bingo, except add 5 GHz (and soon, 900 MHz), too.


Nope, I’m thinking of when I transparently switch channels when my neighbor comes home and starts blowing up his own network. I’m aware it’s not part of the specification, which I’m sure you’re aware doesn’t mean anything except that it’s not part of the specification.


I don't know what you're thinking of but no, you can't have that because if you had this magical hotspot your network card in your iThing would not know what to do because wifi does not do frequency hopping. Every time your hotspot changes channels it disconnects all your devices.


I can't speak for jsmthrowaway but I think that he's referring to Band Steering (please correct me if I'm wrong). It's not in the spec but my Unifi gear definitely has a checkbox to enable it if I opt-in to see "advanced settings".

As far as I know it's not dynamic, though. It just happens when a device connects to a base station and only "steers" the client to connect using one of the 2.4 or 5 GHz bands, and that doesn't interfere with channel selection.

The AP also has a function to scan bands to find the "best" channel to use but that disconnects all devices while the scan is in progress. I think some APs might be doing this automatically (?). My other AP (a Mikrotik) has an "auto" channel selector but I don't know what it does or when.


Right, and that's band steering, not frequency hopping.


Correct. I’m on Unifi gear with two bands. When it has opportunities to do so, it moves around the SSIDs themselves, too, which seems to happen after every client gets steered to one of them (it’ll reconfigure the other). I might have enabled this feature at some point without realizing it.

My neighbor moves around his channels, too. That’s what’s making me comment that it’s likely more common than we think. I’m not at home to prove this and I’m getting downvoted for sharing, but I’ve watched him move between 36, 40, and 44 at a frequency that suggests it is happening automatically, probably when all of his sessions go idle or something. I have observed this with multiple tools.

My 5 GHz was on 44 this morning, because I happened to look for unrelated reasons. I just SSH’d home and it’s now on 36 without interaction from me. Specifications are not implementations. I’m almost positive this is a common feature which I’ve seen in many APs over the years, even though it’s not per-packet like Bluetooth (which I was also aware of, given that I'm experimenting with a Bluetooth 5 mesh system for IIoT).


You're forgetting devices with two bands, and assuming a lot about my devices (and competence). I know my network and its behavior, and I'm not just spewing hot air here.


You are arguing with me about something that is easy to verify in 10 seconds by looking at the wiki page: https://en.wikipedia.org/wiki/IEEE_802.11

(it does mention a frequency hopping physical layer but that was part of the obsolete predecessor from 1997, the modern spec uses OFDM so unless all your equipment is from the last century you're not using frequency hopping)


I think there is a misunderstanding related to the terms here:

1. IIRC my router is supposed to switch channel (i.e. frequency) automatically unless I explicitly set it to a fixed channel.

I don't know how but my devices figures out and continues working.

2. I think the misunderstanding is related to the word frequency hopping or whatever was used further up in the discussion.

I normally take "frequency hopping" to mean changing frequncy multiple times a minute or even faster while I doubt my wireless access point will switch channels more than a few times in 24 hours.

Disclaimer: I never really sat down and verified if it ever changed.

Also agree with logicalle that I wish for a bit more civility.


Paul_S, could you be a bit more civil to jsmthrowaway so as to elicit what he does know and how he knows it?

jsmthrowaway can you give more information about your setup and what the devices do?

No need for a heated argument. This whole thread is quite instructive!


Why was this downvoted?

(No I'm not complaining about being downvoted - this isn't even my comment - just trying to learn.


Based on my frequent lurking, I would guess it was due to the personal (and off topic) nature of the comment. Addressing users specifically in this way seems to be looked down upon here.


not really. people aren't allowed to be rude and uncivil with each other here, as it shuts down constructive discussion. this is in the guidelines.


This is a daily goddamn struggle for me. My Macbook Pro routinely forgets about my Apple trackpad, and the only thing that fixes it is restarting the laptop. The sound system on the laptop often selects the wrong mic for the input when a BT headset is connected. My iPhone keeps switching between headset and internal speaker/mic when on a call. Pairing the same device to multiple hosts (laptop and phone) is like birthing a hedgehog backwards. And let's not forget where you try to initiate pairing from a laptop or phone instead of the device. Why even provide the damn buttons to do it if they never work?


I would love a wired trackpad. Every couple of weeks I probably burn 20 minutes or so when either my home office trackpad or my office track pad just stop being seen by my MBP. The fast that my apple device frequently can't talk to my apple device has kept me from pushing much deeper in other bluetooth devices as it is bound to be worse.


I would love a wired trackpad.

You can use Apple's current trackpad (the "Magic Trackpad 2", I believe) on a Mac (at least) via USB even with Bluetooth entirely off.

Edit: I believe (but have not confirmed) that their current keyboard works similarly. Their current mouse does not, at least in any really useful way, because its Lightning port is on the bottom.


because its Lightning port is on the bottom.

I have always been amazed how everyone touts apple as the pinnacle of a design powerhouse, yet there are so many critically stupid design decisions the company has made, this being just one.


I totally agree, and had to Google for answers for this decision.

The rationale, at least what I found, to be that people wouldn't let it completely charge with a different design, but with this one people would leave it charging overnight. Also something about users ignoring the fact that it's wireless and just always use it with the cable attached.


Those rationales don't make much sense to me since they seem like they'd be equally applicable to the keyboard and trackpad.

Every design is about tradeoffs. In this case, the mouse would have to have a somewhat different shape to accommodate a port at the front. That different shape would surely be a poorer fit for the visual aesthetic they were going for and it likely would've impacted the feel during usage, as well.

Some people would prefer that tradeoff — certainly those who would prefer to (or have to because of an RF-noisy environment) use the mouse wired all the time. I expect the vast majority of their customers don't do that, though; and, I, for one, prefer (or at least am neutral to) the choice they went with.

It definitely makes for a silly-looking photo while it's charging, but it's certainly not a blunder that none of the designers ever considered.


Being able to use it while it's still charging is a really useful fallback scenario, though. Imagine if you couldn't use your phone while it was charging, if it was unusable until it was fully charged again. No one would put up with it. Yet a mouse is vitally important too!

If you've reached the point when you need a backup mouse for your primary mouse, then the primary mouse has failed.


Except that charging the mouse during a biobreak/while getting more caffeine will charge it enough to last the rest of the day.


Confirmed for the trackpad. It can indeed be used solely via USB. I'm doing it right now.

EDIT: added 'for the trackpad'.


The Magic Trackpad 2 with the built in battery and thunderbolt cable works like a wired trackpad when plugged in. No BT required when connected via USB.


Interestingly, that are probably the only devices that work flawlessly for me. MacBook Pro in ~3 different environments, automatically pairs input devices and speakers and selects the audio I/O accordingly (i think it switches to the BT device if it was the active I/O when it was last connected). Same thing with the iPhone when paired with the Car BT. But with basically all other devices I made the same experiences. I find myself regularly reinstalling BT and audio drivers on a HP elitebook because it drops connections and can't find devices afterwards.


Tethering iPhone to my MacBook over Bluetooth? Forget it, never works.


Works perfectly for me these days. It was a little fiddly 5 years ago, but with the newer OS functionality where you can select your phone from the WiFi drop-down (with tethering turned OFF on your phone) and it automagically connects to your phone, enables tethering, and hops online) is like magic. YMMV.


Sierra?


I'm on Sierra now, yeah, but I want to say it worked on the previous release too.


Ironically works great with my iPhone and my Surface Book. :)


Oh the irony... sigh

:/

I just resigned myself to do the Settings > tether over WiFi > WiFi only Every time.


The Apple trackpad doesn't work reliably in an environment with lots of stuff going on in the spectrum. Its basically unusable without a cable.

But otherwise it works fine.


I have a rule of never using bluetooth for anything critical. I have to fight bluetooth enough to get music working, I can't imagine having that same fight over my mouse or keyboard.


For me, the biggest problem with BT is that BT audio is almost entirely unbuffered. I wear a set of BT headphones connected to a fitness watch (Polar M600) when running. When the BT connection from the watch to the headphones is briefly blocked by part of my sweaty body (think arm movements when running), the BT signal is interrupted and the music cuts in and out with every stride. If BT audio could be buffered for 15-20 seconds, this would not be a problem.


This is one of the infuriating things to me too, yeah. On the one hand, Bluetooth is a horrifying mismash of all these special-purpose protocols rather than a generic pipe. But then, when I want to take advantage of the fact that it's not a pipe and that I'm shipping down prepared audio (rather than live audio), which is something one of these specialized protocols ought to be able to negotiate, suddenly Bluetooth is just a dumb pipe relative to this use case. It's the worst of both worlds.


If you want generic pipe there is always the serial port profile.


Somehow my Honda Odyssey buffers bluetooth audio for ~3 seconds. I had thought that the designers/developers were just inept because none of my other bt audio devices work this way. It never occurred to me that this could be considered a feature.

... no, I still think it's a bug.


UGH this is a problem with all Hondas before ~2014. Drives me nuts. It's even worse when using it with Google Maps since the audio is delayed.


Agreed. My 2008 Audi A8 had around 3s of buffering. If I pressed "next track", it would take 3s to respond. Parked and watching a video? Out of sync by 3s with audio. Definitely not a feature.


If it was buffered and you changed track on your phone, you would then have to wait 15-20 seconds for the track to change as it would playing from the buffer.

I guess if it was 2 seconds it might be more bearable.


I think OP meant to pre-send music and keep the next 15 seconds available locally. This way you can handle 15 seconds of interruption.

So, when you start streaming music, it starts playing immediately, but it tries to send data faster than it is played until you have a decent buffer. Obviously, this would only work for non-realtime things.


Not if it honored an "invalidate buffer" command.


Increase the size of the buffer to N > 1 songs and allow the device to tell the headphones to skip ahead in the buffer. If "next song prediction" is hard (e.g. the user is clicking around their library), support a device "burst push" mode that lets the device send the first M seconds of a song very quickly.

On a side note, it's interesting that this buffering problem is identical with switching channels on any streaming service, and that the buffering delay is a significant psychological barrier to moving around, and that traditional radio works really friggin' well for what it does.


Is it? The reason I ditched BT headset in favour of wired one was that, on my Linux desktop, it would over the course of an hour acquire ~1s of lag between video and audio. Could be fault of the Linux BT stack, but still, someone buffers audio somewhere.


You might be able to thank PulseAudio for that: https://bugs.freedesktop.org/show_bug.cgi?id=58746 From what I recall, with an unreliable enough Bluetooth connection it can easily hit tens of seconds of audio delay.


Bluetooth 5 is fixing a lot of these issues with Bluetooth audio. It'll be over BLE too, which is a simpler stack that'a less prone to issues than to old Bluetooth 2 based stack.

I'm waiting to buy wireless headset until they hit the market. Could take a while though.


Thats probably a problem with your headphones. I have the original JAMBOX by Jawbone and I can clearly see there is a delay (1 sec) between doing something a hearing it in the speaker. I suppose that is because of buffering.

It works like this on both iOS and Android devices.


I remember learning somewhere (perhaps a Podcasts discussion?) that Bluetooth headphones can tell the host device what their audio delay is. Given that information to host device can adjust video playback so that the sound syncs up perfectly. I know iOS is capable of this, I'd assume macOS is well.

So it depends on the headphones.

Of course that only helps with premade content. If you're playing a game the OS can't adjust the game for 200 ms of lag, everything is just going to be out of sync.


I've always wondered how that worked! When watching a movie on my Linux laptop, it automagically syncs, even with a very poor quality promotional BT speaker that says "this bud's for you!" every time it pairs.


I've tried 4 different headphones from 3 different vendors with 2 different smart watches (Moto 360 Sport, now Polar M600). I've had the same problem with all of them.

At least one of the headphones has been a very high quality LG (HBS900?). Oddly, I've had the best connectivity with a cheapo freebie headphone set that has audio so bad that I can't stand to use them..


> If BT audio could be buffered for 15-20 seconds, this would not be a problem.

My phone (or my car) does about 2 seconds of buffering, which is usually just enough -- however, it applies the buffer regardless and doesn't synchronize to any other content. So, using BT to try and watch any videos completely fails.


This is shit gear. My Bose QC35 stay connected to my macbook pro and iphone 7 50-60 feet away through multiple walls.

See my comment elsewhere here, I had a set of Outdoor Chips bluetooth speakers for my ski helmet that would cut out if I turned my head the 'wrong' way and I returned that garbage posthaste.


That would require ~1Mb of memory (or potentially more), also would be unfeasible for some applications (like phone calls)


Phone calls use a different Bluetooth profile, so that's largely irrelevant.

1 megabit (did you mean megabyte?) of memory is not worth noting. It could be integrated into the IC that contains the entire system for headphones quite easily. That system already has memory in there anyway.


1 Megabyte and your average BT audio device has less memory than you think

Standards have not been made considering today's availability of memory and resources


I'm not sure how much memory you think I assume is present. My point is that it would be a possible and simple extension, not that it is already present.

There is currently exactly as much memory as needed. That does not make it unrealistic to extend it with a buffer. Cheap SBC-only devices certainly won't do it as those are solely about cost optimization, but the beefier, pricier AAC/Apt-X multi-device capable bluetooth audio sink solutions that sell for premiums could easily incorporate a bit of extra memory. 1Mb (megabit) would give a few seconds of buffering, which would be sufficient. It'll take a bit more silicon, but nothing absurd. Hell, it could be external memory over SPI, so integrators could select buffering capability themselves, without significant cost/silicon area increase of the bt chipset

However, to actually utilize buffering, you'd need a higher bandwidth link than is required for live playback to catch up after connectivity dips. Buffering by itself only adds latency. The question is then whether you'd want to use your link bw for resilience of low quality audio, or instant high quality audio...


Usually far less than 1 MB. It's about cost, footprint and power consumption.


What profile is used for phone calls depend on configuration of the phone and the headset.

Last week I accidentally found out that I can use two phones at once with integrated HF in my car's radio as long as they use different BT profile (somewhat unsurprisingly, while trying to fix completely unusable sound quality over BT)


That's so you can have a music device and calling device connected simultaneously. Both connections can be consumed by one device if you want both music and call functionality.

I don't think your phone will try to use A2DP for a call.


I used to absolutely abhor BT, but in the past few years it seems to have gotten really, really good about picking up, and maintaining a decent connection. Since then, I've picked up BT headphones, BT keyboard + mouse (Apple), and a nifty little waterproof BT speaker. Now, the only issue I sometimes have is when I want to connect to a new host device. Other than that, BT has been really nice to me.


I came here to say this. I've owned lots of Bluetooth devices, dating back to an Ericsson T39 and some kind of Bluetooth headset in 2002. It seems clear that at some point in the past couple of years, something changed, bigtime. Not sure if it's Bluetooth 4, or what.

It's always been terrible. Even just connecting my phone to my Tivoli Audio speakers from 2 feet away was a nightmare. I'd have to try to reconnect 10 times before it would finally work. This is across dozens of devices and now more than a dozen years.

My Mac stuff is mostly better although I found when I had a deskmate, in a room full of 40 people using multiple Bluetooth devices, sometimes my trackpad would disconnect and have a hard time reconnecting. When he quit, everything went back to normal, then when they hired his replacement (and gave him different equipment), the problem came back.

That said, recently it's been an absolute miracle. I bought a new VW Golf with bluetooth. It's AMAZING. It connects quickly, every time, and works flawlessly. After having a positive experience with a friend's bluetooth headphones, I bought some Bose QC35 bluetooth headphones, and THEY are amazing! they even connect to 2 devices at once so my computer can send me alerts while I listen to music streaming from my phone. And the range is outstanding -- 60+ feet through multiple tile walls in my office building. One time I got into my car wearing my headphones (on a conference call), started the car, shut off my headphones, and the call magically transferred to my car, instantly. Based on THOSE good experiences I bought some Jaybird X3 running headphones, which also work amazingly well. I tried some Outdoor Tech Chips helmet speakers for my snowboard helmet, and it was IMMEDIATELY back to 2005. The sound would have static and crackle. When I turned my head left with my phone in my right pants pocket (the antenna is apparently in the left earpiece), it would LOSE CONNECTION. Hell, sometimes even sitting on the ground with my head bent, about 2 feet line of sight, it would lose connection. I took them back to REI and bought a Sena Snowtalk 10 so now my ski helmet is back to multi-room range as 2017 Bluetooth intended.

I'm interested in following this comments thread to better understand the tech, but I simply can't believe how good it's gotten. I'll look at my phone and it will tell me the charge of my bluetooth headphones and apple watch, and everything JUST WORKS, thanks god, for once.


Totally agree. I had a pair of Plantronic Backbeat Go (Wireless earbuds with a wrap behind cable) and the bluetooth has horrible. Dropped in a pair of the QC35s and it's been like a dream. You really seem to get what you pay for.


Let's trade lives


Lmao. Bluetooth is so random about working or not! So many conflicting views in this thread.


I know very little about this technology, but I believe that Bluetooth LE/Bluetooth 5 is actually quite different from earlier versions, and much better: the vendors basically swapped out the BT4 technology whilst keeping the name.


Yeah. This is pretty much true. Things like audio is not on BLE yet though, as far as I know. Will arrive with Bluetooth 5 (better bandwidth will enable it)


For programmers using Bluetooth Low Energy (BLE 4.0) on Linux/BlueZ, we're working on this handy BLE GATT library for Python: https://github.com/getsenic/gatt-python BlueZ/GATT is very stable to our experience and supports most functions such as BLE device discovery, listing services and characteristics, reading/writing/subscribing values from/to characteristics.


I have a lot of gear in the Apple ecosystem that uses Bluetooth and I don't consider it unreliable at all. I use at least 6 different Bluetooth devices all day, every day (MBP, keyboard, trackpad, iPhone, Watch, Airpods, with additional car pairing, portable speaker and iPad pairings) in close proximity to a bunch of other developers behaving similarly. Looking around I can count at least 40 BT devices in active use around the office and I would still characterize my Bluetooth devices as more reliable than any wifi AP I've ever used.


A big part of the reason "Bluetooth" is unreliable is that there is no one "Bluetooth." Each manufacturer's implementation differs in strength and weakness, and depending on the potentially shoddy chips in the devices you are connecting to, a different Bluetooth standard will be used.

I have Bluetooth devices years old that I've never had problems with, and others that are a constant nightmare. The software stack behind the Bluetooth is also a major component in the reliability question.


Disclaimer: non-technical anecdotal evidence here—

I had a colleague for a time who's dad was a hardware engineer with Toshiba & worked with/on their part of the specification Working Group.

His pop said that the whole BT stack was unambiguously a steaming pile of poo from the get-go, and it was nearly miraculous it functioned as well as it did¹.

At that I had to chuckle, seeing how I'd wager that each of us have had enough woggy experiences with the tech to agree with the point he made so plainly.

But I do love the chosen icon & the history behind it, viś-a-viś the name ("Bluetooth"), so it's not all bad <wink>. ---

¹—this was around 2010 or so, to add some context wrt the relevant timeline(s).


Not 100% sure on this, but I believe devices utilizing Apple's W1 chip use a combination of Bluetooth + WiFi (WiFi for the initial handshake upon connecting probably or something like that). If anyone's ever used AirPods it's amazing how reliable they are compared to other bluetooth headsets.


AirPods pair and work fine when Wi-Fi is off, yet they aren't even detected by the phone if Bluetooth is off. That would indicate to me that WiFi isn't involved?


You're incorrect, they do not support Wi-Fi. I think you were thinking of the chips in the watches.

I have AirPods, and they were pretty fantastically. I've had occasional issues but it's rare enough it doesn't bother me at all.


Is it? I'm a pretty heavy user if Bluetooth in a few different use cases and it's pretty reliable for me.

Best way to improve reliability is to avoid dodgy or counterfeit radios in crappy electronics.


Likewise, I have no problem with it (for mouse, trackpad and keyboard, all in all-day use).

I'd still prefer wired for anything stationary (keyboard & trackpad in my case), just to eliminate the entirely superfluous batteries. In a world where that option isn't available, however, bluetooth works perfectly well for me.


Same here. Been using Bluetooth heavily for about 10 years with no problems I can recall. Certainly far more reliable than wi-fi where devices and routers frequently need restarting for no obvious reason.


I know this is a little tangential, but this brought some simmering anger back to the surface. Just today I was trying to communicate with a bluetooth device that provides a serial channel.

Back in the day I used to just run "rfcomm bind <mac-address> <channel>". But it turns out BlueZ decided to deprecate (read: stop building by default) the rfcomm command in favour of... wait for it... a barely-documented D-Bus interface.

How much do you have to hate your users before you decide to take away a useful, functional executable and replace it with an unspecified interface over D-Bus that requires hours of research to use rather than 30 seconds of reading a manpage?


The last few years, I have not had trouble with BT, but maybe it's because I simplified my use cases to ones which work after early failures. Here's what works for me and never fails:

- Macbook to Apple bluetooth mouse

- iPhone 6s to late model Mazda infotainment system

- iPhone 6s BTLE connection to Garmin Forerunner watch


That hasn't been my experience -- the same physical devices work for a while and then fail usually after an OS update.

My iphone6 paired inconsistently with my car until iOS 10.2 then it failed completely. Then the dealer did a routine maintenance and it started pairing again. Until 10.3, and now it's hit and miss.

Plantronics headset used to be my foolproof solution for Google Hangout and Bluejeans meetings. Now it won't pair successfully with the MacBook at all. Nor with the iPhone 6.

It's the "moving target" aspect that is most infuriating for me.


That's been my experience with BT as well... basically it's the "Swiss cheese model" where every BT implementation has problems, but you can sometimes find pairs of devices which don't hit each others holes, and once they work, tend to keep working. So you keep trying different sets of devices, and then once you've found the right ones, cling to them like a drowning man to a rock :)


Seconded (thirded?). Daily user of BT keyboard, mouse, and headset. Nightly user of BT connection between phone and speaker. I had my fair share of lumps in the beginning, enough to put me off of it completely until 4 or 5 years ago, but it's been a primarily pleasant experience ever since.


I have a Linux computer (Dell Chromebook 13) connected to the Microsoft Mouse 3600 Bluetooth (BLE 4?) and it was rock solid. The mouse picks up immediately whenever the computer is on. It was almost miraculous how well it works. The mouse is really quite darn responsive too.

That is, I use the cutting edge Linux distribution (Ubuntu 17.10) -- it was pretty darn painful even on 17.04. I have another keyboard that is on Bluetooth 3.0 that fucking disconnects every other day.

So YMMV - I think BLE mice and keyboards are much better in terms of 'just works' unless you pair them with a whole bunch of devices.


I have a Marshall bluetooth speaker and I find it pairs and works more reliably with my Linux laptop (Ubuntu 16.04 on a Thinkpad) than it does with my Mac laptop (Sierra on a 2012 MBPr)


That sums up bluetooth in general. Trying to use a bluetooth device with multiple devices/computers is a pain.


From an experiential view, I say "crowded spectrum" My bluetooth keyboard takes ages to associate at work (which is close to a mainline rail station), but at home in the relative country, it works smoothly.


I wonder how much of this is related to hardware generally being developed and tested in isolated suburban office parks as opposed to big cities.


No matter where you develop the hardware, you really can't get over the fact that you have a small crowded spectrum where bunch of devices keep trying to transmit over you and causing collisions.


Oh wow. And I through it's reliable. I used it only a few times on smartphones and laptops (I like my mice and keyboard with cables) but I still remember what a big deal it was compared to infrared and how mobile phones in early 2000s would lose connection and the only sure way to use IR was putting them next to each other on a flat table with the IR thingies of their physically touching(!).

That makes me a little less excited about my plans of getting Dual Shock 4 for my PC for gaming.


I gave up on Bluetooth at home around a year ago. Not sure what it is, but I'd pretty much have to put my phone right next to the bluetooth speaker for it it work reliably. Might as well use a cable.

I had high hopes for Google Chromecast Audio for my music at work and at home. Probably my fault for jinxing myself by asking "What could possibly be worse than Bluetooth?" Chromecast Audio has definitively answered that.

For one thing, you can't limit who can interact with the Chromecast. Anyone on the network can see it. At work, my music would usually pause ~4 times a day as someone else's phone would see it and connect to it. I'd have to set up a new wifi network that only I could use to fix this. Since I only listen to music a few hours a day, that's pretty frequent.

It also gets confused if I leave work and then try to use Google Play Music elsewhere: my Google Home in the bathroom will play a song and then stop, I think because "google play is being used on another device", but it doesn't tell you that.

Maybe I should just go back to using something like a Raspberry Pi with access to my music library, it still is mostly songs I have the CDs for and ripped, though I've added probably 50-100 songs over the last year on Google Play, my 600 CDs I have all in FLACs.


I've never used a Chromecast Audio, but their website explicitly mentions a password and a guest mode w/ 4-digit PIN to let other people play music on it. Sounds like yours isn't working as intended, could it be something else causing the pauses?

https://support.google.com/chromecast/answer/6279388?hl=en

EDIT: jordanthoms points out that "guest mode" is for letting people stream audio without giving them your wifi password. Once they're on the network everyone has access.


The "guest mode" allows for users not on the WiFi network to see and interact with the Chromecast using WiFi Direct. I do not believe there are any options on the Chromecast itself to limit access to users already on the local WiFi.

FWIW, Chromecast Audios are one step further than this, as they do not support this "guest mode" as they do not have a way to display the PIN. Anyone on the WiFi network can freely connect to the device.

That said, I have never seen a device automatically connect to a Chromecast or Chromecast Audio. Normally you have to explicitly connect to a Chromecast.


As described in the linked page, host gets the Chromecast Audio's guest PIN from the Google Home app


That's for allowing people to cast to the Chromecast if it's nearby even without connecting to the WiFi - if you are connected to the same WiFi as the Chromecast no password is required.


Ah, missed that when I skimmed through. Looks like there's no way to secure it against people who are on the same wifi network.


I remember laughing when I changed the youtube video and then my phone got a notification; indeed anybody on the local network and android home running will be on the party.


I switched from a bluetooth dongle of unknown provenance to a more powerful Zoom (brand) class 1 dongle and hung it from a usb cable off of a lighting fixture in my home office. I get complete coverage to a Jabra headset of a rather large screened in porch despite having to penetrate my pudding head, two interior walls, and one exterior wall. The class 2 dongle barely worked outside.


I'll just leave here that the "official" linux bluetooth stack (i.e. BlueZ) has dogshit documentation.


I never have Bluetooth issues in my Rav4 between any of my phones (ZTE, OnePlus), it is perfect always. I can not emphasize enough how amazing it is.

My and my wife's Fitbit have constant Bluetooth issues to our phones. This is completely and utterly annoying.

Driver related? Not sure.


> Driver related? Not sure.

Well, is the bluetooth as good when your wife is driving the Rav4?


He meant software drivers, not people drivers.


Probably hardware-related. Good, or at least thouroughly-tested and characterized, design can make or break high-frequency RF applications.

Check out the PCB guidelines for the popular NRF51822 Bluetooth tranceiver:

https://devzone.nordicsemi.com/blogs/655/general-pcb-design-...

It barely scratches the surface, and in reality there will always be some trial and error as other nearby circuit elements can cause unpredictable interference or feedback. But it's a lot more finnicky than laying out an Arduino board.


I would buy a wireless audio speaker that uses NFC instead of bluetooth to connect to Android or iPhone. You would have to set the phone on the device but that would be a small price to pay if the connection were more reliable.


NFC has an effective range of about 20cm, bluetooth (and wifi) up to about 50m. these have completely different use cases, and if you're limiting yourself to 20cm range for a semi-permanent fixture (like speakers) you might as well run a wire.


that's not how NFC works.


I would love if all my Bluetooth devices used nfc to set up pairing.


Yes, maybe pairing could be accelerated with NFC, then switching to bluetooth for extended range and bandwidth.


Do you really have problems with Bluetooth connection when the speaker is touching the phone?


I am in the Apple ecosystem and the "pair" process is incredibly slow, even when it works. I'm arguing about seconds here but it's usually like 15 seconds.


NFC has distance and bandwidth limits.


I would be OK with the distance and bandwidth limit if it were high quality audio transmitted zero distance (contact) wirelessly, without a pairing step which is currently manual.


I think the bandwidth is too low for any half-decent quality of audio. Not 100% sure.


Despite multiple apps of the likes of shareit, I find bluetooth to be the only mechanism of data transfer that just works. SHareit and the likes of it get new versions which break on my Android 7 for each upgrade and it is a PITA to get it working for different android versions, for some reason it does not show my device on a moto phone and I have to use the hacks like get a file from the other device to my device and then send something on the established connection.

but there is one thing, bluetooth is not useful if the file is big.


Part of the issue is that bluetooth as a whole is nothing more than a wireless serial connection. It's the various protocols built on top of it that determine its stability. The Bluetooth SIG only really control the pairing between the two devices, a low layer. You're hoping that the company you buy stuff from has implemented the protocol correctly, over which the SIG has no control.


Traditional "slow" IrDA is essentially HDLC over UART with packed based protocol stack over that, that among other things can emulate dumb reliable duplex pipe (TCP-like) that is usually exposed as virtual serial port, bluetooth is similar, with significantly larger set of "other things". Other relation to serial port is that it is usual interface between the HCI and host system (using fairly highlevel protocol). In the lower layers it is frame based protocol that has no relation to serial ports.


I have nothing to add only 'yes me too my how I have suffered', the countless crappy bluetooth devices I have connected and disconnected and hours and hours I have wasted trying to get them paired with various linux boxes, nearly all in short order choosing death rather than do my bidding. I am looking at one right now currently unconnected. 'Dummy device'. Why indeed.


1. fragile encoding schemes

2. fragile modulation techniques (uwb would've been a "final solution" to the problem, but died to patent trolls)

3. interference from wifi (try using bt mouse while downloading an hd movie)

4. because of three different "wire protocols"

But the upside is that BT super cheap to implement, and thus ubiquitous


I've always had trouble with bluetooth devices until I got AirPods. Whatever bluetooth setup they're using is very reliable. I use them with my phone, windows computer, ubuntu work machine, and I rarely ever have connection issues.


Plantronics seem to do it substantially better than anyone else somehow.


Because it's not as popular as WiFi or Ethernet or USB. It hasn't had the decades of hard core, hard knocks field usage of WiFi/Ethernet/USB. So the chipsets are less robust to errors, are less sensitive to highly noisy environments. The drivers aren't as battle tested as the other connectivity.

WiFi in its initial days (802.11b) reminds me of bluetooth right now. Quirky, bad tools, weird errors. But WiFi caught on and manufacturers started throwing $B at R&D for better chips and better drivers for those chips.

Bluetooth just has a problem with scale.


Wifi and Bluetooth rose to consumer attention around the same time, and bluetooth appealed to people who weren't necessarily 'techy' since it was adopted heavily for car hands-free kits. Businesspeople outfitted company cars with it soon after it was released for legal reasons.

I wouldn't say adoption was lower than wifi, in fact bluetooth was probably more rapidly adopted because, since it's linking simpler devices, it's cheaper (wifi needs the whole TCP/IP stack underneath it), but you're probably right in that it has seen less investment in error robustness than wifi - momentary loss of voice through your handsfree is usually tolerable, wifi dropping a large number of packets is not.


I wonder if the tools we commonly used Bluetooth for are more noticable if they error too.

If my Bluetooth mouse stutters for a quarter or a second, I'll likely notice and be angry, but if a YouTube video stutters loading, it'd have buffered and likely hidden that.


I thought 802.11 and BlueTooth were both brought to market in the late 90s. Am I mistaken?

Your statement seems like a 'first out of the gate', or 'battle tested' argument. They are approximately the same age, no?

They aren't competing technologies; they have differing use cases.

One has enjoyed wide success, and the other is still stumbling along.

Is it a case of unclear standards, bad implementations, or lack of interest in the blue-tooth use case of wireless accessory devices?


WiFi is complicated too, you just don't see the problems most of the time. The setup people generally use at home tends to work and the setup you use in large deployments is typically managed by people who turn off the features that don't work with the devices people use on the network.

Still if you start with features like the fast handoff or the more obscure authentication combinations you'll see the problems.


Yes! Wifi doesn't always work either.

Especially back in the A/B days it was awful.


You're not wrong, but I had early wifi (even on Linux) and Bluetooth today is worse. Bluetooth is not a new standard.

I've dealt with it's internals to a small degree. It's overengineered and excessively stateful, leading to a lot of edge cases and failure modes that just do not need to exist. It could have been orders of magnitude simpler and nearly stateless and it would have been a dream. State is evil in protocols and should be avoided if at all possible.

It really reeks of vendor clusterfuck with lots of requirement overloading, which is very typical in modern protocols that are vomited forth by consortia. WiFi is at least a little bit cleaner owing to the fact that it had its requirements clearly specified: Ethernet over the air.


Wouldn't WiFi in its initial days be 802.11 legacy or systems prior to the IEEE standard such as WaveLAN?

You are right though. I vaguely remember messing about with WaveLAN and I don't recall it being as flaky as some of the early 802.11 systems.


In early 2000's I used BT (with DUN profile IIRC) for in-home wireless network for the sole reason that it was more reliable than wifi (at the same time my consulting usually involved various wifi brokenness).


I wouldn't say WiFi is reliable either, especially once you add the whole stack of protocols run over it (DHCP, SMB, etc) that are equivalent to the stuff that's handled by Bluetooth and its profiles.


I would try eliminating Bluez5 and Pulseaudio first...


Simple- it inhabits the same band almost everything else inhabits- 2.4GHz. To an extent, the reason Bluetooth is unreliable is the same reason most Wifi is unreliable in crowded areas. There's a lot of appliances that use that bandwidth over incompatible standards.

Even worse are the "spark" kind of 2.4GHz appliances that don't play nice, like wireless camera systems and baby monitors. If your strong-signal wifi or bluetooth keeps dropping, it's far more likely to be one of those at fault than anything else.


I am currently in the process of developing a BT based embedded system, and in my experience Bluetooth is extremely resilient to the crowded 2.4 GHz space. It has very good and fast frequency hopping, so it maintains a connection even in the presence of several WiFi access points covering all but a small portion of the 2.4 GHz band. We ran a test in our office which has many access points and laptops/phones, and even an alarm system that uses 2.4 GHz. At a distance of 10m between central and peripheral (with several dry-walls in between) we didn't lose a single bit over a full week of continuous data transmission at several 100 kbit/s.

There are limits of course. At some point the 2.4 GHz band is so saturated nothing low power gets through. But so far, I am pleasantly surprised by the reliability.

All this was done using BT 4.2 though, so maybe that helps.


I did some similar experiments with the original Ericsson Bluetooth development kits and got the same results, BT won over WiFi.


Yes. This is why I stopped using wireless keyboard and mouse. Despite the fact that they are within close proximity to their receivers, there is frequent signal instability, resulting in spikes in input lag and outright input loss.


Why is that range so popular? When presented with what appears (perhaps naively) to be a vast range of frequencies to use, why do so many wireless specs settle on the same thing?


Much of the EM spectrum is licensed. 2.4GHz is one of the 'unlicensed' bands set aside for consumer electronics, meaning no special permit is required to create something that uses radio within it. That's the gist of it.

Different radio frequencies have different ranges and propogation effects depending on local weather conditions, absorbtion by humidity etc. Generally high-frequency radio is short range but high bandwidth, which is perfectly suited to what consumers want.


To add, the US Department of Commerce puts out a great graphic [1] showing how the spectrum is split up. The only bands available are labeled "Amateur" in a greenish box.

https://upload.wikimedia.org/wikipedia/commons/c/c7/United_S...


The green amateur bands are in no way "available". Those are the Amateur (Ham) Radio bands.

N9NAY


The bands typically thought of as "available to electronics" (meaning available without a license) are designated on that chart as ISM bands. (https://en.wikipedia.org/wiki/ISM_band)

This doesn't seem to be a hard and fast rule all the time (eg some wireless runs at 5Ghz I believe, a bit off from the 5.8Ghz ISM band.) I believe WiGig is roughly centered around the ISM 61Ghz band though.


Because it is the resonant frequency of water. Early microwave ovens polluted that frequency band so heavily that no other service used it. Nowadays, microwave ovens don't send out radio waves any more, so the frequencies are available for other services.

Which is a bit of a shame, since the resonant frequency of water is, by definition, absorbed by water, like wet walls or walking water-bags like you and me. Other frequencies would be much better-suited for human cohabitation, but that's historical evolution for you.


"Nowadays, microwave ovens don't send out radio waves any more"

Every microwave oven I've ever had, including name-brand ones purchased in the last year, sent out loads of interference on one or more wifi channels.

This morning I noticed that my connection at the breakfast table was really crappy. The upstairs hotspot likes to change channels every few days, and it had picked 11. When the microwave stopped, everything cleared up, so I forced it to channel 1 and tested with the microwave again. No problems there.


Every single sentence in your post is incorrect. https://www.indiegogo.com/projects/why-2-4ghz-chasing-wirele...


AFAIK, there are not so much unlicensed frequency ranges that are available almost worldwide. 2.4 GHz was initially reserved to use in microwave ovens that can cause interference in that range, then spread-spectrum technologies (such as used in wifi) that are not sensitive to noise from ovens became cheap so this frequency range became useful for that.

Other popular unlicensed range - 443 Mhz has much lesser bandwidth so it's popular for slower data transfers, i.e. in doorbells and temperature sensors. Also AFAIK it requires larger antennas because of longer wavelength.

5 GHz wifi is only recently started to become popular because, I think, there are some problems in implementing such high frequencies, or because it's more affected by walls and furniture.


As a general rule, high-frequency transmissions suffer from short range, particularly in an atmosphere. 5GHz was implemented in 802.11a in the early days of wifi, but suffered shorter range than 802.11b despite its higher bit rate. However, as it's not used as much by consumer devices, the spectrum is generally clearer so you get more consistent data rates.

802.11n and 802.11ac get around this by dynamically hopping between the two bands as necessary to get the clearest signal at the current distance.


> 802.11n and 802.11ac get around this

11n is available on both 2.4GHz and 5GHz. 11ac is 5GHz only.

I don't think there's any specific hopping thing in n/ac. It's just normal roaming between APs with the same SSID, if you use the same SSID for both frequencies. Same roaming would happen with just 11a and 11g.


The FCC and other government bodies (ITU) in other countries tightly control the radio spectrum to prevent interference. There are a few bands that are set aside for general use called the ISM bands (industrial, scientific, medical). In those bands you can emit RF that might interfere with other devices. So you can run stuff like wifi, cordless phones, induction heaters, and microwaves that emit RF energy that can cause interference without a license (subject to some restrictions).

https://en.wikipedia.org/wiki/ISM_band


It's unlicensed worldwide -- free and easy to use. Same as the 900MHz and 5GHz bands. It's a political/regulatory decision, not technical.

(And the higher the frequency, generally the more bandwidth is available, but chipsets are a little more expensive.)


Actually, 900MHz ISM band exists only in ITU Region 2 (ie. Americas), in rest of the world it is usually allocated to GSM and some IIRC aviation related use in the middle of GSM up-down guard band, both are things that you don't want to interfere with, because somebody will care and come after you. 5.8GHz is universally accepted as ISM, but sometimes shared with licensed users (eg. weather radars), who have priority.


Fun fact: It's available worldwide because they wanted to use Microwaves on ships and planes. https://www.indiegogo.com/projects/why-2-4ghz-chasing-wirele...


The technical reason is they do not travel through walls that well, so they would make very bad bands for things like radio stations or cell phones.. so why not give it to people to use for local devices.


Use of the radio spectrum is regulated (e.g. by the FCC in the US). Different things are allowed or not, depending on the frequency. 2.4GHz is one of the few frequencies which are allowed for consumer use without much restriction; hence that's what everything uses.


There's a fair amount of restriction, the difference being if you agree to operate within the restrictions you don't need a license.


My guess is because most ranges are already reserved for other types of devices. I think the 2.4GHz spectrum is one of the few open for general consumer use.


Because it's unlicensed. It's unlicensed because it's close to the frequency designated for household microwave ovens.


Its an ISM band and not heavily regulated. There are only a few such bands.


As much as I dislike proprietary protocols, I'd be greatly in favour of Apple deciding to make a replacement for Bluetooth that works with all their products - and Just Works. It'd be no use to me, as my only Apple product currently is an iPhone, but if I saw that Appletooth Just Worked, I'd be looking at diving (back) into their ecosystem.

I know some people are saying Bluetooth works perfectly between their Apple products, but plenty of people are saying it doesn't, too.


Considering how poorly AirDrop works I don't think I'd trust them to get it right.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: