Because it's not as popular as WiFi or Ethernet or USB. It hasn't had the decades of hard core, hard knocks field usage of WiFi/Ethernet/USB. So the chipsets are less robust to errors, are less sensitive to highly noisy environments. The drivers aren't as battle tested as the other connectivity.
WiFi in its initial days (802.11b) reminds me of bluetooth right now. Quirky, bad tools, weird errors. But WiFi caught on and manufacturers started throwing $B at R&D for better chips and better drivers for those chips.
Wifi and Bluetooth rose to consumer attention around the same time, and bluetooth appealed to people who weren't necessarily 'techy' since it was adopted heavily for car hands-free kits. Businesspeople outfitted company cars with it soon after it was released for legal reasons.
I wouldn't say adoption was lower than wifi, in fact bluetooth was probably more rapidly adopted because, since it's linking simpler devices, it's cheaper (wifi needs the whole TCP/IP stack underneath it), but you're probably right in that it has seen less investment in error robustness than wifi - momentary loss of voice through your handsfree is usually tolerable, wifi dropping a large number of packets is not.
I wonder if the tools we commonly used Bluetooth for are more noticable if they error too.
If my Bluetooth mouse stutters for a quarter or a second, I'll likely notice and be angry, but if a YouTube video stutters loading, it'd have buffered and likely hidden that.
WiFi is complicated too, you just don't see the problems most of the time. The setup people generally use at home tends to work and the setup you use in large deployments is typically managed by people who turn off the features that don't work with the devices people use on the network.
Still if you start with features like the fast handoff or the more obscure authentication combinations you'll see the problems.
You're not wrong, but I had early wifi (even on Linux) and Bluetooth today is worse. Bluetooth is not a new standard.
I've dealt with it's internals to a small degree. It's overengineered and excessively stateful, leading to a lot of edge cases and failure modes that just do not need to exist. It could have been orders of magnitude simpler and nearly stateless and it would have been a dream. State is evil in protocols and should be avoided if at all possible.
It really reeks of vendor clusterfuck with lots of requirement overloading, which is very typical in modern protocols that are vomited forth by consortia. WiFi is at least a little bit cleaner owing to the fact that it had its requirements clearly specified: Ethernet over the air.
In early 2000's I used BT (with DUN profile IIRC) for in-home wireless network for the sole reason that it was more reliable than wifi (at the same time my consulting usually involved various wifi brokenness).
I wouldn't say WiFi is reliable either, especially once you add the whole stack of protocols run over it (DHCP, SMB, etc) that are equivalent to the stuff that's handled by Bluetooth and its profiles.
WiFi in its initial days (802.11b) reminds me of bluetooth right now. Quirky, bad tools, weird errors. But WiFi caught on and manufacturers started throwing $B at R&D for better chips and better drivers for those chips.
Bluetooth just has a problem with scale.