Bluetooth is an EXTREMELY complex radio protocol on Layer 1. It's like a mating dance between scorpions in the middle of a freeway. High chance something gets messed up.
Layer 1 keeps drastically changing too. Bluetooth 1 and 2 use completely different modulations, and are not backwards compatible. Bluetooth 3 simply was an extension to 2. "Let's agree over Bluetooth 2.0 to use WiFi instead." Bluetooth 4, while much simpler, uses an entirely different scheme.
Instead of a "general purpose" wireless network like WiFi, Bluetooth tried to be application specific. Except the only profiles everyone wants are mice, wireless audio, and fitness trackers. If you look at the application layer spec, it reeks of design by committee. Everyone haphazardly jammed their pet projects together, and there are redundant and vestigial parts everywhere.
The Linux side of BlueZ is abysmal. Honestly, I don't even know how anyone does anything with Bluetooth on Linux besides a mouse and keyboard. And barely even that.
As much as I hate on the protocol, the Layer 1 spec is truly ahead of it's time, in some areas. Watching two radios frequency hop, and negotiate to avoid a congested wifi channel was unreal.
I'll add on to this a bit.
The Bluetooth stacks out there are, in general, tested only for a few of the many profiles BT supports. The firmware running on the BT chips is always of questionable quality, it is a black box, that in the case of one major chip vendor, is not coded using source control(!), figured out empirically as we'd get firmware drops that had fixed out latest reported issue, but bugs from previous FW drops would be present again. (Maybe they did use source control, and just had the world's worst regression test suite, e.g. none?)
That is device side. On the PC/Phone side, things are exactly the same. EVERY phone has problems with their Bluetooth stack. The Android Bluetooth stack introduces new bugs in between major revisions So does the iPhone BT stack. The iPhone BT stack had a problem for, I believe it was 3 major OS revisions, where it would just drop connections to BTLE devices, and users had to turn BT on and off on the phone to reconnect. It was reported to Apple by multiple vendors, it took multiple years for a fix to come out, and it made BTLE on iPhone a horrible experience. But similarly severe bugs exist everywhere.
HID is the one profile that will work. Serial also probably works, more or less, but serial you have to build a lot of other infrastructure on top of it. BT pairing on phones works nowadays, thanks in part to the popularity of wearable devices, but it barely functioned 5 years ago.
Other specs, including simple ones like media control (AVRCP, which is an awesome spec by the way, give it a read), will have differences in implementation between OSs (desktop and mobile), typically some basic features are not implemented for who knows what reason.
All that said, BTLE is incredibly simple and easy to understand. :)
It is the new features here that are the problem. I think you actually can incrementally fix anything, but such a project can be as expensive as a rewrite -- and neither will succeed if new stuff is constantly being thrown in.
I've been doing iOS development for wearables for a few years and have found that you can generally make BLE for an iOS app very reliable. But it requires some work in the software to cover up some of the underlying flakiness. Reminds me a lot of building resilient networking code for apps back when cell service was slow and spotty. But if you do a little work to build in command queues, auto-reconnect, etc it can be very solid for the user.
Oh, and it helps a lot to pick a good chip. From my experience, the Nordic NRF5x series is the best. But as has been mentioned elsewhere in this post you still need to have a quality antenna design. I've noticed a huge difference between using a reference design and one that spends some time in a chamber with a good RF engineer.
Huh? Pairing worked like 10 years ago. And sending files between phones. Everyone in school was sending J2ME games/apps around :D Audio with media control also worked. And modem (using the phone's GPRS/EDGE connection on a PC).
for audio, right?
But when iOS/Android smartphones came on the scene it took a huge step back in reliability. A2DP also took a huge step back in audio quality until the new platforms tweaked their SBC parameters and added in MP3/AAC support.
The craziest setup i ran with back then was a Nokia N800 paired up with a Sony Ericsson C702 and a no-name folding keyboard. The C702 was also paired up with a pair of Jabra headphones/headset.
At certain times all of those could be in use. The C702 acting as the modem/router for the N800, whole also playing music through the Jabras, and me typing comments on the Maemo forum etc.
The only times there were a hitch were when i used the modem profile between the N800 and the C702, rather than the PAN. Something about the intensity of chatter or something between the two made the music and keyboard stutter.
Do note though that headphones can act up if you are out and about with little for the signal to bounce off. Best then to have all devices on the same side of the body for less obstructions.
In this day and age it's absolutely insane to not have revision control.
When I read the original comment about bug fixes disappearing, it reminded me of merging test onto stable by hand, not knowing what was "the good code" and having to ask, research, etc. It was messy and some times we got regressions.
We called this thing, a bug fix got "merged away".
Nowadays with Git, feature branches, better test coverage, continuous integration; things have got much better.
Even in places that think they do, it seems pretty common to have an additional codebase in the database without any real version control.
I'd like to know how problems like this correlate to data breaches.
I.e. was it just a crazy requirement handed down, or did they not know better?
I just can't imagine it even for solo development, nevermind collaborative work! "Agh. We must have had a merge conflict, damn, the core part of my work is gone... Let me see if I have a recent copy... Oop, no, I don't, because I've run out of disk space from all these clone backups."
I do occasionally get Dropbox-level merge conflicts for tmp editor/IDE files, but I can't recall getting any for source files.
GP seems to be deliberately avoiding commands relating to remote management, such as push.
It worked OK for that purpose. And I still have dropbox mainly for that reason. My source code is always local. Everything goes into TFS, but the local is backed up to dropbox.
'course, now there's a hundred dollars in various adaptors to rebuy...
What are you using to watch Bluetooth at a low level?
My Magic Mouse at work would get very jerky with my Mac Pro. Resetting it would sometimes clear it up for a while. I swapped it with my Magic Trackpad from home, and the mouse is fine at home, and the trackpad now has the jerkiness although not as much as the mouse did.
I tried moving the Mac up on to the top of the desk so the mouse/trackpad is only a few feet away, and has line of site instead of having a metal desk between it an the computer, but that made no difference.
I'm speculating that it is some kind of interference from the office downstairs. (It's an office of Azima DLI, which develops all kinds of stuff that might be noisy in RF).
I'd like to be able to take a look at what is actually going on in my office (1) with RF in general, and (2) specifically between my Mac and my trackpad or mouse.
I thought this might be an excuse to step up from one of those $15 SDR dongles I've played around with to a HackRF, but from what I've read the HackRF cannot fully deal with the Bluetooth frequency hopping. I think I read that using multiple HackRFs one can do it, but that's beyond what I want to spend.
Exactly. As I said in another comment, we did a test in a very busy office space and didn't loose a single bit of data @ several 100kbit/s over a week transmitting continously. We could see it hopping on the sniffer.
- Are there similar close range wireless protocols that are stupid simple (say like IR communications but on Radio spectrum)
- Could someone or some team something stupid that could be handled by todays micro controlers (esp32 is 7€ for a dual core 32bit processor, I suppose it can be used as a dedicated comm. proc)
In a way, wifi is perfect for 0-50m~, we just need a similar idea for 0-10 and low bandwidth.
Another option is ANT+ - that is very low power but also very low bandwidth (8 byte long messages 4x a second ...) protocol used mostly by fitness gear like training bikes, treadmills and such. Many smartphones have radios that are able to use it. Again something that is easy to make work even with an 8bit micro.
For more advanced stuff there is Zigbee, but that is quite a bit more complex (but still manageable even by an 8bit MCU) and there are some licensing issues which kept it rather obscure.
Local radio text chat could also be fun... Think of an IRC channel, only for the people who can grab the signal...
ANT+ could be enough for a control path or maybe keyboard.
I always wanted to try zigbee but it felt too much like a closed world, although people seemed very satisfied with it it seems.
In most everything I own, bluetooth is frustrating crap. Apple's somehow gotten it (mostly) right, so why can't anyone else?
I believe more people here will have encountered this behaviour when all of your paired devices suddenly disconnect from the MacBook, what have helped me is to turn my Wi-Fi radio off, wait for the devices to reconnect and then turn it on again. This has happened to me for at least the past 3 or 4 major versions of OS X/macOS.
It's not an issue when running Windows on the machine and it's not due to congestion regarding 2.4GHz wifi networks. And it's not AP related. It's macOS related. I am also not alone in experiencing it. Not the worlds biggest issue but frustrating as hell.
Is it really ahead of its time? I recall reading somewhere that Bluetooth traces much of its roots to military radio applications.
While wifi is perhaps more interesting, at least with OBEX push and FTP i know (and it have yet to have it fail, knock on wood) that i can transfer files, albeit slowly, between two devices.
Get two devices on the same wifi network and i still need to figure out a protocol both of them can talk so that they can exchange data (never mind that they had to introduce wifi direct because certain devices refused to connect to ad-hoc wifi hotspots).
Add in all the complexity of different bluetooth versions and you have something that "seemingly everyone" has but few people reliably use.
Most of the issues in this thread are related to poor hardware design more than a crowded spectrum. While the spectrum is in fact crowded in metropolitan areas, most Bluetooth communication doesn't require much bandwidth and can handle error prone areas with ease.
While the frequency hopping helps a ton on BL (and WiFi for that matter), the issues people outlined are due to:
1) Shitty firmware
2) Shitty hardware
Antenna design is black magic and only a few firms in the US do it well. It took us almost 10 months to fully design and test our antenna assembly(s) with a very capable third party firm.
It took dozens of trips to a test chamber, a dozen computer simulations that take a day to run, and PCB samples that take days to verify. They have to be tuned every time copper or mechanical parts move as well.
It's a real pain and most Bluetooth products use garbage chip antennas and baluns or reference designs for antennas. This increases the sensitivity to failure and provides a generally shitty experience.
Most of your product interactions around bluetooth are budget products connected on one side of the equation (e.g. a $50 bluetooth headset). So despite how capable your Mac or iPhone is, if you have a garbage headset on the other side with poor antenna design, it'll be a disaster of an experience.
this made me literally lol. my macbook pro is the shittiest bluetooth device i have ever used. just search google for macbook pro bluetooth audio issues, and you will find a slew of forum posts going back years. there's one terminal hack after another suggested just to try and get macbook pros to properly play audio over bluetooth. i don't even bother using my macbook pro these days to play audio through a bluetooth speaker. i just use my iPad or android phone.
Apple has also shipped hundreds of millions of devices over that same time period, so the number of people with failed hardware is non-trivial. More importantly, it is extremely rare for people to perform root cause analysis on the actual failures which combined with the number of vendors and protocol complexity in the Bluetooth world means that a significant number of times the blame is placed on the wrong component.
The main thing I take from this is that vendors need to think about how they can provide a visible way to learn about peer device errors so e.g. your Mac could tell you that sound quality is horrible because the Bluetooth device doesn't support a high-quality codec or sufficient bitrate, is seeing high levels of retransmits or dropped packets, etc.
Like I had mentioned previously, if one side of the paired devices has a garbage antenna setup, it will have poor performance. Things like your soundbar, or a generic bluetooth speaker probably use reference designs for their BT setups and have not tuned them for the enclosure or use case.
What quality is the hardware in something like this:
I think that you are remiss if you don't actually name the companies/produucts/devices which are designed well then....
There was a company in Aptos/Santa Cruz who was designing the antennas for a lot of Lockheed stuff back in about 2006/2007 or so who we went to for antenna designs for our active RFID devices - but I cant recall for the life of me the name of the company now...
Your run of the mill bluetooth headset, mouse, or keyboard likely does not spend the time on the RF design beyond "does it work".
1. Do absolutely nothing.
2. Completely lock up the computer so it needs a hard reset.
3. Completely lock up the computer for a few seconds, then work.
4. Start working, but ignore ~40% of the key presses.
5. Swallow the first keypress or two, then work.
6. Actually function like a keyboard.
If Microsoft has a good RF team, I'd hate to see what worse teams do.
I wish there was a USB version of this keyboard, or at least a wireless dongle version (like Microsoft's cheaper keyboards).
The companies that deploy a TI, Nordic, Broadcom chipset are subject to the scrutiny I just mentioned. The chipset manufacturers offer reference designs but generally you need a custom matching circuit and a complex RF design to do it correctly. That's where most manufacturers skimp out because its generally "good enough"
As for WiFi, it does not frequency hop but is capable of "auto" selecting a channel on boot (generally).
Many, many Wi-Fi implementations are capable of frequency hop, including my own at home.
What he is probably referring to isn't frequency hopping, because what it sounds like and that it actually refers to means different things to different people. For someone who is not a RF engineer, hopping might simply mean "an AP that has multiple radios that uses one to scan the 802.11 2.4Ghz space to see what channels have the least utilization, and then restarts the other radios on a that channel, possibly forcing all clients to lose their connection as if out-of-range and reconnecting again on the new channel as they find it".
This, of course, has nothing to do with RF frequency hopping, but if you don't actually know anything about RF, you won't know what hopping means either, and just imagine that what your router with built in AP is doing must therefore be that 'hopping'.
Bingo, except add 5 GHz (and soon, 900 MHz), too.
As far as I know it's not dynamic, though. It just happens when a device connects to a base station and only "steers" the client to connect using one of the 2.4 or 5 GHz bands, and that doesn't interfere with channel selection.
The AP also has a function to scan bands to find the "best" channel to use but that disconnects all devices while the scan is in progress. I think some APs might be doing this automatically (?). My other AP (a Mikrotik) has an "auto" channel selector but I don't know what it does or when.
My neighbor moves around his channels, too. That’s what’s making me comment that it’s likely more common than we think. I’m not at home to prove this and I’m getting downvoted for sharing, but I’ve watched him move between 36, 40, and 44 at a frequency that suggests it is happening automatically, probably when all of his sessions go idle or something. I have observed this with multiple tools.
My 5 GHz was on 44 this morning, because I happened to look for unrelated reasons. I just SSH’d home and it’s now on 36 without interaction from me. Specifications are not implementations. I’m almost positive this is a common feature which I’ve seen in many APs over the years, even though it’s not per-packet like Bluetooth (which I was also aware of, given that I'm experimenting with a Bluetooth 5 mesh system for IIoT).
(it does mention a frequency hopping physical layer but that was part of the obsolete predecessor from 1997, the modern spec uses OFDM so unless all your equipment is from the last century you're not using frequency hopping)
1. IIRC my router is supposed to switch channel (i.e. frequency) automatically unless I explicitly set it to a fixed channel.
I don't know how but my devices figures out and continues working.
2. I think the misunderstanding is related to the word frequency hopping or whatever was used further up in the discussion.
I normally take "frequency hopping" to mean changing frequncy multiple times a minute or even faster while I doubt my wireless access point will switch channels more than a few times in 24 hours.
Disclaimer: I never really sat down and verified if it ever changed.
Also agree with logicalle that I wish for a bit more civility.
jsmthrowaway can you give more information about your setup and what the devices do?
No need for a heated argument. This whole thread is quite instructive!
(No I'm not complaining about being downvoted - this isn't even my comment - just trying to learn.
You can use Apple's current trackpad (the "Magic Trackpad 2", I believe) on a Mac (at least) via USB even with Bluetooth entirely off.
Edit: I believe (but have not confirmed) that their current keyboard works similarly. Their current mouse does not, at least in any really useful way, because its Lightning port is on the bottom.
I have always been amazed how everyone touts apple as the pinnacle of a design powerhouse, yet there are so many critically stupid design decisions the company has made, this being just one.
The rationale, at least what I found, to be that people wouldn't let it completely charge with a different design, but with this one people would leave it charging overnight. Also something about users ignoring the fact that it's wireless and just always use it with the cable attached.
Every design is about tradeoffs. In this case, the mouse would have to have a somewhat different shape to accommodate a port at the front. That different shape would surely be a poorer fit for the visual aesthetic they were going for and it likely would've impacted the feel during usage, as well.
Some people would prefer that tradeoff — certainly those who would prefer to (or have to because of an RF-noisy environment) use the mouse wired all the time. I expect the vast majority of their customers don't do that, though; and, I, for one, prefer (or at least am neutral to) the choice they went with.
It definitely makes for a silly-looking photo while it's charging, but it's certainly not a blunder that none of the designers ever considered.
If you've reached the point when you need a backup mouse for your primary mouse, then the primary mouse has failed.
EDIT: added 'for the trackpad'.
I just resigned myself to do the Settings > tether over WiFi > WiFi only
But otherwise it works fine.
... no, I still think it's a bug.
I guess if it was 2 seconds it might be more bearable.
So, when you start streaming music, it starts playing immediately, but it tries to send data faster than it is played until you have a decent buffer. Obviously, this would only work for non-realtime things.
On a side note, it's interesting that this buffering problem is identical with switching channels on any streaming service, and that the buffering delay is a significant psychological barrier to moving around, and that traditional radio works really friggin' well for what it does.
I'm waiting to buy wireless headset until they hit the market. Could take a while though.
It works like this on both iOS and Android devices.
So it depends on the headphones.
Of course that only helps with premade content. If you're playing a game the OS can't adjust the game for 200 ms of lag, everything is just going to be out of sync.
At least one of the headphones has been a very high quality LG (HBS900?). Oddly, I've had the best connectivity with a cheapo freebie headphone set that has audio so bad that I can't stand to use them..
My phone (or my car) does about 2 seconds of buffering, which is usually just enough -- however, it applies the buffer regardless and doesn't synchronize to any other content. So, using BT to try and watch any videos completely fails.
See my comment elsewhere here, I had a set of Outdoor Chips bluetooth speakers for my ski helmet that would cut out if I turned my head the 'wrong' way and I returned that garbage posthaste.
1 megabit (did you mean megabyte?) of memory is not worth noting. It could be integrated into the IC that contains the entire system for headphones quite easily. That system already has memory in there anyway.
Standards have not been made considering today's availability of memory and resources
There is currently exactly as much memory as needed. That does not make it unrealistic to extend it with a buffer. Cheap SBC-only devices certainly won't do it as those are solely about cost optimization, but the beefier, pricier AAC/Apt-X multi-device capable bluetooth audio sink solutions that sell for premiums could easily incorporate a bit of extra memory. 1Mb (megabit) would give a few seconds of buffering, which would be sufficient. It'll take a bit more silicon, but nothing absurd. Hell, it could be external memory over SPI, so integrators could select buffering capability themselves, without significant cost/silicon area increase of the bt chipset
However, to actually utilize buffering, you'd need a higher bandwidth link than is required for live playback to catch up after connectivity dips. Buffering by itself only adds latency. The question is then whether you'd want to use your link bw for resilience of low quality audio, or instant high quality audio...
Last week I accidentally found out that I can use two phones at once with integrated HF in my car's radio as long as they use different BT profile (somewhat unsurprisingly, while trying to fix completely unusable sound quality over BT)
I don't think your phone will try to use A2DP for a call.
It's always been terrible. Even just connecting my phone to my Tivoli Audio speakers from 2 feet away was a nightmare. I'd have to try to reconnect 10 times before it would finally work. This is across dozens of devices and now more than a dozen years.
My Mac stuff is mostly better although I found when I had a deskmate, in a room full of 40 people using multiple Bluetooth devices, sometimes my trackpad would disconnect and have a hard time reconnecting. When he quit, everything went back to normal, then when they hired his replacement (and gave him different equipment), the problem came back.
That said, recently it's been an absolute miracle. I bought a new VW Golf with bluetooth. It's AMAZING. It connects quickly, every time, and works flawlessly. After having a positive experience with a friend's bluetooth headphones, I bought some Bose QC35 bluetooth headphones, and THEY are amazing! they even connect to 2 devices at once so my computer can send me alerts while I listen to music streaming from my phone. And the range is outstanding -- 60+ feet through multiple tile walls in my office building. One time I got into my car wearing my headphones (on a conference call), started the car, shut off my headphones, and the call magically transferred to my car, instantly. Based on THOSE good experiences I bought some Jaybird X3 running headphones, which also work amazingly well. I tried some Outdoor Tech Chips helmet speakers for my snowboard helmet, and it was IMMEDIATELY back to 2005. The sound would have static and crackle. When I turned my head left with my phone in my right pants pocket (the antenna is apparently in the left earpiece), it would LOSE CONNECTION. Hell, sometimes even sitting on the ground with my head bent, about 2 feet line of sight, it would lose connection. I took them back to REI and bought a Sena Snowtalk 10 so now my ski helmet is back to multi-room range as 2017 Bluetooth intended.
I'm interested in following this comments thread to better understand the tech, but I simply can't believe how good it's gotten. I'll look at my phone and it will tell me the charge of my bluetooth headphones and apple watch, and everything JUST WORKS, thanks god, for once.
I have Bluetooth devices years old that I've never had problems with, and others that are a constant nightmare. The software stack behind the Bluetooth is also a major component in the reliability question.
I had a colleague for a time who's dad was a hardware engineer with Toshiba & worked with/on their part of the specification Working Group.
His pop said that the whole BT stack was unambiguously a steaming pile of poo from the get-go, and it was nearly miraculous it functioned as well as it did¹.
At that I had to chuckle, seeing how I'd wager that each of us have had enough woggy experiences with the tech to agree with the point he made so plainly.
But I do love the chosen icon & the history behind it, viś-a-viś the name ("Bluetooth"), so it's not all bad <wink>.
¹—this was around 2010 or so, to add some context wrt the relevant timeline(s).
I have AirPods, and they were pretty fantastically. I've had occasional issues but it's rare enough it doesn't bother me at all.
Best way to improve reliability is to avoid dodgy or counterfeit radios in crappy electronics.
I'd still prefer wired for anything stationary (keyboard & trackpad in my case), just to eliminate the entirely superfluous batteries. In a world where that option isn't available, however, bluetooth works perfectly well for me.
- Macbook to Apple bluetooth mouse
- iPhone 6s to late model Mazda infotainment system
- iPhone 6s BTLE connection to Garmin Forerunner watch
My iphone6 paired inconsistently with my car until iOS 10.2 then it failed completely. Then the dealer did a routine maintenance and it started pairing again. Until 10.3, and now it's hit and miss.
Plantronics headset used to be my foolproof solution for Google Hangout and Bluejeans meetings. Now it won't pair successfully with the MacBook at all. Nor with the iPhone 6.
It's the "moving target" aspect that is most infuriating for me.
Back in the day I used to just run "rfcomm bind <mac-address> <channel>". But it turns out BlueZ decided to deprecate (read: stop building by default) the rfcomm command in favour of... wait for it... a barely-documented D-Bus interface.
How much do you have to hate your users before you decide to take away a useful, functional executable and replace it with an unspecified interface over D-Bus that requires hours of research to use rather than 30 seconds of reading a manpage?
That is, I use the cutting edge Linux distribution (Ubuntu 17.10) -- it was pretty darn painful even on 17.04. I have another keyboard that is on Bluetooth 3.0 that fucking disconnects every other day.
So YMMV - I think BLE mice and keyboards are much better in terms of 'just works' unless you pair them with a whole bunch of devices.
I had high hopes for Google Chromecast Audio for my music at work and at home. Probably my fault for jinxing myself by asking "What could possibly be worse than Bluetooth?" Chromecast Audio has definitively answered that.
For one thing, you can't limit who can interact with the Chromecast. Anyone on the network can see it. At work, my music would usually pause ~4 times a day as someone else's phone would see it and connect to it. I'd have to set up a new wifi network that only I could use to fix this. Since I only listen to music a few hours a day, that's pretty frequent.
It also gets confused if I leave work and then try to use Google Play Music elsewhere: my Google Home in the bathroom will play a song and then stop, I think because "google play is being used on another device", but it doesn't tell you that.
Maybe I should just go back to using something like a Raspberry Pi with access to my music library, it still is mostly songs I have the CDs for and ripped, though I've added probably 50-100 songs over the last year on Google Play, my 600 CDs I have all in FLACs.
EDIT: jordanthoms points out that "guest mode" is for letting people stream audio without giving them your wifi password. Once they're on the network everyone has access.
FWIW, Chromecast Audios are one step further than this, as they do not support this "guest mode" as they do not have a way to display the PIN. Anyone on the WiFi network can freely connect to the device.
That said, I have never seen a device automatically connect to a Chromecast or Chromecast Audio. Normally you have to explicitly connect to a Chromecast.
That makes me a little less excited about my plans of getting Dual Shock 4 for my PC for gaming.
My and my wife's Fitbit have constant Bluetooth issues to our phones. This is completely and utterly annoying.
Driver related? Not sure.
Well, is the bluetooth as good when your wife is driving the Rav4?
Check out the PCB guidelines for the popular NRF51822 Bluetooth tranceiver:
It barely scratches the surface, and in reality there will always be some trial and error as other nearby circuit elements can cause unpredictable interference or feedback. But it's a lot more finnicky than laying out an Arduino board.
but there is one thing, bluetooth is not useful if the file is big.
2. fragile modulation techniques (uwb would've been a "final solution" to the problem, but died to patent trolls)
3. interference from wifi (try using bt mouse while downloading an hd movie)
4. because of three different "wire protocols"
But the upside is that BT super cheap to implement, and thus ubiquitous
WiFi in its initial days (802.11b) reminds me of bluetooth right now. Quirky, bad tools, weird errors. But WiFi caught on and manufacturers started throwing $B at R&D for better chips and better drivers for those chips.
Bluetooth just has a problem with scale.
I wouldn't say adoption was lower than wifi, in fact bluetooth was probably more rapidly adopted because, since it's linking simpler devices, it's cheaper (wifi needs the whole TCP/IP stack underneath it), but you're probably right in that it has seen less investment in error robustness than wifi - momentary loss of voice through your handsfree is usually tolerable, wifi dropping a large number of packets is not.
If my Bluetooth mouse stutters for a quarter or a second, I'll likely notice and be angry, but if a YouTube video stutters loading, it'd have buffered and likely hidden that.
Your statement seems like a 'first out of the gate', or 'battle tested' argument. They are approximately the same age, no?
They aren't competing technologies; they have differing use cases.
One has enjoyed wide success, and the other is still stumbling along.
Is it a case of unclear standards, bad implementations, or lack of interest in the blue-tooth use case of wireless accessory devices?
Still if you start with features like the fast handoff or the more obscure authentication combinations you'll see the problems.
Especially back in the A/B days it was awful.
I've dealt with it's internals to a small degree. It's overengineered and excessively stateful, leading to a lot of edge cases and failure modes that just do not need to exist. It could have been orders of magnitude simpler and nearly stateless and it would have been a dream. State is evil in protocols and should be avoided if at all possible.
It really reeks of vendor clusterfuck with lots of requirement overloading, which is very typical in modern protocols that are vomited forth by consortia. WiFi is at least a little bit cleaner owing to the fact that it had its requirements clearly specified: Ethernet over the air.
You are right though. I vaguely remember messing about with WaveLAN and I don't recall it being as flaky as some of the early 802.11 systems.
Even worse are the "spark" kind of 2.4GHz appliances that don't play nice, like wireless camera systems and baby monitors. If your strong-signal wifi or bluetooth keeps dropping, it's far more likely to be one of those at fault than anything else.
There are limits of course. At some point the 2.4 GHz band is so saturated nothing low power gets through. But so far, I am pleasantly surprised by the reliability.
All this was done using BT 4.2 though, so maybe that helps.
Different radio frequencies have different ranges and propogation effects depending on local weather conditions, absorbtion by humidity etc. Generally high-frequency radio is short range but high bandwidth, which is perfectly suited to what consumers want.
This doesn't seem to be a hard and fast rule all the time (eg some wireless runs at 5Ghz I believe, a bit off from the 5.8Ghz ISM band.) I believe WiGig is roughly centered around the ISM 61Ghz band though.
Which is a bit of a shame, since the resonant frequency of water is, by definition, absorbed by water, like wet walls or walking water-bags like you and me. Other frequencies would be much better-suited for human cohabitation, but that's historical evolution for you.
Every microwave oven I've ever had, including name-brand ones purchased in the last year, sent out loads of interference on one or more wifi channels.
This morning I noticed that my connection at the breakfast table was really crappy. The upstairs hotspot likes to change channels every few days, and it had picked 11. When the microwave stopped, everything cleared up, so I forced it to channel 1 and tested with the microwave again. No problems there.
Other popular unlicensed range - 443 Mhz has much lesser bandwidth so it's popular for slower data transfers, i.e. in doorbells and temperature sensors. Also AFAIK it requires larger antennas because of longer wavelength.
5 GHz wifi is only recently started to become popular because, I think, there are some problems in implementing such high frequencies, or because it's more affected by walls and furniture.
802.11n and 802.11ac get around this by dynamically hopping between the two bands as necessary to get the clearest signal at the current distance.
11n is available on both 2.4GHz and 5GHz. 11ac is 5GHz only.
I don't think there's any specific hopping thing in n/ac. It's just normal roaming between APs with the same SSID, if you use the same SSID for both frequencies. Same roaming would happen with just 11a and 11g.
(And the higher the frequency, generally the more bandwidth is available, but chipsets are a little more expensive.)