I'm not an eldery, and I prefer a jack socket. I just don't get the benefit of bluetooth in most situations.
Jacks are more reliable and definitely plug-and-play. Unless it has a physical failure, your jack socket always just works. As you state it, bluetooth requires you to "Turn on Bluetooth on your phone, then turn on your headphones, then hold down the button to pair, etc", for no benefit. Oh, and now you need batteries on your headphone.
Look at professional musicians: you won't find anybody connecting his guitar to the amp through bluetooth. They are using good old wires and jack sockets: why is that? Are they too old?
Latency. When you're making music and interacting with it directly, you can't afford any noticeable latency. When you're passively listening to it, that doesn't really matter. For phone/tablet users the main place this matters is games, due to their interactive nature.
And price: a jack is dumb tech and cheap (cheap anough that I always had about 10 extra jacks of all kinds with me when I was doing a gig).
But to be fair, wireless transmitters/receivers do exist for musicians. They don't use bluetooth ofc, but that means that the latency can be bearable with wireless. It's made for live playing though (freedom of movement), no point using that in studio
Yup. It's not the radio that causes the latency, but the software stack. You absolutely can make low-latency and just-works wireless solution for musicians, but it's name will not be Bluetooth.
The bluetooth implementations all suck. My $40k car is about as reliable and predictable from a bluetooth perspective as Chinese knock-off Jawbone copycat from 2007.
WHY is this still the case? I mean, Bluetooth is fundamentally just a communications protocol broadcast with some kind of antenna. Yet I don't have these kinds of issues making wifi or cell service work.
Who decided to use such a flaky protocol for audio and such, and then remove the reliable connection?
The problem with Bluetooth is the stack is too complex. Too many application-specific profiles baked directly into the protocol, which in turn need to be dealt with by device manufacturers. Bluetooth basically handles everything from OSI Layer 1 through 6 at the same time. See: https://en.wikipedia.org/wiki/List_of_Bluetooth_profiles
Add to that great complexity at Layer 1, and constantly changing specs. Bluetooth 1, 2, and 4 are all radically different than each other.
Compare with WiFi, which only has to do one thing (Layer 1-2). Cellular is much more complex, but that's why releasing a new phone involves endless rounds of (usually mandatory and very strict) carrier certification.
The RJ-45 connector is an open standard that can carry basically anything—internet, video, audio, low-level data, etc—at a high bandwidth and with low latency. Why can't we have a wireless standard that does that?
I want non crappy Bluetooth mice that don't require a separate USB dongle. :(
Edit: To be clear, I'm sure ethernet cables are an entirely different thing and there's actually a very good reason why wireless makes everything more complicated. But, well, maybe that's one of the reasons people would like to stick with wires.
Even though wireless transmitters have gotten a lot better than previous times (in the 1980s one of the Spinal Tap spoofs was getting radio interference in their wireless guitar system if I recall correctly), now using similar frequency bands wi-fi does with proprietary protocols from what I see (definitely not Bluetooth!)... you still see reviews of reliability problems in some circumstances. You'd definitely want to carry around a standard guitar cable as a backup if you have one.
I do like wireless, for a different kind of reliability: never ever having to worry that I'm going to snap off the plug in my phone, pull the jack off the circuit board from sitting on it wrong, whatever. When I sit down on the train, last thing I want is to sit on my phone wrong and wreck the phone and the headphones at the same time.
Moving parts and physical connections are fragile. Wireless connections solve that fragility. Agreed that pairing sucks, but that's a solvable problem.
I personally think the 3.5mm TRS jack is more reliable than Bluetooth :) but I would like to comment that this is not the most durable jack out there, so the relative fragility of 3.5mm also is a solvable problem.
The music industry standard for instrument cables is 1/4" TS for unbalanced instrument audio, 1/4" TRS for a few things like monitoring headphones, and XLR cables for most everything else. XLR in particular is a very durable system, unless you get a very badly made XLR cable or jack I would never call this system "fragile". Even 1/4" is quite an improvement durability wise over 3.5mm.
XLR is not exactly a very portable jack, but a mini-XLR jack does exist -- not as portable as TRS 3.5mm, of course. But if increasing jack durability is such a concern on a more portable device, this connector is an option. Same with using 1/4" TRS, this is a connector which has the advantage of having more headphone options with this option built in (most headphones marketed for studios should be 1/4" TRS or at least have that option).
None of the above are as portable as 3.5mm and I think this connector is "durable enough" to where I doubt it is such a priority for most phone consumers to give up some phone thickness for better durability. But in professional music applications there is a reason why 3.5mm is relatively rare.
Well, you can always sit on your phone wrong in the train (or do something physical 'wrong') and wreck 'something' (most likely the screen - coincidental evidence i've seen more phones with broken screen than ones with broken headphone jacks), headphone or not. Why aren't you advocating for the removal of screens from phones?
While I'm not the most careful person in the world, I'm also not the most incautious; I think it's probably not an uncommon experience for people who do regularly ride trains--e.g. those of us living in big cities where you commute by train instead of your car--that you sit and you realize you're sitting on your coat in an odd way that's got your phone in the pocket underneath you where you're sitting.
I mean, at some level, come on and use your imagination. People riding the metro don't always have total control over getting their belongings situated properly when the train's packed and there's a lot of bustling around. I assure you this experience is not unique to me.
That's weird, it took me half a year (commuting daily) to go from "carrying phone in jeans' back pocket is practical" to "I have to take out phone from the pocket _every time_ before I sit". No problems ever since. But I guess I learn fast ... Unless it's setting up some wireless crap.
Hey that's great. You know what? With the wireless earbuds, I never have to think about that again. I can throw my phone in my pocket, any pocket, and leave it there while I get crammed into a metro car.
So, I guess there are tradeoffs -- and fixing pairing means everyone can enjoy freedom from one more wire, while keeping on using a wire means we keep adapting our behaviors around a wire.
Don't forget that you can still have a headphone jack AND bluetooth, the problem here is the removal of one as i'm sure many people would be mad if nex gen phones would dump bluetooth and use wired-only audio.
And speaking of latency, technology has broken in another respect here: rhythm games. If I play Dance Dance Revolution from a PS2 on a flatscreen, they add so much lag that the visuals are unusable, even in video game mode. Others have told me they have the same problem even on games for newer consoles intended to played on such TVs.
So I end up needing to find a CRT to play them.
I also saw this on the Nintendo Wii emulations of SNES games on new flatscreens.
Audio and display lag in rhythm video games is a 100% solvable problem provided:
1. The amount of lag is consistent (check, as long as you're not using Bluetooth)
2. The game is not key-sounded (pass for DDR)
3. You know how much lag there is.
Since rhythm games are deterministic, you can just offset the visuals and audio output to compensate for the delay. Keysounded games are imperfect, but it's rarely an issue unless there's a lot of lag.
The real problem is #3. Without extra, expensive equipment, there's no way to know exactly how much lag there is. It's annoying that the only real problem comes down to an inability to measure.
You could also solve the problem with a hardware database. For instance, I know my projector is delayed by 22ms because multiple reviews mentioned it, so that's the value I use in Project Diva. You can't find this info for most TV's.
Unfortunately, no one with the resources to really tackle this problem seems to care enough to do it.
I should also note that what many games do do is offer some type of user calibration, ie, press the notes with as good timing as you can manage, and we'll assume the average is the amount of delay in your setup. This kind of works sometimes, but it's imperfect for obvious reasons.
Actually, thinking about it again, the reason it isn't sent back via ARC is probably because actual lag time seems to be something that manufacturers don't want to share.
After going through the rigmarole of choosing a new TV recently, TV manufacturers seem very keen on discouraging actual comparison of specifications.
I don't think adding your own lag solves it. The game is still expecting you to hit the buttons at its own correct time. Even if you lagged the sound to match the screen, the game would think you're wrong. And you can't get the image from before the console outputs it.
You adjust the timing window. If the display adds 20ms of lag, you expect the player to press 20ms later.
This wouldn't work for most games, because in, say, Mario Kart, the world needs to immediately react to the player's input. But in a Rhythm game, no one will notice if the visual feedback comes in a few ms late, as long as inputs are being read at the perceived-correct time. (Delayed audio feedback may be slightly noticeable in particularly bad cases, hence my note about key-sounded games.)
N.B.
1. This doesn't make the timing window more forgiving. Presses made at +0ms are counted as too early, because it is too early on the player's display.
2. I say "display" for simplicity, but IRL it's often a mistake to assume that audio and the display are delayed by equal amounts.
-----
Edit: To be clear, you can't fix this—the developer has to add user-adjustable latency settings. PS2 games designed for zero-latency CRTs and will never work properly on laggy LCD screens.
New games, however, usually let you adjust for latency. Although, they often make the mistake of not accounting for audio latency, or not allowing visuals and audio to be adjusted separately.
Btw, I've heard that the newer Rockband games in particular can auto calibrate for latency via hardware built into the peripherals.
Display latency sucks, but I'm a little sympathetic in the particular case of rhythm games, which can have timing windows as small as 15 milliseconds (e.g. for a "Marvelous" in Dance Dance Revolution).
For comparison, for Virtual Reality ≤ 20 milliseconds is usually considered the target latency.
And Playstation 2 developers couldn't have predicted how display technology would change.
It wouldn't matter at all if they didn't add noticeable to an already-computed image.
Remember, this is lag added by the television software, not the PS2 system. We know this because if you take the pure hardware approach of component cables, there's no noticeable lag.
VR has an excuse. VR has to do an expensive computation to decide where all the pixels have to appear to seem real to the human eye. The TV just has to take a signal and display it on the screen. That's the identity function.
>And Playstation 2 developers couldn't have predicted how display technology would change.
Rather, they couldn't have predicted slowness of upscaling and refusal to implement an optimize-for-time option.
Don’t forget video. Bluetooth adds latency, and “true wireless” (no cable between the earbuds) adds another layer of latency as the signal has to be sent from one earbud to the other. Noticeable enough that you have to manually add delay between the video and audio tracks, if your player allows it.
Exactly. I bought Bluetooth headphones to listen to music at work & home, and it was all fine... until I decided to watch some TV shows in my free time. It was then that I realized I have 0.5 - 1.5s lag in audio (growing with time), which made movies unwatchable. Very quickly I bought two pairs of wired headphones instead.
Something very wrong in the stack there. There should be a delay in the video in order to compensate for the latency sending the audio to your ears (Bluetooth does not add much here but it all adds up) and you should experience no latency.
>I just don't get the benefit of bluetooth in most situations.
Aside from exercise, there isn't one.
OTOH, as you note, there are a large number of serious drawbacks -- battery failures being the biggest and most obvious. The other thing is that, heretofore, you could rely on a dumb audio jack if you wanted to play music with almost any device. Now you're limited to things with Bluetooth, and you have to extend some amount of trust in making that connection.
I use almost exclusively higher-end headphones -- say, $300 and up. Bluetooth has effectively zero presence in that market. There's a reason for this.
I also own AirPods, but their job is about 60% gym and 40% conversations. For serious listening, I reach for the Grados or the Etymotics or the Sennheisers.
Far too much of my life has already been wasted on pairing, unpairing, and re-pairing various Bluetooth devices. And I don't even use Bluetooth headphones.
Look at professional musicians: you won't find anybody connecting his guitar to the amp through bluetooth.
The first example that comes to mind is Angus Young, who has said that he considers his wireless rig to be part of his "sound". If any rock musician qualifies as "old", well, Angus ain't running around on stage like he used to. Anyway, that's just one example. Pros use wireless rigs all the time. They're not BT, though.
And they don't have some of the audio problems the blue tooth and USB have.
I am having a lot of trouble with my works laptop /docking station using usb for skype - To the point where I am going to try brining in a real sound card and use one of my Shure dynamic mics with a 1/4 inch jack
Squelchy artifacts in music were noticeable over bluetooth in a 2016 model year rental I drove recently. Sound quality was definitely superior with an aux cable. I will almost certainly never buy a phone without that jack.
> Unless it has a physical failure, your jack socket always just works.
What does that mean? What would it mean if I said "Unless it has a physical failure, your car always just works"?
I've had lots of problems with wired headphones and jacks. More often it's the headphone that's the problem, but I've have several issues with the jacks themselves. They can get stuff jammed into them, but they can also just not work properly. I don't know why exactly, but I've had it happen.
It means you can have two perfectly functional wireless devices and not be able to make them communicate at all, but it won't happen with an analog wired connection.
Surely you wouldn't accept "Unless it has a connection failure, your wireless headphones alway just work". Because it's implying it has more reliability than it actually has. And the reality is that wired headphones don't always just work either. I've had heaps of problems with the headphones themselves, and several instances of issues with the jacks.
Apple designs experiences, and your argument (which is a valid one) boils down to your use cases align with Apple's target. Like anything, wireless have pros and cons.
The big pro is that it's fully portable and gives you a freedom of movement that is difficult to match. Most of the cons are associated with poor care (corrosion, stuff jammed in there) or poor quality devices (which is a "con" shared with wireless).
There are real cons as well. Wireless devices are 5x more expensive than a wired device of equivalent quality, is susceptible to a variety of forms of interference, introduces pairing problems particularly as devices age (ie. A 6-10 year capital asset like a 2014 Honda Odyessy may randomly not work with your iPhone Xs), higher latency, unpredictable latency with different endpoints, etc.
The rage issue here is that by taking away the jack, your profoundly reduce the flexibility of the device. The only meaningful benefit is to drive revenue for Apple's secret weapon -- high margin accessories. Those $30 earbuds have a cost that is probably <$3.
I mostly agree with your comment, but not this part
> The only meaningful benefit is to drive revenue for Apple's secret weapon -- high margin accessories. Those $30 earbuds have a cost that is probably <$3.
As I've argued in many other places in thread, removing the headphone jack will speed the adoption and price-reduction in wireless headphones. While this is not a pro for a lot of people, it is for others, including myself.
I don't agree with the argument that the only thing is to drive revenue to Apple, as there are many different brands of wireless headphones, just as there are with wired headphones. I own non-apple bluetooth headphones.
I can see your perspective here, but I think that's an effect, but not the Apple strategy. Particularly given that they "just happened" to release magical wireless earbuds at the same time!
Apple doesn't need to own the whole wireless vertical, but they are adept at cross selling within the product line.
Jacks are more reliable and definitely plug-and-play. Unless it has a physical failure, your jack socket always just works. As you state it, bluetooth requires you to "Turn on Bluetooth on your phone, then turn on your headphones, then hold down the button to pair, etc", for no benefit. Oh, and now you need batteries on your headphone.
Look at professional musicians: you won't find anybody connecting his guitar to the amp through bluetooth. They are using good old wires and jack sockets: why is that? Are they too old?