Hacker News new | past | comments | ask | show | jobs | submit login
Apple AMX instruction set (M1/M2 matrix coprocessor) (github.com/corsix)
210 points by rostayob on Sept 5, 2022 | hide | past | favorite | 188 comments



Knowing the history of Apple an open standard and given their success with their implementation of the ARM64 ISA, it is unfortunately highly probable that they will follow the proprietary route once again.

Indeed, they are already doing it, we're lucky they weren't in a dominant position when TCP/IP or HTML were invented.


There isn't any standard matrix multiplication instruction set so there's nothing to standardize over. Machine-learning driven instruction sets (of which MM instructions are motivated by, but not exclusively for) like this have been generally bespoke because the field moves relatively quickly compared to hardware. Every vendor generally follows some basic principles but the specifics are dependent on the workloads and models they expect e.g. quantization or how they expect to split models across accelerators. And ARM does not allow public proprietary instruction set extensions to ARM cores, one of their defining architecture features is that licensees literally are not allowed to do this.[1] The only reason Apple was allowed to do so in this case is likely 1) They negotiated it as part of their AAL (probably for a lot of money) and 2) They do not publicly document or commit to this feature in any way. It could get deleted or disabled in silicon tomorrow and Apple would be able to handle that easily, and in every other visible way they have a normal ARM64 complaint CPU core (there is the custom GIC and performance counters and some other stuff, but none of those violate the architectural license and are just IP stuff they chose to work on themselves.)

So actually the thing you're complaining about is prevented by ARM themselves; Apple cannot publicly commit to features that would fragment the architecture. They don't have to do everything identical either, though.

[1] They have publicly said they will allow some future Cortex cores to contain custom instructions, but it is quite clearly something they're very much still in control over, you won't get a blank check, especially considering almost all ARM licensees use pre-canned CPU cores and IP. You'll probably have to pay them for the extra design work. There are no known desktop/server-class CPUs that fit this profile on the current ARM roadmap, or any taped out processor, that I am aware of.


> There isn't any standard matrix multiplication instruction set

The Scalable Matrix Extension supplement was released last year. Though obviously AMX predates it, having shipped in actual silicon 3 years ago.


In addition to being too new, Scalable Matrix Extension is for Armv9 - the M1 and M2 are Armv8 architectures


> The only reason Apple was allowed to do so in this case is likely 1) They negotiated it as part of their AAL (probably for a lot of money)

Apple fronted the cash that created ARM holdings in the first place, so yes, they invested quite a lot of money (well, relative to the other senior partners Acorn and VLSI and later investors), and ARM was hardly in a position to tell them "no".


History will tell, but I have a bad feeling about "Apple Silicon".

They would not use that naming if they intended to support the official ARM ISA in the long run.

The only thing that would prevent them for going the proprietary route is if they can't.


> They would not use that naming if they intended to support the official ARM ISA in the long run.

Given Apple's marketing priorities, my guess is that the intent you speak of had zero weight in their naming decisions either way. They have no interest in raising the profile of ARM chips in general, and every interest in promoting their specific chips as amazing.


Apple Silicon is no different from Qualcomm Snapdragon or Samsung Exynos.


Does Apple license theirs to other platforms?


I was referring to branding. To clarify my point, I believe having branding separate from Arm’s does not substantially indicate a desire to move away from Arm.


No, and given ARM's hostility to Qualcomm's acquisition of Nuvia they probably would pitch a fit if Apple started selling silicon to third-parties.


I suspect, given Apple's pivotal role in founding ARM holdings, that they have as close to carte blanche with respect to the ARM IP as one could imagine.


Huh, I hadn't realised Apple had been one of the investors when ARM Holdings was spun out of Acorn Computers. It seems their interest was in the Newton using ARM chips.


> "we're lucky they weren't in a dominant position when TCP/IP or HTML were invented"

TCP/IP and HTML became dominant because they were open standards. Should they have been proprietary, they would have floundered and something else would have emerged instead.


I guess people have forgotten, that Webkit came out of KHTML from the KDE team and that Apple was a nightmare when it came to contributing code back. They just released a huge dump of the whole thing.


I remember this... it was just a fork. Projects get forked. It's unfortunate from some perspectives, but from other perspectives you can understand why forks happen.

When you have a long-running fork, especially one that is so active, merging it naturally becomes a nightmare. This is expected and ordinary.

The Linux kernel gets forked by Android vendors and others all the time. A lot of the changes never make it upstream, for various reasons. At least the story ends a bit better for KHTML / WebKit.


Every single Apple patch to GitHub projects is done by the same single indistinguishable user account. This isn't just "some long-running fork". It is Apple culture to actively prohibit contributions to open source projects unless 5 managers sign off on it.


They really don't like people "poaching" their employees/wildlife. Remember they illegally collided to stop other large tech firms with Apple board members from cross recruiting.


No, it’s usually just overzealous lawyers. Remember that Apple is still a pre-dot-com company and — like Microsoft — retains vestiges of those attitudes.


I think that's collusion not collision :) But probably a case of "Damn You AutoCorrect" :)


I don’t get this criticism; this is a case where open source _worked_. People complained about it. Apple cleaned up their act on it a bit, but still maintained WebKit as a fork. And Google forked WebKit when they decided they didn’t want to play in Apple’s sandbox anymore. This is how it’s actually supposed to work. It gets messy sometimes because humans are involved.


This isn't quite the same thing. Apple are really terrible at cultivating open source - more obvious than KHTML is the fiasco that resulted from Apple's half-hearted efforts to kindle a community around Darwin - but my impression is that they have been decent enough with the kind of openness that standards processes need.


Were there ever really interested in Darwin being a thing? I've been following Mac closely since OSX and I've never seen it getting any limelight.


At Apple's executive level, it doesn't look like anyone ever really cared. But people were hired, like Jordan Hubbard, who were supposed to liaise with the community and it's clear that both a nontrivial number of Apple developers were optimistic about the prospects for a healthy Darwin community and that many Apple users found Apple's choices in the early years of Darwin being open sourced and then partly closed to be very disappointing.


I was disappointed in how the Dylan project ended up.


They don't even remember how vertical integrated they were before OS X came to be, how would they remember that?


> TCP/IP and HTML became dominant because they were open standards. Should they have been proprietary, they would have floundered and something else would have emerged instead.

Anyone remember AppleTalk?

https://en.wikipedia.org/wiki/AppleTalk


ActiveX would like to have a word with you.


ActiveX was never 'dominant' really. It was always a duopoly with Java (and in many use cases also with flash!) and a pretty niche one at that (corporate software and crappy webcams)



DECnet, Token Ring, Novell, X.25, et cetra, would like to have a word with you.


Yes, that attempt to EEE the web was thwarted thankfully.


Right. If the government hadn't stepped in to encourage Microsoft to play nice, we might live in a world where "the web" simply means Internet Explorer.

I'm beyond the point of negotiating with the people on this website. Apple is due in for exactly the same treatment, it's only a matter of time before the US eats their favorite crow.


> If the government hadn't stepped in to encourage Microsoft to play nice, we might live in a world where "the web" simply means Internet Explorer.

I kinda doubt that. As soon as Microsoft had a virtual monopoly on the browser market, they let IE go stale for years. Hardly any feature development, hardly any bug squashing. Terrible security. By the time the browser choice thing in the EU and the antitrust thing in the US happened, the rot had already set in and everyone was fed up and yearning for a browser that didn't suck. Google drove their Chrome truck right into that gap.

If IE had actually been a decent browser, no amount of "choose your browser" screens would have been enough to sway people from it. Just like they cling to Chrome now because Google is too smart to make that mistake.

PS FWIW I don't like and hardly use chrome but technically as a browser it's great, I just don't like Google's attitude to privacy.


Instead we live in a world where the web means Chrome. How wonderful!


They usurped the minds a generation with free email and YouTube.


Look, as someone who was rooting for MS to lose big time back then, I would have been happy for that to be true. But MS lost through hubris, not the antitrust settlement.


If it wasnt clear in my reply, I agree with you.

But as an aside, don't let this site get to you too much. There's a lot of arguing for sport that goes on here.


Phyrric victory, modern Web simply means ChromeOS rebranded.


WebAssembly....


WebAssembly is an open standard and I don't really see how it's any bigger a problem than asm.js or obfuscated JS already was.


Agreed! Would be nice to reverse the tide on that stuff though. C'est la vie.


Apple AMX is a private extension. It is not documented and not exposed to the developer. You have to use system-provided HPC and ML libraries to take advantage of the AMX units. This gives Apple the freedom to iterate and change implementation details at any time without breaking backwards compatibility.

I am sure they will support open standards in time as they mature, but for now there is little advantage in doing so. Not to mention that open standards are far from being a universal panacea. Remember Apple’s last serious involvement into open standards - OpenCL - which was promptly killed by Nvidia. Apple has since learned their lesson and focus on their own needs and technology stack first.


https://en.wikipedia.org/wiki/Thunderbolt_(interface)

What I liked about this is that Apple realized it makes more sense for someone other than them to develop and supply tech for something like Thunderbolt, co-engineered with Intel for Intel to do it (and eventually open it, royalty free).


Well, they did create OpenCL.

But yeah it's ages ago and they've kept everything proprietary since then :(


I think OpenCL was a pivotal moment. Apple created the draft and donated it to Khronos, where is subsequently was stagnating before Nvidia effectively sabotaged the effort in order to push its own proprietary CUDA. Since then Apple has been focusing on their own hardware and ecosystem needs. I think the advantage if this is well illustrated by Metal which grew from being this awkward limited DX9 copy to a fully featured and very flexible yet still consistent and compact GPU API. Sometimes it pays not having to cater to minimal common denominator.

In the recent years I am more and more convinced that open standards are not a panacea. It depends on the domain. For cutting edge specialized compute open standards may even be detrimental. I’d rather have vendor-specific low-level libraries that fully expose the hardware capabilities, with open standard APIs implemented on top of them.


AMD and Intel were the ones that sabotaged the effort by never providing the same level of tooling and libraries.

Don't blame NVidia for AMD and Intel incompetence.

Ah, and on the mobile space, Google never supported OpenCL, rather coming up with their C99 Renderscript dialect.


Re: the "lucky" remark: they certainly tried! Remember AppleTalk and HyperCard?


This is seriously ahistorical revisionism. There was no universal networking standard in the 1980’s, and there certainly wasn’t anything universally suited for networking on microcomputers, especially not with the zero-configuration usability that Apple wanted. Remember that TCP/IP was largely a plaything for academics until the early 1990’s, and Apple had to create zero configuration and multicast DNS before they could consider deprecating AppleTalk for use on small LANs.


I think that if Apple had been as powerful as they are now back then, they would have pushed "their" tech more aggressively and refused to support "inferior" protocols.

Really, we've been lucky that it was Microsoft and not Apple that was the dominant player in the 90s.

And I am far from a Microsoft fanboy, but I think that Apple hubris has always been there, and their contribution have to be mitigated in some way to stay on the positive side.


All vendors for home computers were vertically integrated, the PC was the exception only because IBM messed up and Compaq was able to get away with their reverse engineering of the PC BIOS, while the OS was developed by a third party (Microsoft).

Ironically what we see nowadays with phones, tablets and laptops is a return to those days of vertical integrated software.


Maybe I’m just slow but it wasn’t immediately obvious to me how to use this for matrix multiplication. Let me now try to explain.

Suppose we have some matrices we would like to multiply, a_ok and b_ij (and let’s say their sizes line up with the hardware because I think those details aren’t so relevant).

Their product is

  c_ik = a_ij b_jk = sum(a_ij * b_jk for all j).
The hardware lets us cheaply compute and accumulate an outer product (see picture in OP):

  r_ij = r’_ij + p_i * q_j
Now start with r = 0 and accumulate:

  r_ik = a_i1 * b1k
       + a_i2 * b2k
       + ...
       + a_in * b_nk
       = c_ik
Each row corresponds to one AMX op on all the cells of the matrix.

Writing it out like this it seems quite straightforward. I think I was caught up on thinking about the per-cell computation too much. When computing based on cells in the output, you take a row from the left hand side and dot it with a column from the right hand side (nn dot products). Here, we take a column* from the left hand side and a row from the left hand side and outer product them (n outer products) and add up the result. Perhaps this is partly a victory for this kind of symbolic index notation. I think this would all be much less obvious if I wrote it all out as a sum of outer products with eg the tensor product symbol.


So what software actually uses this / what compiler supports this if it's neither supported nor documented by Apple?


Applications use it via a higher level interface: Apple's Accelerate framework[1]

1: https://developer.apple.com/accelerate/


Speaking of which, is anyone aware of example code using the LSTM? I've been trying to get this to work, but there seems to be information missing e.g how to setup the input/output descriptors and how to manage input data: https://developer.apple.com/documentation/accelerate/bnns/us...


It’s just like with private APIs. It’s functionality that they don’t know (¿yet?) whether they want to support it in the future, but they do provide calls that use them indirectly.

There also could be bugs in the hardware that they carefully programmed around.


Why do people act as if “private APIs” (which is somewhat of a contradiction in terms) is nefarious? An API is something that is publicly documented that the vendor promises to support where the behavior won’t change.

Raymond Chen has been blogging for close two decades about all of the hacks that MS had to put into Windows because vendors used undocumented APIs.


There are two ways of dealing with apps using private APIs though: Microsoft indeed goes out of its way to maintain backwards compatibility, sometimes at the expense of sanity, but Apple breaks stuff — this API is private for a reason, you had to reverse engineer how to call it, and you knew your risks when you decided to use it, so if your app breaks, you get to keep both pieces.


Yes, and I suspect this is informed by Apple historically having APIs that were much more open and prone to what we would know call abuse by developers — but was then just cleverly taking advantage of the environment — during the Apple II/early Mac era.


I agree that people tend to get a bit too worked up about it, but I don’t think it’s a contradiction in terms as such - an API is really just any interface where two distinct pieces of software interact in some way. It doesn’t need to be formally described or published or anything like that, and the idea of a private API is pretty common generally.

Where it starts to piss people off though is where those private APIs are used to allow first-party software access to platform features that third-party software doesn’t get. For some applications that’s not so bad, but in the case of a general-purpose operating system platform or similar it’s kind of an anticompetitive move and we should complain when companies do it.


Once you make an API public, no matter how badly designed it is, you have to support it forever.

I would much rather an API be private, let the company dog food it and let their internal employees use it, and then make it public. It also gives them the freedom of completely changing the internal workings.

The extensions API and the Siri integration for third parties are great examples. The Siri intent based API is very usable and reminds of the Amazon Lex based API - the AWS version of the consumer Alexa skills SDK.


>Once you make an API public, no matter how badly designed it is, you have to support it forever

That's simply not true.



Presumably first party software that qualifies. I’m thinking siri, or their AR stuff. It’s possible it’s also used by something like CoreML where there is a public facing framework that utilizes this under the hood.

I have no special info, these are just guesses.


I don’t get why other chip manufacturers don’t go this same route. For example AVX is done on the same core that also supports integer math. Many companies have a separate GPU but AVX seems to always come prepackaged.


There is a cost to moving data to another chip(let). I think this is the main reason


Qualcomm most likely will go this route (proprietary instructions) as this now officially blessed by Arm. I'd bet we'll see that in their 2023 chips.


Isn't ARM suing Qualcomm?


Which compiler would be required for this (https://github.com/corsix/amx/blob/main/aarch64.h)?

I understand the limitation is not at the OS side, as nothing can be done there, but at the compiler-side (I mean that the Apple-supplied compiler doesn't compile against the AMX instruction set, so you'd need a compatible one that, I understand, doesn't exist).

Or is it just undocumented and you can actually get it to work with a Standard xcode and macOS installation given the headers provided?


This header works with standard Xcode/macOS, by taking advantage of inline assembly in a slightly-cursed way (turning register names into numbers and encoding the instruction itself).


Slightly cursed is how I roll. I would like to know where the trick first originated from though (I found it at https://github.com/yvt/amx-rs/blob/main/src/nativeops.rs#L22 rather than inventing it de novo)


This happens often in the Linux kernel to continue to support older assemblers for newer instruction set extensions.

The x86 retbleed mitigation uses .inst to trick the hardware instruction decoder...different instructions are encoded/run than what is speculatively decoded.


Compilers wouldn’t target it. You’d get to it via https://developer.apple.com/documentation/accelerate


Is there a comparison with other fast cpu methods of matrix multiplication?



On M1, for single-precision, one AMX P-unit is ~1.64 TFLOPs, one P-core is ~102 GFLOPS. So ~16x core-for-core. But you have four P-cores for every AMX P-unit, so more like 4x. And for double-precision that shrinks to 2x (~410 GFLOPs to ~51GFLOPs).

(This is a simplification that doesn't include the E-cores, nor the AMX E-unit, but their contribution isn't huge. I suspect AMX throughput may have doubled on M2, but I haven't verified that.)


That's a pretty huge amount of processing power hidden away! Are these experimentally confirmed performance numbers?

So this is 16x16 single precision fused multiply-adds at 3.2GHz with a throughput of one per clock, right? (16 x 16 x 2 x 3.2e9 = 1.6384e12) And does using fp16 quadruple throughput again? That would put AMX well above the GPU for fp16 matrix multiplication! (2.6 TFLOPs fp16/fp32 for 8-core GPU, 128 multiply-add per core at 1.278GHz)

How does it compare in terms of power, can it sustain 3.2GHz indefinitely or does it hit power/thermal limits fairly quickly?


Correct, yep. These are theoretical numbers, measured in cycles from a P-core (with no loads/stores), real-world performance tends to be a little less (~93%): https://twitter.com/stephentyrone/status/1455665595677085697

FP16 only doubles throughput rather than quadrupling.

I haven't looked at power/thermals, so I can't really comment. (Though it's possible it's always running a bit under 3.2GHz, since I was measuring in clock cycles - that might be part of the 7% difference.)


IIRC I measured something like 10-50% performance difference (don't remember exactly, but it was somewhere in there), vs a reasonably well-regarded blas implementation. This was for dgemm specifically; I don't know if the story changes for smaller floats.


Not too bad, I do scientific computing and choose Intel/Nvidia as their APIs for accelerated math operations are documented and supported for developers.

I've been paying attention to what Apple has been pushing with their M1/M2 chips, and I'm pretty tempted to try it out, but unless these features are documented and supported I can't feel comfortable writing programs relying on them.


The API apple wants you to use is documented and presumably is here to stay.

That doesn't help if there's some edge case you'd need access to the raw ISA but still.


Also, BLAS is part of that interface (https://developer.apple.com/documentation/accelerate/blas)

Of course it is a black box in that you can’t (realistically) try and speed it up. You still run the risk of Apple’s priorities being different from yours.


OTOH the interfaces are standard. Using Accelerate instead of the standard BLAS is just a compiler switch.


They are not supported directly, but you can ue them through the Accelerate framework, which has an optimised BLAS and FFT implementations (amongst many other things).


SGEMM / DGEMM using AMX2 (the first M1 has AMX2. The A14 has AMX1) is approximately 100% faster than the same running with NEON, which is already a specialized vector math system.


Doesn't M2 add support for SVE2 as well?


No, there are no publicly available chips with SVE2 support at all; the closest you can get is SVE1 on Graviton 3 from AWS.


Yeah, M2 doesn't. But I think there are some phones using the Snapdragon 8 Gen 1 that have SVE2.


Oh wow, that's quite impressive. I assume they're just using a 128-bit datapath like NEON? Thanks for the correction.


Yep, still 128-bit.


Is this a repost the other day, I thought that was too good have been missed out.

Also I’m keen to see if this 60Gb.a-1 near field wireless data link for Apple Watches for diagnosis will be able to be used in some sort of MagSafe/usb for iPhones.


(This isn’t a post about near-field, Watches, MagSafe, or USB; perhaps this comment was meant for another post?)


What is the “non proprietary” alternative and why should Apple or it’s users be forced to wait on consensus?

This is the same reason that Apple wasn’t saddled with the horrible PC “standards” before USB became ubiquitous.

Not to mention even today, Bluetooth is a shit show outside of the Apple ecosystem as far as handoff an ease of pairing.


> Not to mention even today, Bluetooth is a shit show outside of the Apple ecosystem as far as handoff an ease of pairing.

I see people mention this, and I don't get it. I have a bunch of el cheapo Bluetooth audio adapters (car/garage/bathroom), one Bose headset, one Jabra headset, one chi-fi headset, one JBL speaker. They just work.

Very occasionally on my Android phone I have to pull down the system tray, click the little arrow next to Bluetooth and select the device I want. But in those few ambiguous situations there is no tech apart from clairvoyance that would allow the phone to know my intentions without my input.


Are you using multiple devices with each bluetooth product, or are you using multiple bluetooth products with a single device? In my experience the latter is mostly the same across Apple and non-apple. Where Apple's experience seems to be noticeably better is using an apple bluetooth product with multiple devices.

For example, I recently tried to switch from some airpods to a pair of Anker liberty earbuds. I have 4 devices that I regularly use the headset with, an iPhone, an iPad a macbook air and a work laptop. The airpods switch almost without any effort on my part between my personal devices, connecting automatically to whichever of the devices I am currently interacting with when I take them out. On the occasions when that doesn't happen, selecting them from the bluetooth menu connects them without issue. For my work laptop, because it's not registered with my personal accounts, the airpods don't connect automatically but after pairing them once during the initial setup, when I select them in the menu, they connect reliably and quickly.

By contrast the anker earbuds apparently are only able to store connection information for two devices at a time. While I can pair them to all 4 devices, when I take them out, they will always connect to whatever the last device they were connected to, and if I connect from the bluetooth menu/settings on the device, they will only connect to the second to last device they were connected to. For the other devices, they will attempt to connect and eventually timeout. Even though the device still has the earbuds registered, the only way to connect it to device 3 or 4 is to delete it from the device, and go through the entire re-pairing process, at which point the earbuds will stop connecting to what is now the 3rd to last device they were connected to.


That's odd. I have a pair of airpods pro shared with an m1 mbp, and an iphone. The airpods seem to randomly connect to 1 of the 2 devices, almost never the one I want them to connect to. This is the same experience I've had with other BT devices.

The only good BT experience I've had is my Bose QC35ii headphones, which can connect to multiple devices at the same time.


Seems kind of limited and you have to use the “Bose app” for complete functionality. Are there apps for the Apple Watch? My AppleTV? And I still have to pair them to each device and they only switch between the “two most recent devices”.


I have a Mac, an iPad, an iPhone, an AppleTV and an Apple Watch.

If I’m on my iPad and playing something and put my AirPods Pro in my ear, sound automatically gets sent to my AirPods. The same happens with my Mac and iPhone. It doesn’t happen with the AppleTV since multiple people might be watching.

Also once I pair my headphones to one device on my account, it’s automatically paired to all of my devices. The initial pairing process for my phone and iPad is just open the case the first time and a pop up shows up.

On the other hand, if I’m on my iPad watching a movie and get a call on my phone, it switches over automatically. It then switches back when I go back to my iPad. If I take one AirPod out of my ear to gear someone, video pauses automatically on whichever device I’m using.

Then there are the little touches like being able to control headphone options like noise cancellation and special audio from my phone, iPad, or TV, seeing battery remaining and automatically being registered with Find My.


> If I’m on my iPad and playing something and put my AirPods Pro in my ear, sound automatically gets sent to my AirPods.

And for me apple decides to do these switches on its own without my input at times when I do not want this to happen. I have not found a way to disable this yet.


This is really annoying in a family household. I use my AirPods connected to my phone when I am cleaning the house, cutting the grass outside etc. My son and daughter love to watch videos on the iPad. My AirPods randomly switch to the videos they’re playing on the iPad without me ever asking for that to happen. It irritates me that Apple does this.


You can disable this feature on just your iPad from the Bluetooth settings.


Can’t you just make a separate profile on the iPad for your kids.


What kind of videos are they watching?


Mostly video game walkthroughs on Hobby Kids TV.


https://support.apple.com/en-us/HT212204

But is there ever a time that you said “I really wish I had to pair my headphones to each of my devices individually and had to adjust settings by clicking on a button on my headphones and decipher the various beeps”?


I get the same on Android on almost all those things. The only thing I don't have is whatever "Find My" is. For controlling headphone options I have a couple of dedicated apps on my phone, but if you have to touch that more than once per month it's a clear symptom of bad headphone design. (For example the Jabra headset I use at work has a dedicated physical button for toggling noise canceling.)

The upshot is that I suffer zero vendor lockin. I bought Airpod equivalents from Aliexpress, the FIIL T1 Lite for $35. They are little marvels of Chinese engineering that do everything I need and actually sound great - they frequently outperform $100+ earbuds both in reviews on blogs and according to casual Reddit commentary. Since I only use them when out and about, it's not possible to experience any audio quality upgrade unless I sacrifice practicality and go with some closed-back over-the-ear cans.


You get almost the same thing as long as you use third party apps, still have to change settings manually, etc? Do you also have to install the same app on your phone, tablet, watch, streaming device and your computer?

When you pair to one device, do they automatically pair to all of your devices?

All Apple headphones work as standard BT headphones on non Apple devices. I fail to see any “vendor” lock in.

As far as price, you can also pick up a pair of $50 Beat Flex headphones that have most of the same functionality with Apple devices.

I assure you those $35 “AirPod equivalents” don’t have the noise cancellation, spatial audio, transparency mode, or microphone quality that the AirPods Pro have.


I get the same thing out-of-the-box. I get tap-to-pair, noise cancellation, spatial audio, transparency mode and a better microphone than the Airpods (can use a higher-quality codec than AAC). Settings stay on-device, too.

Airpods were a neat party trick maybe... 5 years ago? Wireless audio isn't complicated nowadays though, I've tried at least a dozen Bluetooth headsets that embarrass the Airpods Pro (often at a lower price point).

The last thing I want to do is stop people from buying overpriced headphones though. If Airpods make you happy, then by all means, buy them. You're mostly paying a premium for iCloud integration though, which I'd frankly pay extra to avoid.

Shit though, if you want proof that Airpods are a downgrade from regular headphones, just compare the audio quality: https://youtu.be/N6Y_Q7RYmmY?t=360


You can pair your headphones seamlessly to seven devices without unpairing? Yes I have a phone, tablet, watch, computer and two AppleTVs - one in the bedroom and one in my home gym.


Yep. Multipoint connection will tether to any availible devices, and then switch between whichever ones are actively playing. Works like a charm, don't even need an AppleTV to pair it with my display.


I only have to use a third party app to change the settings on the headset (like equalizer and noise canceling level). For my devices I have no need to change these more than a couple of times per year, at most. For my Bose QC35 I think I've never touched the app after first configuration. Those settings are stored in the headset, so once I've set them they stay the same regardless what device the audio streams from.

The vendor lock-in is by definition there if there are any special Apple features. If there is no vendor lock-in, there is neither any special Apple-exclusive magic features that justify the price premium?

I know my FIIL buds don't have active noise canceling. They are IEMs and give about 20 dB passive noise reduction which is more than enough. The microphone quality is decent, but for any longer calls I use my Jabra which has a proper mic.


> The vendor lock-in is by definition there if there are any special Apple features.

This is ridiculous. Then no manufacturer, weather it be cars, or clothing, or industrial equipment would offer anything different than their competitors, lest it be deemed "vendor lock in".


One of the non-extreme solutions is publishing the doc for the extension + explicit usage grant on any related patents. They don't need to go full standardisation route before the first release. It would still be a proprietary extension under their control, but not a haha-screw-you proprietary.


ARM literally doesn't allow public instruction set extensions by compliant license holders. Apple is presumably only allowed to do this precisely because they do not sell or otherwise offer their CPUs with any other software, with any other documentation, which hides this implementation detail entirely from all users, and I assume this allowance is worked directly into their specific ARM Architecture License.

Apple themselves designed probably half (or more) of the ARMv8 standard themselves. I assume they are pretty aware of what avenues are available to them in this case.


Apple are religious about interoperability within their platform, making it easy and reliable. If they were to do as you suggested with some of their proprietary tech, there will be products that implement it badly. To the user they would have no idea who’s at fault, and would probably blame the tech in general, damaging Apple.

Standardisation, in combination with certification to use the “label”, ensures that people developing on top of their innovation do so well enough that it doesn’t damage the brand.

(Somewhat less relevant to an instruction set, and not something I particularly agree with)


>interoperability within their platform

Isn't this called an oxymoron?


Quite right, I should have said “ecosystem” really. And obviously this comment, about Apple in general, was a little off topic as a reply to the parent. I know what I was trying to get at, but didn’t convey it well in this context. Oh well.


No I’ve seen platforms where interoperability is non existent but it’s still a platform.


This doesn't really make sense. There are already systems implementing connections to Apple stuff badly due to lack of documentation. The situation would only improve with the publication. For example we already have most of m1 hardware reverse engineered in Asahi - that's not going away. We've had things like air drop and earbuds charge state RE'd too. We'll get amx libraries as well soon.


“Apple’s stuff” is there hardware + operating system.

The benefit of buying “Apple stuff” is the integration between their software and hardware ecosystem.


Yes, and... That seems irrelevant to the point of my post?


It’s relevant because

> For example we already have most of m1 hardware reverse engineered in Asahi - that's not going away. We've had things like air drop and earbuds charge state RE'd too. We'll get amx libraries as well soon.

You’re trying to use Apple hardware with non Apple software.

If you put Windows on an x86 Mac, do you expect the same experience (or battery life) that you get if you’re running MacOS on an x86 Max?


Apple Bluetooth is not much better than run-of-the-mill Bluetooth, to be fair.

Bluetooth is one of the rare cases where they should have come up with something different and better.

(Just kidding, Bluetooth is an abomination and a very weak protocol)


Apple did not even bothered to implement proper codec support for their flagship airpods. When I enable mic, audio quality goes to zero. Where's their proprietary standards when they're needed?


Isn't that more of a bluetooth issue? You're switching from one audio stream to two. Each now has only half the available bandwidth, at most.


That's bluetooth issue indeed but Apple could have implemented some proprietary extensions when both devices are Apple ones.


I see. So they're damned if they embrace a standard, and they're damned when they don't.


Wow, Apple's own headphones have that issue as well? I thought it was just my Sonys that did that!


That's the case for all bluetooth headphones. The reason is that when microphones are enabled, the codec gets downgraded to some crappy one (presumably to free up bandwidth because there are two audio streams?).


The issue isn't really bandwidth - it's about the host and device agreeing on a suitable codec.

On a Mac, my Sony headphones will fall back to the SBC codec if the mic is active. Fine for voice, but music/video/gaming sounds terrible. On Android, however, they will negotiate bi-directional AptX or some similar modern codec, so the quality is much better.


Fun fact though, on Linux you can force high-bitrate SBC on almost any headphone to get almost the same result as AptX.


SBC and aptX have the same quality (at the same bit rate). You are most likely talking about SBC XQ and aptX HD: http://soundexpert.org/articles/-/blogs/audio-quality-of-sbc... & https://habr.com/en/post/456182/


Indeed, however the AptX family has other codecs that support even better quality (AptX Lossless) and lower latency as well as adaptive bitrates, so modern AptX support is still better (hence almost as good)


> Not to mention even today, Bluetooth is a shit show outside of the Apple ecosystem as far as handoff an ease of pairing.

I wish my iPhone's bluetooth worked as well as my Linux laptop. I've no idea why Apple gets so much praise for it's bluetooth, it's not "the worst", but it's not very good either. Is Android really _that_ bad?


Are you using Apple or Beats headphones? If not, BT is going to suck regardless.


We detached this subthread from https://news.ycombinator.com/item?id=32723422.


I agree which is also why I’m very disappointed that the EU is forcing Apple into the absolutely horrible USB-C standard.

People can buy android phones if they want to. Why remove choice?


Some things deserve to be standardized. Electrical plugs being a great example. Removing choice is a feature. WiFi standards are another example. It's tough to understand exactly where to draw the line, but making computer cables all standard seems like a worthy goal. I can see a future when any device from displays to external drives to your phone can all use the same cable. That would be a nice feature where removing choice would be the better outcome.

Out of curiosity, what's horrible about the USB-C standard?


Thought experiment, you pick up a random USB-C cord, now answer a few questions.

How much power can it deliver?

Does it do data and if so, at what speed?

Does it support video over USB and if so, at what resolution?


1. The same or more power than Lightning

2. The same or faster than Lightning

3. Either none or higher quality than Lightning

I don't need to see the specific cord - Lightning only carries USB 2.0 and compressed video streams through a weird proprietary protocol. The base spec for USB-C cables is USB 2 and low-speed charging - i.e. equivalent to Lightning for everything but video out.

The main complaint about USB-C that people have is that there's no consistent labeling for the cheap-o base-spec cables versus the ones that actually have high-speed data lanes in them. This doesn't matter for the USB-C vs. Lightning debate, since charging and data will be the same or better and video requires a special cable or adapter in either case.


As others have pointed out, you're wrong about USB-C's minimum standards.

But more important, markets work best when consumers have good information about what they're buying.

Lightning always works as expected. Give me a Lightning cable and a Lightning port and I know what they'll do. Comparison shopping for a Lightning cable is easy.

But making an educated decision about which USB-C cable to buy requires understanding an increasingly complex matrix. You cannot just look at a USB-C cable or port and know what it is; you've got to parse each device or cable's spec sheet (if you can find one). https://arstechnica.com/gadgets/2022/09/breaking-down-how-us...

The possibility of lock-in to a proprietary system is one piece of information, but consumers aren't getting screwed by lock in to Lightning connectors. It's easy to find a cheap Lightning cable that performs as expected; it's easy to comparison shop for them on price.

Consumers are, however, wasting a lot of money on USB C cables that don't do what they expect because the USB-C "standards" make it extremely difficult for ordinary consumers to know what they're buying.


When I first started traveling for work with my MacBook Pro in 2021 and my portable USB C monitor, I would often have the wrong USB C cable and I didn’t know the vagaries of USB C.

I was at one of my company’s sites (I work remotely) and even the IT department didn’t have a “standard” USB C cable that could do 100W power and video over USB.

I ended up ordering one from Amazon - and having it shipped to my company’s office. I work at Amazon (AWS).


Much as I like USB-C this is far from accurate.

My JBL speaker will only charge with a USBA -> USBC cable, but not with USBC->USBC.

I've a couple of cables that will charge headphones or other devices, but won't show data devices (like an external SSD, webcam, etc).

I've some cables that won't charge my laptop, however, other cables on the same charger do charge that same laptop.

Maybe some of these devices and cables are non-compliant, but they're what we see in the real world, regardless of what the spec says. USB-C is a mess. I still need distinct USBC cables, and need to remember which ones can charge which devices.


1. Some USB cables only support power up to 5W

2. Not all USB C cables support data some or power only.

3. But USB C is suppose to be a “standard”. I can’t just assume any USB C cable is going to work with either my portable USB monitor or an iPad Pro that has a USB port


Industry bodies vary in quality; USB-IF is notoriously bad at UX. They focus on providing opportunities to participate in their newest standard. That's why you end up with monstrosities like renaming USB 3.0 to USB 3.1 Gen 1 and adding USB4 2.0 instead of simply calling it USB5, like it should.

Cables and docks, once again to give opportunities to participate to as many players as possible, only have to implement few elements of the standard to be branded.

Apple and Intel basically used the Thunderbolt standard to get rid of this mess. Thunderbolt 3 and 4 are forcing implementation of all the optional elements of USB to be branded. So if you want to have a sane experience with USB-C, stick to TB ports, docks and cables.


The superior experience with an Apple branded Lightning cable and charger: 5 watts, USB 2.0 speed, and no. Now that's progress!


20 watts, actually.

But anyway, the point isn’t whether lightning is better than USB-C. It’s about whether USB-C is good enough that we want to accept being locked to it for all time. If Apple wants to invent a better connector, they can’t, assuming this European regulation goes through.


> for all time.

That's a strawman. Standards evolve and new standards come along. 2G -> 3G -> 4G -> 5G. Well look at that.

> If Apple wants to invent a better connector, they can’t

Not true. If Apple is willing to share, they can invent all they want and propose a new standard.


So they have to wait on a consensus. Wait for all vendors to agree. Wait for a government body to approve the new standard and then they can implement the new “standard” and still have it be as convoluted and half ass like USB C?


> So they have to wait on a consensus. Wait for all vendors to agree.

Yes, just like 5G. Interoperability is sometimes worth the tradeoffs so a nuanced view is needed instead of a reductive "standards bad, competition good".


Or on the other hand iMessages is far better than SMS and RCS.

Then you have Google who alone introduced three incompatible messaging apps in one year. It must be part of the promotion process for Google SWEs and PMs to introduce a new messaging app.


> Or on the other hand iMessages is far better

I have a friend in Brazil is barely surviving as a farmer and has a tiny budget for a phone. He needs messaging. Which is better for him? iMessages or SMS?


Neither actually. The answer for his particular situation is almost certainly WhatsApp.


Not quite. He needs to communicate with everyone, even folks who don't have WhatsApp installed. Not many of them, but they do exist in Brazil.

Either way, it's not going to be an expensive iPhone with iMessages is it?


Probably not, but what's your point? Nobody was arguing that Apple phones are _cheaper_ than Android ones, nor that they're more popular in Brazil


My point is that iMessages is not "better". It depends on the use case, and cost is sometimes a factor in deciding which solution is the "best".


They didn't have a problem doing that with Thunderbolt, I don't see why it would become a problem now all of the sudden.


They didn’t wait on industry consensus. They basically worked with one company - Intel - and put the port on Macs. They definitely didn’t have to wait on the government to give them permission.


You are right, but the current topic is standardising on USB-C for power delivery. So in that case only the first question applies. And funnily enough that question is the same for every power cable and plug. Even simple wall plugs and extension cords can burn out if a device draws too much power.


How is the “current topic” only about power delivery. What happens when I plug my hypothetical iPhone with USB-C using a “standard” USB-C cable into my computer to transfer my 4K video?

What happens today if I pick up any random “standard” USB-C cable and try to charge an iPad Pro 12 inch or any other iPad that has a USB-C port, try to connect it to a video source that a USB source or connect it any USB-C device?


> What happens when I plug my hypothetical iPhone with USB-C using a “standard” USB-C cable into my computer to transfer my 4K video?

Your phone will charge and you'll be able to transfer your 4K video. Isn't lightning still limited to USB 2 speeds? If so, then it won't be any slow.

USB-C isn't a problem for Macs and iPads that use it, so not sure why it would be a problem on the iPhone.

(I didn't understand your second question.)


Not all USB-C cords support data and those that do support data support data at different speeds. Some USB-C cords support data but don’t support video over USB-C. I travel a lot and soon will be doing the “digital nomad” thing.

I travel with this portable monitor:

https://a.co/d/aFeMMK1

You can plug it into a computer and get video and power with one USB C cable. But you have to have the “right” USB C cable. Video over USB-C is “standardized”. But not all cables support the standard.

Ironically, you can have the same issue today. Some cheap third party Lightning cables don’t support data and they don’t work if I plug up my monitor to my Mac for a third monitor using Duet (I can’t use the Mac built in capability because of an incompatibility with corporate mandated malware).

The iPad Pro should also work with my monitor. But still, you have to have the right “standard” USB C cable. You would have the same problem With a hypothetical future USB C iPhone.

I’m assuming you didn’t know about how some USB C cables don’t support all of the standards. If you didn’t know - someone who posts to HN and I assume knows more about technology than the average person - what chance does the average consumer have or the people making laws to force USB C to be the standard?

That is not meant to be an insult. I thought all USB C cables were the same until two years ago when I got my first modern MacBook for work.

I’m not saying Lightning is better than having a USB C port on the iPhone that supports the maximum power possible on the iPhone, with higher speed data rates and video over USB C. Apple agrees and that’s why almost all iPads now have USB C ports.

But that doesn’t mean that you will just be able to pick up a random USB C cable and it just works.


>Isn't lightning still limited to USB 2 speeds?

It supports USB3

https://www.apple.com/shop/product/MK0W2AM/A/lightning-to-us...


What happens today if I pick up any random USB-C and try to charge my non-Apple phone is that it charges, in some cases fast, in others slow. I don't have to carry around a special cable. I would expect the same from Apple, once they implement it. As for the data stuff, I'm sure they will manage somehow.


Try your definition of “slow” on a iPhone 12 Pro Max using a cord that only delivers power over 5W. Better yet, try charging an iPad 12” Pro (that does have a USB C port) with a USB C cable that can only do 5W. Now try to use that same “standard USB C” cable to charge a 16 inch MacBook Pro.

Of course that standard cable is not guaranteed to support data at all.

So much for a “standard”.


iPhone 12 Pro Max has smaller battery than my 5y old Nokia, so I would not worry about that.


And how long do you think it would take to charge?


> How is the “current topic” only about power delivery.

Because that's what the law covers. All phone *chargers* must be USB-C compatible. And they must all interoperate. You must be able to buy a phone without the charger, so you can re-use your old one.

https://single-market-economy.ec.europa.eu/sectors/electrica...


How does that help reduce ewaste if you still don’t force standardization on the type of USB C cable being sold - ie a cable that supports 100W PD, video over USB C, and data? What are the chances that the $100 Android phone is going to come with a cable that supports the “standard”? What are the chances that your random convenience store is going to be selling cords that support “the standard”?

And your quoting the law showing that the government also didn’t know enough to consider all of those questions proves how incompetent the government is at writing laws concerning technology.


I answered your question, and now you've moved the goalposts... but I'll answer some more.

> How does that help reduce ewaste if you still don’t force standardization on the type of USB C cable being sold - ie a cable that supports 100W PD, video over USB C, and data?

The vast majority of people don't need that cable. Most people plug their cable into the wall socket (oh dear another government standard!) and recharge their device.

> And your quoting the law showing that the government also didn’t know enough to consider all of those questions proves how incompetent the government is at writing laws concerning technology.

I think this is a competent start. Industry was asked by the EU to self-regulate on this, and industry failed. I'm glad the government stepped in and in a few years I'll be able to grab any random USB-C cable to charge a wide variety of devices. Progress marches on.

In general, the EU has been very successful at writing laws around technology. Look at the mobile phone networks. I can travel anywhere in Europe and it just works. And my roaming charges are also kept lower thanks to laws. Lots of great technology laws out there if you could neutrally assess things instead of always reaching for "government bad".


Most people would be okay with a 5W cable to power a huge battery on the large iPhones? Try this, plug in a large phone with a cable that only supports 5W and has 20% battery life. Now start a video call using Zoom. Guess how long the your phone will last.

The EU didn’t enforce a law to have an industry standard for cell phones, a private consortium of company’s did.

I’m not “moving text goalposts”. The explicit aim of the EU wanting to enforce a standard was to reduce eWaste. If you have a chord that doesn’t support data, power delivery at an appropriate wattage and video - something that the iPads that have USB C already do and the hypothetical iPhone will, you will still be throwing away cables just like I threw away all of my “standard” USB C cables that came with various devices and got some that supported 100W PD, 10 GBps data and video over USB C.

The 100 section 11 chapter GDPR that did nothing but give the world cookie pop up’s shows the incompetence of EU law makers better than anything.


Read the law more carefully. Chargers, cables, and the specified devices must all be interoperable, and they cannot slow down the rate of charging. You seem to be wrongly assuming that the current state of the art will not change with the new laws.

> The EU didn’t enforce a law to have an industry standard for cell phones, a private consortium of company’s did.

You're wrong. I worked in the EU for mobile phone companies helping make them compliant with some EU regulations. Just for example, EU law specified an industry standard for roaming charges. Roaming charges were absurd before the law and different in every country - to the point that everyone feared answering a call while in another EU country. Sometimes companies succeed at good standards without government regulations. Sometimes they fail and the government should step in.

> I’m not “moving text goalposts”.

You did. You asked why the discussion was focused on power. I answered that. Goal achieved. That wasn't good enough for you after you learned why the discussion was focused on power. So you moved the goalposts and came up with a new complaint. That's exactly what moving the goalposts look like.

> The 100 section 11 chapter GDPR that did nothing but give the world cookie pop up’s shows the incompetence of EU law makers better than anything.

You're wrong again. I use those cookie pop-ups to refuse everything but the necessary cookies. That's not "nothing". GDPR has done so much good in protecting our privacy and forcing companies like Google and Facebook to adapt. Excellent.


Mandating roaming charges is completely different than mandating 4G and 5G protocols.

You keep focusing solely on “charging” when USB C powered phones and iPads that already support USB C also carry data and in the care of the iPad data using the “standard” for video over USB C. Seeing that the EU didn’t mandate any of they shows why the government has no business involved in technical standards.

The “goal” of the EU regulation was to reduce ewaste. The proposed regulation fails because you still have to replace your cable if you want your phone to transfer data.

Google and Facebook didn’t have to adapt their business at all because of the GDPR. You want to see what an effective policy for increasing privacy looks like? One private company - Apple - introduced a pop up that gives users the ability to opt out of tracking and everyone including Facebook that lives or dies by ads announced billions in reduced revenue.


You keep repeating the same points so I'm moving on.


Clearly Apple doesn't think the USB-C standard is so horrible since they are using it everywhere else in their ecosystem.


And even Apple ships a power only USB C cable that doesn’t support data with their non MagSafe equipped MacBooks.


I wouldn’t go that far. While I don’t agree with the EU forcing the USB-C “standard”[sic] on private companies, it’s time for Lightning to die. Apple is already moving to USB C on iPads with only the low end iPad still being Lightning.


> the USB-C “standard”[sic]

There's no reason for scare quotes, USB-C is an official standard. IEC 62680-1-3:2021 (https://webstore.iec.ch/publication/66588) is USB-C, and IEC 62680-1-2:2021 (https://webstore.iec.ch/publication/66589) is USB PD.


Okay, still the same thought experiment. Pick up a random USB C cable. Now tell me:

How much power can it deliver?

Does it support data and at what speed?

Does it support video over USB C?

If I bought a cheap USB power only 5W cable and got the hypothetical iPhone 15 Pro Max with USB C support that could charge faster with a 20W cable, do 10Gps data transfer and video over USB C, wouldn’t I still end up throwing away the USB C cable I got with the $100 Android phone contributing to eWaste? Isn’t that the entire argument about forcing Apple to support USB C?

What happens when I buy a cheap USB C cord from the convenience store? Will it support “standard USB C”.


If that USB-C cable only supported 5W, then it does not meet the spec and must not carry USB-IF branding. At a minimum, USB-C cables must support 20V/3A and optionally support 20V/5A.

I get you on the rest of the issues, because the simplified USB-IF branding (Hi-Speed, SuperSpeed, SuperSpeed+, etc) crucially isn't printed on the cable itself. Moreover, the constant renumbering of the standard means manufacturers often forgo the consumer-facing branding and market devices/cables with the latest standard, which means nothing regarding what capabilities a device/cable supports.

USB-IF needs to be better at enforcement, for sure. In the meantime, I just just Thunderbolt cables for everything that needs advanced capabilities and pack-ins for everything else.


How do you explain to the average consumer that even though all of these cords have USB C ends;

- the white cord that came with my old MacBook Pro 13 inch

- the little cable that came with my Beats Flex

- the cable that came with my Anker battery.

- any random overpriced USB C cable that you pick up from the convenience store or the bodega.

Are really USB C cables and that none of them support data?


Yeah this all is a mess, but if we restrict ourselves to considering only conformant cables then the problem is at least tractable.

All Type-C to Type-C support 60 W power delivery (3A at up to 20V), some support 100 W (5A at up to 20V) but those can no longer be certified, and the new 240W cables must have a certain logo on them that includes 240W clearly visible (and this means these cables can only be conformant if certified). And how much power a Type-C to Type-C cable can handle is completely orthogonal to the data it can transmit.

USB does allow conforming passive cables that only have USB 2.0 lines, which can support any of the voltages. These can often be differentiated from the cables that support USB 3.x/USB4 by way of the cable being surprising thin, but this becomes harder if it supports more than the minimum 60W.

Passive cables that support USB 3.x can vary in the maximum speed they support, which will also impact some alternate modes. If you want to ensure video support on a passive cable, your best option would be looking for a 0.8m or shorter passive cable that says 40Gbps, as those will all support the maximum currently allowed display-port bandwidth over type-c. [1] But all passive cables that include the USB 3.0 wires should support the lower Displayport 1.x alternative modes.

However, to reduce confusion in the future, USB-IF have recently revamped the rules for certified Type-C to Type-C cables. Cables must be marked with a logo that indicates 60W or 240W. If the cable supports 3.x or newer, it will also marked the max supported speed in Gbps as part of that logo. Failure to use the right logo for what your cable supports will result in failed certification.

Users are expected to assume that that any cable that does not specify wattage only supports 60W (since all USB C-to-C cables support that, except the optically isolated ones, which cannot be mistaken for a normal cable). Users are expected to assume passive cables do not support USB 3.0 data at all unless marked with: 1) a speed in Gbps, 2) a bare SuperSpeed logo (implies a max of 10 Gbps [2]) or 3) marked as Thunderbolt 3 (20 Gbps [3] unless a speed is otherwise shown).

Users are presumably expected to assume that active cables only support 5Gbps unless otherwise marked, and won't support any alternative modes (unless otherwise marked) if not marked as 40 Gbps, in which case DP2.0 alternate mode should work (but I'm not sure that display port 1.x modes are guaranteed to work).

Active cables are also where many problems lie especially as they don't always look different from passive cables. Active cables can mostly only support alternate modes that they were explicitly designed to support which for some is none at all. For example Active gen1 or Gen2 cables don't support USB4 at all. Active Thunderbolt 3 gen3 cables can be used for USB4 by some USB4 devices but this is an optional feature, so not all USB devices and hosts will support this.

Footnotes: [1] In theory, such cables should be able to handle DP 2.0 at UHBR 20 (80 Gbps) transfers, since they can reverse the 40Gbps return communication lanes, going from 40Gbps bidirectional to 80Gbps monodirectional. However VESA has not yet standardized that as an option. [2] Since these would probably be gen1 with 5Gbps per lane, and all typeC cables have two lanes in each direction. [3] Thunderbolt 3 implies gen 2, which as 10Gbps per lane, times two lanes in C-to-C cables.


How is USB-c horrible?


The physical form factor of the “type C” connector is great but that a good example of how broken things are is that anyone can stick that type of connector on pretty much anything - might be usb 1, might just be power delivery, might be usb 3.2, might be real thunderbolt, might be thunderbolt-esque paie framed over usb 3.2 messages, this is before even getting into the wild world of hdmi 1.4 and display port over usb 3. This mess is assuming that the vendor implemented things correctly or is using a compliant controller chip, reality just gets even worse.

A simplistic interpretation:

Because the consortium wanted to get everyone on board, they allow pretty much any part of the spec to not be complied with. In theory there are various profiles the should be adopted but in practice that hasn’t happened.

What happens when I plug in a c-type plug? You just can’t say… and I mean you REALLY just can’t say. Will high power delivery and hdmi work (I’m looking at you broken Nintendo switch usb-c implementation), will you get thunderbolt packets wrapped over usb 3.2? Will you even get high speed? Is the cable active or passive? Will this cable give me high speed data? Will this >3ft cable give me high speed charging or just silently stay at 5v and ~1amp because the resistance on the middle pin is too high on that particular cable.

To placate many vendors who wanted to because to produce cheap crap and flood online stores, many parts of the spec do this all without active protocol handshaking and simply fail silently.


>The physical form factor of the “type C” connector is great

At least for phone charging, I find it worse than lightning. It's way too loose (whereas lightning is snug), and I'm always worried about the plastic bit sticking out on the female side is going to break.


Before I standardized on the cables below, I had USB-C cables in my laptop bag that:

- were power only up to 100W. But couldn’t do data.

- could do power up to 60W and data at USB2 speeds. But they couldn’t do video over USB-C. I have a USB-C powered portable display that can do power and video directly from my laptop.

- a USB-C cable that can do power, video, and data. But I’m still not sure how much data and power it can deliver.

- a few smaller USB-C cables that came with my Beats headphones and my Anker battery. I don’t know what they can do.

I finally threw away all of my cables besides my MagSafe cable and standardized on these for mobile devices (cheaper and not as thick).

https://a.co/d/7e3gH9u

But these are truly “universal”.

https://www.amazon.com/dp/B093YVRHMB


[flagged]


Nah, HN very much downvotes posts showing that maybe a company that controls the entire ecosystem may give users a better experience than a “consensus” of industry experts with an “open” standard.


And bonus downvotes if that standard isn't yet another UNIX clone.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: