Hacker News new | past | comments | ask | show | jobs | submit login
Why aren't motherboards mostly USB-C by now? (2021) (greenspun.com)
215 points by cute_boi 11 months ago | hide | past | favorite | 496 comments



My experience has shown that USB C connectors are far more fragile than A, and every time I plug one in, for some reason the fit just feels vague and unsatisfying. An A plug, when you get it the right way, slides in solidly. As others have mentioned, C seems to have been designed to be the smallest possible, which makes sense for mobile devices, but not for a desktop and its peripherals.


Interestingly enough, they're specced the opposite way.

> Standard USB has a minimum rated lifetime of 1,500 cycles of insertion and removal, the Mini-USB receptacle increased this to 5,000 cycles, and the newer Micro-USB and USB-C receptacles are both designed for a minimum rated lifetime of 10,000 cycles of insertion and removal. [1]

> To accomplish this, a locking device was added and the leaf-spring was moved from the jack to the plug, so that the most-stressed part is on the cable side of the connection. This change was made so that the connector on the less expensive cable would bear the most wear. [1]

I suspect that would make it the responsibility of the cable, not the receptacle, to give you that satisfying feedback. Which is good, because the cable is much cheaper and easier to replace than the receptacle.

[1] https://en.wikipedia.org/wiki/USB_hardware


The insertion cycles are only for proper insertion. Like when you are looking at it and doing it at a correct angle.

Also, rated lifetime does not take into account things like tugging the cable to the side of your PC because your foot got tangled in it.

USB C is extremely fragile due to its size and not much can be done to help it.

USB C might be created to withstand more insertion cycles. After all it is meant to be used for devices like phones that are plugged in and out constantly. Part of this improvement is just more experience creating connectors (yes, connectors got much better in the past decades).

A cable connected to a back of PC is not being constantly plugged in and out. Nobody I know is plugging in their keyboard or mouse multiple times a day.

--

When I design my boards, I put connectors that match the size of the board. I don't put extremely tiny connectors on a very large and massive project. I want a connector that isn't too fragile for the project and this means matching it with the forces that you can expect.

We, humans, use different forces when around large objects than when we are around small objects.

Large PC standing on your desk means forces used around it will be potentially larger than forces around a small appliance like a phone. If you tug on a cable that is hooked up to the phone, only small force is needed to move the phone. And we are usually more careful around a small appliance like phone. If you tug on a cable that is hooked up to a PC, much larger force will be needed before you move the PC. And people are less careful and using larger forces when doing things around a PC -- a larger object than a phone.


Funny you say that, I can tell from family experience that USB type A ports are quite fragile to a vertical tug as it breaks the internal plastic bar at its base so you're left with a hole full of floating copper pins. It of course requires more than a light tug, but less than you'd think.

The USB-C housing is much more rigid than that of type A, so it should handle this much better. The thinner bar might also handle deflection better by virtue of flexibility.

Also, hardware engineers know that ports are operated by drunk gorillas. Incorrect usage and odd forces are considered when designing a physical interface.


Can't comment on USB-C yet and obviously case point 1, but on my desktop tower (Fractal Design 6, so should be quality) I have for cca 5 years, the frontal USB-A connectors which are used most often, 2 out of 4 are already dead (I'd say maybe 500 insertions altogether, definitely not more). 1 is sort of in between, sometimes works, sometimes doesn't.

I generally know very well how to take care of electronics and those ports never experienced that dreaded 'vertical tug' you describe, at least to my knowledge.


In my two previous laptops I insisted on having four USB ports because I anticipated that they would wear out eventually and wear out they did - electrically they work, but mechanically half of them barely hold the plug in.

Meanwhile I have a USB-C to USB-A multi-dongle to which I have connected my mouse and keyboard and I move the whole thing between the work and private laptop on a daily basis and despite that after three years only the plug part is showing signs of wear. The socket is fine.

Same goes for the charge port in my phone. Still works as advertised even though I went through a few cables in those six years since I bought it.


I like this post. No trolling: Did you report the issue to Fractal? I am sure they wish to hear your valuable feedback. They might also offer to replace your case with the latest version!

P.S. I am not a shill for Fractal, but a faithful owner.


Tank you for the advice, I guess too lazy to deal with logistics and customs (living in Switzerland, some stuff trivial in US is... not so trivial in more diverse Europe). On top of not really having desktop working for who knows how long.

Plus planning to build new desktop soon, I'll still probably go with them (Define 7 is my pick so far), I think I had just bad luck since they have quite solid reputation and most other cases are not to my taste or needs.


> Fractal Design 6, so should be quality

Side note: I too wish this brand guaranteed quality. I have two Fractal Design cases, a Define R6 and an Era. The former is quite good, the latter is crap - panels don't fit. Btw, they cost the same (fucking expensive).


I really wish we’d end up with something like the old MagSafe.


MagSafe was quite bad tbh. Exposed contacts that easily get contaminated and magnets attracting nearby particles to degrade the connection. At the same time, the disconnect force in the vertical direction is far too small (place the laptop on something soft and the connector tilts), and in the horizontal and perpendicular directions high enough to still yank the laptop.

I do not know if the new MagSafe is an improvement over V1 and v2, but solid connections or non-contact charging are the way to go.


What is your opinion on magnetic usb c adapters? For about 10€ you get 140W charging and 40Gbps.

Compared to magsafe it is not recessed (though that has not been an issue in my experience), but on the plus side it also does data, video and is universal. Currently using the same type of adapter for my laptop, phone and tablet with the ends on various chargers and docking stations.


I had a number of them from two or three different brands.

They had tendency to cause sparking and I lost at least one phone due to it. The phone was fine but it lost ability to be charged through the USB port. And because it did not have wireless charging, when the battery ran out it became an expensive brick.


Yup, I lost a chromebook that way. I think magnetic connections are fine for 5V power with no data line, but as soon as you add more connections and try increasing voltage/power you are asking for trouble.


My deep fryer has a magnetic connection for the power cord. That's high voltage and wattage, and works just fine.


Because the only two pins are hot and neutral, which are designed to have high voltage across them.

Put 20 V Vbus on a CC line and see what happens.


That seems more likely to be because that design was bad rather than that the concept is bad.


Yikes. Haven't had an issue yet with the ones I use. Good to know.


can't it be repaired?


maybe we can get that for USB-D in 2025 after the Apple magsafe patent runs out


Microsoft had a magnetic connector on the Surface laptops so I don’t thing patents are standing in the way.


Even before that, magnetic power connectors were common on kitchen appliances like kettles and fryers.


Can we just stop changing things up in the USB world for a while? USB-C causes me more headaches than the prior USB incarnations do, but changing these things is itself painful enough that I'd rather just put up with what we have now than change yet again.


Once you accidentally insert the usb c cable into the wrong port and short out your mobo you start Inserting it properly and start fearing it :/


> yes, connectors got much better in the past decades

Really? How is e.g. the HDMI connectors "better" than the VGA connectors? VGA connectors seems far more robust to me. Same with d-sub mouse connectors.

I guess the signal rating is better now but the old ones didn't need high frequency capabilities.


VGA connectors were incredibly irritating to deal with, especially since so many of them were equipped with smooth screwing bolts with very little area to spin them with your fingers. I for one don't miss this.


If you wanted to have quick insert connection you could just not tighten the screws. I almost never tightened the screws, but it was good on monitors which were pivoting.

I am writing this on a screen where the HDMI connector pops out if a even touch it.


> I am writing this on a screen where the HDMI connector pops out if a even touch it.

Just one more reason to switch to DisplayPort. The only downside of DisplayPort over VGA is that it's up to the connector model if it locks, not user input; but other than that DisplayPort seems clearly superior over VGA. Apart from the obvious signal improvements, slightly more compact design and the recessed port, even an non-locking cable sits far more securely than an unsecured VGA cable.


If I had to choose between a disconnection and damaged circuitry… I managed to pull too strongly on a DisplayPort cable plugged into monitor, resulting in broken PCB. Luckily, the HDMI wasn't affected.


Just tighten one of them, its what i do half the time


And other half of the time you tighten another?


Yes :P. I’m a coward and never put it in without tightening at least one or both


I drilled two holes in my laptop as it had no room for VGA cable screws :D (about 2y ago)


Honestly, I have never tightened those screws.


Some of them actually had an end slot that could be tightened / untightened by a regular screwdriver It took me ages to realise this.


In the 1980s and presumably before, plugs like this often only had screws you needed a screwdriver to tighten.

I don't know if this was to prevent people disconnecting things, or if the expectation was that once connected, things should stay that way for years.


It helped me a lot when I had a colleague who had a strong workman's hands and who could not be found at the end of presentations where he had set up the projector's cabling.


You never had one of the pins slightly bending so the connector wouldn't fit anymore and you had to manually try to bend it back?


Sure. But if I remember correctly the cable ends are both male, so it was never the device that got bent. In some sense weakness on the right part is a feature.


every meeting room had a long semi-permanent cable wired through the walls / floors. bent pins on those were not fun.


Once you’ve been involved in repairing them, you start to insist on the sane approach - cable in the wall, connected to a faceplate which has a plug on the back and a plug on the front. That way any damage can be repaired without re running the cable.


VGA DVI and HDMI all have their major issues, DisplayPort is the clear winner in all aspects.


I bent a few pins in a VGA connector due to misalignment once. This was when the cable was hardwired into the monitor!


Ye maybe I am just nostalgic. Hard to tell.


You may only have dealt with your own devices. Yes, VGA devices were much harder to accidentally unplug or become loose than HDMI. But they were often far more solidly connected than e.g. the mount on a video card, or the attachment of a laptop motherboard to it's case. So you would find components which had been messed up internally by the use of a VGA connector as a leverage point.


Yep, harder to accidentally unplug is a large negative for a generic cable. There very few applications where it's a good thing, and connecting a computer to a display isn't one of them. IMO, DP in particular shouldn't have the locking mechanism.

But becoming loose without unpluging isn't good either.


I have never bend the pins on HDMI connectors. It doesn't even occur to me that it could even happen. I have bent dozens of vga connectors. At least you can bend them back.


I once bent the USB-C connector on a dell dock just because the cable is stiff and the bend from going below my desk put too much force on the connector. It still works, but has an angle now. If the cable wasn't permanently attached, having the wear parts on that side would be fine. The EU should definitely mandate that, I don't see any technical reasons for soldering one side of the cable.


On the WD19/22TB it's actually user-replacable/upgradable, but parts aren't necessarily easy to find.


I've ruined many usbc connectors like this and specifically dell dock connectors


There are many times I've tried to insert USB-A wrong since it's rectangular but only has one correct way and sometimes the fit is tight and sometimes it's hard to see the socket (behind a PC, under a monitor, etc...)

USB-C doesn't have this issue. Much better


> Nobody I know is plugging in their keyboard or mouse multiple times a day.

If you securely store a laptop at the end of each workday then you're up to 1 cycle per day minimum. It will be more than that if you're required to secure the laptop during coffee breaks and/or if you work from several places in a work-day.


The quote is specifically talking about desktops not mobile use cases like laptops


  > tugging the cable to the side of your PC because your foot got tangled in it.
  > USB C is extremely fragile due to its size and not much can be done to help it.
Zip tie the cable to the case?


    The insertion cycles are only for proper insertion.
Every time I see comments like this I think of the COVID-19 trolls who insisted that masks were only helpful if "worn correctly".

Proper? Please. Why not just say that the connector suffers from poor design and is difficult to plug correctly. Everything else is humiliating the average user, like my parents, with advanced degrees whom are not STEM.


Minimum is just that, minimum.

Have you ever seen a broken standard USB port or connector? Me not once in my entire life, unlike micro-USB or USB-C.


I surely have. Any even semi-public USB power socket is hanging on by a thread, some of them have the inside paddles snapped off entirely. I haven't seen a Type C one in such a state yet, but I eagerly await it.


The problem with usb c (but also mini and micro) is the weak connector on the device itself. The way its connected to the board is way weaker than a large usb a connector on a motherboard. Accidentally pullen a keyboard while attached will for sure destroy a usb c port. A usb a most likely survive.


Yup, bricked a old gopro with a mini connector by tearing it off the circuitboard. Didn't think that could happen so I wasn't particularly careful but it's physics, small things break easier.


Simply put your money where your mouth is.

Framework laptops come with expansion cards that are on sturdy rails and serve as the world-facing USB-C/A/HDMI/SD card/whatever connection. The inner connector on the mainboard is well protected.

https://frame.work/assets/pdp/expansion-card-9240577483dc9e2...


That's going to depend on the connector. Plenty have through-hole mounts. You can also get surface-mount type A connectors.


They all do. Yet it still doesn't help. Those 4 little retaining lugs are fixed into the board only by solder, which is fragile and will crack after a while, esp. when the connector is mated/demated constantly and forces are applied to it.

USB A has those lugs much larger and usually also bent under the board or twisted, so the connector isn't fixed into the PCB only by solder.

USB C and micro USB are mostly designed for clamshell enclosures such as a typical phone - in that case the connector shell is supported by the enclosure itself (ideally through some flexible padding) and not only by the flimsy contacts/solder lugs.

However, there are plenty of devices on the market where these connectors are not supported like that because the designer has been an idiot - and guess which devices end up in the landfill with broken off connectors the most.

Breaking the old school USB A, square USB B or mini USB is almost impossible like this unless one applies really unreasonable amount of force. I have only ever managed to accidentally destroy a USB A port once like that by accidentally hitting a USB stick hanging out of it - but the board and the adjacent ports were fine, only the port shell and contacts were destroyed.


Mini USB-B is still not very robust in terms of connection the board. I’ve got a couple devices here where the port is ripped off the PCB. (I voluntarily [attempt to] repair devices for friends, so I see more than a typical rate of failed parts.)

By some definition, that’s evidence of unreasonable force, but it definitely happens and should be anticipated when you put the device in the hands of lots of gorillas.


> Those 4 little retaining lugs are fixed into the board only by solder,

You're saying this as though there's a single design of USB-C cable to board connector. There are, in fact, many different designs, including many with "lugs" which go through the board, as well as mid-mount connectors which use the material of the PCB as structural support.


To be fair, I haven't seen a USB C power port available in public once so far myself, and while it would be nifty for them to become the norm over USB-A, I really think it's a bad idea to subject users to unknown USB charging ports in public. A wall outlet can't pretend to be a keyboard, for instance.


Never seen a broken public A port, but maybe it was because the A port was already there for a decade and got good wear while public C ones are relatively new? And I think many people still use A to C cables on the go because those ports are way more common, so they just get a lot more wear.


The terminal seating at LAX has both 110 VAC plugs and USB-A plugs where the cable falls out as soon as you let go of it.


My laptop has a opinion which side is up on usb c and doesn't connect the other way around (the way docking stations use)

I basically have to replace my laptop to use my docking station just because usb c isn't sturdy enough :/


Did the USB C port ever work that way? Extra electronics are needed to support being able to flip the high speed connects like that. Maybe the laptop never had that?


I bought it used and only realized it later, but yes it's supposed to work on both sides.


I know the repair place I worked at sees a lot of broken USB-C ports. Way more than they ever had A. They hate the new, smaller ports because they're usually built very flimsily unless it's Apple.


As the goto electronics guy I have swen broken USB A but not broken USB-C (yet). USB-C is sturdy, the weakness is either going to be the cable or the bonding of the copper with the PCB. Both is basically a matter of go much money the manufacturers are willing to spend.


My experiences differ. I've had plenty of USB-C devices of which the port becomes very loose after a while of normal use. I'm glad my current phone has wireless charging, not because of its convenience but because that way I don't wear out the USB-C port.

USB-A is so simple, two holes and some leaf springs make for a satisfying clunk that doesn't wear out so quickly.


> I've had plenty of USB-C devices of which the port becomes very loose after a while of normal use.

I used to think the same thing, until I looked into how the USB-C plug actually operates: The movable spring parts are all in the cable, so if there's anything wearing out, it's usually that.

Clean the plug/port (lint collecting in the back of the port prevents a proper connection and makes the haptic feedback feel off), possibly swap out the cable if the springs in it are really worn out, and things are as good as new.

> some leaf springs make for a satisfying clunk that doesn't wear out so quickly.

But once they do, you have a broken USB port.


> so if there's anything wearing out, it's usually that.

And yet, very nearly all of the USB-C connector breakage that I've experienced has been the socket-side, not on the cable.


Are you sure there isn’t just dirt in the port? Or a bad cable? I’ve consistently had luck cleaning gunk out of USB C ports and having them work like new.


My experience was that most of the time it was the cable, followed by dirt in the port.

Now PCB quality can also wildly differ and the bonding between substrate and pads can be a weak point if the manufacturer of the device went for the cheapest possible PCB. But for that you have through-hole USB-C connectors and cables that should break before the connector.

So if your USB-C connector breaks off the PCB that is on the manufacturer of the device.


I've seen many.

In more rental cars than not, the USB-A port is somewhat worn out, which is really annoying as it can make CarPlay connections quite fragile.

I've also seen many USB-A ports in airplanes not being able to hold the cable anymore due to the spring contacts being worn out or having been bent out of shape.


> Have you ever seen a broken standard USB port or connector? Me not once in my entire life, unlike micro-USB or USB-C.

USB-A has failed just as often as Micro-USB in my experience. The only connector I haven't seen break is USB-B, but that could be due to fewer samples, since only printers seem to use that connector, and how often do you unplug/plug a printer?


Yeah but 10,000 connections is WAY past any reasonable motherboard lifespan. Meaning the minimum for USB-C is ridiculously over-engineered for any motherboard.


As the user above points out, this is ideal insertions, and not necessarily representative of the kinds of situations it is likely to encounter.


> Yeah but 10,000 connections is WAY past any reasonable motherboard lifespan.

I just don't believe that those connectors last that long in the real world. My personal experience and observations indicate that they don't. Or, at least, they're much less robust than earlier USB connector designs.


i most definitely have seen more broken A ports in my life than I’ve seen broken C ports. I still remember being mystified by my elementary school janitor bending one back into shape(-ish) back when i was 6 or something lol


Absolutely, the two on the front of my HAF 932 were the first things to break, in fact.

Funny enough, I've never seen a micro or C break :D


My USB C’s sometimes fail. The slot is not firm, and is loosened over time.


yeah, I had to replace the front ports on my current PC case because the center part of the plugs came out...


Is the solution to have easily-replaceable USB sockets?


Or standardizing USB-XLR.

People would complain about the price of the cables, the size of the cables, and the size of the devices.

Small is fragile, and fragile should be replaceable.


With an XLR connector if you pull you will rip the motherboard apart.


you're lucky usb-c ports are failing constantly


never had a broken USB-A, but definitely owned one and seen another USB C in which the tiny board in the middle of the socket broke off


Micro-USB had on paper better insertion/removal cycles than Mini-USB, but in practice Micro-USB connectors were flimsy garbage and Mini-USB fairly robust for the size.


Not exactly flimsy, but they often have super high friction, translating into some seriously rough handling.


And one messed up cable could force them enough so that they break.

USB-C solved this by demanding thick walls on every side of the cable's connector.


> To accomplish this, a locking device was added and the leaf-spring was moved from the jack to the plug, so that the most-stressed part is on the cable side of the connection. This change was made so that the connector on the less expensive cable would bear the most wear.

Not arguing that this isn't a good idea, but considering that the A port looks gigantic compared to C it seems those springs being a couple times larger also makes them a lot more durable, to the point where it's a non-issue.


Maybe they designed this way, but in practice usb-c ports are getting destroyed over time and not cables. On my two Samsung phones ports started degrading over 2 years use and after 3-4 years they are trash, barely connecting without fix to any cable. Same cables connected to the less used hosts, like new Kindle or new powerbank connect snugly and hold fine. I feel that usb-c port design is just garbage or maybe pushes materials too far.

And I've never ever seen a usb-a port break or the cable connector part.


Personally I think these numbers are fabricated horseshit, drafted by some engineer behind a desk running endless simulations.

The USB industrial complex made the same claims with micro-USB vs mini-USB; and are now doing it again with Type-C.

My empirical evidence: I've now had to replace two laptop motherboards due to a (mechanically) failed USB-C port.

Guess what I haven't had to do in 25 years? Replace a failed USB-A socket. Ever.

USB-C connectors are fragile and companies are making more and more large, heavy cables with huge connectors especially common with Thunderbolt or port replicator-type devices. It is a recipe for disaster and they expect us to fall for this USB Jedi Mind Trick again.


The connector lifetimes are tested by some manufacturers. When I was working on USB (electrical) testing at Microsoft we had test jigs for testing connectors to failure and I can vouch for Microsoft's connectors having lifetimes exceeding 25,000 insertions.

We would also test how many miles a mouse would travel before the glides would wear out.


we had test jigs for testing connectors to failure[...] 25,000 insertions

That's likely to be the exact reason why the reliability specs don't line up with reality: your test jigs are almost certainly going to be inserting the plugs directly with exact precision, whereas in practice people are going to just "jam it in" and apply significant side-loads as well as misalignment and levering forces.

It's not hard to design a connector that'll last 25k precisely aligned insertions by a test jig but will consistently fail after a few cycles of being mated by typical "hamfisted" users.


>That's likely to be the exact reason why the reliability specs don't line up with reality: your test jigs are almost certainly going to be inserting the plugs directly with exact precision, whereas in practice people are going to just "jam it in" and apply significant side-loads as well as misalignment and levering forces.

as someone who was involved with endurance/reliability testing of other physical products, I think you'd be surprised.

Randomized jitter and movement patterns are often replicated and considered.


I agree with the parent's comment. I also worked with automated test setups and did human tests. Nothing compares to a human tester. We tested small remote controlls and it was amazing to see that a human hand can inflict more damage to the product that an automated test setup. (the remote was given to different persons just to push the buttons until we reached the desired number of button presses).


> your test jigs are almost certainly

Why assume that? Why not be humble and just ask the person with first hand knowledge instead of basically attacking their setup based on guessing?


Because you can search for what connector testing machines look like, and they all look like they'll just destroy the connector if it isn't aligned correctly.

https://www.youtube.com/watch?v=XdIdEbr9wQQ

https://www.youtube.com/watch?v=8GkDnQlesUM


This is true. A test system will only apply force from one or more predefined directions. A user, on the other side will not take care that he inserts the connector at 45 degrees +/- 3 degrees.


I guess most damage actually does not happen during insertion itself but when someone trips over the cable while the plug is in the socket. Simply because of its bigger size that force is then applied to a bigger grip surface and thus lower per square millimeters on USB-A while the forces will be much bigger with C


This is like Samsung testing their foldable phone... in a clean room.


The fragility comes from how easy it is to damage them by handling them incorrectly. The fact that you can insert them properly 25000 times without damage doesn't change this.


How can a large usb A port be weaker than a tiny usb c port? If you attach a keyboard and throw it out of the window the usb A port will remain ok whereas the usb c port instantly gets damaged.


> How can a large usb A port be weaker than a tiny usb c port?

Because for USB-A, the part that wears down is in the port, while for USB-C it's in the plug.

> If you attach a keyboard and throw it out of the window the usb A port will remain ok whereas the usb c port instantly gets damaged.

Have you tried that yourself, a statistically significant number of times?


> Because for USB-A, the part that wears down is in the port, while for USB-C it's in the plug.

I've had a number of USB-C ports break, I've never has a USB-A port break.


Of course you want the port to be weaker than the board it is attached to, so that the port breaks rather than the board.


My phone is 4 years old and it's type-c port completely destroyed for the last year, it barely connects (and cables are good, verified on the newer hosts). Even we assume I've connected it 2 times per each day every day, ot would mean about 2000-3000 insertions tops. This is an order of magnitude worse than that fantasy 25k number. Another phone with 1000-2000 insertions over life time is also showing port degradation.

And I charge my phones in 99% of cases at home, stationary and undisturbed.


Then the tests must be faulty. At least, they don't reflect the real-world experience of me and my friends. USB-C has a much higher failure rate for us.


My empirical evidence: I've now had to replace two laptop motherboards due to a (mechanically) failed USB-C port.

I have had laptops with only USB-C ports since 2015 and not a single one failed. I also don't know anyone who had a broken USB-C port. There is probably variance between manufacturers.


And variance between users, some people grab the cable and yank it to remove the plug from the socket, for example.


Guess what I haven't had to do in 25 years? Replace a failed USB-A socket. Ever.

USB-A connectors are robust enough that you can't really pull them out by the cable. You have to pull the connector, which means you always pull them straight out. USB-C cables are smaller so some people yank the wire part of the cable instead of the connector, leading to a shearing force on the socket, which breaks it over time.

Broken connector ports are pretty much always user error. If you're a bit more careful to pull things out straight the port won't fail for years.


The shearing force is often done by the cable itself, with gravity doing the work. Some USB-C cables are so thick and connector so large to support the heavy shielded cable, that it's bent at an angle when plugged in.


> Broken connector ports are pretty much always user error.

After using GPIB connectors (with huge cables attached to them) for some time, i tend to dissagree.


Exactly. There is no way a significantly smaller connector with way more pins can be as robust as a large one with less pins. It is 100% horseshit.

My anecdote to add, but this time comparing USB-c power delivery with the lowly ~8mm barrel jack. I recently got a laptop shipped to me by a remote client to use to connect to their network. I pull it out of the box set it up and I notice the usb c power cable has the slightest of kinks...

Of course it wouldn't charge. I managed to manipulate the cable, holding it, pressing at various angles long enough to use the device until a replacement PSU was delivered. A barrel jack wouldn't care in the slightest about such kink. And if it didn't work I could get another barrel jack from an electronic parts store the same day, chop off the old one and solder the new one on. Good luck doing that with usb-c.

BTW, let me also rant here on the (lack of) usb-c PD compatibility even between devices made by the same manufacturer at the same time.

As the above was happening I happened to have another Dell xps laptop sitting on my desk with another usb-C power supply. I thought to myself, I'll just use one PSU for both, right? But before I tried my inate cynism made me think "surely they would not make it so convenient for us" so I googled if this is a good idea. Turns out no. The PSU I had was 90W while the other laptop required 120W. I read stories of it simply not working, but also of broken PSUs. I decided to try anyway and the laptop that needs 120W psu wouldn't even boot with the 90w PSU. I thought maybe it would throttle down the components. Nope, a bios message said, unplug it or else.

So yes, I agree the magnificent claims of usb c resolving all sorts of problems are all horseshit. The only real problem they really resolved is that it is reversible. Also the ideas of PD and sending eDP over it were good, just implemented poorly.


> And if it didn't work I could get another barrel jack from an electronic parts store the same day, chop off the old one and solder the new one on. Good luck doing that with usb-c.

Most consumers can't solder, and I wouldn't even know where to find an electronic parts store. I can buy a (probably bad, but better than nothing in a pinch) USB-C cable at a convenience store on almost every street corner.

I'm more than happy to never have to deal with barrel jacks anymore. The inevitable drawer full of incompatible (at best) or physically compatible, but electrically mismatching barrel jack chargers has plagued too many households.

> I decided to try anyway and the laptop that needs 120W psu wouldn't even boot with the 90w PSU. I thought maybe it would throttle down the components. Nope, a bios message said, unplug it or else.

Seems like Dell does not care a lot about compatibility then. This is entirely allowed by the spec for good reasons: The alternative would be more barrel jacks, and at least I can use a Dell charger for my phone or laptop.

It's definitely not the norm – my laptops charge even from a 5W USB-C charger (albeit very, very slowly).

> So yes, I agree the magnificent claims of usb c resolving all sorts of problems are all horseshit.

I can definitely see how it doesn't solve all the problems all people have had with connectors, but it's solved all sorts of problems at least for me. I really like it.


>Most consumers can't solder, and I wouldn't even know where to find an electronic parts store. I can buy a (probably bad, but better than nothing in a pinch) USB-C

But these PSUs haven't got any receptacle you can plug a replacement USB-c cable. On the brick side the cable goes into the brick. That's it.

Fair point about soldering and general rarity of electronic parts stores. If things remained as repairable as they were let's say 30 years ago you wouldn't need to know how to solder, but you would have an electronics store within a short driving distance where they'd solder the barrel jack on for you.

A common argument is, but everything is miniaturised now, to which I say, so what? Microscopes and bga rework equipment has been available for even hobbyist for quite some time.


Since 2018 I've charged, operated, and booted a variety of USB-C devices over a variety of USB-C PD power sources.

e.g MacBooks of the 60W and 90W variety I've operated on 90, 60, 45, 30, 20, 15, and 5W chargers, car adapters, power banks, displays, powered hubs, or other USB PD devices. Of course on 5W they can't charge while in use (even idling, where they ~maintain charge) and booting requires them to have enough of a battery charge to support the boot power surge, but it has always worked: PD has always negotiated to the best power the charger could do. IOW when manufacturers do their job, it works.

> A barrel jack wouldn't care in the slightest about such kink.

Oh boy I would disagree. I've had decades of terrible experience with barrel jacks, many of which were completely invisible to the eye. I would not be surprised the small kink would have hidden much worse internal damage or defect. If it failed to negotiate power that's because of an abundance of safety, and I'd rather have that than a fire hazard silently being accepted and internally arcing under load.

> The PSU I had was 90W while the other laptop required 120W.

What would be the non USB PD alternative? Proprietary PSUs, with either one of: non interchangeable because of a different plug, the same plug but the most power hungry computer still refusing to boot from some proprietary charger id protocol or hardware, the same plug but different voltage and either doesn't work of fries the computer's power circuit (I've had all of those back in the day).

So your current USB-C situation isn't any worse than it was before USB PD existed. In fact it's better WRT failure modes because USB-C PD compliance means it will negotiate a safe voltage (including taking the cable into account). Of course if you plug in a non-compliant charger all bets are off but that's no worse than before when there was no power delivery standard whatsoever.

That said, as demonstrated by the MacBook example, a manufacturer doing their PD job would not lock the computer out at the bios level - instead waiting to have a safe charge on the battery -, a kind of locking that also happened back in the day of proprietary plugs among the same manufacturer.

Therefore I do not see any reason to blame that on USB PD "horseshit" instead of that particular manufacturer.


Just use the 120w on both…

Of course something that needs 120w isn’t going to be able to be be powered by a 90w supply.

This is not a problem. Frankly it sounds like you don’t understand how electricity works.

I use the same USB-C GaN charger for all of my devices including MacBook Pro and Sony A7RIV. It’s 200W…

Just buy a big enough charger and you’re good.


> Of course something that needs 120w isn’t going to be able to be be powered by a 90w supply.

That might be true for desktops (and even these could very well throttle their TDP based on available power), but most laptops I know can actually just supply the difference from their battery.

I personally just wouldn't get a laptop that can't run on a much more compact lower power wall adapter. Being able to use a tiny 20W power brick in a pinch is great.


In my experience, it's entirely manufacturer-dependent. Comparing Apple, Dell and Lenovo, the Dell and the Apple machines (MacBook Pros and XPS 13) have had no problems. Both feel as good as the day they were purchased. The Lenovo (ThinkPad P14s/x280) had connectors falling out after around six months - which is particularly annoying given that they charge over USB C. Overall, I've been massively disappointed by the build quality of the more recent ThinkPads, especially since they get a lot of individuals espousing their benefits here, especially over the MacBooks.


I once worked in an office where virtually everyone (30ish people) had failed USB-C ports on their MacBooks. This was around 2017. I don’t know if Apple fixed this, out of fear I baby the ports on my 2018 MBP, and they still click like brand new. So I do know that when treated with extreme care they will last. Just feels silly though.

I’m definitely one of the few people lamenting Apple’s forced hand in moving to C on the iPhone. Lightning is incredible. It takes so much abuse and yet still fits like the day I bought the phone. Not looking forward to having to baby my phone’s port as well when I upgrade.


Apple had the opportunity to make Lightning a standard. They could have dropped the royalties and enabled USB3/4 speeds on all their devices YEARS ago. They did neither and, predictably, an open standard came along and took over.

I should add that’s it’s not a perfect connector. Every lightening cable I’ve ever owned has eventually developed corrosion on the male connector, leading to spotty connection with the device. I’ve never had that problem with USB-A or C. Probably because the pins aren’t exposed, so aren’t ever touched by my sweaty/oily/acid fingers. That said, it’s usually a quick fix with some rubbing alcohol.


My USB C connector failed on my Dell 5520 about 2 or 3 years in. None of the USB-A ports have failed yet, and that computer is pushing 6 years old now.

USB C is, in my experience, much more sensitive than USB-A to real-world usage.


These are requirements. There is a looong way from requirements to final product.


I think it's a bit ironic that the cable is considered more costly than the receptacle, which is generally true if you consider the device as a whole, but now we have multiple hundreds of dollars thunderbolt cables and the cap on the actual receptacle component is below $10 even for ones with support for higher speeds.

But then again, most people don't have the SMD rework tooling to be able to easily replace receptacles


> now we have multiple hundreds of dollars thunderbolt cables

This is mostly an exaggeration.

Apple has a $160 3m 40Gbps cable but Cable Matters has the same for $70 (and Trebleet has a 2.5m one for $50).

The 25 feet 40gbps cable from Pure Fi is the only I can think of that fits the "multiple hundreds' description at $200 but then again it's an active optical cable (it's not compatible with plain USB).

> the cap on the actual receptacle component is below $10 even for ones with support for higher speeds.

Maybe the actual C connector they started with is below $10 but you are looking at rather expensive redriver chips inside and careful engineering to care for the signal integrity at 40gbps.


It is rather less than 1 usd even on Lcsc where everyone can buy it


It is probably easier to have two cables and replace a broken one in case of failure than replacing your USB connector (or the lifted pads) in case of failure.

A broken cable means the cable will be replaced, a broken connector might mean the device is thrown out.

Now I am a person with the skill and tools to replace such a connector or patch up the PCB in most cases, but Ibstill think I'd rsther replace the cable. Thebexpensive thing isn't the part. It is my time.


Paying multiple hundreds of dollars is a choice, not a necessity. An active 40 Gbps, 240W cable is around $50 these days. I doubt that most repair shops would even give you a repair estimate for less than that.

> But then again, most people don't have the SMD rework tooling to be able to easily replace receptacles

And that's exactly why a reasonable design assumes that buying a new cable will be the preferred option (over having to re-solder components on a device) by most consumers. It certainly is for me.


Micro-USB and USB-C receptacles are both designed for a minimum rated lifetime of 10,000

The idea that micro-USB and USB-C are supposed to be equally durable is the wrongest thing I've read all day. Frankly, mini-USB is the single most delicate connector I've ever used. Other than a couple PDAs, I never owned a single mini-USB device those receptacle didn't end up breaking off the board.


You’re mixing up mini-USB and micro-USB. Also, the board attachment is not strictly speaking part of the connector standard. Manufacturers are free to, and do, screw that up in myriad ways.


No, I'm not. The connector side of micro-USB is usually the failure point. I had mini-USB receptacles fail at an extremely high rate back when I was building and selling mini-USB 2.5 inch external hard drive enclosures back in high school. Micro-USB wasn't even invented until 2007.


If you’re not mixing them up, then I have no idea how to interpret your grandparent comment:

> The idea that micro-USB and USB-C are supposed to be equally durable is the wrongest thing I've read all day. Frankly, mini-USB is the single most delicate connector I've ever used. Other than a couple PDAs, I never owned a single mini-USB device those receptacle didn't end up breaking off the board.

What do the second and third sentences have to do with the first sentence? They read like they’re supposed to provide the (otherwise completely missing) support for your statement that it’s the “wrongest thing you’ve read all day”, but they don’t.


Even if you are not mixing them up, your comment is confusing since you are intertwining commentary on both.

Splitting them into separate paragraphs with your assessment and experience on each would help the legibility.


Mini is quite fine I thing you are mixing it up with micro.


How many PDAs do you know with microUSB connectors? MicroUSB wasn't even introduced until 2007.


Same here. I've experienced flaky connections with several USB-C ports, and sometimes the port itself gets dislodged. I've had devices RMA'd because of it.

Nowadays I'm very careful when plugging in USB-C, doing my best to not pull on the cable, etc. It's overall a worse UX than with USB-A, and being reversible doesn't make up for it.

Then you have the mess of protocols and standards that use the connector, where I need to use specific cables for specific devices, and I'm never sure which one is which. The term "universal" in USB has completely lost its meaning.


Rather, the cable became universal and the protocols diverged. But I’m fairly sure the situation will be fixed by the adoption of USB-PD as the ultimate power standard. As all the other standards (QC and else) are including it in their own. And the remaining HDMI devices will be phased out in a decade or so.

Then you’ll need to only care about the wattage rating when charging devices. Seems like future for me.


What about bandwidth? There are 10Gbps, 20Gbps and 40Gbps cables, along with "wattage" (really, amperage/current).

Both will continue to go up.


It's worth noting I suppose that newer USB power delivery standards are scaling voltage rather than amperage/current to deliver more power to the device.

The pins are specified for 1.25A each yielding 5A and I believe the 5A cable specification came at the same time as 20V with the 100W power delivery specification.

Every power increment since then has been a voltage rather than current increment.


Good point, I was aware that newer USB-C PD standard offers bigger power delivery (up to 240W, I think), but I didn't know it was achieved through increased voltage.

That means it goes up to 48V DC, so that's certainly curious, thanks!


I don't think this is anymore than growing pains. Do you guys seriously misremember how terrible usb-a ports were? Now they're the gold standard apparently.


I've never seen a USB-A port or cable connector break physically or become loose fit. At most wires inside cheap cables snapped, but that's unrelated to the port. On the other hand USB-C port (host part, not the cable) quality is dogshit. All my type-c devices slowly degrade their ports over time. At 3-4 years mark they are trashed.


At the risk of sounding a bit dumb, I shorted my motherboard with a USB A, it felt like I got it right, and pushed it in, at which point it shutdown. Fortunately there didn't seem to be any permanent damage


Happens to the best of us. I also had a bit of a scare like that recently. The only thing worse than a non-reversible connector is one that breaks (or fries your device) when you insert it the wrong way...


One issue with USB-C is that dust gets packed into the female side. I've found it can be in there so tight that even after attempting to clean it, the dust remains. But going in with a sewing needle, rubbing alcohol, and really scraping you can fix the port.


Yes! Every time my phone's had a problem charging, the issue has been cleared by doing exactly this.


I wouldn't do this on phones. Alcohol and the needle are conductive materials.


Connectors which have consistently shifting pressure applied, especially from side to side, are going to fail faster regardless of the design. It just so happens that the side that's most likely to be in this state is the side connected to the phone you're holding, the headphones you're wearing, etc. I have seen USB and Lightning ports fail on tons of different devices where the user is actively using the device while it is plugged in, even when they treat it gingerly and even without any identified sharp tugs or the like.

I've taken to using Volta plugs for all my USB devices at the device side to prolong the life of their ports, and it's great for unifying micro USB, Lightning, and USB-C onto one cable. Highly recommend them!

That said I agree USB-C connectors on desktop motherboards are particularly poorly made, and indeed make an unsatisfying connection.


I think that’s dependent on the port, there are some I’ve plugged into with a very satisfying click and it felt very secure. But you’re right the cheap ones are pretty vague.


I don't know. I can easily lift my phone by the USB-C cable. And as for the connectors, sure USB-A usually has 4 through-hole mounting and through-hole live pins[^1] so the connection to the board is rock solid, but USB-C connectors, despite having surface-mounted live pins tend to have shallow mounting pins as well[^2]. I find it pretty durable.

[^1]: https://jlcpcb.com/partdetail/ShouHan-10_0_4C6_3ZB/C6386909

[^2]: https://jlcpcb.com/partdetail/Dealon-USB_TYPE_C018/C2927038


I have the same experience. The USB-C on my phone is dyin. Charging the phone has become a challenge. Yet on my 12 years old computer, the front USB A ports are a bit loose, but still working properly.


100% agree with this. My 2.5yr old phone will not longer register the usb-c port and I have to charge it wirelessly. My PC still has a few USB connection ports from 2009 that still work. (Although most are newer than that)


We've had a problem with the cables provided with ThinkVision USB-C monitors. They're so horribly stiff either end gets quite a lot of force exerted on it. We use them for hot-desk setups at work and I've seen multiple with the metal part of the plug broken off from the plastic shroud.

I've asked for some "Cable Matters" ones to be bought as that's a brand I recognise and _think_ is good, we'll see!


That mushy feeling has caused me to push it into the wrong port multiple times and cause my pc to turn off instantly.

Think the mushy feel is due to the io plate on the mobo


I think it's just because the C cabels tend to be newer and companies van get away with more disposable build quality now than when A was booming.


Yeah USB-C is horrendous.

Broke my light macbook air from being so tightly connect it dropped, macbook also had very short charger cables for a while.

While being so tight it often refuse to charge because of it being so good in collecting dust and having a long neck that makes a small hit bend it so it doesn't work.


In a macbook pro build it certainly feels like this. I have an old 2015 mbp with A connectors serving for almost ten years now. When I plug something in it is tight and solid and feels right. On my latest mbp with C connectors, they are loose already after a year of service, I need to wiggle and shake the connector inside the port to get a nice contact and make peripherals work, of course, that breaks it even more over time.


I recently moved my display over to running on USB-C, so I could have one cable to charge my laptop and feed the display. Plugging it in was a very unsatisfying experience. The port seemed misaligned, so I had to be a fair amount of effort into it, and once I got it, it felt like nothing. I was worried when I rotated the monitor back into position, or adjusted the hight, that it would fall out. I haven’t thought about that in years with other ports. It works, and I like that it’s one cable, but the feel is awful.


Interesting. This is completely the opposite for me. I have bend a ton of usb-a cables over the years. Has never happened to me with usb-c the connector for industrictable to me.


they should make it big as A, and reversible as C,

why do I have to think of solutions so obvious?


You’re a genius.

Of course, no one wants a big connector in their cell phone, so we’ll have to make a mini C. And then a micro C. And then they’ll make USB D to try to simplify all of these competing standards.

That said the form factor of type C is a little too small to be wieldy to me.


Those exist[1]. I reckon connector size is an issue on smaller devices, but there's no reason why USB-C couldn't be improved to make the connection more secure.

[1]: https://tripplite.eaton.com/main/search?q=reversible%20usb%2...


And this will help also with power delivery. As an EE, putting 100W in such a small connector gives me shivers.


No matter how much I baby USB-C connectors on phones, I eventually end up in a place where I'm wiggling the cable to get it to pick up that it's plugged in to charge - usually around the time i start looking for a new phone.

I'd like a Framework style swappable USB charging port already.


I got my HTC vive.out of the closet today and as usual hate USB micro and love USB c.

Usb-c just fits.

Do you mean the real USB plug the bigger one? Those are dependent on my device. The USB a port on my desktop case are okay, the ones at the back of my Mainboard way to tight.


> but not for a desktop and its peripherals.

This doesn't make sense to me. Why should it be bigger?


Because the bigger connector, if it's attached at its corners or throughout the length, can phisically better withstand the shearing physical forces applied to it.


Not my experience at all. USB-A would very often break. Never I had a USB-C break.


that is a very good observation and I agree. I guess that can be fixed, and high quality c-connectors will provide this sensation, just look at the iphone lightning connector which IMO provides something similar.


This is my experience as well. I have had substantially more problems with USB C connectors failing than with any of the prior USB types.


I notice a lot of people are arguing against usb c, but in reality: devices are designed for the average consumer, not us special HN peeps, usb c is the way forward, we can't use a bulky almost industrial USB connection forever, having a single type of connection has so many benefits, if someone exceeds the plug/unplug rating or the socket breaks for some other reason...it can just be repaired.

Everyone arguing against c like they wanna go back to old style serial port connectors or something.


I like USB-C myself, but I don't think this is a good argument:

> if someone exceeds the plug/unplug rating or the socket breaks for some other reason...it can just be repaired.

...when consumer tech companies these days tend to make repair much more difficult than it used to be.


I’d honestly prefer usb a if it wasn’t for the fact that I have to try plugging it in four times before I succeed.


Seems fine on MacBooks


calling USB A “legacy” and a “relic” is so misplaced. we all still own USB A devices. I challenge the author to go find a wired keyboard or mouse that comes with a USB C plug. it’s possible, but not easy.

the thing is, we could just have both! this isn’t some winner/loser scenario. the real question should be why these computers with enough bandwidth for 11+ connectors don’t at least split the difference with 6/5 of each. gives device manufacturers enough confidence that customers will be able to plug in the thing they bought, and gives consumers the ability to plug in anything and everything they need. then in 5 years when things have advanced we can start talking about entirely consolidating on USB C.


> I challenge the author to go find a wired keyboard or mouse that comes with a USB C plug. it’s possible, but not easy.

My Moonlander Mk1 does, it has C-to-C cable with an optional C-to-A adapter on one end. My mouse is type C, but it comes with C-to-A cable.

But, obviously, this is chicken-and-egg kind of a problem. No one will build a cheap mass market keyboard with a C-to-C cable for a market (desktop computer users) where type C connectors are nearly non-existent. Extra components (adapters or extra cables) won't be worth it. If the majority of desktops will shift to have majority of their ports of type C, you won't find a type A keyboard anymore. Just like you will have some difficulty finding a PS/2 keyboard today (sure they exist, but aren't exactly widespread anymore).


USB A was released in 1996, it's older than I suspect the majority of HN readers at this point. An A to C cable is entirely passive, so not exactly a big deal to get a cheap adapter.

Not sure I'd rely on keyboards as the standard bearer for 'not a relic' many of them still support PS/2 via passive adapter. That was announced in 1987 and is older than me.


Comparing technology to human lifetimes doesn't seem very relevant. Even not accounting for broad categories like "fire" or "language", we're still using individual technology from "long ago" like knives, forks, pen and paper, etc. It's fine to make improvements that keep backwards compatibility (more ergonomic scissor should be able to cut the same paper), but changing connector just because it's old is unwise.

The issue of type C is that it's trying to accommodate the entire spectrum of applications from very high bandwidth to very low cost. The range of possibilities has much increased since when Universal Serial Bus was devised, thus many recent solutions feel like they're bad at everything and good at nothing. Perhaps we should allow the specialization of type A as a cheap, reliable, slow, high power, connector, and save type C for only when high throughput is needed?


> ... but changing connector just because it's old is unwise

Well not because it's old, I'm a huge fan of my 1/8" headphone jack.

Type C is intentionally significantly more functional (not to mention the ergonomics of reversibility) but also fully backwards-compatible requiring only passive routing to get you a Type A adapter.

> The issue of type C is that it's trying to accommodate the entire spectrum of applications from very high bandwidth to very low cost.

That's up to the host. You can support just USB 2.0 with the same signals routed as a vanilla type A connector.

There's even super cheap USB-C receptacles that only have the relevant USB 2.0 pins routed out. Like this one [1].

There's really no reason I can think of not to use one.

[1] https://www.digikey.ca/en/products/detail/gct/USB4125-GF-A/1...


I've implemented USB-C with PD and SS speeds before, so I know. I've even used if not that exact same connector, but at least a functionally same one from LCSC.

It's just that these reasons seem so inconsequential to me. There's stuff like type C needing two extra resistors even for USB 2.0. Being able to adapt between connectors is also BS in my opinion because you can convert anything between anything if you really wanted to. And if you only use USB HS speeds, you'll just waste connect lifetime of your expensive higher-specced cable. Sure, you can get 2.0 only type C cables, but why? I think there's no good reason to use USB-C besides high bandwidth purposes. Barrel jacks or supply power better. All our normal peripheral devices aren't consuming exponentially more bandwidth anytime soon.


Absurd comparison. It’s completely irrational to compare computer hardware is if it’s just some tool.

If you don’t need the bandwidth, good for you. The rest of us choose modernity and nice things like, USB-C hard drives, network adapters, etc…

USB-A is not a connector, it’s a whole spec. You can’t get the same performance out of that connector as you can a USB-C connector.


You can push 10 Gbps over USB-A connectors with USB3.


I haven't seen a USB-A port implementation that exceeds 10Gbps (3.2, was it?).

I just assumed there aren't enough lines/pins in a USB-A connector for it to work.

A quick search gives me https://www.tomshardware.com/news/usb-3-2-explained which supports that experience (USB-A is not listed under 20Gbps).


I'm perfectly fine with higher bandwidth connectors, but don't force it on the majority of users who do not need the modernity and nice things. Have Type-C as a dedicated high bandwidth connector and don't force me to use it on my devices that don't need the high bandwidth.


The wheel was invented before anyone living today yet it remains relevant.


Only because this is living rent-free in my head - a good wheel analogy would be Type C is rubber wheels and Type A is wooden wheels. Compatible, but nobody uses wood anymore.


Actually some continue to use wooden wheels:

https://dcrwheels.co.uk/custom-wheelsets/building-with-woode...

"Ghisallo have been making rims since the 1940s"


That's really cool :) thanks for sharing!


I don't think that analogy works. On the female side, Type A is robust and Type C is fragile. I'd prefer to have my equipment with the more robust solution if at all possible, because fixing it is a real problem.


Weirdly, the port on a cheap mechanical keyboard I have is USB-C but the cable they packaged is USB-C -> USB-A instead. They likely didn't package C->C because of the chicken and egg problem. Not everyone has enough Type-C ports.


Type-C hubs are annoyingly expensive, because they all assume you want all the USB 3.x speed and/or power delivery bells and whistles - but for connecting simple peripherals none of that is necessary.

I built my own based on a USB 2.0 hub IC, but I never got around to publishing the design files. I should get on that!

Even building a single prototype unit cost me about the same as a single off-the-shelf type-C hub, but of course, building subsequent units would be substantially cheaper.


> Type-C hubs are annoyingly expensive, because they all assume you want all the USB 3.x speed and/or power delivery bells and whistles - but for connecting simple peripherals none of that is necessary.

I've also noticed there are next to no usb-c hubs, apart from "docks" that usually do many more things are bigger. But, don't they have to provide the specced power to be "compliant"?

What I find annoying is that even "higher end" docks don't have many usb-c ports. I'm typing this through a HP dock that has a big-ass power adaptor and is quite big and heavy itself (has a huge heatsink), yet it still provides only one usb-c port. At least it seems to implement PD (I can charge a laptop through the downstream port), even though I think it only provides 15W.


I was outraged when I couldn’t find a small, simple USB 2.0 USB-C hub for my Framework! (1)

1: the four USB-C ports it provides aren’t enough for my use; one is occupied by Apple’s USB-C DAC (the Framework’s 3.5mm port is unusably noisy with my Shure IEMs). Then add my charger, tethered iPhone, and a peripheral like a mouse or drawing tablet, and you can see why I need a hub.

P.S. if you did release the files and it was a reasonably affordable DIY project (e.g some soldering and assembly), I’d love that. Although I guess you’d need to buy a fair number of all the components to bring the price down.


> Type-C hubs are annoyingly expensive, because they all assume you want all the USB 3.x speed and/or power delivery bells and whistles - but for connecting simple peripherals none of that is necessary.

Meanwhile the most frequent complaint I read about USB-C is trickster/confusing cables or hubs because they don’t support everything


My theory is that's why it's so hard to buy a "featureless" hub - consumers will demand refunds if feature X is not present.


I have more trouble using Amazon's useless filters or finding products that even try to be more specific than just "USB-C".


Mine, a fairly budget mechanical keyboard, came with a C-C cable, but includes a small C to A adapter.


Same. Keyboard and mouse


Almost every device I’ve bought in 2023 is usb-c now, even low end stuff, and keyboards and mice. By 2025 I’d expect it’ll be rare and unusual to see a micro port on any new device. USB A is definitely legacy.


PCs still come with half a dozen or more USB-A ports and only a single USB-C port (if you're lucky). Dongles, flash drives, webcams and the like are still prominently USB-A. Logitech's lineup is still mostly USB-A. I'm not sure that there will even be a full transition to USB-C, but if so, it will take quite a while.


PC motherboards are often like a study in the history of PC motherboards with vestigial functionality that hasn’t been in widespread use in decades. That’s a fairly poor indicator, and I think it largely happens to buffer the feature sheet, take advantage of cheap pricing on components over time, and copy and paste functionality in their design software. It’s over. The only place for a USB-A is on an aftermarket USB hub. That’s ok. It was nice while it lasted, but it’s time to move on.


There is no incentive for manufacturers to use USB-C over USB-A for applications that neither use charging nor need above-USB-2.0 speeds. A lot of peripherals and gadgets fall into that category. Another factor is the limitation on available PCIe lanes depending on chipset. If you can provide 8-10 USB ports, you won’t be able to give all of them USB 3 speeds, and having USB-C ports with different speeds on the same device would be weird, as there is no color-coding mechanism like with USB-A.


I bought a new machine last week (Intel NUC 13 Extreme). It has three Type-C ports of which two are Thundebolt. We are slowly getting somewhere. Meanwhile my phones, displays, keyboards, all use Type-C connectors. Though some came with a USB-C to A converter.

We are slowly getting there.


Counterpoint: I also bought a new machine, a couple of weeks ago (Optiplex 7010 Micro). It has six USB A ports, four of them USB 3 and two of them USB 2 only, with the option to add a single USB-C port (or instead an extra DP or HDMI or even VGA or serial port); if you add that option, that USB-C port (which has DP alt mode) can be used to power the computer (instead of using the barrel plug), which is nice.

Very high end devices might have more than a single USB-C port (which is probably your case, given the "extreme" in the name), and these ports might even be Thunderbolt or USB4, but that's still rare. In my opinion, we are still in the USB-C equivalent of the "only two USB 2 ports, if you want more get a PCI add-on card" phase we had in the serial/parallel/PS2 to USB migration back in the day.


I just bought a new device that came with a USB-A to USB-B cable!!!! the horror!! Makes sense, as it pretty much functions like a printer, only subtractive instead of additive process. Then it has Bluetooth instead of WiFi for wireless!?!?


i think the GP was talking about wired peripherals that plug into usb-a ports on the computer side, not wireless mice/keyboards which use usb-c to charge

my current wireless mouse/wired keyboard have usb-c on the peripheral end but I still use usb-a to connect them to the computer


These days it's common for the wire to not be fixed in. So it's just a usb-c port on the keyboard and you can use whatever cable you want.


I'm still confused with the term GP. Shouldn't it be OP?


GP means grandparent. It refers to the comment exactly two up.

Ex. From my comment, GP is the comment by pynappo, starting with "i think the GP was talking... "

Similarly, you can have GGP for great-grand parent, etc.

OP refers either to the starter of the post or the top level comment.


The OP would be the geenspun.com post or cute_boi, depending on context. The grandparent, in this case, is aslilac.


It's always GP in this subreddit.


going off of you saying “micro port” I think you’re talking about USB B.


More likely they mean micro-USB. USB-B is anything but micro.


Micro-USB is a variant of the USB-B connector. The whole point of A and B was that A ports were used on the "host" system, and B were used on the peripheral. If you remember USB On-the-Go, that was an attempt to reverse the trend and allow a USB-B device (including micro ports) to act as a host.

USB-C erased that distinction in favor of a full duplex network connection between two hosts.


USB-OTG ports are technically mini-AB and micro-AB ports that can fit both A and B plugs.

And USB-C is neither full-duplex (at least not for USB 2) nor host-to-host; there is a protocol negotiation and some devices can never act as hosts, although some can indeed assume both host and device roles.

Even in a “host-to-host connection”, only one side will act as the host.


> Micro-USB is a variant of the USB-B connector.

In the same sense that A and B are variants of each other, sure.

Even though it's called micro-B, the design is closer to A, and micro supports both ends with basically the same plug. I would never refer to it as just "B". "B" means the square plug.


"some devices can never act as hosts, although some can indeed assume both host and device roles"


If we're being pedantic, it's USB Micro-B - more specifically the High-Speed variant. There's also USB Micro-A and USB Micro-AB, and all three have SuperSpeed variants which are twice the size.


Thanks for the reminder on just how bad the USB working group is at naming things.


Thanks, this is new information.

I wasn't being pedantic though. USB-B is in my experience used to refer to the square plug (commonly found in printers). Micro-USB seemed like a closer approximation to what I though that poster was referring to.


My mouse and keyboard are both usb-c and I wasn't specifically looking for usb-c when I ordered either of them. Logitech MX Master 3 and Keychron Q1. They are both relatively high end devices, but I don't think it's hard to find usb-c devices.

That said I do own lots of usb-a devices and will continue to own them for the foreseeable future. I would not purchase a laptop that was usb-c only today.


Is your MX Master the "mac" version? The reason I ask is because my 3s came with a USB-A dongle (which I prefer instead of Bluetooth). I've never used its charging cable, so I don't remember whether it was c-c or c-a. It also didn't have a C-A adapter provided, so if you want to use the dongle and only have usb-c ports available, you have to supply your own.


That's a good point, thanks for keeping me honest! Mine is not the Mac version.

USB-C to charge it, but dongle is A now that you mention it. It stays plugged into the back of my monitor.


This keyboard is brand new, Type A. I don't recall any I was shopping for to be Type C.


> calling USB A “legacy” and a “relic” is so misplaced

Calling established technologies "legacy" and "relic" despite them being widely adopted and used should be recognized as the manipulative intellectual dishonesty that it is.



Note that these are keyboards and not mouses, and they have USB-C sockets rather than permanently attached cables with USB-C plugs. They are also very premium products.

When it comes to the entire ecosystem, a $5 bargain-bin noname OEM keyboard with permanently-attached cable is more representative - and those definitely don't come with USB-C yet. That would probably add a few cents to their BOM, so it simply isn't worth it yet.


To be fair I'd much rather us converge to 'sockets on both ends' now that the plug is the same anyway

Throwing away a perfectly good device because the cable is kaputz exacerbates our e-waste problem


I was replying to this specifically

> I challenge the author to go find a wired keyboard or mouse that comes with a USB C plug. it’s possible, but not easy.

where they do specially call out keyboards


More generally, I'd think many/most mechanical keyboards are USB C.

But I'd think that the budget, off-the-shelf keyboards would be USB A.


That is fairly meaningless, though. My six year old CODE keyboard could be USB-C today if I wanted, it uses a detachable USB cable. They wouldn't even need to make any changes to the keyboard other than including a different cable.


Those keyboards are really beautiful...


On the other hand, the introduction of USB-C will reach its 10 year anniversary in just a couple of years.

If this were 2008 you would definitely call a parallel port, a 56k modem, or a floppy drive a relic even though you probably used one in 1998.


My 1996 $2000 Packard Bell was effectively unusable by 2003 or so. My 2003 or so upgraded machine couldn't play Youtube videos by 2012. My early 2013 $600 Lenovo laptop is still going fine today (though it helps that I don't game anymore).


That's not a fair comparison considering how compatible USB-C is with USB 3.0 A/B.


Compatibility wasn’t what motivated people to switch away from the parallel port, 56k modem, nor the floppy disk.

Consider this exercise: The Nintendo Switch is the third best-selling console of all time. Walk me through how you would design it if the only port was USB-A.


Back in 1998 there were 4 or 5 main port types. And today there are still 4 or 5 main port types (with bluetooth as well). Compatibility was indeed the reason to switch to USB in the first place, as the universal serial bus made plug-and-play real. No reason not to have two USB port types, except device thinness.

The only issue the parallel port, 56k modem, and floppy disk share in common is throughput. This isn't a problem with USB-A compared to USB-C (until USB 4.0 becomes mainstream).


You can buy middle American mass market automobiles that have USB-C and do not have USB-A. That’s your bellwether.

Most of the USB user base doesn’t care about that nerdy stuff.


I'm not a tech nerd. I'm a lab biologist who just comments here. Multiple ports is my non-nerdy, non-rich use case.

My current car is almost 17 years old, and was bought used last year to replace the 20 year old car that I totaled. I certainly can't afford most cars with any USB ports, and wouldn't want them if I could.

Are car USB ports used for anything other than smart phones?


Car USB ports are generally only for smartphones, which is the #1 computing platform in the world.

That’s really the crux of my point. USB-C is the “winner” because it’s ubiquitous in the smartphone and tablet world (especially now with full adoption from Apple underway).

Cars are also a piece of technology that does a frat job representing the lowest common denominator of the average consumer’s relationship with technology. If your brand new $2,500 MacBook Pro doesn’t have a CD player, that doesn’t mean the CD is dead. It’s an expensive early adopter’s device. But if your brand new Toyota Corolla base model doesn’t have a CD player, that means it’s DEAD.

There’s also the answer to the question “what’s the most likely cable I have on me?” If you have an Android or Apple phone from the past 5 years you don’t have any rectangular USB-A cable with you in your travel bag.

I would also argue that the average person has very little practical use for USB for a lot of its data transfer capabilities anymore. The average person who owns a laptop probably never plugs in anything to the port besides things they are charging or maybe a mouse dongle.

It’s expected that your car (and most people’s cars) are older, but what I’m saying is that the fact that someone buying a new car doesn’t get USB-A pretty much shows us that it’s not long for this world if not dead already.

Sure, my desktop still has plenty of USB-A ports since space is not at a premium and nerdy custom builders demand them. But if you buy a desktop computer from Apple it doesn’t have any, and PC OEMs like HP and Dell won’t be far behind.

The only thing I’ve used the USB ports on my desktop for are mouse dongles and a fingerprint scanner. Basically, ~$20 disposable tech that would work perfectly fine as a USB-C version. My keyboard has already made the switch to USB-C and my mouse charges with USB-C as well, and I bought these items multiple years ago.


USB-C is simply more convenient to use, and you get it right 100% of the time, instead of your orientation being wrong 50% of the time with USB-A.


Just don't accidentally stick USB-C into a USB-A port. It will fit and it will short the port, causing the machine to crash. Speaking from experience...


Also don't accidentally stick an USB-A port into an Ethernet port. It will fit but your peripheral won't work


Sometimes I look at USB-C plugs and US power outlets and wonder if there’s a shock hazard there…


Once when I boarded a flight I noticed one of those multi-country mains power sockets next to a 2-hole headphone socket. They were arranged and aligned in a way that made it obvious that a 2-prong headphone cable would fit in the power socket. I took a picture of it and put it on Facebook before taking off.

After a nap I actually did put on headphones and plug them into the power socket by mistake. Luckily nothing happened.


I look at US outlets and wonder about shock hazards.


I think most people in the US have stuck a finger between the two completely uninsulated prongs of a plug and got a shock, usually during childhood. They usually only do it once.


The USB-C plug would be both too tall and too wide for the hot/neutral part of the outlet, and too wide for the ground portion.

So, no?


Would need a complete circuit. So unless you’re sticking in both ends of one cable to the same outlet…


Depending on how well you're grounded, you might just be the component that completes the circuit...

As far as I understand, not all US power outlets have a residual current device.


While you're at it, also don't accidentally plug a computer's Ethernet port into an rj45 wall plate that connects to an analogue pabx, it will fit and it will fry the NIC.

Don't put shit in the wrong hole is really one of the most basic lessons we should teach people in life.


On my computer, the USB-C port is at the back -- one time, I shorted the motherboard out due to trying to insert a USB-C cable without looking. Only did it once, but learned my lesson.


Typically I get it wrong 100% of the time, i.e. I was in the right orientation but misaligned so it does not work, so I flip it, fails again, and then I swear and flip it a second time.


Indeed, USB superposition strikes again. Image:

https://global.discourse-cdn.com/boingboing/original/4X/9/6/...


If you get it wrong 50% of the time let me substantially reduce that for you: logo goes 'up' for whatever the normal orientation of your device is. That should clear up 99% of the cases.


I had a Cooler Master case where the IO stack at the top of the case had the opposite orientation (logo down). Threw me off forever.

I'm at a point in my life where I've just given up and figure that I'm only going to get it right 1/3rd of the time anyway: Nope, turn; nope, turn; yup.


More like 55%. The usb ports on the bac of my machine are sideways, and the one on the top is backwards-ass wrong from what I would expect.


The motherboard has an "up" direction if you lay it out flat.


Eww that's weird, ok in your case you are out of luck. That's pretty strange though, I have a ton of USB-A devices here and they are all in line with the way I outlined. So that's a bit of a surprise but it may well be an exception.

As for the ones that are sideways: the 'bottom' of the case is the bottom of the motherboard, so if you think of it that way it might still work for all but the top side ones.


We live in a right handed universe so even sideways has a correct answer.


My desktop has all vertical ports, now what?


Normally speaking: right. Because the circuit boards are usually on the left. If they are not you're out of luck.


My pixel slate has a USB-C port that will only charge in one orientation.


That sounds like it's either broken or there's some dust or lint in the plug/cable.

Google is usually pretty good about their USB-C implementations.


Of course it's broken. It also won't power on unplugged, even with a full battery.

Definitely not googles finest device.


might be a cheap cable.


I had a similar issue on my old phone and I'm pretty sure it was because of something wrong with the connector on the phone's side.



I use a docking station to help give me dual monitors on both Mac & Windows. Some monitors don't work in Mac. Maybe I need to get a new docking station.


USB-A has a straight forward top and bottom. Once you know this, you almost never connect it wrong.


Maybe it's a chicken and egg thing, but why do we still have usb-a keyboards and mice and thumb drives? wouldn't it make more sense for them all to be usb-c?


I have at least 10 computers in my house, and none of them have more than one usb-c port. More of them have zero. Two of those with a single usb-c are motherboards purchased in the last year or so.

If a mouse or keyboard or thumb drive expects to be used by an apple computer, or a phone, usb-c is the right answer. If it wants to be plugged into a PC, usb-a is a better choice.

USB-C instead of any flavor of USB-B makes a ton of sense, and everyone should adopt that, but USB-C instead of USB-A is a little soon for PC oriented products, IMHO. Wait a few years, or ship with an adapter.


Yeah, that's what I mean by "chicken and egg": We have all these usb A ports because that's what the peripherals expect, and we have usb-a peripherals because that's what we have ports for. My laptop has just one c port which I connect to a hub with a bunch of A ports. My desktop doesn't have any c ports at all. My car has one A port which connects to the phone for navigation (navigation using bluetooth doesn't work) and a whole bunch of c ports which are for charging only.


Having them be USB C would mean they could only be used on USB C ports (adapters from USB C socket to USB A plug are forbidden by the USB standard, because they would allow creating the forbidden USB A to USB A cable), while having them be USB A allows them to be used on both USB A ports and (with a simple passive adapter) USB C ports.


> forbidden USB A to USB A cable

In the early days of USB before flash drives were common, I was convinced such a cable would let me connect two PCs together to transfer data. Spent some time looking around in stores before a kind sales rep advised they did not exist.


> before a kind sales rep advised they did not exist.

Plot twist: they do exist, but not like you would expect. Unlike the forbidden USB A to USB A cable (which connects together the power supplies on both ends), there's a special debug-only USB A to USB A cable, which connects only the USB 3 pairs (and leaves both power and the USB 2 pair disconnected). Of course, that cable is useless unless you know how to put one of the devices in the special debug mode (and know which of the USB ports is the correct one, since AFAIK this debug mode usually works on only of the USB ports).


But they do, but they aren' passive and require drivers https://www.amazon.com/Plugable-Transfer-Compatible-Computer...


That’s a very good point. It doesn’t stop some manufacturers from still shipping the “forbidden” adapter type for just that use case, though.

I prefer USB-A for FIDO authenticators for that reason (and because the plug is more robust for USB-A and basically indestructible; C plugs can and do get bent on a keychain).


I have yet to see a bent usb c (not saying it couldn’t happen, but…

It’s have seen multiple crushed USB A connectors.


Ah, I should have been more precise: The (technically out of spec) "half-A" plug used by e.g. Yubikeys and some low-profile USB drives seems near indestructible to me. Regular A plugs can definitely be crushed.


As some anecdata, I work in facilities with hundreds of non-tech folks using USB-C Yubikeys and we see multiple bent connectors daily. Granted, our userbase isn’t known for treating electronics kindly…


Both my wireless keyboard and mouse are USB-C. I use the C connection for charging them.


My wireless mouse has a usb-c port for charging only (doesn't have a wired mode). But its dongle is usb-a only and no adapter was provided. It's a rather recent and "high-end" model, too: an mx master 3s.


I understand your point, but keyboards and mice aren’t the correct target I feel. I haven’t had a non USB-C peripheral in at least 4 years.


> I challenge the author to go find a wired keyboard or mouse that comes with a USB C plug..

I can confirm, it's a journey. It's the same kind of journey than finding a hair trimmer that charges on USB C.

I see it more as makers being complacent and not giving a fuck though. There's no technical reasons for those to be USB A, and the USB C ones work great.

So yes USB A will be there for a while, and more often than not it's a symbolic middle finger to the buyer, a signal that a product should probably be avoided.


> There's no technical reasons for those to be USB A, and the USB C ones work great.

There's a good technical reason for keyboards and mice to still be USB A: adapters from USB A socket to USB C plug are allowed by the standard, but adapters from USB C socket to USB A plug are forbidden (because they would, together with a common USB C cable, allow creating the forbidden USB A to USB A cable). This means that USB A keyboards and mice can be used in both USB A ports and (with a simple passive adapter) USB C ports, while USB C keyboards and mice could be used only on USB C ports.

Therefore, until having enough free USB C ports in computers is common enough, using USB A ports (with an optional adapter to USB C on the box) on the keyboard or mouse makes sense. This is similar to how, during the transition from serial and PS/2 mice to USB mice, it was common for them to come with a adapter which allowed them to be used either on a USB port or (with the adapter) on a PS/2 port.


>because they would, together with a common USB C cable, allow creating the forbidden USB A to USB A cable

Not hard to do this regardless. Amazon sells A-to-A cables[0]; and Unicomp keyboards for example have a USB-A port in the back, and connect to PCs with a bundled A-to-A cable. Seems like preventing this is a lost cause.

[0] https://www.amazon.com/Monoprice-1-5ft-24AWG-Cable-Plated/dp...


I have more USB switches and kvms that do a-a instead of a-b than I care to admit


Yeah, USB switch for keyboard/mouse is the reason I expect to be using USB-A (or the "illegal" adapter) for few decades more.


On keyboard it's easy enough to just put a USB-C port on the keyboard and let users either use an A-C or C-C cable.

For mice, having a hardwired cable probably still makes sense in terms of bulk and strain relief, but I suspect that wireless mice are also far more common.


looks at his MIDI controller that comes with a USB-C female to USB-A male converter plug

You don't say...


Mmmmm forbidden cable....


A C-to-A adapter came with my laptop dock, but I have never thought of this


> USB A will be there for a while, and more often than not it's a symbolic middle finger to the buyer, a signal that a product should probably be avoided.

You have exceptionally strong opinions about USB A and C.


Micro USB is the real bane of it, but the time I spent to find USB-C versions of so many devices, including mices, made the issue a lot more personal I think.

I kinda hate that we're stuck in dongle town for so long now. And going wireless brings in the charging issues. Computer makers are also to blame, but I think that ship has sailed.


So yes USB A will be there for a while, and more often than not it's a symbolic middle finger to the buyer, a signal that a product should probably be avoided.

maybe it’s the opposite, it’s a warm hug reassuring you that they’re not going to change things just because some of the cool kids are.


> There's no technical reasons for those to be USB A, and the USB C ones work great

Except that all of their plans already have USB A in them. They'd have to rework the plans. Then, rework the lines making them. Then they'd have 2 versions for a period of time. Someone in accounting and logistics would have to do more work.

Seems like a perfect time to bring that conversation to a halt with the "if it's not broke, don't fix it" line.


Different industries move at a different pace.

Many common power plugs were standardized 50-100 years ago. Compared to them, even USB-A is still new. If all you need is power delivery, there is little reason to switch to yet another plug type, which only exists because of unrelated requirements in other industries.


At introduction there were billions of USB A peripherals and few users with USB C ports. The only sane thing for peripherals to do is ship USB A not wanting to cut out 99% of the market and the only sane thing given that for PCs to do is ship mostly USB A ports. So we start with an obvious optimal choice on all parties parts right now how do we break out of a trivial equilibrium into a mostly USB C universe?

If a PC ships with mostly USB C ports. Well since the majority of accessories are A users are going to be frustrated when they need adapters/a hub to plug in anything because they don't have enough of the ports accessories actually use.

If accessories shift first then users are going to be frustrated when they need adapters or a hub to have enough ports to plug in their accessories.

Remember that the average user keeps a computer for 6 years and they keep accessories longer often throwing things out when they literally stop working or can't be made to work with their new device.

Furthermore even a slight increase in costs is problematic when you margins are fairly razor thing. It's a really hard sell for anyone to move forward.

Apple has a substantial advantage here wherein they have enough good will from their users, enough margin, and enough sway to simply upgrade and tell their users to buy adapters while neither losing profit nor users.

That being said being A is hardly a middle finger for the vast majority of devices which need neither more power/more speed than usb 3.2 2x1 can provide as we are talking about 10Gbps and 15W. The most common accessories are mouse,keyboard,sound,cameras,small storage, less commonly network adapters

We haven't yet found a compelling case for a beefy connection but surely there is right.

High end video capture, high end storage, displays, hubs that serve many fast devices, 10Gbps Ethernet.

None as common as the previously listed and not fun to get working when not every port supports high power, higher speed, or optional features nor every cord. Using such features feels like the plug and pray of the early 90s whereas plugging in a DisplayPort monitor or a standard stereo jack speaker system is as boring as plugging in a toaster.


> I challenge the author to go find a wired keyboard or mouse that comes with a USB C plug.

Not the author, but all my Logitech keyboards have USB-C.


Yet on the opposite side we are still buying USB A peripherals _because_ we do not have USB C ports everywhere.


> I challenge the author to go find a wired keyboard or mouse that comes with a USB C plug. it’s possible, but not easy.

So let’s see . . . I have the following in my house, most are at least 2 years old and all are USB-C :

MacBook Air

MX3 mouse

Keychron keyboard

iPhone 15

iPad Air

Xbox controllers (several)

LG 4K monitor

Racer Headphones

Rode VideoMic Go II

Pebble V2 speakers

GoPro 11

Chromebook laptop


The dirt cheap mechanical keyboard I just bought is USB-C. Which was great, because I didn't need an adapter.


> calling USB A “legacy” and a “relic” is so misplaced. we all still own USB A devices.

All my client devices have detachable cables. All my host devices are USB-C and have been for a while, except for a) Xbox, b) RPis, c) Car.

a) I rarely plug anything into it except a dedicated charging cable or the odd Xbox-specialized devices.

b) for which client devices (if any) are never unplugged

c) A cable is plugged in and never leaves the port

For general-use client devices I either swapped cables for USB-C to uUSB-A or USB-B or had USB-C OOTB for recent devices, increasingly so on both ends.

I also recently moved from an iPhone 7 to an iPhone 13 Mini. If it wasn't for Apple refusing to do an iPhone 15 Mini I'd be USB-C all around.

> I challenge the author to go find a wired keyboard or mouse that comes with a USB-C plug.

Accepted: I've had both for over a year: Keychron K2, Keychron M1. Logitech has a lot of devices with USB-C client side.

The irony is that one of them came with a USB-A (host side) to USB-C (client side) cable. The other one was C-C with a "upgrade your computer connector" C-A adapter†.

> Then in 5 years

Huh. I've been going all in on USB-C since 2018, my strategy being to pick the new standard through and through and backport/polyfill USB-C on the USB-A host side† as needed, NOT the other way around.

It's absolutely ridiculous that any off-the-self device today has anything else than a USB-C port client-side (which is where most of the mess actually is with all the mini x micro x hispeed x A x B x HDMI x DP connectors). I'm not throwing away any pre-C device or cable, I upgrade existing cables to be USB-C††. If a cable fails I get a C-whatever replacement.

As far as I'm concerned USB-A/B is a relic. Folks cling onto it like they clung to VGA (then, and now HDMI), floppy or optical drives.

† Upgrade the host port by leaving this permanently in the host device port: https://www.amazon.com/UGREEN-Adapter-Female-Converter-Charg...

†† Upgrade the cable by leaving this on the cable: https://www.amazon.com/UGREEN-Adapter-Thunderbolt-Compatible...


Almost all of the dozen keyboards I’ve bought on the last few years are USB-C.


It's very easy to bodge together a USB-A connector in a pinch in a way that USB-C just isn't. USB-A is simple and there's literally tens of billions of devices that would be made just a little bit closer to being e-waste if USB-A ports became substantially less common than they already are on mainstream devices. Adapters exist and should exist, but we shouldn't have to rely on them.


No. If we designed everything around its “bodge together in a pinch factor” we wouldn’t have technology let alone industry. It’s just not realistic or practical to live in the past and I do not believe it would be oppressive to ask people who still have a use for type A to buy a $5 adapter. Also the type C connector retains the exact pins you need to use it in 2.0 mode, they’re just smaller. Literally all these adapters do is connect type A pins to type C pins and add a 56kOhm resistor.

The real reason is bandwidth.


Still we live with power sockets and lamp sockets that hark back to Edison times, more than a century ago, and live relatively happily. Backwards compatibility with millions (and hundreds of millions) of existing devices is a powerful force.

While I'm in favor of USB-C connector eventually replacing USB-A connector completely, I think it's best done gradually, with new devices still offering the occasional USB-A socket for quite some time. Remember how slowly PS/2 ports disappeared from PCs which also featured USB sockets next to them, or how PCI and PCIe coexisted for many years on PC motherboards.


New mobos still have PS/2, especially the higher-end ones.

Edit: don't believe me? https://www.gigabyte.com/Motherboard/Z790-AORUS-TACHYON-rev-...

    1 x PS/2 keyboard port
    1 x PS/2 mouse port


And long may it last, there is no way to truly replicate a ps/2 port/header with adapters etc

Hopefully if they ever go away for good someone makes a pcie ps/2 card with proper interrupts (if that is even possible behind abstractions like MSI) rather than converting ps/2 to usb


IIRC PS/2 ports are actually an extreme overclocking feature as they are more likely to continue to function at borderline stable clocks.


Yeah but I'd still argue PC manufacturers have it backwards. Offering 1 or 2 USB-C ports and 4 USB-A ports is the wrong ratio. The Mac Pro has it right: 2 USB-A ports in the front, one inside for software keys, and 8 Thunderbolt 4 ports for truly insane speeds impossible to achieve with an A plug.


The reason may be the cost. Four insane-speed ports, especially with DisplayPort routed to all of them, take more resources, more expensive chips, and trickier PCB routing.

My Thinkpad T14 has one USB-C port which works best for charging (also handles Ethernet and slow-speed stuff), and another that works best with an external 4K display. All else is USB-A. But I bought it for $500 used. A used Mac Pro 2019 is like $2500, to say nothing about a new one.


To my knowledge there haven’t been any recent leaps and bounds advances in electrical socket technology where a bunch of 200 year old houses are the only thing holding us back.

Early on maybe 8 or nine years ago I would have made a similar argument as you are now, but it’s been quite awhile even mice and keyboards come standard with usbc now.


I can imagine a much nicer power socket, symmetrical, more safe, always grounded, smaller than the Euro socket. But it has no chance to take over the world, which is dominated by either the Edison's or Siemens's (I suppose) early designs.


> people who still have a use for type A

Isn't this basically everyone?

I can count on one hand the number of times I've seen something that plugs into type C and it's not either a phone charger or a type A adapter.


> I can count on one hand the number of times I've seen something that plugs into type C and it's not either a phone charger or a type A adapter.

By way of comparison, my experience is basically the opposite. Most of USB devices I have attached to my computers are USB-C - my keyboard, local backup disks and non-perf critical mass storage, my yubikeys, one of my monitors, network dongles. Most of the USB-A things I have around are for recharging stuff, plus old external disks I keep out of paranoia, and, things like my car's ports for connecting my phone for CarPlay.

The last one is something that annoys me about the EU's insistence on USB-C for phones. I had a bunch of perfectly serviceable USB-A to lightning cables to discard because the connectors have changed for no benefit to me. It seemed driven more by dogmatism than any practical advantages in real life. None of the shit I need to plug the phone into changed their connectors, so I just need to replace cables for no good reason.


> I had a bunch of perfectly serviceable USB-A to lightning cables to discard

Can't you just give them to the person who is inheriting/buying your previous phone?


The word “discard” does not preclude what you suggested.


> ...people who still have a use for type A to buy a $5 adapter

I have literally tens of devices with usb A to usb B/mini/micro etc. cables, and only a few that have usb C, and even most of those came with a usb A -> usb C cable.

USB A is far from dead and should stay... I don't want to live in an "apple-like world", where you need an adapter for everything that should be included directly in the first place (lika an USB A port!!). And I already have to buy a serial (rs232) adapter, because they removed it for no real reason.


> And I already have to buy a serial (rs232) adapter, because they removed it for no real reason.

RS-232 is a pain, it's one of the few users of negative voltages on a PC (others being ISA cards and PCI cards; PCI Express finally removed the last negative voltage pin), and the input needs to be tolerant of any voltage between -25V and +25V. I can see why computer manufacturers would love to not have to deal with it anymore.


And somehow a usb->rs232 adapter can handle all that, including negative voltages and costs less than $5, and a max232 chip for less than $1.


The type C port is superior to and entirely backwards compatible with, the type A port. For your 10s of devices with type A, just get one of these: 7-Port USB 3.0 Hub with 19.7 Inch Cable, ORICO Ultra-Slim Data USB Splitter https://a.co/d/aRru1K1

It’s way better than plugging every device into the back of your PC anyway.


I still have plenty USB-A/B devices and I'd rather not buy a new cable for each of them.

Ain't broke, don't fix it. They will fade away eventually.


You don’t have to but a new cable for each one. 7-Port USB 3.0 Hub: https://a.co/d/aRru1K1


This hub sounds so convenient that maybe, just maybe, some genius had an idea to incorporate it into the motherboard?


It's also very easy to bodge together a USB-C connector in a pinch that only supports USB 2.0

USB-C is an extraordinarily flexible connector and is super backwards-compatible. If you wanted to provide the exact same support as a USB A port, it's not any more complex. Just finer pitch to solder, but not any finer than anything else on the motherboard.

An A to C cable is also entirely passive. I think it's ok to rely on them. The future is now, time to move on. It's been 27 years.


Have you seen how big the contact pads are on type A, and how small they are on type C? I don't think anyone is "bodging together" a type C connector or soldering anything on its contact pads.


>It's also very easy to bodge together a USB-C connector in a pinch that only supports USB 2.0

If I found out (some of) the USB-C ports on my mb were limited to USB 2.0 speed and power I’d be quite pissed.


I would hope nothing on your mobo is bodged from the factory.


USB-C was designed for highly-integrated mobile devices. Motherboard vendors don't want to deal with power delivery and DisplayPort, and users will be confused when that stuff doesn't work.

When a USB-C port is out in space, rather than flat on a desk, C-to-A adapters create a lever arm that increases the risk of damage. Though this can be solved by using adapters with ~100mm of wire in the middle.


Motherboard vendors don't need to deal with PD and DisplayPort. USB-C will work just fine without them - it'll just behave like any regular old USB port.


Right, but if they do that then people will complain that their USB-C port isn’t working, or about how USB-C is so confusing because you never know what a port does.

They’ve got a choice between doing nothing and having a few people complain, or doing a bunch of work and spending a bunch of money and still having people complain. One of those is a lot easier.


USB-C is like a regular old USB port, with twice the data lanes (or a mux[1]) and a transistor to turn on the power when requested.

In other words, it costs more.

[1] https://richardg867.wordpress.com/2020/02/29/usb-c-done-chea...


On the scale of a motherboard, those things are cheap and provide more value than half the marketing bullet points.


Hence why they have so few


USBC is only a connector. It doesn't specify a protocol.

On Macs, USBC ports are used for both Thunderbolt and various USB protocols.


I always found that my C to A erred on the side of letting the cable fall out. So no damage but fairly irritating.


Seems like anyone who would buy anything other than a laptop would be pretty tech savvy, unless they just bought it because of familiarity and habit thinking "Desktops are what we use for home stuff, I don't wanna mess with laptops I don't know much about".

I can't really think of why an average user would want one other than gaming, and PC gamers can likely figure out USB cables.


Power Delivery was an addendum to USB 2. It is specified for USB-A ports.


I haven't ever seen a controller that does the BPSK modulation required for A ports. I suspect the standard was defined, but everyone just decided to do signalling over the CC pins on USB-C and ignore the USB-A standard.

If you know of any controllers that implement the spec, I'm very interested in knowing part numbers! If for no other reason than as a historical curiosity.


You aren't going to get 20W of fast phone charging power out of USB 2; you get 2.5W tops. USB 3 gets you to about 5W.

Meanwhile USB C goes up to 240W. You're probably never going to get that directly out of a motherboard.


>you get 2.5W tops

False. USB-BC allows up to 1.5A, or 15W and doesn't require usb-c

https://en.wikipedia.org/wiki/USB_hardware#USB_battery_charg...


The majority of USB C ports will never support 240W power delivery.


Two reasons:

1. USB-C is more technically complicated, especially when you want to create a host, and especially when you want to do anything faster than USB 2 speeds. Fully-integrated chips for USB-C are still quite rare, so you have to hook up a handful of different not-exactly-cheap chips to add a USB-C port. On the other hand, USB-A is pretty trivial and can be added at basically zero cost.

2. There's still a wide USB-A ecosystem out there. Having to use adapters suuuucks, so people want to have at least some USB-A ports in their system. Moreover, a lot of more-trivial peripherals feature USB-C ports rather than pigtails, so you can easily connect them to USB-A ports with A-to-C cables.

The balance will slowly shift towards USB-C, but it's not exactly surprising that USB-A is still quite common.


I just want at least 1 of both. For at least the next 10 years. I will still have USB-A peripherals I use daily to weekly until then. I would've paid $100+ extra for a USB-A port on this macbook.


The aux cord removal is still felt by many


I refuse to buy a mobile phone without one. Just not happening.


What phone would you buy right now if needed one?


Asus and Sony are still holdouts in the premium price range, Motorola and OnePlus in the budget range, and also a number of Chinese manufacturers I think


Sony, AFAIK, still gurantees boot unlocking, so, they're cool. But the price...


I have an xcover 6 pro that I adore!

Pros:

- incredibly durable; in the first week I had it, it slipped out of my pocket and was run over by a car, screen side down. No lasting damage aside from a few marks on tne glass

- headphone jack

- expandable storage

- removable battery

- dock-rechargable (which makes up for its lack of wireless charging)

- pretty clean android skin, cleaner even than my Pixel (which had a nonremovable google search box)

- has 3 programmable buttons, each with two options, so you can have hardware that launches your most used/time-sensitive apps. I have one mapped to Shazam, for example.

Cons:

- pretty big

- cant remove some samsung apps

- not a great camera or flashlight

Overall i would definitely buy another!


I bought a new phone a few months ago. Galaxy A14 5G. Of course, it's an extreme budget phone, but it works fine for me and - because there's less pressure to go for flatness - it has a 3.5mm port. In my experience, lots of budget phones still have them.


I might be the minority here but I prefer USB-A. The connector feels more robust and I don't worry about damaging it. My keyboard, mouse, webcam, flash drives, etc. all use USB-A. A disadvantage of USB-C is that lint can get in the port and prevent a proper connection (this happens on my phone all the time). I believe that USB-A will be with us for a long time; the wisest thing at this point is to make sure that the latest USB protocols are supported over both physical ports. This was you can use USB-C on devices where small size is a priority (such as smartphones) and USB-A on devices where backwards compatibility/physical ruggedness are priorities.


I have mixed feelings about it. Technologically I definitely prefer USB-C, and the "reversible cable" and "plug fits either way" parts are really neat.

However, they do indeed feel quite a lot more fragile. USB-A is an absolute tank, and I regularly just haphazardly plug it in without paying much attention to it, knowing that I'm not going to damage it. With USB-C I'm always a bit careful about not putting too much strain on the connector - especially as most connectors are rather long and act like a lever!

I can totally see myself breaking off USB-C ports because I forget to unplug the cable to my docking station. With USB-A, I pretty much expect to break the cable first.


> With USB-A, I pretty much expect to break the cable first.

It might seem that way, but the port supposedly breaks more easily than the cable in USB-A, since the A port contains springs (which will eventually get worn out and can be bent).

Anecdotally, I've seen tons of broken A ports with exactly that failure in airplanes, buses, and other places with public USB charging ports that people don't treat particularly nicely.

The leverage of C plugs does seem scary, but I have yet to encounter a broken port – fingers crossed!


Less than an hour ago I set my backpack down on top of a usb-c cable plugged into the back seat usb-c port in my car. The cable broke at the plug, but the port is fine. I'm a little upset with myself because it was my only 5A rated usb-c cable.


I don't even have any devices that use USB-C. Not a single one. The only USB-C cables I have are for charging phones.


The computer accessories shelves at e.g. Target are still dominated by A. Two years after this article.


Because wired computer accessories are almost exclusively used by desktop users which all have USB A.

Go look at phone charging cables which you might want to plug in to your laptop and see that C-C is more common than C-A now.


TFA is about desktop computers.


Yup, I don't own a single device that uses USB-C either: laptops, tablets, phones, keyboards, mouses, flash drives, network storage devices, routers, wifi access points, printers, scanners, kindles, microcontroller boards, calculators, rechargeable battery chargers, wireless temperature monitors, etc. Everything is USB-A (to micro-B, sometimes mini-B).

Still mad at Apple for removing USB-A. I understand adding USB-C. Why remove them??


When buying an expensive (relatively) computer was a big deal in my house, I had a hard time convincing my dad about the iBook G3 which didn't come with a floppy drive.

Apple has been at it since the dawn of time.


I think Apple made a terrible mistake. But I am not worth $3 trillion, so who am I to argue? But I will piss into the wind anyway:

The biggest difference between USB-A and floppy/CD/DVD drive is the network effect. USB-A is an interface between 2 devices, so its usefulness is measured as an O(N^2). It allows all my laptops to connect to any of my peripherals. To replace USB-A, I would need to replace all my laptops and all my peripherals. A floppy drive, on the other hand, is an O(N) device for the most part. Its usefulness is mostly limited to the single device that it is installed. (Sure, floppy disks are shared between computers, but I would say that this is a less frequent use-case.)

The other difference between USB-A and floppy is that the USB-A is "good enough" for almost all use cases. My keyboard, mouse, flash drives, USB-ethernet adapters, all work perfectly fine with USB-A. In contrast, going from a floppy to a CD increased the capacity by 1000X. From CD to a flash drive, we got another factor of 10X to 100X. And much faster random access. A floppy became "NOT good enough" very quickly.

I see USB-A existing for the foreseeable future, the next 10-20 years, because of the network effect, because it is good enough for most things, and because it is slightly cheaper than USB-C.


> To replace USB-A, I would need to replace all my peripherals.

You can get those USB A-C plug adapters for $1-2 a pop. Stick one on the end of each peripheral cable, now they are all USB-C.


I specifically avoid buying anything without USB-C. Not only do I like having everything on the same connector, MicroUSB is often the least reliable part of a device aside from battery calendar aging.


Phone, tablet, laptop, tyre pump, MiFi 5G hotspot, VPN router, portable monitor... most of my devices are USB C, except for my car (2019 - recent rentals are USB C), desktop PC, keyboard and mouse.

I think it's great for travel, combined with an Anker 737 or similar battery and Anker 736 100W charger.


Unless everything is fairly old, that must take some doing. Everything on my desk is USB C. Both keyboards, Logitech mouse, phone, dev kits / boards, both work and personal laptops (PD charging, personal only uses USB C). I’m getting used to the every cable is a double ended USB C life.


I have a mouse, headphones, stream deck, phones with type C


The big upgrade in USB-C for mobile devices is power delivery; a secondary benefit is video output for laptops, enabling docking using a single cable instead of an old-fashioned dock.

PCs aren't in the power delivery game and don't really need an extra video output.

Meanwhile, USB 3 over a USB A socket is still able to transfer 500 MB/sec of data - I just did this earlier today, copying from an old M2 SSD in an enclosure to my new upgraded installed SSD. That's pretty quick.

USB C could double it, but how often do you need 1 GB/sec? Not many flash drives can deliver that outside of SSDs in enclosures.


USB C is easier to use.

Machines should have more C ports even if they have to share controller lanes--rarely would anyone be in a situation to saturate the system but being able to have multiple devices plugged in at once is useful.

I'd be perfectly happy with an internal USB C hub--but I have never seen such a thing. (I *do* have an internal USB 3 hub. Same issue--the convenience of multiple ports, I don't need the bandwidth of them having their own controller lanes.)


>USB C is easier to use.

Not in my experience. The last few times I went to try to use one of my USB-C peripherals it was a pain. For example I wanted to use my portable external monitor with my laptop. Both are "USB-C" however the issue is that USB-C describes the physical connector and not much else. I already new my laptop supported DisplayPort Alt Mode and I knew that was what the display was expecting (rather than being USB display adapter). However, I then ended up digging through the box of USB-C cables testing them until the third cable I tried lit the screen up. This was the general experience with my Thunderbolt M.2 interface as well. I don't have much that takes uses PD but as I understand that's yet another dimension of permutations.

Without intimate knowledge of both devices /and/ the cable it's hard to say what any combination of "USB-C" parts will do. Even OEMs fail to get this right. I friend recently told me of buying Dell laptops which offered a dock as an option. Unfortunately, the dock expected Thunderbolt & DP MST while the laptop only supported USB3.

My personal opinion is that the relative lack of USB-C in a highly competitive, consumer market like DIY Motherboards is likely because even after 10 years the consumer demand is just not there. When judged the more lofty aspirations of USB-C that are being expressed here (replacing most if not all USB-A, HDMI, DP, Thunderbolt, and lower wattage power cables), I consider USB-C to be a failure.


> mechanical hard drive and a feeble 8 GB of RAM (is that even enough to run Windows by itself?)

Windows 10 runs acceptably on a Core 2 Duo with 4 GB of RAM - this was my son's gaming PC until a couple of months ago. (It did have an SSD, though - a mechanical hard drive would certainly have reduced the snappiness.)


It’s pretty crazy that an operating system needs that much at all. I mean, we all get used to these big numbers but 4GB is actually a lot of bits. A single GB can store more books than a human can read in their lifetime. Somehow, we have accepted that the most fundamental piece of software on a computer should require many lifetimes worth of books of RAM. We have come a long way from thinking we would never need more than 640k!


I used to kill off services I didn't need on my Windows NT 4.0 Workstation ... I could get it down to thirteen megabytes of RAM used.


By point of comparison, my fairly minimal Linux partition uses ~200MB of RAM after booting, and I suspect a good chunk of that is cache.


But you’re conflating data at rest vs data in memory.

A pdf for example can be very small but then take up considerably more memory to display an embedded image once it’s done decompressing for render. Similarly text is more expensive just by virtue of higher resolution displays.

What we ask of our machines has expanded greatly, and the immediacy with which people want things to load is more exacting too, so a lot of content is kept warm in memory.

I’m not saying 4GB is a healthy amount and I agree there’s likely room for optimization, but let’s not also act like there aren’t huge advances in what computers do to necessitate that either


I’m not sure I’m conflating anything. A bit is a bit. Some bits are faster than others but it’s all just bits in the end.

Yes, compressed content can take up more bits when uncompressed. Also, compression can’t reduce size beyond what information is actually there without losing that information somehow. Just because we are not good at working with compressed content and require a lot of RAM to deal with raw/uncompressed data doesn’t mean it’s not wasteful or even that it’s the only way to do it.

I’m glad you agree we have come a long way too. Perhaps as we collectively go further, we will find ways of accomplishing even more with fewer resources.

— written from a pocket sized computer with far more bits than the total number of humans that have ever existed


I used such a computer (4GB DDR3 ram and 2 GHz dual core Pentium) for some very basic work tasks on Windows 10 LTSC (the one without the bloat). It had Windows due to some weird hardware incompatibility, so I used it for about a month. The computer is used for browser with a couple of Google Spreadsheets opened tabs and a couple of messengers once in a while to reply to some messages.

It was almost unusable in that very scenario. I could handle a couple of tabs, but once I opened a couple of extra tabs (e.g. some search either through Google or any local shops) it literally hanged. I had to wait for it (to swap, I suppose) for a couple of minutes to unfreeze.

I went very angry over time and replaced the weird incompatible with Linux component (GPU) for another one and now it runs Fedora 39 with default Gnome 45 installed. It works perfectly well and it hangs only when I open relatively big number of tabs, 15 to 20, which is a lot for the use-case of that very machine. I thought of optimising the machine to run just swaywm (which eats like 300 MB compared to over 2GB of Gnome), but I never faced any issue for over a 3 months period.

So I would confirm that even 4 GB is plenty unless you need a big number of browser tabs opened. But not with Windows. Windows was usable on that machine if you don’t use browser heavily, but very very lightly.


SSD or mechanical hard drive?


Oh, I thought I mentioned! SSD of course!


I have two usb-c ports on my desktop. I’ve never used them.

Almost everything that has a usb-c connection can easily be connected with a c to a connector.

My laptop, on the other end, everything I use is usb-c, and everything that’s not is usb-c though a multi-port adapter.


USB-C aside, I find that *TX motherboard manufacturers have been too unimaginative when it comes to I/O and haven't evolved for the past ~10y to serve growing demand in new and evolving segments.

My pet-peeve: If you want more than a single NIC in an AMD-based small form factor, your options are extremely limited. Even if you're open for importing and paying >$300 for just the board or going a couple of generations back. Meanwhile, you have a plethora of options in the exploding segment of NUC-like mini-PCs. There's almost nothing on the market in-between firmware-roulette Chinese mini-PCs and effectively ATX-sized breakout-rigs. Asrock RACK (with scarce availability) is the only exception I can think of over the past 5y. I don't get it.


You need a PCI card NIC anyway for when Realtek introduces another high throughput triggered bug in their driver and the motherboard NIC becomes useless.


If you're getting a motherboard with multiple NICs, it's probably not limited to lowest cost NICs, so Intel is on the table. (Although, I'm still not sure if they got their 2.5g NICs to work?)

Dual NIC boards with AMD sockets does seem like a niche unfulfilled, though. There's some options on the Intel side though, something like ASRock Z690M-ITX/ax might work if you compromise on CPUs.


The server space is quite nice with the ASRock Rack models: last month, I bought ROMED8-2T and got double 10G NIC and EPYC compatibility in ATX form factor. I also paid $100 extra for Intel NICs plus an USBC port. The motherboard cost 920 dollars though.


something that might be useful to you for that; you can get m.2 slot network adapters which you could use on (some) ITX boards. there's at least 1 & 10gbe ones available.


Sure. But if you want to actually have two m.2 drives to either mirror your root filesystem or use one as a secondary? It gets tangly quickly.

inb4: CPUs as well as commonly used chipsets have enough lanes, it's not that.


It sounds like you're at the intersection of "tiny", "multiple network ports", "multiple M.2 drives", and "dedicated gpu". I wouldn't be surprised if that's a rare combination.


Could do without the dGPU given APUs. I don't think the rest of that intersection is rare at all for workstations and desktops.


I think a workstation with only one pcie slot is a bit rare?

Either way, if you're using an APU then that frees up an x16 PCIe slot, which on many motherboards can be adapted into 4 M.2 slots. That should get you plenty of networking.


Add 4- or 8-port GbE NIC to PCIe, add 64GB of RAM, add USB stick for router operating system, and you should be able to handle edge traffic for a small office.

What's your goal here??? Multiple NIC in ITX form factor, for what?


The vast majority of keyboards and mice are USB-A, so you want at least two USB-A ports for those.

Even Apple recognizes this, which is why the desktop Mac's still have a couple of USB-A ports.

(And most of the rest are Bluetooth, not USB-C!)

USB-C has its merits, but I think they are bigger on mobile devices where you only have a limited number of ports and they have to serve multiple roles. On a desktop, it's certainly nice to have a couple of USB-C ports, especially if they're USB4, but you also have dedicated ports for video, audio, etc, so the versatility isn't quite as valuable.


I can’t speak for other people, but I feel like my life is a revolving cocktail of USB Type-A, Type-B, Micro, and Type-C. I can’t just cut and re-crimp my old 2001 Sony Memory Stick USB adapter (which still has drivers available for Linux) to support USB-C.

One thing that stuck out to me about the article is that the author didn’t talk much about their use cases with USB.


I still have a bunch of devices on USB-mini, that's the hardest one to source stuff for.


> I can’t just cut and re-crimp my old 2001 Sony Memory Stick USB adapter (which still has drivers available for Linux) to support USB-C

Yeah but you can put a $2 USB A-C adapter plug on the end of it and just leave it there


All of the devices I have plugged in or use regularly on my desktop now are USB-A:

Only USB-A, no plug on the other side: keyboard, USB stick, Oculus Rift (CV1)

B/mini/micro on the other side: scanner, digitizer tablet, sound card

USB-C but shipped with USB-A to C cable: mouse

USB-C, shipped with both USB-A and C cable: HDMI capture card

Many of my devices that use USB for charging are still micro-USB

So yeah, many of these devices are quite old, but from a practical standpoint, I actually need zero USB-C sockets on my motherboard, and many USB-A. I guess it will change in the future, and it is good I have a USB-C (unused) port on my motherboard, but for now USB-A is simply more useful.


I hate the trend to remove USB-A from laptops. Put as many or as few USB-C ports as you want, but there should be at least two USB-A ports.


At this point, the only type A cables I use are bundled cables that have type C at the other end because I don't have enough type C ports for all C-C cables. In other words, I actually need zero USB type A ports - I'm just without choice.

There's still the oddball micro-B (or mini-B, which is an awful connector) device out there, but that does not require a type A port.


USB-A is only unsuitable for small handheld devices, but is SOOOOOOOOOOOOOO less confusing than all others!


I think the main reason is cost. A proper usb-c port should be able to be used for storage, display and powering devices with high power requirements. Even 15W is kinda high power unless you're buying a top end space heater from Intel.

Now multiply this circuitry by 6-10 ports...


There is no need to make every port fully featured. Even MacBooks doesn't have that for the kind of price they demand


Having different ports that are mechanically compatible but not feature-wise is a UX failure. The blue/black color labeling made that somewhat acceptable for USB-A, but now that motherboards have started to use blue/red color labeling for ports that are feature compatible (just connected differently internally), even that color labeling is becoming a UX failure.


I think at least the Mx do have only fully featured ports?

If they support just one display but you can plug it in any port is one thing.

On a motherboard you're more likely to see 'these 5 ports are data only, 2 of them can charge beyond 5W, and that 6th port can be used for a display'.


To be honest, I haven't seen broken type-c cable yet. But I saw damaged USB A cables due to wrong insertion side. Someone just inserted a cable with a huge force to a port.


I'm 100% sure I can repair USB-A 2.0 host and cable connectors with common soldering iron.

I'm almost sure I can repair USB-A 3.0 connectors.

I'm 90% sure I won't be able to fix even 2.0 USB-C without help of special tools and a microscope, not to mention hand tremor. (


USB-A should be relegated to the rubbish-heap of poorly designed interfaces for the simple reason that it has no physical indication (e.g., RS-232C) or clear visual indicator for orientation, and you get it wrong 50% of the time. It's way beyond time the USB-A interface is eliminated. USB-C can't get here fast enough.

My MacBook pro only has USB-C ports, they work well for data, video out, as well as power (charging) and I am glad to be rid of the USB-A ports.


This said, USB-C needs to have its standards updated, and standardised. Eg all cables must support a minimum of USB 3.1 with power and data, with appropriate shielding. Stop yielding so many concessions. I'm already in a spot where I have a number of USB-C to C cable varieties that all look basically the same, but have very different characteristics. One cable basically refuses to charge anything. Another has no data capabilities. Another can carry decent power, but data is seemingly slower than USB 2.0. It's an absolute shitshow that manufacturers can ship something with USB-C ends and carrying USB-C markings that meets an arbitrary free for all standard.


A big part of this is that a lot of manufacturers are just completely ignoring the standards.

The standards specify a rather limited set of cables, and it specifies clear ways to label them which make it extremely obvious to consumers what the cable is capable of. But manufacturers just... can't be bothered to actually follow the standard.

The main reason why there isn't one cable is that it has mutually-incompatible goals. A high-speed data cable needs expensive, thick, shielded data wires - and is limited to about 80 cm in length due to signal integrity issues. All completely fine for connecting a laptop to a docking station.

On the other hand, a smartphone charging cable needs to be cheap, long, and flexible - but it doesn't need to carry high-speed data. The aforementioned docking station cable would technically work, but it'd be $30 rather than $3 and provide a far worse user experience.


>it'd be $30 rather than $3

I think this is the big one. Having only fast cables is already possible. Just toss out all your unknown and slower USB-C cables, and buy a bunch of known good 30$ Thunderbolt 4 cables to replace them all. You don't need to force manufacturers to do anything.

But I'm sure the majority of users don't care about USB 4 or Thunderbolt on all their cables. They've probably never used USB-C for mass data transfers, it's a charging cable, or maybe a USB 2.0 data rate devices like a Playstation controller or a headphones adapter now that 3.5mm jack is dead. Even most of the enthusiasts here probably don't care about spending an extra 5$ to upgrade their keyboard's USB-C cable to USB 3. It's just a waste of money.


For average consumers it's great. I think my phone is the only thing I own that could use 3.1(I assume, I've never looked because I do most everything wireless).

Almost everything (Including my 200Wh solar generator!) Will charge from 5v at 5W on a regular USB, and almost nothing really needs more than that, slow charging is quite often fine for phones and tablets.

Phones tell you if they slow charge, so it's just a matter of keeping the good cable plugged in the good adapter.

For the way most people use USB-C, as a power cable that can do data with a little extra work when needed, it's already great.


and you get it wrong 50% of the time

I've seen this mentioned before, and I don't get it. Never ever have I inserted an USB-type A plug upside down. There have been instances where I had to fumble around to find the right port, but that never resulted in inserting the plug in the wrong way. How much force do you need to apply to even achieve that?


> USB-A should be relegated to the rubbish-heap of poorly designed interfaces for the simple reason that it has no physical indication (e.g., RS-232C) or clear visual indicator for orientation, and you get it wrong 50% of the time.

That's a crap connector implementation problem. If the connector is compliant with the spec, it will only go in one way. (Looking at you, Yubikey.)


One criticism of Apple back in the Ive days was the lack of practicality. I like the fact that Apple relented on the MacBook Pros and they have an HDMI port.

I don’t want to have to use dongles unnecessarily.

> Apple fanboys/fangirls/fanothers: even the Mac Pro includes two legacy USB-A ports (admittedly it also has four Thunderbolt ports). Why is this relic of old tech cluttering the clean design of an Apple product in 2021?

This is naive. The chip hardware support to have 2 extra USB A ports is nothing compared to having two extra USB C/Thunderbolt ports and USB C/Thunderbolt is overkill for many situations.


> This is naive. The chip hardware support to have 2 extra USB A ports is nothing compared to having two extra USB C/Thunderbolt ports and USB C/Thunderbolt is overkill for many situations.

Type C doesn't require anything more than USB 2.0 with the same signals routed out. There's no difference whatsoever. Type C is just a connector. Yes there will be more chip support if you decide to make the connector more fully-functional but that design choice is on the device manufacturer. They do not need to do that to have a Type C port.


Yeah, but having Type-C ports with different capabilities is wildly confusing to users. Imagine that Apple had mixed Type-C ports. Imagine the number of support calls from people that says 'doesn't work' when they try to plug their Studio Display in an USB 2.0 Type-C port. Or when the Studio Display works because they plugged it into a port with USB3 and DP-Alt, because the downstream hub doesn't work, because you need to hook it up with Thunderbolt.

In the end the only thing that is understandable to most users is when all the ports on a system provide the same capabilities (and probably Thunderbolt 4).


> Imagine that Apple had mixed Type-C ports.

They actually do :) they don't all have the same capabilities. Some are Thunderbolt, some aren't.

USB4 goes some way to addressing this concern though, since anything marked USB4 has to be fully-functional.


Not on the same Mac, as least for MacBooks (I haven't paid much attention to desktop Macs).


I regularly wish my work MacBook Pro had a single A port. I understand my personal Air not having one—it’s lacking all ports but C—but it’s the one thing preventing my MBP from being as always-useful-entirely-on-its-own as the 2014 Pro I used to have.


I carry a usb a to c cable in my “go bag” to work. It’s a 3” cable with some flex to it. Weighs nothing.

They are in the range of $10 Put in an expense report if IT doesn’t have one for you?


Sure, but it still makes the pro slightly worse than it could be. One of the things that quickly convinced me I’d been computing wrong, the first time I was issued a MBP, was that I could just pick it up and walk off and be able to work for hours. No need to grab the power brick or an extra battery (did I remember to charge it?) in case I was away more than two hours, no need to take a mouse, since the trackpad was actually a viable input device beyond a few painful minutes. Enough ports I’d almost certainly not need anything else (I wasn’t doing network tech stuff anymore, so absent Ethernet didn’t hurt me)

The windows machines at work have A ports (and C). They have a single notable leg up in the MBPs, now, as far as pick-up-and-go capability. Of course they still lag in other areas, but it’s gone from pure win to “most of it’s better… except that one thing”.


I found the return of the HDMI and SD card reader so refreshing but also so puzzling.

On one hand that's what we asked for. On the other hand it was in the same wave as giving up on the touch bar, slowing (stopping ?) the evolution of the iPad, no touch support on the laptops when the design changes were hinting at it.

Basically it felt like they froze the hardware form factors in time to solly focus on the ARM transition and the Vision Pro. That's a good decision on many front, and also a sign that the mac is now in its "conservative" phase where change isn't to be expected for a very long time if ever.

Now that Panos is also gone from Microsoft, I wonder where innovation will happen on the laptop/desktop side. Asus and Lenovo ? If it doesn't, desktops will slowly fit into the mainframes' position, and laptops will be legacy/ultra niche tools.

Perhaps then it will truely be the year of the desktop for linux ?


> On one hand that's what we asked for. On the other hand it was in the same wave as giving up on the touch bar, slowing (stopping ?) the evolution of the iPad, no touch support on the laptops when the design changes were hinting at it.

Apple typically tries for a fixed design cycle - the 2016 MacBook Pro design changed in 2021, so you can hypothesize when the next significant revision.

Loud voices stated that pros hated the keyboard, hated the Touch Bar, hated having to use dongles to do certain tasks like give a presentation or pull in photos from their camera. People also waxed poetic about MagSafe.

Lo and behold, the next version changed the keyboard, dropped the Touch Bar, and added HDMI and SD card slots, and added back MagSafe as a charging option.

The iPad has had hardware revisions since then (it has gone from A15X to M2) but there has been some hold-up on rolling out new models. The oddity is that nobody reporting based on supply chain leaks has knowledge here.

My suspicion is that there was originally a launch planned for March 2023 of new Air, Pro, Mini, and two new Apple Pencil models. For some reason - all we got was a new base Pencil model this year, and that was only released last month.

I know of no design changes to laptop hardware that hinted there would be touch support. The software design changes in MacOS 11 and beyond have been around UX alignment with iOS, and more specifically iPadOS. Apple wants the 'mobile' team that works on iOS apps to see that there is a supported and intuitive way to adopt their codebase to iPad and Mac releases, rather than having the 'web' team wrap the code in Electron for Mac and letting iPad run in iPhone compatibility mode.

This is why iPadOS has been gaining Mac-like features (mouse/trackpad support, keyboard shortcuts, multitasking/multi-monitor enhancements to name a few larger ones)

> Basically it felt like they froze the hardware form factors in time to solly focus on the ARM transition and the Vision Pro.

The Pro models have a design expected to accommodate the M1 through M5. Several iPad models are all due for a design change, but for whatever reason thats gummed up.

I would expect if design resources are taken up by Vision Pro, they don't break or pause their iterative process - they just have less manpower to propose and implement changes in an iteration.


> Loud voices stated that pros hated the keyboard, hated the Touch Bar, hated having to use dongles to do certain tasks like give a presentation or pull in photos from their camera. People also waxed poetic about MagSafe.

Yes, if we compared the current laptop lineup to 2015's, the only real significant changes are USB C and the 14" notched screens. Internals radically changed as well as performance, battery life, and that's where Apple is keeping the focus (rightly so given the reaction to the current models, where the consensus seems to be to not touch anything anymore outside of the battery and the chips).

> The iPad has had hardware revisions since then (it has gone from A15X to M2)

Yes, same path as the mac where internal changes seem to be the priority. The biggest change for me was the introduction of the cantilever keyboard as an option, which as you point out went along with Mac-like features coming in the OS.

In contrast, in the same time span Microsoft introduced the Surface Studio line (the elusive 2+ desktop and the widely loved but crazy expensive laptop) and the Surface Pro line that directly answered the "What's a computer ?" question Apple was asking. Looking at the numbers [0] it seems they found a decent niche where the iPad has stagnated for so long.

I think Apple could have interesting ideas for the next iPads, but as you say, no leaks whatsoever at this point probably means it's at least a year down the line, and if it's a real big change the iteration process will need that much time (and they should take their time. A half baked prototype would be killed on the spot after such a long wait)

[0] https://www.windowscentral.com/hardware/laptops/microsoft-su...


I'm ready for it. My desktop has a half dozen usb-c adapters stuffed into various ports and hubs.

Usb-c cables are the most versatile. You can get all kinds of pigtails and adapters to turn them into anything else. Other connectors don't have that property of universality, because they don't have enough cables.


USB-C is the bees knees if it works right. Which does not seem to always happen.

Plus, "active cable" is close enough to an oxymoron to make me afraid we overreached.

(I do like that a small pocket kit with an USB C cable and a couple of adapters seems to cover most needs.)


The one thing I don't quite like about active cables is that they didn't use Dallas One-Wire or i2c or uart. Otherwise, it seems to work fine.

Hobby level stuff would be a lot simpler if any old microcontroller could talk to a PPS supply, interrogate cables, etc.


I2C or uart would require an additional wire, so that'd be cost-prohibitive. But I do agree that something like One-Wire would be better from a hobbyist perspective.

Although probably easy to implement in silicon, USB PD is an absolute nightmare to DIY. Pretty much your only option is to get a dedicated PHY like the FUSB302B to do the actual electronic part of PD comms for you.


Seems like it must not be that easy to implement, or it would already by in all the chips the way CANBus seems to be integrated in tons of stuff.

One-wire stuff is so cheap, and and they have EEPROMS meant for exactly this kind of electronic marking(I believe it's what power adapters did before USB), and it can be done with pure software on an MCU costing pennies.


A lot of devices don't need what USB-C offers: USB-A is a concurrent, not legacy, standard. You want USB-C for high-power, high-transfer-speed connections. An incredible number of peripherals don't need either of those things, connecting them with USB-C just makes no sense.


"No sense" seems like a strong claim. USB-C is a better plug (reversible, durable). I don't know how much more expensive USB-C might be, but considering the number of cheapo devices that come with an unwanted cable with at least one (often 2) USB-C ends, I assume it's not all that high.

The main use I encounter where USB-A is actually better are dongles that don't stick out.


Can't say the USB-C is all that much of a better plug. Reversible: certainly convenient, but the tiny little stubby easily breakable plug? Strong minus. I've snapped a million micro-USB connectors (may it die as many deaths in favour of either real USB-A or USB=C), more USB-C connectors than I wish I had, but I can't remember ever snapping a USB-A connector.


Eep, I've had much better luck than you! My bigger concern for tugs would be whether the plug or the socket broke first, I think, though I don't know what fares better by that metric


Usb4 does not define "type-a" connectors, thus type-a by definition is not "concurrent" with type-c in the usb spec. It is deprecated/legacy (ie still in use but not supported by newer versions of the spec)


That's very much not how USB version numbers works. Each version replaces a previous version "within the same generation", while guaranteeing backward compatibility with USB 2.0 (at least for the foreseeable future, given that it's unlikely that devices that only need USB 2.0 features will ever disappear).

- The USB 3.2 standard replaced 3.1/3.0, but explicitly did not replace USB 2.0

- Similarly, USB 4 guarantees USB 3.2 and USB 2.0 compatibility.

It doesn't matter that USB 4 does not define a Type A connector for USB 4 connections: a USB 4 controller can accept Type A on circuitry that's designed to operate in USB 2.0 compatibility mode.


Because every designer has a choice - build onboard A connectors and supply seperate A-C adaptors or build C-onboard and supply seperate C-A adapters.

The incentives are to make the motherboard at a price point and A connectors are going to be 1cent or less cheaper. This is probably true for the cable adaptors too.

But if I had to put money on one thing it's power - you can run a small laptop off USB-C which is where the whole thing shoukd be moving to - USB-C for everything power and data, but making every connector hook up to the 5.5V on the motherboard is awkward and marking just one is confusing.

It's not a great argument but my take is USB-C is only great when every connection is USB-C and if it's not great then why bother


USB-C ain't all that great.

Though I say this, I hope that we all move forward to USB-C soon.

First of all, most of us who use PCs will use them for a very long time. A PC put together will often last someone a full decade. The specification was barely published and finalized less than 10 years ago (August 2014)!

USB 3.1 (still using the USB-A port) was introduced around the same time! USB-C had not hit widespread adoption until roughly 6 years ago - and the vast majority of computers that are in use today are from 5 years ago.

Secondly, USB-C cables are still relatively expensive. Many cables don't adhere to the standard; many can only deliver data and not power, while other still can do power delivery but not data.

Thirdly, USB-C ports may be the shiny new kind of port, but I find them to be incredibly brittle on motherboards and laptops. USB-A ports tend to be built like a tank, while USB-C ones tend to break if you push it in too far. Even though they're supposed to be more resilient, but somehow they aren't.

Finally, there's still a bunch of stuff that uses USB-A.. no one wants to use a bunch of dongles with their PC.


PS/2 connectors lived on motherboards for a very very long time after mice and keyboards went to USB.


PS/2 ports are still there on high-end and "gamer" mobos.

This is also related to the fact that a lot of high-end keyboards/mouses are also PS/2.

They don't cost more than USB, are simpler in many ways, and if the mobo has dedicated ports for them, I think it make sense to plug the keyboard and mouse there to free up USB ports for other peripherals that are almost exclusively USB.


There’s also the matter of PS/2 interruption vs USB. I don’t believe it truly matters much anymore but there were distinct advantages of PS/2

The link below has several outlined though I can’t speak to their veracity or relevance anymore

https://superuser.com/questions/341215/is-there-a-distinct-a...


You can actually damage the port if you unplug a PS/2 connector while power is on. Happened to me.


The initial roll out of USB mice was on 125Hz polling interval. The PS/2 mice were faster at first because they sent the signal immediately with no delay.

You can easily see the difference - with 125Hz it's impossible to draw a good circle with a mouse since it will always draw short straight lines


No matter what devices I use, USB seems to give me a bit error every few terabytes or so, over cheap cables >1ft. This makes me wonder: how do I find good (certified) cables, and why isn't there forward error correction in the USB protocol such that cheap cables simply work?


USB does use error detection (CRC), but if you transfer enough data over anything, you'll eventually see undetected errors.


Presumably if it's a mass storage device with block-addressible storage then the host's fs driver should be able to (if it has its own file checksums) transparently identify the corruption and re-read the block, right? Or at least bubble up to the user and fail the operation instead of allowing silent data corruption.


Perhaps in case of storage devices (although it would be far less efficient to do it that way). But this approach doesn't apply to e.g. high speed industrial cameras. In any case, it's USB's job to transfer the data, so imho the responsibility lies there.


> Perhaps in case of storage devices (although it would be far less efficient to do it that way).

That's in fact how it works for USB bulk transfers: The host learns about the checksum error having occurred and it will retry until it passes.

That doesn't seem inefficient to me, given the low error rates involved (many flavors of Ethernet do the same, for example) – are you saying there should be FEC?

> But this approach doesn't apply to e.g. high speed industrial cameras.

These usually use isochronous transfers, where errors are indeed only detected (and reported), but no retransmission is attempted.


That's how it works, as far as I understand: USB indicates the error, and the host driver can retry if it makes sense – and for storage devices, it definitely does.


If this is true, then the MTBF design parameter is unacceptably low.


USB does have a specified maximum bit error rate, and presumably the CRC length is tuned accordingly.

What it probably does not have is a detectable error counter that marks a link as broken (e.g. in case of a bad/damaged cable, plug or controller) before the chance of an undetected error becomes unacceptably large, although it seems like a host would be able to implement that as an optional feature?


It is rated maximum 1 bit error per 10^12 bits, which makes the chance of enough errors within a single packet to break the CRC (or to cause more than 3 retries) astronomically small.


I have one of those USB-C to HDMI + USB + SD-Card etc adapters on my laptop. It definitely looks like it's not doing the connector on the computer any good. At some point I'll 3d print a holder to take the strain off the connector.


I own several devices that won’t work with a C-to-C cable. Somehow something gets confused. You need to use a C-to-A cable and if you have a computer that only has C ports like me, then you need to use an A-to-C dongle. What a mess.


> I own several devices that won’t work with a C-to-C cable. Somehow something gets confused. You need to use a C-to-A cable

These devices are probably missing the resistors between each of the two CC pins and the ground, which are necessary to tell the computer on the other end of the C-to-C cable "I'm a device and need you to put 5V on the power pins". The C-to-A cable, on the other hand, always has that 5V power (because the A end of the connection always supplies power; that's the reason why A-to-A cables are forbidden, they would short together the power supply of both ends), which means the device gets powered even if it's missing the CC resistors.

That is, it's a manufacturing defect, and the device was probably tested only with C-to-A cables; once C-to-C cables become more common (at least where I live, it took a long time before I managed to even find one, while C-to-A cables were plentiful and many devices came with one), this will become less of a problem.


I found calling USB-A ports “legacy” a bit funny. “Less modern” perhaps.


> Wouldn’t it make more sense for all of the USB ports on a new PC to be USB-C and then use adapters for legacy components?

It's annoying enough swapping wireless mouse adapter and keyboard plugs between my personal computer and my work computers multiple times per hour without also plugging and unplugging adapters.

Bluetooth would be even worse. But that raises the question that maybe we aren't getting a lot of USB-C ports because wireless is expected to leapfrog it to some extent?


That's why Logitech has multi device Bluetooth mice! Not sure about keyboards but they probably have it.


Thanks. It looks like they have it for the keyboards, too.


The whole idea that Type C should replace Type A is odd to me. Actually the number of times I've had to buy new cables and connectors for USB is irritating.


The main benefits vs USB 3.0 A jacks is it can also carry video and power. This makes a lot of sense for laptops, but much less for desktops. Thunderbolt is its own thing, and these wouldn't be thunderbolt ports. That leaves speed, but there aren't many applications for more than 10 Gbps. You're either going to use PCIe or Thunderbolt.


My 2c: because peripheral devices have a HUGE life cycle, and tons are still using legacy/big USB legacy connectors. And MB manufacturers do not want to provide a few legacy/big USB legacy connector <- USB-C connector converters with their MBs, probably because of costs.


It would be nice to see USB-C replace hdmi etc. But USB-A has been adopted as a sort of low power dcdc power delivery method. It’s in cars and wall sockets. Things that have very very long life times. USB-A will probably stick around for a long time.


It's moving there, but slowly. Recent Asrock motherboards have 2 USB 4 ports already.


Te author is… delusional at best. USB has multiple standards that use the “a” type port. And many peripherals exist that are primarily “a” and not “c”. Apple is on the usb-c track because the eu is forcing them to conform to such.


Some devices feature USB-C ports that do not conform to the specifications. They're essentially just a DC barrel port disguised in USB-C clothing. and various other combinations exist as well with partial features.


I built another PC recently and it got me thinking how archaic and clunky the whole thing is.

The biggest improvement in the building experience of the past 20+ years is probably getting rid of SATA and IDE cables with NVME SSDs.


usb-c is obtuse about hubs. you're not supposed to be able to daisy chain them, which is total crap, but also means that it's very limited in how you can just throw some extra ports around.


any device that doesnt need a 20gbps or higher link is better off connected to your motherboard via a usb-a to usb-c cable. the "a" connector has not been deprecated. in some ways it is the superior connector, higher mating cycles, less fragile, less likely to get ripped loose, much more torque required to bend, and so on. the "c" connector is more useful on the opposite end where you will be plugging in and out more frequently.


> the "a" connector has not been deprecated.

Actually, it is:

> The three sizes of USB connectors are the default, or standard, format intended for desktop or portable equipment, the mini intended for mobile equipment, which was deprecated when it was replaced by the thinner micro size, all of which were deprecated in USB 3.2 in favor of Type-C.

https://en.m.wikipedia.org/wiki/USB_hardware#Connectors


> the "a" connector has not been deprecated

I mean... usb 4 literally doesn't define a type-a connector (or type-b). Usb3.2 is the "best" a usb-a port is ever going to deliver.

So while it may still be popular and in-use, from the view of usb as a standard, it is deprecated: it's not present in the latest version.


> higher mating cycles

It’s the other way around.

USB-C sockets are much more robust than their A counterparts, since the spring is in the plug for C, as opposed to in the socket for A.


perhaps. but thats why you have the C at the other end because it doesnt make sense to be frequently plugging things in and out of the motherboard and and wearing down those connectors.

but i doubt you will get more cycles out of C in real life. that might be what the spec says but given the pitch of the pins on the C connector compared to the A, you have a world of greater tolerance to work with


If you don't frequently plug things into your mainboard, does it even matter?

> but i doubt you will get more cycles out of C in real life

You're free to doubt the designers of the USB specification and/or the manufacturer's compliance with the specification, but just logically, the part about having the springs in the plug, not the port, makes sense to me.

I've seen many broken USB-A ports in airplanes and other public charging ports with the spring connectors bent beyond recognition.


manufacturer's compliance with the specification has always been a total joke heh. i see what you're saying about the springs, but i shouldn't have brought mating cycles into the discussion. the side plugged into the motherboard should just be secure above all else. i can easily ruin any usb-c connector with just my thumb and index finger. the thickness of the A shell and its square profile prevent it from bending under torque pretty darn well.


furthermore, ive never even seen a usb-c connector at an airport or for any public use and i doubt i ever will. hell, even IEC can barely withstand that use case,i usually have to bend my prongs for the plug to stay in. usb-c not gonna make it 3 days.


I've seen a few already, both in airports and elsewhere (e.g. in rental cars and I believe even in an airplane once).


My sole interest for motherboards in a stationary computer are for Windows gaming.

Flight sticks, wheels, button boxes basically everything that is wired use USB-A


USB-B is also alive and well in areas like audio production.


That’s device side however rather than computer side. But yeah, usually USB-B is paired with USB-A on the other end.

That said, a lot of new gear is USB-C only


USB-C cables are less flexble than USB-A cables. Which makes connected devices harder to manipulate with and connectors are more stressed.

It just feels more fragile.


Most people buying a desktop unplug their old desktop and plug the new one in in its place, swapping over all the peripherals.


Find me a device that needs the bandwidth of USB type C. The only thing that springs to mind are 4k HDMI capture cards...


M2 SSDs can already saturate even a Thunderbolt port; then there's the alt-mode uses like DisplayPort, or the tunneling in usb4 that essentially gives thunderbolt3 functionality, such as running PCIe devices.


USB type C is a connector. It says nothing about the protocol it supports.

It could be anything from USB 2.0 up to USB 4/Thunderbolt 4.


Tons of things.

High resolution displays at high frame rates

High speed storage

External GPUs

High speed networking


Framework mainboards are all usb-c


Everyone wants their own, I guess.


Calm down bro, it's just another cable, it's not like your motherboard needs to switch between host and client.


I feel dumber after reading anything by Greenspun. Look at me, actually saying this out loud, wasting all your time with the sort of pointless comment that 5 minutes ago I'd have laughed at the very idea of posting. That's the power of the man's content. What a guy!


I dunno. I always enjoyed the pictures of his white dog.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: