Since I got over the hurdle of buying cables and adapters, I've been mostly happy with USB-C. The one lingering concern is actually mechanical.
Many of the cables/dongles are long and skinny relative to the tiny metal connector, placing a fair amount of torque on the port. A number of the cables I've used are poorly fitting as well, exacerbating the mechanical strain.
A good example of this problem is the Yubi-4C. The connector protrudes about half a mm from the port and it has enough play to wiggle up/down/left/right. Having had to replace the logic board and I/O boards on my 2016 MBP 15 within 3 months due to port damage, I get a bit nervous when I notice fit problems like this.
Aside from outright damaging the ports, poorly built cables seem to wreak havoc on the mechanism that grips cables on Apple machines. Prior to the board replacement, the Apple power cables would gradually come loose by their own weight. Fortunately, the Geniuses did agree that this was a problem and replaced the top case along with the guts of my computer.
I've broken several motherboards and many cables over the years. It drives me crazy. The USB-C is the worst. I'm on my third cable now. At least I finally figured out to make sure I get USB-C chargers with removable cables. Now I only have to replace the cable, which I've done twice in the last month.
The Dell XPS 13, which I'm getting in a week or so, has both a proprietary power jack and supports USB-C charging. I'm very happy that I won't be breaking many more USB-C cables.
Dells proprietary charger is the worst. They have a cable dedicated to insuring the use of original chargers, which appears to be especially prone to breakage. If it does, your XPS won't charge anymore, but it still gets power. I managed to repair my charger, but it was a real clusterfuck.
Your problem is not with USB-C its with Dell, everything they sell that i have seen was rubbish, we had to replace the majority of screens and keyboards year after year and now we dont touch them with a barge pole now, i cant imagine they have improved.
No clue why you're being downvoted. When I worked for Solectron Global, Dell and HP's repair departments were the absolute worst. Always backed up, always short on parts, always breaking mid-repair. Most frustrating job I've ever had, and a real eye opener as to the truly garbage nature of their products.
I've had a couple of recent XPS 13 laptops, and they seem pretty solid to me. I've seen plenty of crappy HP and Dell machines, but the good ones are pretty good.
I definitely agree...but also with material wear. One thing I've never quite understood is the connectors of USB aren't made of a softer material than the ports. Even cars got this right in places many years ago on metal parts that touch one another. As is, i've already noticed a 'loosening' on a phone only a few months old.
This was a major improvement of micro USB over mini-USB; the retention-clips on the male connector are sacrificial; this is why micro-USB cables die regularly, but micro-USB jacks rarely do. I don't have any USB-c devices yet, so I don't know how those will fare.
The proprietary connectors used by Nokia, Ericsson, etc, before USB where all "better" in the sense that the contacts were just pressed against each other, just like what used to be (?) on Ipads and Macs.
I've killed a number of USB ports by walking away with or dropping the phone while it's connected.
Oof, tell me about it. I still have a N900 that's otherwise in decent condition, but the MicroUSB charging jack is rattling merrily around inside the case...
I think this means reducing the size of USB for phones to mini, then micro, now Type-C shouldn't be applied to desktop and laptop systems where size isn't at a premium. If I'm connecting a 10TB USB HD or a Printer to my desktop, I want USB 3.1 Gen2 and a Type A/B connectors are just fine. If I'm charging my phone, I can be wireless or Type-C.
With my 2016 MBP, the USB-C connectors have had no mechanical problems. I plug in various dongles, connectors, and USB drives every day. I'm left wondering what one needs to do to physically break the ports – I'm not very careful!
I've had a ton of problems with the TB2/miniDP port. I still use it for my display connection and GigE to both a 15" and 13" MBP, but it can be intermittent and flakey.
Oh yeah, I agree with you there. I do have a few Lightning cables that have died, but I treat them as consumables anyway.
I know it supports some form of USB3, but the display out is handled by having a h264 decoding dongle (!). It unfortunately doesn't have enough pins to really support the range of alternate modes and high data speeds of USB3.
I think these are actually the most minor problems with the USB-C vision. The VAST majority of consumers will never encounter a thunderbolt device, and the ones that do can be expected to understand the cable difference.
What is far more of an issue for the standard is the established network effect. USB-A has become so commonplace that its built into our furniture, and micro-USB is a legally defined requirement for device charging in some areas. These foundational forces will prove to be a massive deterrent to USB-C uptake, and far more of a risk to the vision than multiple cable types that are clearly labeled!
But, the bright side of this I think is Apple's adoption of the Qi charging standard. Its clear that an evolutional jump to wireless charging will entail new technology, so I see Qi taking up the "one way to charge everything" vision.
I agree, the USB-A form factor is by far the most widely used peripheral connector in the history of computers. The shift from USB-A to USB-C is without precedent.
Practically every mouse, keyboard, printer, camera, charger, and external drive made in the last 20 years has used USB-A. Practically every car in the last 10 years has USB-A. Every hotel room, every airplane, every DC power converter. There are still millions (billions?) of devices shipped every year with USB-A.
Luckily space exists for devices using USB-A and devices using USB-C, and making cables with A at one end and C at the other is trivial enough that many cables exist like this.
Two compatible cable standard is easy to handle. The hard part comes when you have USB-A, USB-C, Lightning, Micro-USB, and Mini-USB all competing. Trying to keep enough of each type of cable on hand gets to be pricy.
Mini-USB is dead, and micro-USB/USB-A are opposite ends of the cable - they don't get used for the same thing. I'm guessing Lightning must be some weird one-manufacturer proprietary thing, in which case they deserve to be punished for their antisocialness.
So you really don't need that many cables. C-to-C, C-to-mini, and A-to-C will do you.
Contra sibling comments, it technically isn't a legal requirement. It's entirely voluntary, a memorandum of understanding between manufacturers, facilitated by the EU.
Obviously 'voluntary' here deserves some scare quotes since the EU had made it very clear that it was willing to swing its sledgehammer and formally regulate if the manufacturers couldn't agree on a standard amongst themselves. Which would have been less good for everyone (including the EU), e.g. since formal legal regulations are a lot harder to change with the times, So they were pretty incentivized to work something out, and happily they did.
In the EU. And it put the insanity of having many incompatible chargers to an end up to the point when some mobile phone producers did not include a charger with the phone they sold because they rightly assumed one available from the previous phone.
One really good EU standardization effort which actually worked except for Apple.
There were an exception clause specifically for the likes of Apple. As long as the proprietary plug was removable from the charger (either by having the other end plug into an A socket on the charger, or by being a proprietary-to-micro converter) the EU would accept it.
The main point was to facilitate that chargers could be reused, rather than fill up drawers and landfills.
Apple can still do so because their charger cable can be detached and has an A plug on the charger itself.
Thus the charger can be reused on a different device with a simple change of cable (a different story is that Apple has their own way of signaling max As to the device).
They ship with a lightningstrike/thunderport/whatever-was-that-name to usb cable though which can be used to charge it with pretty much any USB charger out there.
Not perfect by far, but better than it used to be before EU started to tell phone manufacturers to wisen up.
It’s easy to forget how crazy bad things used to be before the EU mandated a standardization.
From what I have read it’s in Europe and if the device doesn’t include a micro-USB port it must have a dongle / adapter included at no extra charge, which is what I believe Apple does.
A committee goes into a room to design a single vehicle that will do all jobs, it will carry heavy loads, it will commute to work, it will fly between airports, it will work on the watar and below the water. Such is the challenge facing these guys.
Is it surprising that it doesn't work well? Not really. The compromises are pretty extensive. And between the users demanding thin phones and rugged connectors, its really really hard to do that with anything other than perhaps titanium.
I have a bunch of computers, my oldest was built in 1968 the latest was built last year. Invariably the port complexity goes up with the older computers, and connector reliability also goes up. I've got AUI cables for Ethernet that work as well today as they did in 1984 when they were all the rage. I've got 50 pin SCSI-1 and 36 pin Centronics printer cables that are reliable and functional. I've got computers with 3 of the four USB 2.0 connections on them unusable either due to fusing issues or mechanical strain. I've got a phone handset (my developer's version of a Nexus 1) with a micro USB connection that won't hold the connector in the plug.
I guess the bottom line is that we can make really reliable and ugly connectors.
Definitely those SCSI connectors with the two little screws holding them in aren't going anywhere. But the original ones had ribbon cables. Those would be impractical on today's devices.
Those thick cables also had a significant impact on the mechanical strain - they required a substantial connector to avoid disconnection due to cable movement. But they were mostly used for desktops anyway, and weren't really intended to support the idea of moving the device around.
So, I'm sure most people appreciate the progress in the electronics side, the amazing data rates, the engineering of the wiring, the flexible, lightweight cables.
But where we've gone wrong is usability side of things. Why did it take so long to come up with a connector which can be inserted upside down? USB C is still not as good in this respect as a 1/4" jack - which supported insertion in ANY orientation since it's use in telephone exchanges since 1878!
1/4" jacks were also bidirectional - the same connector on either end of the cable. Another issue we have to deal with with USB.
I would prefer that any cable you can plug into a USB C jack is interchangeable. Conceptually, it's a pipe between two devices. If they connect together, it should work. I don't think people generally understand that the cable itself has electronics in it that negotiates power and data rates. And I don't think we should go down the route of marking cables either. They should just be universal.
I hope Apple takes the lead on cleaning up this mess. They've committed to USB C, and this is a major problem right now.
A major change I see between the cables and connectors you describe is the move from parallel communication to serial communication. This has brought with it a great improvement in speed. I suppose, we could have a large connector with just a few pins for serial communication, but then these individual pins would be vulnerable to breakage just as they were in the old parallel connectors. Maybe even more breakable due to their sparsity. Im surprised your connectors lasted so long.
Contact based connectors rather than pin based based connectors as mentioned in some other thread might be the answer. I don't know what problems they might lead to though. Personally, I'm a huge fan of apple MagSafe 1/2 cables.
There are committees and there are committees. In this instance i feel it was a committee stuffed with company yes men rather than electrical and computer engineers.
And thus we get a "standard" that is very adaptable to company "needs" but a nightmare in terms of electrical and computing (never mind basic daily usage).
USB-A had all of these compatibility concerns too, when it first came out. I remember having a Gateway All-in-One with only two USB-A ports in '98. You had to worry about USB 1 versus USB 2, and some devices wouldn't get enough power through a hub, and so forth.
I'm really looking forward to seeing where USB-C is in two or three years.
He does address this at the end. Also, a cable that doesn't support Thunderbolt and 100W charging will probably always be cheaper than one that does, and thus we should expect suboptimal cables to keep being sold.
Oh, in those days, computers would fry themselves --
literally -- from overheating. CPU vendors raced to break the 1 GHz barrier, but thermal throttling wasn't as mature. If I pushed my computer too hard for too long (say, with a really tough ray trace) then it would just shut off uncleanly from being too hot.
The joke was you could fry an egg on your CPU because it got so hot:
I just love the fact I can now charge my laptop and my phone using the same charger - and both charge super fast. I travel often and going with one charger is a life saver. Being able to charge my laptop from my old power bank to get two more hours is also super cool.
I remember discovering that some of my Micro-USB cables could only charge and couldn't transmit any data. I honestly don't understand why people talk about the issues with USB-C as if it was a new thing. Seems to me like most of the problems are theoretic and that most end-users are actually happy with the new standard.
1. They already have a lot of accessories for lightning and they would kinda lose that sweet 30 % MFi for 3rd party lightning accessories
2. A lot of people who don't know why USB C is great will probably bitch about how Apple is changing the port on the iPhone _again_, like they did when they switched from 30 pin to lightning. Could be a PR nightmare
No PR nightmare is bigger than loss of headphone jack - I imagine lot of people had great and expensive headphones that are no longer usable directly. Apple seems to survived that.
If memory serves, Apple had a licensing deal with Intel over the lightning port on the iPhones. I can't recall the expiration of the deal, but that's why the laptops already have USB-C and the phones/tablets don't.
Strongly disagree with the article, it seems the author is confused about the internal hardware. The actual pin connections within all USB Type-C cables are the same -- no matter if they are shipped with a product that uses Thunderbolt, Displayport, or any other standard. The only variance cable to cable (and this exists on existing micro cables as well) is on things like shielding, AWG, and ferrites which would determine signal & power integrity over the length. IE don't use a very long & skinny cable for delivering lots of power or high speed data unless the cable was bought for that specific purpose, which is something consumers do today if they aren't using professional installers.
Most of the rant is about different device capabilities, and how cables being more or less the same makes it almost worse instead of better.
And that high variance in cable quality is a bigger deal than it needs to be. You can't tell by looking at a cable what bandwidth and amperage it's supposed to support. Even if it has thick wires you can't tell if it has a chip to enable 5A charging.
Also there's a non-quality factor. USB-C cables with no high speed wire pairs are valid and are quite common. You don't need more than USB 2.0 in a charging cable.
It would be nice if we had some standardized iconography for the cables (xAmps, thunderbolt/non-thunderbolt, &c.). There are really only a few useful permutations of cables.
Devices are a bit more complicated, but I'm hoping that the Hub situation will sort itself out (early in the days of USB 1.1, I remember plenty of devices that wouldn't work with hubs, or only with certain hubs).
There is standardized iconography[0][1]. It's enforced under trademark law by the USB Implementers Forum. In addition to the old "basic" (USB 1.0/1.1), "hi-speed" (USB 2.0), and "on the go" icons there's now "superspeed" (USB 3.0), "superspeed+" (USB 3.1), and icons for the power delivery spec. The new icons are just as inscrutable as the old ones, but they exist and the USB Implementers Forum requires correct labeling for all cables and devices.
The various proprietary extensions might be less consistent with labeling, but at least every Thunderbolt-compatible cable I've seen has a little lightning bolt icon in addition to the standard USB ones.
When I was trying to clone the drive from my old MBP to my new one, Apple's restore tool refused to recognize the USB-C charging cable (that comes with the MBP power supply) as compatible with Thunderbolt. But running a classic Thunderbolt cable with USB-C dongles on both ends worked fine.
It turns out that the Apple USB-C high-speed charging cables just don't support the required Thunderbolt standard, and the software can somehow tell that the cable is incompatible despite it fitting in the slot.
There's no colour scheme that can adequately handle all the possible variations.
But there are other problems: it makes no sense to have all 4 ports on a 15" Macbook Pro (for example) support power. You only need one port to charge with. I think I read that not all the ports on the 13" Macbook Pro are the same (in terms of capabilities).
This is just strikes me as a (hollow) victory for principle ("one port/cable to rule them all") over pragmatism. This always reminds me of the quote: a foolish consistency is the hobgoblin of small minds.
This whole thing is expensive, unnecessary, user-unfriendly and confusing (to the average user).
Insanity seems to be the right word for USB-C.
Let's imagine you buy a new monitor, which has a usb-c port. Fortunately, your device also has a usb-c port so it will just work, right?
Well, that monitor could support at least the following protocols over that cable:
- Displayport, obviously.
- HDMI, because that might be useful in some cases.
- MHL Alternate Mode for USB 3.1, which is not-quite-HDMI.
- USB itself, like the external mini video cards created by DisplayLink
- PCI-E, as the monitor could contain any regular video card. Unlikely, but technically possible.
Is the monitor going to work when plugged into your laptop? What about your phone? Will it charge either of those? I'm not sure how this has managed to gone this horribly wrong, but somehow "supports usb-c" has become completely meaningless. Oh, and it can also support analog audio as well!
And if anyone thinks I'm exaggerating, please tell me which devices do and which do not work with this "usb-c to vga" adapter, because to me it is literally impossible to tell: http://www.belkin.com/uk/p/P-F2CU037/
In other words, they're still different ports (for all practical purposes), they just all have the same connector shape (which makes it harder, not easier, for the reasons you mention).
Put that way, I'm not entirely sure it's worse. Coming from a world where the connector helped you determine what you could do, it seems worse, but that's because we've all internalized the cost of not having the right cable and having to buy different cables for everything. Having a cable that works for most things (if you were lucky enough to buy a good cable, and I'll freely admit that is a clusterfuck of planning) leaves you with just having to know if your ports will work together. That seems solvable, even if it's not currently solved. I think we're a step closer to the dream of a simplified connector ecosystem, even if we aren't quite there yet.
The problem is the unicorn cable that can safely support the dizzying array of power and data transfer options afforded by USB-C costs alot and negates the cost advantage of a single port.
USB-C solves problems that nobody actually had, and in doing so creates a bunch of new ones that are expensive and difficult to solve for the end user. It isn’t quite as dumb as 801.11ad docking, but it’s close.
How long is a generation? USB-C products have been in production and shipping since 2014. It's been on the drawing board way before then.
When the first USB rolled out I don't remember any issues like this. The big problem with previous tech was lack of support (like only having one usb-2 port and the rest being usb-1.1) and expensive cables (we do have those with USB-C). These[1] were fairly ubiquitous for a good 10 years and I don't remember any issues.
I feel like there could have been better consumer-facing design--like color bands to mark support on cables and ports (just like USB-2 was blue and USB-3 were red). They would designate if a cable or port would support Thunderbolt, USB-3, HDMI, DisplayPort, or high power. Resistors have been doing it for almost 100 years and this version would be much simpler. Or like PCI-E have like a lane system where more lanes support higher level features.
But this isn't only a branding problem with incompatible cables. You just can't have a USB-C hub with this standard. From my (possibly flawed) understanding, because USB-C switches modes to carry something like DisplayPort, for example, you can't also carry a USB-C signal at the same time, so you can't split USB-C (without one of the ports shutting off while in that mode). Maybe you could if you hub didn't support those exclusive modes? I haven't seen any actual USB-C hubs for sale (that give you more USB-C ports on the other side).
> From my (possibly flawed) understanding, because USB-C switches modes to carry something like DisplayPort, for example, you can't also carry a USB-C signal at the same time
A USB-C connector has one USB 2.0 pair, four high-speed differential pairs, and two sideband wires. For USB 2.0 you need only the USB 2.0 pair, for USB 3.1 you need the USB 2.0 pair and two of the high-speed differential pairs.
For DisplayPort, you need two or four of the differential pairs, plus the two sideband wires. If you're using two of the differential pairs for DisplayPort, you can use the full USB 3.1 at the same time; if you're using four of the differential pairs for DisplayPort, you still have USB 2.0 left.
The same applies to other alternate modes. As long as they need at most two of the differential pairs, you have USB 3.1, otherwise you have USB 2.0. And power delivery has its own set of wires, so it's also always available.
(The exception is the 3.5mm plug adapter alternate mode, where everything except analog audio and slow charging is unavailable.)
> From my (possibly flawed) understanding, because USB-C switches modes to carry something like DisplayPort, for example, you can't also carry a USB-C signal at the same time, so you can't split USB-C (without one of the ports shutting off while in that mode).
I'm not sure how all those one-cable USB-C docking stations work then, as they carry video and audio one direction (to the dock and then out to the screen), and multiple USB devices and power the other direction (from the wall and connected peripherals and through the dock). If it truly couldn't do multiple things at once, none of these docking stations would work with a single cable, and they do.
After looking some more I do see a few hubs that offer 2 usb-C ports in a hub, but no more than that and I see nothing but complaints: for example, it doesn't carry power or can't send a signal to their monitor. A Belkin rep replied and said that their hub is designed for peripherals, but I don't see any technical reasons.
This post[1] says
> What the problems is - how the USB-C lines within a cable are used is determined at plug-in time, depending on what you plug in downstream of the port. For example, you can't have full speed USB 3.1 and full video at the same time. So the USB-C makes a choice for you. If you plug in both a high-res monitior and USB 3, then you can't have both at once.
Still not technically clear to me. It does sound like with just a hub your transfer speed will drop when using a hub and driving a monitor. Maybe the limitation I'm thinking of is if USB-C is carrying something like Thunderbolt, you can't carry a second Thunderbolt signal? Because USB-C splitting compromises so much, they're so rare in the market?
I have heard the order in which you plug in USB-C will tell it which direction to charge (which gives credence to things being determined at plug-in time). For example, plugging the cable into your phone then plugging it into a computer will make the phone charge the computer instead of vice versa. When I heard that they were speaking about a specific phone and laptop, so it wasn't theoretical, but I'm having a hard time corroborating this. This [2] article says the direction power flows is a computer setting. What's scary is there's a lot of articles warning you not to use your USB-C laptop charger to charge your phone because it could fry it.
Thunderbolt combines PCI Express (PCIe) and DisplayPort (DP) into two serial signals and additionally provides DC power, all in one cable. Up to six peripherals may be supported by one connector through various topologies.
Yeah, seems like just putting icons near the port, showing the supported technologies, would solve the problem just as well as having different ports, while not requiring different cables or space on the devices.
No. As a hardware designer, I can tell you that is a terrible idea. Yes, it looks clever at first sight (all of us went through that phase of thought). But then when you get the physical device and you get to use it, it is only a matter of days or weeks before you encounter many problems with that choice of design (inconvenience at best).
Even if the signals of some ports are electrically compatible, I try to use different shapes of connector for them if they are not logically compatible (the functions are fixed, not swappable). Because you can be sure nobody will always look at icons which are faint, small, can only be seen correctly under a certain angle and light, etc. Supposing you even know which icon means what. Supposing you can even have a sight of the icon and port, that it is hidden behind something and you have to try if it fits blindly.
You do not want to be called every other day because "it doesn't work anymore" because someone connected the wrong device to the wrong port :-). So for some serial ports with incompatible functions, you do not put all DB-9 ports, you put 1 DB-9, 1 round DIN, 1 rectangular (and hope they do not manage to fit the rectangular connector in the DB-9), 1 DB-25... It looks messy, shambolic? Yes it does. Definitely. But it is efficient because it is idiot-proof. And I mean a very large definition of 'idiot' which encompasses about everyone.
I can tell you about the number of times I 'plug' a USB cable into a DisplayPort or the power cable into the telephone plug because they sit in the same area and have sort of compatible shapes... I generally quickly notice because it does not fit well, but just imagine the nightmare if they were truly the same physical connectors...
I think my proposal wasn't clear: you wouldn't have a DisplayPort and a telephone port on the device, distinguishable only by the icon.
I think the point of USB-C is that the same port can support both functions, and so you don't have to care into which port you actually plug-in the cable. My proposal was just for when you were buying a new device, to make sure it supported the same technology as your existing devices - only then you would compare the icons.
You just reminded me of how I killed my first iPod: I somehow managed to plug the cable in upside-down. FireWire was designed to only fit in one way, but for whatever reason, the FW port I plugged it in to didn't fight back when I hurriedly tried to plug it in upside down.
Yes. What defines the protocol is the endpoints, not the cable. Just avoid the "USB 2.0" USB-C cables, which do not connect all the pins (only the ones needed for USB 2.0). Of course, higher speeds and higher charging currents need a higher grade of cable, but that higher-quality cable will work with lower speeds and lower currents. Take a look at the Wikipedia article: https://en.wikipedia.org/wiki/USB-C
AIUI: nothing is protocol-incompatible, but Thunderbolt-compatible cables are short, stiff, and expensive. So you can buy only those, but you do make sacrifices to do so.
It's certainly worse from one aspect--you get less of the ports because they have to support so much. My laptop has a power port, two Thunderbolt, two USB-A and an HDMI. The new ones have just two USB-C, which can do all that stuff, but you only get two of them (or just one if you are plugged into power!).
I have the 15" MacBook Pro which has 4 USB-C ports. I also picked up the HDMI adapter. The adapter has a USB-A port, an HDMI port, and a USB-C power pass-through port. By plugging my charge cable into it I get power, external USB, and an HDMI monitor and it takes up only 1 USB-C port. I also like the fact that I can plug this contraption into the laptop from either side.
I would much rather have 4 USB-C ports than even 8 ports that all do one thing only. It's like having $40 cash instead of 8x $10 gift certificates (of which perhaps only one you might want to use).
I understand your point, but my experience is the opposite. On my old MBP, I'd be using nearly every port when docked at the desk. My 2016 MBPwT has only four connections which means that at power+GbE+DP+mouse&keyboard I'm now out of ports for anything else. My only option is spend even more money on a docking station which has its own share of problems (I've been searching for a decent one forever).
It wouldn't be so bad if it wasn't for the fact that USB-C/TB3 has been such a disaster and even mechanically a disappointment.
I was stoked before getting my Dell Precision 5510 (aka XPS 15 Pro) that it would have a USB-C docking station. It would be awesome with one tiny cable for power+displays+all my desktop USB devices.
But it's never actually worked, I ended up RMAing the docking station. Since the laptop itself can only support one display, I can't have dual monitors. So now I have a 43" 4K monitor, a USB hub and a power connector that I need to plug in/out all the time. And Win10 still gets confused about DPIs and scaling and stuff.
On my ancient-but-still-kicking Dell Precision, at least the docking station actually works and you can simply click in and out in three seconds.
For now. As those old ports become more obsolete they won't disappear, they will be repurposed as USB-C. Some laptops have both mini-displayport and HDMI (HP Pavilion). My Dell XPS has USB-c/Thunderbolt 3 and HDMI. At some point, HDMI will become less common, and that will free up space and resources for other ports. As USB-A becomes less common, laptops will ship with an extra USB-C and one less USB-A, until eventually, no USB-A at all. When's the last time you saw a PS2 port on a laptop? It used to be standard.
For me, I'm enjoying the power of the USB-c/Thunderbolt connector on my Dell XPS. I use a docking station, which I'm sending 4k video and audio out to while receiving video, mouse, ethernet and power over the same cable. One cable docking is amazing, and really shows off the possibilities of the technology (although in this case I'm pretty sure it's Thunderbolt doing the heavy lifting).
But the CPU and chipset need to talk to all those ports. If any USB-C port can suddenly need to do Thunderbolt 3, that’s ridiculous bandwidth requirements. You probably can’t do that to 8 ports on anything.
This is exactly the sort of problem that tends to go away over time (often before it becomes an actual problem for very many people). The PCI-E and even RAM bandwidth of modern chipsets would have been absolutely astounding 10 years ago.
You can share bandwidth among the ports. The default option for 8 ports would use 4 PCIe lanes for each pair of ports. But even full bandwidth for all ports is 32 lanes, not too hard to do.
But even my CPU could manage to have 8 lanes feeding a GPU and 8 lanes feeding a switch for the thunderbolt ports, something that provides enough bandwidth for the vast majority of use cases.
> As those old ports become more obsolete they won't disappear, they will be repurposed as USB-C.
... But they have already disappeared. The 13" Macbook Pro that I'm talking about just has those two USB-C ports and none of the others. If you are plugged in for power you just have one port.
If you buy a 13" laptop you make concessions for on-device connectors, as you always have for ultra-portables. The solution is the same as it's always been, get a larger laptop, or use a connector hub. As for your Macbook Pro, that's Apple. Using the one company that's known for doing their own thing and obsoleting ports before every other company does as your case for USB-C having caused other ports disappear already isn't a convincing to me. For example, the touch bar version of the Macbook Pro 13" has four USB-C ports. Apple has made a decision to obsolete USB-A. I'm not surprised, for the reasons I gave before.
Yep - there's no obvious way to tell what accessories will work together anymore.
My Switch is another great example. It won't even charge with some of my adapters, and I had to get a C -> HDMI splitter that was designed specifically for the Switch, despite the fact that Nintendo isn't even using any proprietary protocols (just less common ones).
But in the case of wifi or bluetooth you don't have to think about if your transfer medium supports the protocol or not (it's just dumb vacuum). In the case of USB the cable can be the weak link.
Also in the case of wifi and bluetooth compatibility issues have a chance to be resolved in software on your end. In the case of USB the list of supported protocols is the property of the hardware interface and the options to resolve these issues are limited.
In the end if you buy a device that has an "USB" port the specification that it has "USB-C" is meaningless, you want to know the exact list of hardware protocols that the port supports.
> But in the case of wifi or bluetooth you don't have to think about if your transfer medium supports the protocol or not
Sure you can - it's gotten better in recent years, but it used to be that a lot of devices didn't support WPA2. This was (typically at least) not fixed in software.
A coworker of mine had a Cinema Display plugged in to the Thunderbolt 2 port on her old laptop. When she upgraded to the Thunderbolt 3 MacBook, she bought a Thunderbolt 2 -> 3 adaptor for the monitor.
Turns out that it was a mini-DisplayPort version of the Cinema Display, and that even though it worked fine with the built-in Thunderbolt 2 port, it didn't with the one in the adaptor. She ended up trading her Cinema Display with somebody who had the Thunderbolt version and an old laptop, because at the time that was easier than finding an mDP->USB-C adaptor.
> This whole thing is expensive, unnecessary, user-unfriendly and confusing (to the average user).
And yet, very profitable for the companies rolling it out so poorly, who can charge more for products featuring the new ports and nickel-and-dime us with expensive adapters, dongles, and cables to replace the perfectly functional USB-A ecosystem.
It's quite a clever trick they've managed to pull on their customers -- convincing us all our problems can be hand-waved away with a single, forceful utterance of "it's the future", thus relegating any criticism to "whining and moaning" about first-world problems.
That even Apple can't manage to simplify the mess within its own laptop line, let alone tablets and phones, is indicative of how poorly thought out this whole idea was from the start. They seem to have a device for just about every combination of incompatibility that is possible.
The goal was noble:
- provide enough power to charge an external device and ditch the bulky power supply and second cable
- support extremely high-speed devices like monitors and SSDs
- standardize the connector with a slimmer, symmetric design
And this is the same Macbook Pro that for reasons that I can't fathom moved the headphone jack to the right side despite every (single) headphone cable going to the left can.
> every (single) headphone cable going to the left can.
FWIW a number of studio headphones support plugging in a cable to either can rather than restricting you to just the left one, if you want to go that route. V-moda sells a few pairs like this and denotes the feature as "dual inputs".
Bonus: if you get up from your desk while wearing headphones, you don't have to worry about pausing or muting the audio. The laptop thinks you're still plugged in.
the best wireless bluetooth headphones, the sennheiser momentum wireless 2s, have their cord (when in wired mode) coming from the right, which is disconcerting but convenient for this one particular use case.
Can't you reverse the left/right channels in the OS? That seems like a simple solution if possible (I just looked it up and it looks like it was easy to do a few years ago, so maybe it still is).
I understand, but if you reverse the channels, you can wear the headphones reversed. Depending on the physical design of the headphones, this may not may not be feasible (if each side conforms to that ear). If it is feasible and you can reverse the headphones and reverse the channels, problem solved, as long as you remember to wear them reversed.
I think most earbuds are easily reversible. Over the ear ones might feel a little odd, but I suspect most that took the effort to be comfortable in a normal configuration (that is, have some flexibility) would be serviceable. In-ear headsets and earbuds that have an around the ear component probably wouldn't work at all.
Really? Any port from either side? With more than one port next to each other per side, you need the option of being able to charge from all the ports and not just the backmost on each side?
That's the ideal, isn't it, assuming it's even possible: a number of ports (2/4/6?) on your laptop, each of which you can plug any peripheral into, whether it's a monitor, power supply, hard drive, or keyboard. This, as far as I'm concerned, is best for the user: no thinking, no struggle, just plug things in and they work.
This was the promise of usb-c - at least, that's what many of us thought was the promise. Now we're finding out reality isn't quite up to it. And it's far worse for people who, naturally, assume if the plug fits, it'll work.
Maybe that ideal is impossible, or financially prohibitive. Usb-c implies it isn't, though.
It works exactly like that on modern Macs - whatever you plug into whichever USB-C port will work. Worst case scenario is that your ultra-high-speed TB3 RAID array will work ever so slightly on the left hand side ports, but it'll still work and it'll still be plenty fast.
And I understand that USB-C ports on other highend laptops are the same? They support TB3, HDMI, DP and other alternate modes so this will quickly become a moot point, just like having to differentiate between USB 1.1 and USB2.0 ports?
Never mind that with the older plugs, the cables were basically passive. But now there has to be special resistors in them to signal what kind of power delivery they can handle, and oh so many companies get it wrong. Because there is also a special resistor that need to be used if you have a A to C cable, that overrules all the others.
These differences are also not exposed in a user-friendly way: I know I have to plug the charger cable directly into my MacBook, and I know, in broad terms, why. But there's no alert telling a naive user they can't (actually they can, but shouldn't) plug it into a dongle and plug the dongle into the MacBook. It appears to work. Eventually you might discover that it works poorly. It's a fake simplicity. Not putting device-specific charging into USB-C would address many of the hidden gotchas, and would have maintained the cleverness of mag safe charger plugs.
At best you can say you won't damage equipment in the process of discovering the quirks of USB-C. Except, of course, for the cheap not-up-to-spec cables that might cause damage.
I’ve been using computers since I was 7 in 1985. I never had to think about the chain of custody of my power cord until very recently.
IMHO, regulatory bodies should be taking a hard look at multi-use cords over a certain power level. It’s literally a matter of time before people die as a result of some fake Amazon cable.
Could explain why you need to plug power directly into your MacBook Pro or MacBook? I use both, and was unaware of this. Any source would be super helpful too.
You may not get the full wattage through a dongle. For example, the Satechi adapter with pass-through charging (recommended by The Wirecutter at https://thewirecutter.com/reviews/best-usb-c-adapters-cables...) is limited to 49W: go to https://smile.amazon.com/dp/B01J4BO0X8/ and search for "Important Notes". 49 watts may be enough for many uses, but (as I discovered recently) not when running a job on all the cores of a 15" MacBook Pro.
Now, presumably the designer of this dongle had some reason to limit the power output. Perhaps the heat dissipation is not enough (the dongle gets pretty hot even when using it just for USB and HDMI). What if some other manufacturer is not so careful and allows negotiating more power through a similar device?
My MBP wants an 87 watt USB-C charger. If you read the product page for the charger, it recommends plugging directly into the MBP's USB-C ports. When I tried charging through the dongle, the battery would charge slowly, or drain a bit when, for example, running gradle builds.
Say what? Most dongles with usb-c ports sold so far are specifically designed to relay power. If that's not working for you then you may have gotten a bad dongle.
I think that's the point. With off-the-shelf aftermarket parts, it's not always clear if you're getting something that can provide clean pass-through power or not. Or even if that's something you should be worried about.
How is that USB-C's fault? There are mini-DisplayPort and Thunderbolt 1 ports look the same, and no one complained about it. There are different capabilities on USB-A ports, and Apple never marked their USB-A ports blue, and no one complained about it. The alternative is like, what, we settle all the capabilities this port can have, and say that's it, we cannot add new capabilities to this port to confuse consumers, we must use a new port for new capabilities?
The 13" MBP have non-equal USB-C ports, if true, is totally Apple's fault. I'm actually surprised at why nobody have sued Apple for misguiding their customers (I don't think Apple disclosed that not all ports on 13" MBP has all the capabilities they bragged about).
> There are mini-DisplayPort and Thunderbolt 1 ports look the same, and no one complained about it. There are different capabilities on USB-A ports, and Apple never marked their USB-A ports blue, and no one complained about it.
Plenty of us complained about it. Some even saw it as a reason to avoid Apple.
> The alternative is like, what, we settle all the capabilities this port can have, and say that's it, we cannot add new capabilities to this port to confuse consumers, we must use a new port for new capabilities?
It's insane to me that Apple sold a mini-DisplayPort edition of the Cinema Display for years, but doesn't bother explaining how to plug it in to the new Macs.
> Search "macbook thunderbolt pci" on Google? It's nowhere in sight by page 3...
With "macbook pro 13" thunderbolt speed" it's the very first result (for me at least)
In any case, no, they aren't going out of their way, nor should they IMO. I think that would confuse many more people than the amount that are looking specifically to use > 2 full-bandwidth Thunderbolt connections at once on a dual core machine. I'd wager that > 95% are going to be happy they have two extra 10Gbps USB and charge ports.
Why would any potential buyer search that specified term instead of going to Apple's official tech specs page[1], which said absolutely nothing about the differences between the 4 ports? > 95% of the users won't have the need of more than 2 full bandwidth connection might be true, but a lot of them might have the expectation of using that two ports on either side.
Actually on that official tech specs page Apple bluntly claimed that:
Four Thunderbolt 3 (USB-C) ports with support for:
Charging
DisplayPort
Thunderbolt (up to 40 Gbps)
USB 3.1 Gen 2 (up to 10 Gbps)
There's even no footnotes to say that 2 of them actually cannot reach 40Gbps bandwidth. If this is not false advertising I don't know what is.
I agree it should be mentioned on that page too ("up to" would probably get them out of any false advertising claims). It's doubly confusing because there are 13" models with only 2 ports vs 4.
I definitely recall seeing complaints about Thunderbolt (and I can't remember if I ever commented publicly, but I definitely thought it was a poor decision). Even so, they were probably less frequent due to the relative uncommonality of mini-DisplayPort, and mitigated by the fact that it was essentially MDP OR MBP+Thunderbolt - relative simple next to the variety of modes/standards USB-C can use.
> There's no colour scheme that can adequately handle all the possible variations.
For cables, you can whittle it down to two things. Bandwidth, and whether it supports 3 amps or 5 amps. Supported modes is a function of signal bandwidth. 5 amp can be a little power icon, and bandwidth only needs about 3-4 options. So most cables would only have a bandwidth rating.
This is not much harder the the issue of 8P8C connectors.
Depending on cable configuration, pinout, wall plate and structured wiring system, that 8P8C might be usable (or not) for multiple different types of data networking, from the assorted ethernet speeds to E1 to token ring, or for a serial console, or delivering power and audio to a remote speaker, or hdmi-over-utp, or even -48V telephony, and let's not even get started on the only-subtly-different but actually incompatible RJ45S connector, or people sticking RJ11 plugs in 8P8C ports.
And yet the world has coped with this proliferation.
I'd be happy with a port that can accept a plug right side up or upside down. I get tired of peering at the back of my computer with a flashlight trying to figure out which way to put it in.
- USB-C is a connector and not all ports/cables/hubs support all of its features and modes. So not everything that can be connected using a USB-C will work the same way, or even at all.
- Due to their potential bandwidth demands, computers can’t have very many USB-C ports
- USB-C will be phased out and replaced before settling down
One of the folks involved in the original USB standards work once told me that the lack of reversibility (or at least a more obvious indicator of orientation) was the main thing they got wrong with the initial standard.
The other one was making it the same width as an 8P8C modular jack. The number of times I've put a USB-A connector in the Ethernet jack when reaching blindly behind my computer...
The port itself is nice, is in the cube of data rate, power rate, port type that is the problem.
You can have A ports that deliver 20V and 3.1 data rates, and C ports that deliver only 5V and 2.0 data rates, and they are both valid according to spec.
Very few connectors are (look) symmetric but can't be plugged in both directions. It is actually a very big oversight by the design committee because, sadly, most of us are not as sharp as you.
Don't think of it as "designed for idiots", think of it as "reaching for the cable in the dark and trying to get your phone on the charger in the last few moments of consciousness before you fall asleep" and you'll have a better idea of why it's reversible.
How big a mess USB C is? Check this note from Plugable: "We have had several confirmed cases where lowering the Power Output of the internal Wi-Fi adapter to 75% in Dell XPS models has helped with USB-C disconnect behavior."
Also, while you can get a small USB C to dual DisplayPort MST hub http://a.co/hyXGdBA and also you can get a small USB C to DisplayPort adapter with USB C power passthrough http://a.co/1KE2imb you can't get an USB C MST hub with power passthrough in a single, small, relatively cheap device. You need an expensive, quite huge device like the ThinkPad USB C dock to get this functionality. It seems as if every USB C accessory maker took a blood oath on hardwiring a HDMI converter to the second port of their MST hub. (Yes, taking a USB C to DP converter and an ordinary MST hub is a cheap solution, of course, but it's not elegant at all especially because both insist on using cords, creating a mess.)
Also, there have been laptop chargers with 65W or 90W on the laptop side and one or perhaps even two USB charging ports, if you are real lucky then 2.4A each. Now that both your laptop and phone charges via USB C you'd think you could get an AC adapter with two USB C ports and perhaps a few USB A thrown in for good measure. Dream on. The highest wattage adapter I found with two USB C ports is 55W, it doesn't even charge a single laptop, it's all 5V. There's a well known ;) brand called LVSun which sells a 80W charger which can do a non USB C laptop + USB C phone (or a USB C laptop + a non USB C phone) plus it has a few USB A ports as well. Still, it's not two USB C ports.
So... USB-C is bad because cable vendors haven't come up with a good way to easily identify what features a given cable supports? Which could be done extremely easily by things like color coding... exactly what we had to do with USB Type-A ports.
I'll never understand articles like this - let's not standardize on a form factor because not every single application of that form factor has the same requirements???
Hang on, it's not like type-A where it's just a single parameter (1.1/2.0/3.0). A USB-C cable has tons of potential features and a port has possibly more features still.
- Protocol: 3.0 or 3.1
- Power: QC, PD, or none of the above; how many watts
- Alt-modes: HDMI, DP, TB3
- Chipset features: UAS
Once you've assigned all of these to a color, the higher end cables are going to look like a pride flag or something. The solution is really not that simple at all and it's far from guaranteed that once adoption picks up that you can just assume your device/cable combo "obviously" supports what you want.
On top of all of this, construction of USB-C cables are far more complex than your run of the mill cable. They're active devices with apparently absurdly tight tolerances. Did everyone forget about the Benson Leung's spreadsheet of killer cables? Even if the industry can figure out a user friendly branding, it's still a roll of the dice for whether you'll fry your laptop.
Lastly, USB-C seems like a mandatory weakening of security. In a few years time, you won't have any other choice but USB-C and now suddenly any random charger (or cable!) could be the easiest rootkit ever deployed. The cute pen testing exercise of dropping USB thumbdrives with backdoored Word docs is going to get a lot more serious. I wonder how long until we have "hardware firewalls" that attempt to rein in USB-C devices. We're past the "USB condom" at this point.
The check there is "does this A-to-C cable say it's A-to-C, or say it's a 3 amp source". If you use it with an incorrectly designed charger, the charger could get damaged. "Killer" is not really fair, because the only "fry your laptop" situation was an instance of a wiring mistake that could happen on any kind of cable.
cesarb is right, you can represent it as max protocol and max current, and type-A cables have approximately the same setup.
Hang on, I thought there were only two main axis for a USB-C to USB-C cable: maximum protocol (3.1 Gen 2, 3.1 Gen 1, 2.0) and maximum current (3A, 5A). Also, wouldn't most USB-C cables be passive, and the choice of alternate mode be left to the endpoints? I don't think a passive cable cares whether the wires are being used to carry DisplayPort, HDMI, or something else.
then each year as new standards are added you make the shade of green slightly lighter. then you just need to know what shade of green your use supports and make sure you cable and ports are as light or lighter.
I guess my question is, if USB-C has to be this complicated, then how is it an improvement over having multiple port types? I would argue that it's worse, and I think that's the point of the linked article. Compared to how things used to be, USB 2/3 is a marvel. Chain a dozen things off a couple of ports, all of which use the same plug. And it (almost) always Just Worked. The connector design was an improvement over what it preceded, but still was ultimately problematic. USB-C is a definite win in terms of form factor and usability of the connector itself. But if not all USB-C ports and cables can support Thunderbolt bandwidth, then all we're doing is sowing confusion. Why not stick to using mini-DP for Thunderbolt and have USB-C keep the USB dream alive?
This is a bit of a tangent, but mini-DP is a horrible connector and I'm really glad we're getting rid of it. It's fairly bulky for what it does, and it does a really bad job of holding the plug in the port. I'm glad that TB is moving away from it, although not necessarily that it's moving to USB-C.
Yeah, the full-size DP is one of the nicest display connectors and mini-DP is one of the worst, though mini-DVI was nearly as bad, I've never seen it on a non-mac, but mini-DP is everywhere.
I have never heard about confusion with Ethernet cables. Everybody understood that to get gigabit speed one needs more expensive Cat6 cables with no suggestions to have a different connector. Why USB-C should be different?
That's not necessary true. Ethernet speed negotiation runs at 1mbps, and is based on announcing capabilities. If both sides are capable of GigE and connected with cat3, they can negotiate to GigE and fail to work. There's nothing in the standard that provides for negotiating a slower speed in this case (but some nics or drivers may figure it out)
But again, like USB1/2/3, Cat5/6 is one additional parameter, not a matrix. You look for a single number and you're done. That's much more complex for USB-C, in a context where the industry is often (maybe even willfully) opaque on reporting correct specification. I hope they will come up with a simple way to signal all these combinations (maybe a standard color-coded label like you have for calories/salt/etc contents in many European supermarkets) but at the moment it's definitely an issue.
You're acting as if cables are picking an choosing tiny subsets of the spec. Reality is most cheap cables support almost nothing, and just about every expensive cable I've seen supports all of the spec currently established. It's not like someone is going to choose to support thunderbolt and then not support usb3.
And there's also Power over Ethernet, which comes in multiple standards and several mutually-incompatible non-standard variants. And non-Ethernet systems that use the same 8p8c connector. It's still a matrix.
I’d argue that people who even know about PoE are a small number whose job depends on them understanding such intricacies (and even them can get it wrong, as proven by this very thread). USB is mass-market consumer technology and has to be simpler than that if it wants to succeed. The best we can hope for, as I mentioned in another post, is something similar to washing labels.
Not really. The cable thing is easily solved - don't buy cheap cables. If you ever plan on using USB-C for thunderbolt, only buy cables capable of thunderbolt.
As for devices - I guess I'd say know what you're buying? How is it any different than the fact I have to know that some mini-displayport ports can do thunderbolt and some can't? Some cables can and some can't?
I mean... Belkin has already started putting the thunderbolt icon on the cables they sell that support it. This isn't anywhere near as complicated as he's claiming. The only gray area is people buying the cheapest possible cable they can find on Amazon then acting shocked when they get what they paid for.
Sadly, looking at some of Benson Leung's results, e.g. https://plus.google.com/+BensonLeung/posts/TkAnhK84TT7 -- looks like bad cables and chargers cost the same as good ones and can be from name brands, and that's just on the power front. Looking through some of his suggested cable/charger spreadsheets shows things generally being a roll of the dice.
> The only gray area is people buying the cheapest possible cable they can find on Amazon then acting shocked when they get what they paid for.
Don't act like cheap cables frying your equipment is normal or expected. Off-brand products usually aren't as good as the expensive stuff, but we have a right to expect them to not be actively dangerous.
The article wasn't saying don't standardize on a form factor.
> It’s comforting to think that over time, this will all settle down and we’ll finally achieve the dream of a single cable and port for everything. But that’s not how technology really works.
It could be done by color coding (or standardized icons, or something), but the standards body didn't actually go to the trouble of doing that. I don't think the OP is objecting to the connector; I think he is objecting to the failure to anticipate the proliferation of cable types and to provide some standardized symbology to differentiate them.
When USB-C was announced and I saw the pin-outs, my immediate thought was, “This is going to be a support nightmare.” I should have written that down, so I could point at it as proof of my genius. :P But I didn’t know what a garbage fire USB-PD is.
I was thinking of Alternate Mode. Alternate Mode is what enables Thunderbolt 3 on USB-C. We already had an alternate mode for USB: MHL. Which divides HDMI ports between plain HDMI ports and HDMI ports that support MHL. The Alternate Mode multiplies the number of ways displays can connect to video sources via USB.
Not to mention what everybody else has commented on.
All these incompatible options are using the same connector. That’s obviously confusing to users and frustrating to support personnel.
That, and allowing OEMs to decide port, data rates, and power delivery spec individually.
In particular as we have a rudimentary power spec in the 3.1 "data" spec, involving resistors in the cables(?!), and a separate power delivery spec that goes above and beyond the 3.1 power stuff to define anything up to 20V at multiple As.
Frankly to me there seems to have been a initial C port design that was a "simple" mechanical reversible implementation of the existing 3.0 wiring. And then someone decided that rather than simply have the pins being mechanically doubled (either in plug or in port) they could make each be an individual wire and run even more stuff through the port. And thus we get Alternate Mode that can carry just about anything the OEMs dream up...
>Much of USB-C’s awesome capability comes from Thunderbolt and other Alternate Modes. But due to their potential bandwidth demands, computers can’t have very many USB-C ports.
Why is this the case?
I read that thunderbolt v3 can handle upwards of 40Gbit/s.
Ok... but what is limiting the number of USB-C Thunderbolt-enabled ports that a given computer can have?
Surely each of those ports are not being maxed out to 100% of their bandwidth 24/7?
Why can't the bandwidth just be limited as needed like in a cheap router for example?
Ironically since then i've just made Thunderbolt 3 a requirement for all company laptops and the whole company, mac and windows gets to use the same dongles, chargers and even dockingstations, which I count as a win.
I thought the entire point was to have _cables_ that work everywhere, which is my preference. I think the ports themselves can and should do different things - even my USB-A ports on my laptop have little icons beside them indicating their purpose or capabilities. The point is to not have to keep many dozen different cables on hand for everything.
Regarding the 3.1-speed: Is there any rational explanation for “USB 3 5GB” being renamed to “USB 3.1 gen 1 5GB” and “USB 3.1 10GB” being renamed to “USB 3.1 gen 2 10GB”?
> And, of course, there’s usually no way to tell at a glance whether a given cable, charger, battery, or device supports USB-C PD or at what wattages.
Forget wattage -- voltage is also a big deal. Want to charge a Dell laptop? You need 20V. Charger support for PD at 20V is somewhat related to wattage, but it's easy to find 40-odd watt chargers that can't supply 20V. And there is usually nothing in the specs that tells you this.
Is there any reason the spec couldn’t say all cables deliver 5V and the recieving end converts that to what it needs? Because from my understanding on electronics, if you double the voltage, you half the amperage. So supply 5V@20A and let the recieving end convert that to 20V@5A
Less current usually means less heat generated in the wires and so you can use smaller copper wires. Higher voltage requires better insulation on the wires. Altogether I think it's a good idea to use 20V at 5A for 100W rather than 20A at 5V. You will get much more heat and voltage loss at 20A for the same size wire.
Cables are limited by both current and voltage. The thickness of the copper limits the current, and the thickness of the insulation limits the voltage. 20A is a massive amount of current and won't be easily carried by anything flexible (the wires in the wall of your house are only rated for 15A, and they are quite bulky). 20V on the other hand isn't that high and needs very little insulation.
This may be a stupid question, but why does a serial bus require 24 pins? That’s only one less than the old DB-25 parallel cables. (I know it’s really only 12 mirrored, but still). USB 1-2 got away with 4 pins.
Also, why can’t we just switch to optical cables already? We’d only need +V, GND and the fibre. Just use the 3.5 mm audio plug (round, fits any way up) with a hollow point, like Mini-TOSLINK did.
It turns out to be cheaper and easier to aggregate multiple serial links than to run a wide parallel bus. Practically every modern high-speed interconnect uses multiple aggregated serial links -- USB, Ethernet, SATA, PCIe, DDR3+, DVI+HDMI+DP+LVDS, etc, etc.
> why can’t we just switch to optical cables already
Too expensive. A big part of USB's dominance in the mid-90's was that it was way cheaper than the alternatives of the time.
Thunderbolt was originally designed for optical interconnects.
> Just use the 3.5 mm audio plug
It's physically too large for current and upcoming designs.
It's not exactly mirrored even. The way the 24 break down is this:
You need 2 power and 2 ground pins per side to carry the large currents needed. Those all get wired to the same (thick) cable (though usually split in two to make the cable more flexible). That's 8/24 pins, 2 wires total.
You have two pins (per side) to signal which mode the cable is being used for and which way around it's plugged in. Because there are so many alternate modes with each having its own signaling standard, you can't pack those in-band, particularly for the case of analog audio, which doesn't have a digital data frame at all. Two are connected across from one connector to the other, so there's only three new wires for the 4 pins. So that's 12/24 pins, 5 wires total.
You have two pins per side for USB 2.0. Those get wired to one side only, but are connected together in the device (not the cable). The opposite side pins are unconnected. So two connected and two unconnected new pins per side, and two new wires. 16/24 pins, 7 wires total. Some cables have active transceivers in them, and to ensure that the far-side transceiver is powered, there's an extra wire carrying its supply. This brings us to 8 wires.
The remaining 8 pins are mapped to four high-speed differential pairs. The important thing about these is that they ONLY support high-speed low-voltage differential data and are protocol-agnostic. To get the enormous bandwidth the standard needs they have to split them up into four different serial lanes. So it's not a single serial bus. It's 5 serial buses, a handful of power and configuration pins. This brings us up to the full 24 pins (22 of which are in use) and 16 wires (or 18 if you opt for the thinner, doubled-up power and ground wires). It's very much not a serial bus.
As for optical cables, that would work, but then you need to transfer all the control signals in-band and either stick to a single signaling standard (and have to do high-speed conversion for everything else) or somehow enclose various protocols in a single frame (like ethernet). This would require a fundamental redesign of peripherals, rather than a small change (a mux and control IC and new port). Additionally, optical transceivers are very expensive and power-hungry at high speeds, to the point where even for extremely high link speeds copper is still being used for ethernet. TOSLINK succeeded because at the very low data rates it has, the optical link hardware is cheap and doesn't use much power.
New MacBook Pro user here: the four USB-C (+ stereo mini audio jack) port configuration doesn't bother me as much as I thought it would. It's nice that I can plug the power cable in on either side of the laptop. One almost inevitably needs some kind of display dongle unless you have both HDMI and VGA on board (and VGA is an ugly port, let's be honest), although I'll say the lack of native HDMI is a pain point for me. But also, if I'm using the laptop without power, I now have 4 ports available, which is more than enough.
Another new MacBook Pro user here: The two ports the 13" model came with are not enough. And the official Apple USB C to HDMI/USB adapter charges slower through the passthrough than a direct connection, meaning if I'm low on battery I need to dedicate a USB port to charging.
My monitor at home (XR382CQK) supports USB C and that's ok, it charges, does video, and acts as a hub via one cable. The problem is the world hasn't had time to embrace USB C. I'd gladly trade my old setup of needing two cables (one Magsafe + one Thunderbolt) for not needing a dongle literally everywhere else I use my laptop with an external display.
With all the hubs I still can't get my 2017 Macbook Pro 15inch to reliably run an external monitor. (it lags every now and then). So far I didn't find out yet which of the components is actually causing it. Crazy how complex it is.
I don't see how lag could be caused by a cable or hub. It's not like a remote desktop or VNC connection, where what flows over the cable is commands to change a stored image; there is no stored image -- every pixel is re-sent for every frame. Any lag must therefore be internal to the laptop. If you have an image on the monitor, and it isn't scrambled and the colors are right, the cable and hub are working.
I was just last week forced to use a USB-C-to-DisplayPort cable, instead of a straightforward DP to DP cable to get a 4K monitor to work. Using the latter, the computer thought everything was fine but the screen flickered like a CRT, horizontal splits and all, and went black every 3s.
There are so many layers involved I don’t even want to think about it, and am just happy it worked...
On a marginal connection source/sink of a DP link will re-train which interrupts main link operation, causing brief lags in display. If the connection is even worse you'll get the usual black-picture-for-a-few-seconds dance.
I had all sorts of odd display problems until I discovered plugging in the adapter and then the display cable into the adapter resolved them all. Seems weird, but has been a sure fix so far.
Sounds like the adapter have some circuitry with a initiation delay involved.
Reminds me of when HDMI first started shipping and people found they had to power up their new TV and games console/BR player in a certain sequence to get HDCP to play nice...
A lot of the problems seems to be implementation... Apple's quality and value has done nothing but nose-dive, while they focus on "impossibly useful" things like the touch bar. How removing ports is a "feature" on a "professional" line or laptops still baffles me.
But due to their potential bandwidth demands, computers can’t have very many USB-C ports
Isn't this more a function of Intel product differentiation and closing off their bus to others? I get the feeling the killer server ARM is going to be a bandwidth monster.
Mostly it's the authors ignorance of the standard. Just because you can connect a display to an USB-C port doesn't mean that is taking up USB bandwidth or requires the CPU to do more work than previously. It's just more ports integrated into one.
It seems to me you are the one who is confused. There are two cases, (1) where the port is directly connected to the cpu and (2) where the port is connected to a 'hub' that is connected to the cpu.
He's clearly talking about 1, which is by far the most common scenario, certainly on Macs, and in that case it does require bandwidth and CPU cycles to run at full speed. If you also want that usb-c port to support thunderbolt and all its features you need even more pcie lanes and even access to the gpu.
If there's a port, there has to be sufficient bandwidth available, regardless if anything is connected and what that might be. USB 3.1 Gen 2 supports 10Gbps, so 4 ports means 40Gbps. If it's a MacBook Pro, it has to support Thunderbolt 3 on all ports. That's 160Gbps for 4 ports, or 16 PCIe lanes. That happens to be exactly the number of PCIe lanes the Intel i7-7920HQ CPU has.
If you want TB3 support on each USB port, you will run out of PCIe lanes available on the CPU. Just with basic support for 2 PCIe lanes per port and 4 ports, you are at 8 already. Similarly with running DisplayPort to each port, your GPU doesn't have unlimited number of encoders.
I am the only one at work with the new MBP and I get frequent questions about it and the ports. I usually tell them that I firmly believe USB-C is the way of the future, but alas I live in the present.
I like USB-C and I buy into the dream a lot. Additionally, the fact that I can plug it in either way around to charge my phone is amazing!
The USB-C cable is obviously very capable, it's just a bit of a mess with interoperability. However, given that USB-C requires licencing, I feel like they can probably fix a lot of these issues afterwards by changing what they do and don't allow. (In a way kind of similar to software patches for the licence itself.)
For example, I assume they could start charging more to use older USB A/B, drop some supported USB-C features that are kind of messy and start charging more for devices that lazily only sort of implement bits of the specification.
The old USB shape has been around for 20 years and counting, if the new USB-C shape is around similarly as long, we have plenty of time to get these issues worked out. (This is already going so much better than ipv6 adoption!)
If they were going to engage in this, they should have made clear and authoritative capabilities markings on the equipment a required part of the standard.
And, as others mentioned, I also worry about torque and stress on the port, particularly in mobile scenarios.
It's so confusing, to ensure compatibility a consumer might be tempted to just buy everything from one manufacturer. This would be very upsetting for Apple, I imagine. Proprietariness by stealth?
Not impossible, just unfinished. I didn't read all the points because it feels like a pointed list, and not really a running text. But it seems to me like I haven't seen a single problem that can't be overcome with fine tuning the USB-C system. E.g. some ports have thunderbolt, some don't. Well it looks like in the future all of them are expected to have thunderbolt or a similar standard.
All I'm taking from it is "It's not fully usable yet" and I continue to use normal USB for longer.
After a month or so of owning a brand-new 15” MBP my regret is that I didn’t buy one of the holdover 2015s. The charger is absurdly large and I have a growing collection of dongles that need to accompany me depending on the task.
I’m currently using a Rube Goldberg two-dongle solution to get photos off of an SD card.
Moreover I still can’t find what I need for my home office in USB-C.
I won’t even get into how much less functional the touchbar is than the hard keys they replace. Terrible user experience.
Just went from MacBook Pro to MacBook (fly 120k miles a year). Got Apple 4K (LG) monitor. Disc are USB2 from monitor as is Ethernet. Damn thing does not wake from sleep with USB keyboard. I had an LG 21:9 34” display with thunderbolt drives, network. The 4K monitor is a great picture but 21” is crap. Apple just put thunderbolt in the MacBook please. USB3 can die. Happy with physical interface, rest is crap.
The USB guys should have never allowed so many standards live within "one" standard. It may be better for the OEMs, especially as they both save money on ports and get to mislead customers into thinking they may get that fast version of USB-C, when they really won't. But so many standards acting as one is very confusing to consumers.
Perhaps a similar simplification can be applied to NEMA connectors https://en.wikipedia.org/wiki/NEMA_connector, to reduce the number of extension cords and add thrills to our daily lives.
I think it is clear, the future is mostly wireless. Why would we need a Wired Connection?
The only reason we need wired is either Power or High Speed Connection.
Power: It is highly unlikely in the next 5 years time we wont have a true wireless power transfer,( i.e not sitting on a Power Mat ), enough energy to power a Laptop, Monitor or Desktop.
High Speed: That is anything from Monitor Cable, Massive SSD external storage, External GPU, and.... I cant think of anything else. ( Suggestions Welcome. ) Basically, what Thunderbolt was designed for; External PCI-E and DisplayPort.
Sometimes this year or next we are going to finalize what I hope will be the Wireless Standards for decades to come,
802.11ax and 802.11ay.
802.11 ax is the successor of 802.11ac, basically what the industry learned from LTE and LTE-A, and moves the best of those tech across to WiFI, works in both 2.4hGhz and 5Ghz.
We could expect real world, uncontended speed of 1Gbps in 2x2, and 2Gbps in 4x4, and both are very conservative numbers.
802.11 ay is successor of the failed 60Ghz 802.11ad. Real world speed 10Gbps+.
Bluetooth 5 which and Improves version of 4.1. And Bluetooth 4.1 is the arguable first Bluetooth which really makes it acceptable in general usage, and if 5 improves the better.
Assuming no Patents non-sense, we could expect to have WiFi chips in what we are connecting now via cables. You will only need cable for Power and High Speed Connection. And coincidentally Thunderbolt is going to be an Open Standard Next year.
What we need to fix / improve is power delivery on USB 3.1 branding. May be Future revision of USB will do that. For now, sitting in a transition period between Wired and Wireless will always be a little painful in the short term.
Just have a rough industry agreement of which features should be represented and what their symbols should be, then put them on connectors (which are pretty long, there is plenty of usable space) and on packaging and listings.
Great Idea. However, I had to write software one time that had all the possible washing symbols as input. There were 30 or 40 possible symbols. No human could possibly know all of them.
Yeah I know, I even downloaded an iOS app to read them ("Laundry Day"); but in practice I noticed most people just learn the one or two they care about and ignore the rest - because if you don't own a tumble dryer, you don't really care about the symbol for that. So if you care about Thunderbolt 2, you would look only for that icon, ignoring the rest; power might have a scaled one like washing temperature, and so on.
As usual though, the problem is that the industry has to come together to agree these symbols, be it formally through the USB consortium or informally in some other venue.
This is the sort of well-written, detailed reporting that I believe to be true, but which ought to have citations for each claim. Of course, cataloging and adding those sources would mean the article takes a lot longer to write and publish.
It's not perfect (it may even be terrible) but I can charge my computer, my phone, and my Nintendo Switch with it which is a huge improvement over the last generation of cable mayhem.
To solve the problems what the author is facing, we need to invent another port and cable standard -- which is far worse than keeping USB-C-like port and only buying a new cable.
Sorry, but what exactly is wrong with USB A and micro USB? I've never understood why USB type-C was a thing. You can get 3.1 speeds on the old connector.
Can we? In a,"consumer" role I've never plugged an ethernet cable into my computer only to find out I got the wrong type of thing for this particular doodad, then lament about things in the comments of a shopping cart site.
Hubs and switches are not an issue, nor are the various ethernet terminations standards. Same with speed. Users by an Ethernet cable, 99% of the time it'll be suitable for gigE (and thus suitable for 100Mb/sec down to 10Mb/sec).
Power over Ethernet: I haven't studied this one, so I'll assume it has some subtleties involved.
Hubs vs. switches: Hubs are awful (have used one), burn them with prejudice. Collisions bring everything down to 1 Mb/s. And hubs are disallowed in gigabit anyway, due to no CSMA/CD.
Category cables: USB has different cables too, and at least Ethernet didn't change the number of conductor wires. But the biggest difference is that Ethernet cables are often in walls or inaccessible places, whereas USB cables are user-replaceable.
100 Mb/s vs. 1 Gb/s: One problem I noticed is that a link might start at 1Gb/s, but after many weeks a momentary error (e.g. power blip) makes the link drop to 100Mb/s or even 10Mb/s and does not automatically recover to the higher speed without manually replugging. IIRC USB doesn't drop down to a lower speed in this manner.
Crossover cables: IIRC if you are using purely switches (no hubs), you will never use crossover cables. Other than that, pervasive MDI/MDI-X has silently solved the problem. Though I wish the standard didn't create this problem in the first place.
I do appreciate the fact that Ethernet treats nodes as peers. Whereas USB is based on a tree rooted at the master and the master must poll devices to ask them to transfer data. This tidy idea gets into trouble when, for example, a phone can be either a host (OTG) or a client. Also it is impossible to directly share USB devices between two hosts (e.g. a webcam accessible to two computers simultaneously).
Ethernet is definitely a standard with fewer issues than USB, but I'd say that is down to a narrower scope.
That has allowed us to converge to essentially a working standard of 1Gbps over Cat5e.
The more technically competent you are - the more you want to be focused on your domain specialization - syadmin, cloud, development, devops, networking, PM etc.
Why would any professional wish to waste time learning the complex compatibility matrix of USB-C consumer cabling?
Exactly. Why should anyone? You should just be able to look at the cable and know what it does. Which as the article points out is not the current situation. The person I replied to stated that googling it is the answer but not everyone is that competent.
Many of the cables/dongles are long and skinny relative to the tiny metal connector, placing a fair amount of torque on the port. A number of the cables I've used are poorly fitting as well, exacerbating the mechanical strain.
A good example of this problem is the Yubi-4C. The connector protrudes about half a mm from the port and it has enough play to wiggle up/down/left/right. Having had to replace the logic board and I/O boards on my 2016 MBP 15 within 3 months due to port damage, I get a bit nervous when I notice fit problems like this.
Aside from outright damaging the ports, poorly built cables seem to wreak havoc on the mechanism that grips cables on Apple machines. Prior to the board replacement, the Apple power cables would gradually come loose by their own weight. Fortunately, the Geniuses did agree that this was a problem and replaced the top case along with the guts of my computer.