Hacker News new | past | comments | ask | show | jobs | submit login
Total Nightmare: USB-C and Thunderbolt 3 (fosketts.net)
814 points by sfoskett on Oct 29, 2016 | hide | past | favorite | 446 comments



We will survive this. We survived having RJ45 jacks (which I just learned are not really RJ45 jacks, but 8P8C connectors) in walls. Is it a phone, a token ring, or an Ethernet? Cat 3, Cat 5, Cat 6? Did the installer unwrap the pairs too far? Are there crossed pairs? Does it have two pair or four (in days of old it was common to make single Cat 5 wire serve two devices since they only used two pairs)? Does it have a DC voltage supplied between some of the pairs? What voltage, what current capacity, which polarity, which pairs? What speed is the ethernet switch port? Maybe it is a VGA extender or an HDMI extender. Maybe it is a serial console. I had an office where one RJ45 went to an FM antenna above the steel roof, you really are not supposed to do that. (I have installed or used all of these conditions except for token ring.)

USB has already had this problem for 16 years. When they went from USB 1.0 at 11mbps to USB 2.0 at 480mpbs they had to change the shielding. The only visible change on the connectors was a tiny + sign in the three branched USB tree molded on the end of the cable, which was apparently so useless to users that no one bothered to put them on. At least, my quick rummage of cables didn't find any. There is alleged to be a color code. The plastic inside the connector is white for 1.x, black or white for 2.0, blue for 3.0, and yellow or red for sleep/charge. My quick rummage of cables suggests this is not necessarily known to cable manufacturers. I think the ports on devices are more rigorous about this, at least, I think all my USB 3 ports are blue.


We won't solve the device incompatibility, but it seems we (as individual users) could solve the cable problem by always paying more to buy USB-C cables the were made to handle ALL of the allowed protocols (incl. power) up to the limit of each protocol. In other words, universally usable USB-C cables, where any failure was an incompatibility between devices, not the fault of the cable.

If I'm going to need a lot of cables, it will probably be worth it to me to invest in making them ALL the universal kind, so when trying to set up any system, I can just use any of MY cables for anything.

Of course, this will then fail if they introduce another protocol that uses the same connector but won't work through my "universal" cables, but unless they do, this could make life easier (at the cost of more expensive cables).


> If I'm going to need a lot of cables, it will probably be worth it to me to invest in making them ALL the universal kind

The problems will not only be the cables and connectors, but for a large part also the devices people will be trying to attach. When it doesn't work, is it a bad cable? Does my laptop even support the device I'm trying to attach? If it's a display, do I need displayport 1.3 or HDMI 2.0 What would be optimal, and what does my laptop support? That's the biggest part of the mess.

Cables that don't support everything can easily be sold as "this is a bad cable" and have a user replace it. Your $1000 storage box you just bought can't be connected to your 2 year old laptop even though they both have the same connectors is a harder sell.


"but unless they do"


> We survived having RJ45 jacks

We survived the 90s in general. Every damn cable was basically designed for one task.

RJ45 wasn't the worst one in my experience. The worst cable was the Serial cable. Null Modem, male-to-female adapters, DB9 vs DB15, and every combination thereof.

Some of the serial cables would work for mice, others for modems, others for printers, some for gamepads. We lived, we learned how to swap cables and label them.

---------

And once you got the right cable, then you may have to configure the right settings. 9600 8N1 is common, but maybe there were 2-stop bits, or maybe the baud-rate was 4800 or some other value.

Uggghhhhhhh.


I know someone who got a lot of smoke out of mixing up his VGA and gameport ports.


I've seen countless people of the years plug their phone cable into the Ethernet port.


This precisely - I smoked a mouse plugging it into the video port circa summer 1988.


I believe Apple never used the blue convention on their MacBook Pros. I think they even mocked the competitors for uglifying their devices this way. Hell, I’m writing this on an rMBP with USB 3 and it uses a white plastic bit.


I think it's fine in that case, since Macbooks never mix USB port types. As far as I know, all the USB ports on any given Macbook have identical capabilities, so there's no need to distinguish them visually. You just need to be informed when you buy the device what port type you're getting.


This is not true now, see e.g. https://news.ycombinator.com/item?id=12824447

Not sure if it was true before.


That difference in Thunderbolt bandwidth is annoying, but there's no standard color or symbol to indicate it, so marking the port would at most vaguely signify that some difference exists, which is less useful.

Apple has never shipped a laptop with both USB 2.0 and USB 3.0 ports, which is what the blue convention is for.


The two USB 2.0 ports on my 2010 MacBook Pro also run at different speeds.


I mainly meant that all the slots spoke the same protocols. So anything that works in one slot will work in every slot. However, I wasn't aware that the speeds were different, so thanks for that information.


Heh, they certainly kept to the standard USB length even if by default the keyboard is going to have difficult reaching the computer....


> Is it a phone, a token ring, or an Ethernet?

I remember the first one or two times I went to IBM's then meeting facility in Palisades, there were warning signs in the rooms to NOT plug your laptop's Ethernet into the token ring jacks. Eventually they rewired the place and put in Ethernet. :-)


"We will survive this"

But why do we have to play this game again?

Why haven't we learnt from previous mistakes?


Because the reality is not straightforward and to break many catches 22 it is required to make decisions that not everybody happy about.


A few years ago, when i was a grad student at Indiana University, I was told that one of the dorms required students to use a specially wired ethernet cable as recently as 10 years ago because the building had been wired before the ethernet standard was finalized and they chose the wrong wiring order.


couldn't they 'undo' the mistake between the wall and the switches in the basement?


A common mistake back in the early days of Ethernet was to wire them as four sequential pairs, instead of the required (1-2) (3-6) (4-5) (7-8). This worked most of the time on 10BaseT networks, but would get flakey at 100Mbit.

If they got the pairs wrong at the wall jacks, there's not much you can do without repatching the whole thing. If they realized that they could make short cables that fixed it so that the pairs were mapped properly to port pins, that's quite clever in my books!


At one of my local universities (also in the midwest US) ethernet jacks in the dorms were intentionally miswired so that only cables supplied by the university (and more specifically, their favored vendor) would work.

Of course you could always make your own cable, but this was in the 90's, when the tool for that was very expensive.


I would think so but I don't really know the ins and outs of wiring. I think about 5 or so years ago they remodeled the dorm and made it so you didn't need the custom cable.

They had a classroom in that dorm I taught in and remember near the end of the semester seeing signs telling people to remember to return their cable.


I agree on the "we will survive" fact. When buying something, of course the buyers they must be careful and get informed.

Having only one plug shape simplifies a lot, and this doesn't change how difficult is to choose a product. At all.


My concern is that within two years they will realize that the C plug doesn't support some new 2019 protocol and they will create USB Type D.


That's a legitimate concern but I have to say USB Type C is about as future proof as you can be at this point in time. So 2019, at least, is probably safe!

There were a lot of smart people working on that standard. In fact, amidst the uproar over the introduction of Apple's symmetrical Lightning cable and USB's opportunistic announcement that "Hey we're working on one too, honest!" people seemed to overlook that Apple had people working on the USB Type C committee as well.


But it's still too thick for charging my wireless earbuds so let's include type D mini... Or was that micro? So much for universal. USB has lost its way.


And we'll be better off in the long run.


we survived wired home networks because they are expensive, and a superior IO solution (wifi) caught up quickly.

Re usb-c: good luck not using the (only) io port on your device.


Superior if your a laptop user sitting on the back deck. I run backups/bluray movies/etc between my NAS and assorted devices. I will assure you that my desktop PC's, which are connected via 10GbaseT have ~30x faster access to said NAS than the small business class AC WIFI AP's I use. The latency, throughput and most importantly the reliability of a decent wired network is still far superior to wifi, and I have a far cleaner/stronger wifi system than the vast majority of wifi networks I've seen. Even so, I still run wires to my home theater devices because I don't have to worry about dropouts in the middle of streaming a bluray, or more importantly the random driver/etc bugs that seem far more frequent on wifi than boring old ethernet. Plus, with the advent of reasonable speed internet accesses (thanks google!) the WiFi is frequently the bottleneck. One room away and that AC 1200 router is only actually delivering 200Mbits, or for that matter the 20 closest neighbors APs/wireless phones/airport radar/etc are messing up the spectrum.


I think people are forgetting that most people (in the usa or globally) don't have the disposable income to (re)wire a house with ethernet. From a "humanity" perspective I stand by my point.

Now for the average HN reader, sure I'm wrong. Enjoy your first world problems :)


It cost me $250 ($100 of which was 300m of Cat6) and a very tedious, insulation filled, Sunday to wire my house up with 1Gbit ports in every room.

Comparing that to the price of the current batch of AC WiFi access points (that look like stealth bombers) I don't think it's too unreasonable.


You are not counting the cost of your expertise in networking and DIY. Most people don't get that for free.


Yeah these days even doing Cat 6 is pretty cheap. We ran Cat 6 throughout the apartment, and the biggest nuisance of it was figuring out the best way to do conduit (solid brick walls mean we can't just go in-wall).


Depends heavily on the house and where you are putting computers. A surprisingly large number of people have their computers right next to their cable modems.


In what universe is wifi a substitute for a proper wired network, for serious bandwidth and reliability? I've got so many other wifi routers polluting the radio waves that I can't stream Netflix on my TV that is 15 feet away from the router.

It's so bad that I bought a 50' ethernet cable and tacked it up along the ceiling from the router to a switch by my TV to create a redneck ethernet network. Works beautifully - well, not beautifully in the aesthetic sense, but it gets the job done well.


I had to do this too. There are at least 30 networks in range of my apartment at all times, so 2.4 Ghz WiFi is useless in the evenings. I'm glad there are few 5 Ghz networks here so my newer devices are still usable.


So why run your own Wifi hotspot? It sounds like your neighbours have already put the infrastructure in place for you!


5GHz has horrible range and wall penetration (by design), so it can't really get as cluttered as 2.4GHz which has decent out of line-of-sight performance


What are you talking about? WiFi is not superior. It's easier to wire all the desktops in your home rather than hoping your WiFi reaches. Ethernet is not very expensive. USB-C is good because it can use many protocols and pretty much the only reasons it is bad are: people that don't know what they're buying and people making cheap, out of standard knockoffs that will damage devices. This is mainly a manufacturer problem and will probably be fixed by the time the majority of people have USB-C ports.


This depends entirely on the size of your house. We're easily covered by a single wifi router. We also don't have easy to rewire walls... (Yay 1910 era construction. :) )


If your 110V wiring is vintage 1910, you should consider rewiring anyway. I was involved in renovating a 1923 building, and the old wiring was neat, but terrifying. Pulled it all out, didn't even want to think about anyone trying to run power through it.


Yeah, we've already upgraded the electric. Definitely could have rerun ethernet then, but did not want to deal with moving any outlets. (Or, really, thinking about the problem hard. We have barely 1000 sq ft, so not hard to cover with wifi.)


You don't have to move outlets, just combine them.


If I want outlets where my computers will be, I would have to move outlets.


Depends on your 1910 construction. My house is 1920s wooden frame. Makes it easy if you're brave enough to poke holes in the plasterboard.


Fair. My point is a lot more valid if you are tying to put something that needs a lot of space, like hvac. Wires are probably not that difficult to get in.


My Grace Digital music player is nearly unusable with wifi (see the Amazon reviews on it). But with wired, it works like a champ. Wired is simpler, faster, more secure, and more likely to work.


More secure? Everyone on the network can see everyone else's traffic wherease wifi only communicates between the router and each computer.


I think you have that backwards


you can have my ethernet cable when you pry it from my cold dead hands!


Exactly. Some WiFi is just too unstable, and I've decided that it's fine for a bit of evening web browsing. But in the office, I want the stability of ethernet.


The spec [1] is a long read, but interesting. Misc. stuff that's not well known:

- The 24 pin connector does not have symmetrical connections. The interface IC senses which way it is plugged in.

- It's still a master/slave system, but either side can be the USB master or the power master. Those need not be the same.

- Who powers whom is an interesting issue. It's up to the OS to decide. There's special support for the dead-battery case - what happens when you plug something with a dead battery into something else? Can you charge your phone from your laptop? Laptop from phone? Tablet from laptop? Phone from phone? It's complicated.

- There's something called the "billboard device", which is the interface IC's mechanism for sending error messages when both ends are not in agreement about modes. The devices at each end are supposed to display this information. Hopefully they do. At least the designers thought about this.

- Hubs are more restrictive. They don't pass through much more than USB mode and power. They don't pass any of the more exotic modes, like HDMI, since those are not multipoint protocols. It is supposed to be possible to pass power upstream through a hub, though.

- Anything with a female USB-C connector has to talk USB-C. It is prohibited to have cables with a female USB-C on one end and some other USB connector on the the other. Male USB-C to other USB is permitted, and will provide backwards compatibility.

- There are extensions defined for "proprietary charging methods" to allow higher current levels. (I wonder who wanted that?)

- There's a mode called "Debug Accessory Mode". This is totally different than normal operation and requires a special cable as a security measure. (In a regular cable, pins A5 and B5 are connected together and there's only one wire in the cable for them. Debug Test devices use a cable where pins A5 and B5 have their own wires and there's a voltage difference between them.) Debug Accessory Mode, once entered, is vendor-specific. It may include JTAG low-level hardware access. Look for exploits based on this. If you find an public charger with an attached USB-C cable, worry. Always use your own cable.

[1] http://www.usb.org/developers/docs/


- Who powers whom is an interesting issue. <...> It's complicated.

Yeah... I have a battery bank from Anker with two plugs: USB-A and USB-C. If I connect my Nexus 6p with USB-C/USB-C cable, everything is good, phone gets power. But if the only cable I have at the moment is USB-A/USB-C, then phone starts to charge the battery bank.


This is crazy. In all the battery packs I have seen so far, the convention seems to be that the micro USB port is for charging the battery pack itself and the type A port is for delivering power to the device to be charged. Looks like a new set of rules apply to the USB-C ports.


That has to be a spec violation of at least one part of the system. How can a USB-A port possibly ever receive power? Does it actually charge the battery or does the phone just think it's delivering power - perhaps because of a non-compliant cable?


That's a problem I've seen with cheap Chinese powered USB hubs. You plug them into the wall and your Raspberry Pi and all of a sudden your Pi turns on (even though its power is unplugged) but doesn't boot.


That may be a Raspberry Pi problem. The Pi's non-power USB ports are "On the go" ports, and potentially bi-directional. But they don't have the proper power circuitry and protection on the power side. The Pi's power USB port doesn't do power negotiation, so if connected to a power source that demands negotiation, it may only get the default per-negotiation 100mA, which isn't enough.

(I've been designing a board that gets power from USB, and had to read up on all this. The USB power system is complex but well designed, and if all the devices comply with the standards, immune to user mis-plugging. The problem is that doing it right usually requires an extra IC at each USB port just to manage power.[1] A lot of low-end devices don't do all this right.)

[1] http://www.linear.com/product/LTC4085


When you plug the phone in, a notification appears in the notification shade allowing you to choose how you want the power to flow. You should be able to choose to charge the phone, rather than use the phone as a charger. I know this option appears when using a Type C to Type C cable, but there's a chance it won't work with a Type C to Type A cable.


I have the same one - annoyingly I find the USB-C cable, when plugged in, isn't flush with the body of the battery.


Which hub? That doesn't even make sense since the only port the Anker accepts power in on is the USB-C port.


I'm not adding much here, but I wanted to say that comments like this are why I read HN.


The "usb-c female to other usb is not permitted" surprised me so I started searching why might be the reason. I've found this g+ post [1] which sheds a bit of light. It's actually good to know as I would've assumed that's ok to have this kind of connector. Probably is, if you know what you're connecting together. Btw you can buy those cable in Amazon :) [2]

[1] https://plus.google.com/+BensonLeung/posts/UFCHbSDRa2o [2] https://www.amazon.com/USB-C-Female-Macbook-Tablet-Mobile/dp...


Personally, I am rather sad to see the Magsafe connector go, and having to sacrifice IO for charging while mobile seems like quite the headscratcher.

It's somewhat funny that not only will you have to carry dongles for everything for a few years time, but also make sure you carry the right USB-C cables, as your friend's might not work. Yay we have superlight laptops, but need to carry a backpack full of spaghettied cables.


>having to sacrifice IO for charging while mobile

You're not sacrificing anything. The old MBP had two usb ports and the new one has 3 available while charging. The only thing that has changed is now you can use the charging port for IO when not charging. Framing this as a "sacrifice" doesn't make any sense.

>make sure you carry the right USB-C cables, as your friend's might not work

With the old Magsafe chargers, and for laptop chargers in general, it's the same deal. If you want to charge your laptop, you have to bring your charger. The fact that the charger is USB-C doesn't make the situation worse.


It's interesting to note that the fancy LG 5k monitor announced alongside the new MBP also charges the MBP. One cable connection and you have a monitor, a docking station with ports, and power. All while leaving three extra ports free.

IMO, that's the real potential of this style of multi-use connector.


In addition to spending $2400 on a basic 15" MBP, we can buy the fancy monitor that has actual USB connections for $1300 more! Don't forget a few $$$$ in dongles for the times when you aren't connected to the 5K beauty. +/- $4000 dollars for a "touch of genius"... Makes me vomit a little in my mouth.

I am the "tech guy" for my circle of friends. Apple has lost all recommendations from me.


Yes, because we should all still be using SCSI on our laptops. With cables thicker than your thumb that had to be handled with utmost care or else they'd fail. And single-purpose connectors, where even the compact form was bigger than a lot of device chargers these days.

USB-A is certainly ubiquitous, which is a good thing. But it was never going to last forever. USB-C vastly improves on the physical design (reversible, symmetric, easier insertion, etc.) and I expect will carry us forward for even longer than USB-A has.

Thus Apple and everyone else are moving towards adopting USB-C, which means there will be ever more widespread and cheap docking and peripheral options. Yes, these transitions are painful but at least there are dongles this time, which hasn't always been the case. Such transitions are also a part of living during the historically amazing times of an accelerated tech curve. Enjoy that while it lasts!


> Such transitions are also a part of living during the historically amazing times of an accelerated tech curve.

The MBP is not bleeding edge on that curve, it is (kind of, sort of) near the top-ish. They made a lot of changes for the Future! without adding, you know, actual cutting edge components (from ram to the cpu/gpu chip(s)).

The whole thing breaks down further when you compound this new (old) hardware with the problems macOS will inevitably have. As each release, in my experience, has had significant issues for months with sometimes show stopping bugs. Can't wait to be served an ad on that shitty little screen...


Why would you ever recommend a high end 15" laptop to average people in the first place. I'd agree with the other commenter that recommends a new tech guy for your friends.


Why? I don't know... maybe it has to do with not wanting to buy a shit laptop at bestbuy/costco and calling it a day. Anyone can do that, and a lot of my friends ask a very specific question: "Dude, I want a good laptop that will last a few years without breaking like the POS stuff I usually buy for $500 at Costco... what do you recommend?" I would say, head to our local Apple store. I am no longer going to say that.


Are they throwing their laptops down a flight of stairs on a regular basis? What exactly is breaking? My Dad is still rocking a 4 year old "POS $500 Costco laptop" and it runs as well as it did the day I bought it.

I added an SSD and upgraded the memory to 8GB and it's more than fast enough for everything the normal person would do outside of kids that are trying to game (in which case you shouldn't be buying a laptop anyway).


The MacBook Pro has USB connections. 4 of them.

Maybe your friends need a new tech guy.


Good point, and that's why, six months back, I chose to buy an iMac 5k instead of a laptop. I saved a couple of thousand dollars, got four USB ports instead of two, and a gorgeous 27-inch 5k monitor that I couldn't buy separately [1].

In my case, mobility wasn't a requirement, so a desktop was fine.

If mobility is a requirement for you, maybe you could buy an iPad Pro or a cheaper laptop than you otherwise would buy, like the single-port Macbook.

[1] It wasn't available in India, and even if it was, it wasn't compatible with most Macbooks. The single model it was compatible with, it required two cables to be plugged in. And the monitor would have been out-of-date quickly, since it didn't support Thunderbolt 3.


Humorously, the $1300 monitor only has USB-C ports so you'll still need dongles to attach all your USB-A devices.


Wait a year and it'll be half the price. Problem solved. :)


$1,300 is a pretty standard price for 5k in my experience.


That happened 5 years ago with the Cinema display


Thunderbolt Cinema Displays required two cables to the machine -- data and power. So, not quite, right?


But the old MBP had two thunderbolts and an HDMI connector, on top of the two USB. You could plug in 3 displays, charge it and still have room for a mouse and a keyboard. Now it's always a trade-off, so on top of DVI/HDMI/DP dongles, you'll have to carry a USB hub.


Where could you possibly go where (a) there are three monitors, a mouse, and keyboard waiting for you, and (b) none of those devices have a built-in USB hub, and you did not think to keep a USB hub there?

Either you're mobile and need the USB ports, or stationary and there's a hub there. What use case am I missing?


> Where could you possibly go where (a) there are three monitors, a mouse, and keyboard waiting for you, and (b) none of those devices have a built-in USB hub

That describes my workplace. (But with two rather than three monitors.) Standard monitors that operate over HDMI/DVI/etc don't have USB hubs without a separate USB connection, most keyboards are the same way.

> Either you're mobile and need the USB ports, or stationary and there's a hub there. What use case am I missing?

Sure, you can work around the lack of IO by buying a hub, but the point was that previously you would not have needed yet another dongle.


So buy the USB-C hub/dock/whatever and leave it at work. No need to carry it around with you. This isn't really any different than the case of a traditional laptop with a dock, except instead of some weird giant proprietary connector, you have a small standardized connector.


> USB-C hub/dock/whatever

There aren't that many good one's. What I would need would be one which handle's at least, 4-6 USB, two HDMI, Ethernet and charging would be a bonus.


Can I tempt you over to the Microsoft side? The Surface Dock has exactly those connections - I leave everything connected to the dock as my "workstation", and carry my Surface Book travelling when need be.


If you're being tempted over to the Microsoft side, you might as well just buy a Dell or HP or Thinkpad or whatever, and get Mac-level specs at half the price, and usually a good collection of today's standard ports.


It's hard to match the Surface Book on specs, at least if you care about the size and weight and resolution and touchscreen. I went shopping 6 months ago with requirements of: thin and light (13" or smaller), available in the UK, NVidia graphics, 12gb or more of RAM, reasonable processor, and touchscreen; I wasn't expecting to walk out with Microsoft hardware but the Surface Book was the only laptop I saw that (more-or-less - it's 13.5") matched up to that.


I think the Surface Pro is still pretty competitive, though the Book and Studio are really out there on price. But that's the great thing about the Microsoft side... You have choices like buying the HP it the Dell.


Who wants to run Windows?


Apple's idea is that your monitor will be the hub. See the new LG Ultrafine Display. Just connect your Mac with one cable and you're ready to work.


Can we say that, at this time, Apple's idea here is a bit dumb/half-baked/incongruent/silly/frustrating and it will materially hurt them in the longer term?


It's the short term where the pain will be felt if there is any. The long term is clear, USB-C throughout which I think is a good thing. What I'm confused about is why the iPhone7 was also not USB-C. I fully expect to see the next iPhone switch over.


while I 100% agree the next iPhone SHOULD switch over to USB-C I'd be very surprised if it actually does...Apple isn't going to want to go through a second port transition in such a short period of time (30 pin to lightening) and certainly won't want to change anything after the headphone port this year. Especially b/c next year is the 10th anniversary, and it's understood that they really want it to be something special,


I agree they won't change so soon, but I just realized there have been as many lightning models as 30 pin models.


Thing is, it's not "just" the connection. They are losing hearts and minds for many reasons... it is not easy to replace that.


None of what you said is remotely true. Betting entirely on USB-C as the one true connector is not half baked and is quite reasonable. And by using it solely it will force accessory manufacturers to build the docks, monitors etc.

This has all happened before with the original iMac. And it was universally seen in the long run as one of the best decisions they made.


On the contrary, it is exactly what I want. And also exactly what I do now with the 27" thunderbolt display. The same way my TV is now my universal HDMI switch.

Monitor-as-dock makes a great deal of sense for the roving user with a fixed base.


I wouldn't buy a $1300 monitor that worked only with a Thunderbolt 3 device. It must work with USB-C and DisplayPort, even if only at UHD resolution than 5K. If I buy a Windows laptop tomorrow, and it doesn't have Thunderbolt 3, I can't throw away my monitor and buy a new one. Or if I want to plug my new monitor into any existing Mac model, which doesn't have Thunderbolt 3.

I would be happy if an adapter exists for this job. The monitor needn't natively support DisplayPort.


That would be pretty cool actually, so only a single connection for everything? Like a docking station! Why isn't Apple making one?


Because Apple does not NEED to make every single necessary object in the universe. LG is making them, they are 5k, they have wide color gamut and 3x USB-C out, and they are in the Apple Store. That's good enough. This enables Apple to focus on what they are good at.


If we stretch this approach, Apple will become good at making the iPhone and nothing else.

The display was the soul of the desktop Mac. It's easier to perceive your Macbook as another PC when your window into it is another mostly generic-looking LG display, even if the panel in the Apple branded display was manufactured by LG.

Perhaps they want to focus on the iMac again and don't want the MBP + display combo to cannibalize it. But again, with extreme focus soon they'll be abandoning the iMac for good.

(I still love macOS and I enjoy how the new MBP looks)


They already did, it's called the Thunderbolt Display.


http://plugable.com/products/ud-ultcdl/

came up in a quick search. Looks good to me!


not available in my country tough.


Look at Thunderbolt 3 docks instead, theres a better selection of options...and I'm sure in the next few weeks there will be even more.


> That describes my workplace. (But with two rather than three monitors.) Standard monitors that operate over HDMI/DVI/etc don't have USB hubs without a separate USB connection, most keyboards are the same way.

But your run of the mill Dell office monitors do have a USB hub, most often 2 ports on the back panel for keyboard/mouse and 2 ports on the side for flash drives and other transient accessories.

colanderman's point is that you should get a USB cable and use that hub instead of individually plugging everything into the laptop. Three video connections and one hub connection gets you all the connectivity you need at a desk. If you're mobile, you'll certainly need less.


My run-of-the-mill Dell monitors at work don't have USB ports.


I have a Dell U2410, which, while not exactly run-of-the-mill (it is wide-gamut), isn't that fancy yet does have all the ports wlesieutre mentioned. (Granted, they require a separate USB connection, but it is an older monitor.)

At work I have an Apple monitor that has multiple USB ports and an Ethernet port on the back of it. I think it has speakers too. All that and the display run across the mini-DP connection. Again, not exactly run-of-the-mill, but well within the price range of Apple's target market, and given Apple's push in that direction, it's liable to be commonplace in a couple years anyway.


I'm a VJ. You absolutely need that many ports when doing visuals at a concert or venue. That was the whole reason I bought a MBP. Now it looks like PC might be a better choice; Apple has abandoned the professional "workhorse" market and is leaning towards the Facebook/email machine market.


You can now get four DisplayPort outputs from two ports: https://www.amazon.com/gp/product/B01F81EIBC/ref=ox_sc_act_t...

And they can all be 4k.


First of all, that's Windows only. Says so right there in the listing.

Second of all, it's yet another dongle I have to carry around and potentially lose. There's often not a whole lot of room on stage, so space is precious.

Thirdly, those dongles that convert video are usually slow garbage. If I am performing as a VJ, I need the video to be frame-synced and perfectly synced to the music without lag. I can do that just fine on my current MBP (run 3 monitors at once from the ports built-in to the computer. That's specifically why I bought it.) And now I can do that just fine from some Asus laptop since Apple doesn't seem to want my business anymore.


Product doesn't indicate support for MBP.


My office? I would suspect many, many offices of people who post here?


Yup. My MBP at work: one thunderbolt port for an external monitor, the other for ethernet. One USB port for my nice keyboard, the other free. And the magsafe for charging. So I'm currently using 4 of 5 connections, and the new MBP will only have 3?


You'll be able to use a single thunderbolt 3 port to get power, monitor, ethernet, and USB accessories, sit down, plug one cable in and you're off to the races.


So now you can plug a single cable in and get power, display, and accessories. Where's the problem?


I'm guessing people don't upgrade their TVs, monitors, and projectors as often as they upgrade their laptops. I can't even find a USB-C adapter that would work with my current display (which was made by Apple)


Here you go: Kanex Thunderbolt 3 to Thunderbolt Adapter https://www.amazon.com/dp/B01EJ4XL08/ref=cm_sw_r_cp_api_B5tf...


> (a) there are three monitors, a mouse, and keyboard waiting for you, and (b) none of those devices have a built-in USB hub.

I don't know of a monitor that doesn't require a USB A-B cable to provide USB, so you're looking at three $70 adapters, plus three USB A-B cables. Sure maybe all that lives at the office. It's still less convenient.

I still can't get over the fact that you can't plug your new iPhone 7 into your new Macbook Pro without a dongle.


That's kinda the point, the new USB-C monitors have built-in USB hubs and charging. So you only need one cable to plug in all of your monitors and USB devices. It's going to take a little while for everyone to get there, but for now there are ~$80 hubs that will break out all of these ports.

> I still can't get over the fact that you can't plug your new iPhone 7 into your new Macbook Pro without a dongle.

You've never been able to though? There is a USB-C to Lightning cable.


I think he means, it doesn't even come in the box. Buy a MacBook Pro, buy an iPhone, and you can't even use one to charge the other without buying a separate cable or an adapter.


> You've never been able to though? There is a USB-C to Lightning cable.

I think it was more a complaint about the fact that you'd have to buy an additional adapter to be able to connect the two.


So your argument is now not "you don't need to buy all manner of dongles" and instead "You just need to buy one of these new USB-C monitors"?


No, I just think the one cable solution with a USB-C monitor is really neat. I won't be getting one for personal use because they're too expensive right now.

I'll be picking up one of the $60-100 hubs/docks that have a single USB-C connector and HDMI/Ethernet/USB-A to connect all of my devices. When the monitors are cheaper in 4 years I'll buy one of those.

My point was that we're actually moving to a really good solution where you can plug in a single, non-proprietary cable and get all the throughput and peripheral connectors you could ever need. It's not even particularly expensive compared to a dock 5-10 years ago.


My Apple monitor at work provides USB and Ethernet over the same cable (mini-DP) it uses for video. Every day I go in I just connect the mini-DP and MagSafe (both of which come from the monitor) and magically all my peripherals are connected.

It seems like in the new USB-C world, I'd only have to connect one cable. Bonus points if it's sturdier than mini-DP and less flaky than MagSafe. Sure beats those weird-ass proprietary docks that used to be common ~10 years ago.


> It seems like in the new USB-C world, I'd only have to connect one cable.

That's something I've never seen properly explained. Can you charge at 100W at the same time as using Thunderbolt 3 connectivity? It strikes me as unlikely since the power delivery and the signaling would have to share the same pins/wires.

Edit: I stand corrected after seeing that new LG monitor. Now I'm just curious how it works. Even PoE can't deliver 100w and Thunderbolt 3 uses much higher speeds while having to inject power.


There's a couple reasons for PoE not supporting over 36W but USB-C supporting PD at up to 100W.

1) PoE only uses two pairs of wires for transmitting current, USB-C PD uses four pairs.

2) Ethernet cables are rated for much longer lengths, show me a 100m USB cable.

3) Ethernet is typically run in walls and more often than not is grouped into bundles, so heat dissipation is an issue.

4) PoE delivers power at ~50v, USB-C PD does 20v.

Also, piggybacking on the other commenter's reply, PoE voltage is carried on the same wires that are transmitting data (for 1000Base-T anyway)


There's no physical reason that power and signal can't share lines. Any particular device or protocol might fail to implement that, but PLC and even POTS show that it's possible and not even new.


The principle of power and signal separation has been used to send multiple signals over a wire since the duplex telegraph in 1872. It's one of the few things in this world older than the headphone jack standard that we've been using since 1878.


You may be surprised when you stick a multimeter into a standard POTS jack and read 48V. (Admittedly, the voltage is lower when the line is in use, but it's still there, which is why old-fashioned phones don't need separate power cords.)


The USB-C connector dedicates 4 pairs of pins (8 total) to power transmission (VBUS and GND). It only does that to give enough contact area and enough conductor circumference in the cable to support 100w transmission.

It also has two pairs of unshielded twisted pair for non-superspeed (USB 2.0) though only one pair is used in the cable apparently; which USB 2 channel you get depends on which way the cable is plugged in I guess.

It has a pair for sideband use. A pair for configuration and control.

The data primarily travels over the four high-speed differential signal pairs; two TX pairs and two RX pairs.

All USB 3 cables must have a chip in them. It communicates over the C&C channel to select how the high-speed pairs are used. When a display is connected, one TX/RX set are used for USB 3 and the other for DisplayPort.

USB 3 also supports "alternate" modes, so the C&C can negotiate that all high-speed pairs are used for Thunderbolt which is just the PCI-Express bus. Intel added the ability to interleave DisplayPort packets with the PCI-Express data. I assume that displays with USB 3 built-in are simply connecting a PCI-E USB hub to the Thunderbolt interface since in that mode there is no actual USB signal in the cable. USB 2 can use the dedicated signal pair for that and not interfere with DisplayPort or Thunderbolt traffic.

Thunderbolt is a nicer standard in some ways - interleaving means if you aren't using a 16bpp 5K display at max refresh rate you have more bandwidth available for other devices. With USB 3 you get half the available bandwidth if you connect a display. Thunderbolt is also old-school in the sense that it projects the CPU's bus to external peripherals... something all early computers used to do. Everything old is new again!

It is obvious USB-3 was very forward-looking. The C&C channel means some future version of the standard can drop USB-2 support and start connecting the unconnected pair if both ends negotiate for it. Now you have an extra TX/RX channel to give you +50% bandwidth. You can also imagine taking over the side channel pins. If you assume the next-gen standard can double USB-3's 10Gbps and add in double the data pairs that would be 40Gbps which matches Thunderbolt 3. Apply the same math to Thunderbolt and that's 160Gbps.


Work and home, traveling between them.


"and you did not think to keep a USB hub there?"


The new 5k monitor display[1] not only connects with a USB-C but it also provides charging power (85W) to the Macbook at the same time.

Plus, it has an additional 3 USB-C ports, meaning you plug one cable in to your Macbook Pro, and you are left with a total of 6 USB-C slots free, which you don't need because power is provided over the same cable anyway...

[1] http://www.apple.com/shop/product/HKN62LL/A/lg-ultrafine-5k-...


I wish it had an Ethernet port and a couple of legacy USB ports on the back to avoid the mess and cost of dongles.

The old Apple Lightning Display was a nice everything-dock.


Here in Europe we have the Philips Brilliance 258B6QUEB http://www.philips.co.uk/c-p/258B6QUEB_00/brilliance-lcd-mon...

It has exactly what you want: hook it up with USB-C, and on the back it has ethernet, three USB-A, audio in and audio out. Fantastic thing. Resolution is 2560x1440. http://images.philips.com/is/image/PhilipsConsumer/258B6QUEB...


There are a ton of good T-Bolt 3 dock options out there, none provide power at the moment, but I'm sure someone will have one on the market shortly.


Yeah, but your thunderbolt or USB-c monitor can now provide power, wired Ethernet, etc all over one of the 4 ports.


Or a Bluetooth mouse/keyboard (which may need their own chargers; hopefully will be chargable via usb-c directly in the future as things transition).


You're right that if you've maxed out the old MBP's ports, the new one supports fewer connections at once.

But I still think it's a net win because:

- We need to buy and manage fewer cables.

- We'll no longer have the problem of having a free port, but not the right kind.

- When we buy a device, we don't have to decide which ports it should plug into. Like, when I buy a monitor, I don't have to decide whether it should have DisplayPort, HDMI, both or either.


And having to use a special Apple dongle to connect normal mouse or flash disk is downright embarassing.


Nothing special about it, its standardized. Actually Apple just released a Notebook that has NO proprietary ports anymore. As a Surface User I find this astonishing and great.

On top of this I find it weird that people say they have to carry 20 Apple branded dongles. Depending on your use-case there is already type-c usb hubs with HDMI Ports, or Ethernet Ports, or both.[1]

The thing is, I love the USB-C TB3 Ports and I love Apple's decision to drag the industry kicking and screaming into a reality where an Office can have a Monitor that is a Thunderbolt 3 Hub, connected with Ethernet and peripherals, that work with both Windows and Mac. This will revolutionize flex-desk setups for big organizations and Co-Workingspaces everywhere. Imo it's worth to have this transition period where people drag a dongle with them to achieve this.

[1] https://www.amazon.com/Resolution-Aluminum-MacBook-ChromeBoo...


And then replace laptop with your phone. Continuum.


macbook users tend to overlap with iphone users. so there's that, no hub I know has also a thunderbolt. will happen given enough time and the current demand, but if this was the direction engineering went with the laptop, a revolutionary iphone would have had two usb c


> no hub I know has also a thunderbolt

Belkin, Elgato, OWC and others sell them. I use the Belkin one and it works great.


Correct, Apple's decision to insist on lightning for the iPhone is a bad one considering where the wind is blowing.


Why would it be a special Apple dongle? Any USB-C to USB-A adapter would work, until you get a USB-C mouse.


USB C mouse? Let me know when you can pick one of those up at Staples or Best Buy


Why the snark? It'll probably take a year before you can get a cheap throw-away mouse with USB-C at Staples or Best Buy.

But assuming you're a professional, you either already have a bluetooth mouse/trackpad or you'll want some quality and order a mouse from a reputable brand.


Does Logitech make a USB-C receiver for their wireless mouse yet?


Heh, that's an interesting point you make. Assuming you can't or don't want to use Bluetooth, I was surprised to see that in The Netherlands, there's only 1 brand that carries an USB-C mouse, and it's a budget brand as well: http://www.trust.com/en/product/20969-usb-c-retractable-mini...

So I have to admit it's indeed very, very slim pickings.


How about a $10 USB-C mouse from Amazon?


My flash drives will be a minor pain in the rear, but the disks I use more often (4TB+ externals) just take a C-to-B cable and I'm done.

I don't think this is going to be as big a deal as people say. Honestly, the biggest beef I have is removing the SD card slot, because I do a lot of photo/video work.


>> I don't think this is going to be as big a deal as people say.

I think it depends on the type of user you are. If you're a true "mobile user", it's probably not a big deal, because you put a value on mobility over other stuff.

If you treat your laptop as a "portable desktop", it can get annoying. The ability to not have dongles for everything is one less thing to think about. Yes, it's not the end of the world to have dongles, but if you're used to not needing dongles, being forced to use them is going to make you unhappy.


I'm not really a "mobile user". But I'll be happy to be able to run my entire desk off of one plug rather than the five (power, two monitors, two USB, one of which is a hub) that are currently plugged into my 15" MBP.


He said "sacrifice IO". The old MBP (which I am typing on) has two USB and two Thunderbolt ports. So a total of 4 IO while charging with the old MBP.


But the charging cable can still be used for IO, as demonstrated by that new LG display


Great catch! Seems obvious since Thunderbolt can be daisy-chained.


This one has 4 USB/thunderbolt/Charing ports. They are all identical.


You're not sacrificing anything. The old MBP had two usb ports and the new one has 3 available while charging. The only thing that has changed is now you can use the charging port for IO when not charging.

The MacBook Escape—which history says will be the most popular new MacBook Pro by far—has only two USB-C ports and a phono jack. There are no other ports at all.


I really dig that name, it'd be awesome if it catches on.


People don't use as many ports as you think they do.


Two USB ports PLUS two thunderbolt/mini display PLUS an HDMI and 1/8 analog, all while charging. That's 6 vs. 3 available ports.


> If you want to charge your laptop, you have to bring your charger.

That's not really correct regarding MacBooks. If you've got the same Magsafe adapter (gen 1 or gen 2) the charger will work. It just charges at a faster or slower rate than intentional.


Only one major generation of MacBooks (2013~2016) used a different MagSafe connector. AFAIK any charger will work with any MacBook, just at different rates of charging.


somewhere in Apple someone hated this - having a charger with a strain reducers will be amazing.


If you miss the safety aspect of MagSafe, Griffin Technology sell a USB-C charge cable for the 12" Macbook with a magsafe-like magnetic breakaway connector:

https://griffintechnology.com/us/breaksafe-magnetic-usb-c-po...

It's not Thunderbolt 3 compatible yet but making a version that is would appear to be a no-brainer.


Wattage issue aside, it's half an inch of protruding metal.

What I really want (and I'm prepared to pay up to $100 for) is a low-profile (near-flush-mount, maybe 3mm max) Magsafe 2 female to USB Type-C male adapter that I can leave permanently in one of the ports.

I've read that Apple have consistently declined to license the Magsafe patent, so I'd settle for a magsafe-like solution.


As I noted in another comment, it's 60W rated while the new charger is 87W http://www.apple.com/shop/product/MNF82LL/A/87w-usb-c-power-.... Confusion indeed.


Phil Schiller stated in the presentation that the new LG USB-C monitors provide enough power with 60W to charge the new 15" Macbook Pro models.

The charger offers 87W because it is meant to be able to provide enough power to charge the battery while the CPU/GPU/external devices run at 100%.

The worst you are going to experience with 60W while everything is running at 100% is that the battery isn't charging or charging very slow.


I have my MBPr (2013) running fine with a MB 60W Magsafe charger for overnight charging (think it's from my 2008 unibody MB). Been this way for 2 years, no problems.


If you're charging it overnight, you probably won't notice, but it does charge more slowly.

If you're running on the dedicated GPU with high load, a smaller adapter might not have enough power to charge the battery at all.


Yeah. You will notice that if the battery is dead flat it won't let you turn the laptop on for a bit with a 60W charger but with the 85W or whatever it is it will power up immediately.


I don't think it matters. I charge my MBP using a smaller MBA charger and vice versa with no ill effects.


It does matter that you're putting more current over the cable than it's designed for. Overloading conductors can be a melty, flamey time.


I think it just charges slower, it doesn't magically pump 87 watts over the cable, just 60.


The default condition of cables is that they can conduct much more current than is safe, and it takes extra effort to limit this. (Yes, there's a limit to what you can put through a wire, but it's enough to melt the metal, and you wasn't too avoid softening the plastic around it.) If you put twice the rated current through a cable, the conductors will handle it just fine. However, the resistance of the wires will dissipate 4× the power the insulation is designed for (twice the current times twice the voltage drop), and the inside of the isolation will get somewhat warmer. How much of a problem this is in real life depends on the ambient temperature, the thermal conductivity of the insulation, jacket, and wire, the temp the plastics can withstand, etc.

The safe thing is to use wire from a vendor that rates it honestly to conservatively, and start within the ratings.


Only if it detects it uses a 60W cable. Which I really doubt.

You can pump 100 watts through a 60W rated cable - it will probably work because the rule is "users are dumbasses, rate it for 60 on paper and 120 in reality".

Or it could heat up, maybe to the point of damage.


You guys really should spend a few moments doing a quick google search before responding. USB doesn't just magically deliver power, its negotiated between the device and the power source. In the case of USB-C, the cable is involved in the negotiation.

See slides 7&9 here specifically: http://www.usb.org/developers/powerdelivery/PD_1.0_Introduct...

More details here: http://www.usb.org/developers/powerdelivery/


Until that negotiation doesn't quite work right.

https://plus.google.com/+BensonLeung/posts/TkAnhK84TT7


It's a whole 60W A/C adapter package, not just the cable. The adapter isn't pumping out more wattage than it's rated for.


It doesn't look like it. All of the pictures show just a cable, and the description has

> In the box: BreakSafe USB-C cable with quick disconnect magnetic connector


Yes, just because you charge up a computer with a 60W A/C adapter but it can take 87W, doesn't mean the A/C adapter is all of a sudden putting out 87W. The cable itself doesn't create more or less flow.


i live offgrid and observe my 13" MBP with an 85w power supply actually pulling around 10-19 watts during typical usage...(my inverter has an output readout) FWIW... doubtless heavy gaming would be different... compiling gets it up to the upper 30s...


How do you know?

There is nothing in the cable that would prevent the current from flowing.


[flagged]


It should still have 87W through, might heat up a tiny bit more, but I really, REALLY doubt that a power cord rated for 60W would be unable to handle 87W too, the safety margin would be way too low.

Also, your dramatic learning experience is dumb, shorting a car battery will do probably do a few hundred amperes, two orders of magnitute more than in any laptop charger ever...


USB cables use resistors to indicate how much current they are rated for, so the charger would limit the current to avoid melting your cable.

(That's why out-of-spec cables are dangerous)


Well... Anything that requires more than 3 pins (which is apparently all that power requires) is going to much harder to build. A 24 pin, magnetically aligned connector seems pretty difficult to me.


As I've mentioned before, they really should have recessed this port so a solution like BreakSafe could be done and not stick out like an exposed wart. Ideally such a device would also be first-party (with 3rd party version available to reduce price).


That's a really big wart on the outside of the computer. Hopefully it won't need to be that big on the Mac.


It seems more likely that we're in the midst of the early days of a transition away from USB-A cables to USB-C for most uses. There are a lot of advantages, namely connector size, reversibility, full power delivery, the unification of ports with Thunderbolt 3 and DisplayPort, etc. Plus, it's designed to be future proof for some time. Maybe I've missed something, but is there a specific benefit to USB-A beyond the current ecosystem and common use? Not that I'm discounting that. I'm not a dongle fan by any stretch of the imagination, though small USB-C hubs can alleviate the use of multiple dongles somewhat.

But I think that, as peripherals and other devices begin to switch over to USB-C, we'll eventually see dongle use become less and less common except for older peripherals and devices. And that's nothing new. Eventually, it'll go the way of the PS/2 port: included on motherboards, but not really used much (relatively speaking; the analogy breaks down a bit when you think of gamers and the mechanical keyboard community, where it still gets some use for n-key rollover).

Transitions between interfaces, as a matter of principle, suck. But we wind up going along with them because there's benefit to do so. I'm curious how this will play out in five, or even ten, years. Not that I want those years to speed by to find out :).


n-key rollover is a bit sad: there's nothing preventing USB from offering more than 6-key rollover---it's just not in the basic standard.


Razor or somebody could introduce a protocol using iso or interrupt or something and others could adopt it, the device would still support HID/bootp so it would retain compatibility. This would also offer better support for analog inputs or higher resolution mice.


Maybe I've missed something, but is there a specific benefit to USB-A beyond the current ecosystem and common use?

I'm worried about the DRM situation, myself. My suspicion is that all of the engineering effort that should have gone into ensuring cable quality and spec compliance is instead being spent on new, innovative ways to lock me out of my own hardware.


USB-A was large enough to fit entire devices (wireless, storage, crypto) inside the port, whereas the USB-C equivalents protrude, and risk getting snapped off.


We'll almost certainly see all those things shrink to USB-C size.


Maybe there could be some kind of switch program. Exchange old cables for USB-C + adapters at amortized costs... Just wondering.


Eventually there should be type-c charge cables that also act as usb hubs (if there aren't already). This is part of the promise of unifying charging and data into one port.


I have a Dell XPS13 Developer Edition that came with such a hub (they call it a dock, but you don't dock to it, you plug it in to the charge port with a USB-C cable). It has DVI, USB, Ethernet, and charges the laptop.

It's a nice concept but it does not work well under linux. Most times I plug in the dock, a random port will not work.


That wouldn't be the TB15 dock? I have one and it has issues even on windows sometimes.


The new 5k monitor sold by Apple[1] has display, charge power (back to the Macbook) and a hub for 3 extra ports all off a single cable.

[1] http://www.apple.com/shop/product/HKN62LL/A/lg-ultrafine-5k-...


Right, we are definitely seeing this happening already, but I meant specifically chargers for on the go. There's no really good reason they can't also be hubs. There just aren't a lot of them yet.


Apple already sells something like that:

http://www.apple.com/shop/product/MJ1K2AM/A/usb-c-digital-av...

You can pass power through it and it adds USB-A and HDMI.


The removable cable is a huge win, though. The magsafe cables are horrendously prone to fraying and cable jacket tears. My own Magsafe 2 cable's jacket inexplicably tore after a month of using it.


There's pretty interesting options that we didn't have before with Magsafe.

You can connect a single USB-C port to a monitor that has a USB-C video in. This monitor will then charge your Macbook and on top of that if the monitor has a built in USB hub it will also do that.

Or if your monitor doesn't support this then you can buy a dock/hub to which you connect your charger, monitors etc and the Mac with one cable.

For me personally that's what I've been missing most since I switched from Lenovo to Macs a few years ago - a docking station. For me plugging in only a single cable is as good as that.



Except... BreakSafe is rated up to 60 watts (20 volts @ 3 amps) but the new Apple MacBook Pro is 87W http://www.apple.com/shop/product/MNF82LL/A/87w-usb-c-power-....


60W is fine unless you're pegging your CPU/GPU constantly.


For anyone reading this thread, using a 60W cabled for a >60W load is NOT FINE.

Quote: "60W is fine unless you're pegging your CPU/GPU constantly."

Translation: Your house will only burn down if your laptop goes into a busy loop.


"MacBook Pro (15-inch, Late 2016) draws up to 85W. Use the Apple USB-C charge cable that came with your MacBook Pro, or a certified USB-C cable supporting 5A (100W), to power and charge your MacBook Pro (15-inch, Late 2016) at its full capability.

"MacBook Pro (13-inch, Late 2016, Four Thunderbolt 3 Ports) and MacBook (13-inch, Late 2016, Two Thunderbolt 3 Ports) draw up to 60W."

"You should not connect any power supply that exceeds 100W, as it might damage your Mac.

"Using a power supply that doesn't provide sufficient power can result in slow or delayed charging. It's best to use the power supply that came with your Mac.

"MacBook Pro can receive a maximum of 60W of power through the Apple USB-C VGA Multiport Adapter or USB-C Digital AV Multiport Adapter. For the best charging performance on MacBook Pro (15-inch, Late 2016), connect the power supply directly to your Mac."

Source: Apple https://support.apple.com/en-us/HT207256


You're incorrect. Load is negotiated during cabling process.

Sure a bad USB-C cable that doesn't negotiate properly may be a danger, but likely the Macbook simply won't charge on such a cable.

I trust Apple knows what they're dealing with (in regards to poor 3rd party cabling) and not end up like a Note7 situation.


But the charger will output 87W, the concern is whether it is safe to put that through this cable.


Yes it is designed for the MacBook not the Pro.


This is a problem, but only for users of the 15" model. The 13" pro (both versions) will only draw 60 watts.

https://support.apple.com/en-us/HT207256


and having to sacrifice IO for charging while mobile seems like quite the headscratcher.

I have an odd hypothesis about this --- the less time people spend charging/running from AC, which is inevitably going to happen when there is no separate power input that can be used easily, the more they'll be cycling the batteries, and the faster they'll wear out.


Whilst the Magsafe port has saved my bacon when tripping on a power cord its second variant Magsafe 2 has proved to be a tremendous headache.

On my 2015 Macbook Pro the port seems pretty finicky and requires fiddling with to get it to light up. The Magsafe connector on my Thunderbolt Display is even worse and a real lesson in patience. It's a Magsafe 1 port which requires a Magsafe 2 adapter - this thing is quite possibly the flakiest Apple hardware I've ever used - needing constant cleaning, fiddling with and blowing on. An absolute and utter piece of garbage - so yeah, I'm not going to miss Magsafe (version 2 at least). Although I suspect I might regret this statement the day I trip over my shiny Thunderbolt 3 cable...


To the downvoters regarding my post about Magsafe being flawed. Here's a reference, the Magsafe adapter scores 1.5 stars and has mostly negative feedback : http://www.apple.com/uk/shop/reviews/MC556B/C/apple-85w-mags...


It has been like that for years. Look at this partial list https://dcs.rutgers.edu/dcs-faculty-resources/classroom-tech...

For every button apple removes they add at least one dongle.


Magsafe was fundamentally flawed. It required a delicate thin and light power cable to work, which was subject to mechanical failure.


I liked when we just had USB2 and all you had to do was try to plug it in once, flip it over, try to plug it in again, flip it over once more, now it plugs in, and now you're reasonably sure it'll work.

And now the near-future seems to be full of dongles, shame.


I agree, we are moving away from "if it's the right shape it's compatible" to "everything fits in the port but they also need to speak the same protocol to talk".

This isn't completely new, a lot of non-standard USB hardware needed a driver (possibly in userspace) to work. However it is going to be much wider.

I think we will need to find methods to advertising to consumers how to figure out if things work together. The single port solution is technically superior but there are definitely human disadvantages that I think we can overcome with a little work.


You forgot the step where you realize it's in an Ethernet jack.


At least if you tried to plug a USB cable into an ethernet jack it would be immediately obvious that it wouldn't work. In today's world the wrong cable will plug in to the wrong port just fine, and you can be scratching your head for a long time trying to figure out why it's not working. And, if you are particularly unlucky, while you are puzzling it out your device can fry.

[UPDATE] Well, I stand corrected. It is actually possible to stuff a USB A connector into an ethernet socket. (You learn something new every day.) But come on, folks, telling the difference between a USB socket and an ethernet socket is really not that hard. You don't even have to look, you can do it by feel. But all these different USB-C/Thunderbolt ports are physically identical. They are literally and by design impossible to distinguish.


> At least if you tried to plug a USB cable into an ethernet jack it would be immediately obvious that it wouldn't work.

Not necessarily. Ethernet jacks are often wide enough that a USB-A plug can be partially inserted, and it's pretty common to stack the two types of ports. If one isn't looking directly at the connector (common when dealing with the back side of a tower or docking station), it's a surprisingly easy mistake to make.


That's a problem that can be solved instantly with a visual inspection. Not necessarily the same for a protocol mismatch.


I have accidentally put a USB device in my laptops ethernet port several times because the laptop is on the back corner of my desk and all the ports are on the back so the cables all go behind the desk but because of the proximity to the wall I can't easily see the ports. USB and eth are right next to eachother. Wouldn't have it any other way. Don't know what I will replace my T420s' with.


"I liked when we just had USB2 and all you had to do was try to plug it in once, flip it over, try to plug it in again, flip it over once more, now it plugs in, and now you're reasonably sure it'll work."

Ah yes, the old proof that USB-A connectors exist in the 4th dimension. :)


> Ah yes, the old proof that USB-A connectors exist in the 4th dimension. :)

No, it proves that USB-A connectors have half-integer spin:

> https://en.wikipedia.org/w/index.php?title=Spin-%C2%BD&oldid...


Life pro tip : The correct orientation in which a usb-cable is to be connected is usually indicated by the presence of the usb symbol.


Unless the people making the cable decided to put their logo on top and the USB symbol on the bottom. I have one such cable from belkin.


> And now the near-future seems to be full of dongles

That's not necessarily a bad thing. With the right dongle you can now connect almost anything you want to a laptop. It was just a few years ago that it was simply impossible to connect some devices externally or at minimum there was a major performance penalty.


> try to plug it in once, flip it over, try to plug it in again, flip it over

"logo up or to the left"

All these old adages will be lost in Internet Time, like tears in the rain...


That doesn't really works when whatever you are inserting in is also flipped in the wrong direction.

And sometimes it takes a long time to look for the logo (both side has different logo etc.)


Yes, having to flip your connector twice is a pain, but having to flip your drivers twice is going to be a much bigger problem to a lot of people.


It is not easy to see, but we ARE on a converging path, instead of the other way around. Current high-speed off-board point-to-point data links (SATA, USB3, DisplayPort, PCIe, etc.) have converged onto some sort of 8b/10b differential signaling. We used to have totally separate OSI stacks, but now we are seeing potential to leverage the same physical layer (i.e. USB Type-C). Sure, we would have to carry different protocols, but I am optimistic --- eventually ASIC makers roll out adaptive chips (just like the cross-wire RJ45 @jws mentioned was solved by Auto MDI-X) that are smart enough to negotiate the correct protocol (MAC layer upwards) between the two sides.


I dunno... I've been meaning to write a post about multi-protocols on the same connector for a while.

Thunderbolt can run across DisplayPort and now both can run over USB-C. M.2 NGFF sockets can run mSATA, PCIe SSDs (either in AHCI or NVME), but not every M.2 or mini-PCIe Wi-Fi card will work in every laptop because of blacklisting (wtf?) even though they both attach to the exact same PCIe bus. Some M.2 sockets are only mSATA and can't take pcie/nvme drives. I'm pretty sure any M.2 drive can attach to the mSATA controller via an mPCIe adapter, but I'm willing to be there are some board that don't support that.

Why the hell are we doing multiple protocols on the same wire? It's beyond confusing. I thought with USB3.1, at least there'd be a universal cable, and I didn't realize until reading this article that the cables themselves are different per protocol!

I agree with what the writer is getting at. This is really confusing and it all feels like weird questionable design decisions.


> the cables themselves are different per protocol

I feel this is a transient rather than inherent problem of running different stacks over the same physical layer. If the physical layer can "miss pairs" then I agree we lose the whole point.

One way is to have some sort of official certificate system enforced by USB.org and adopted by major manufacturers. Basically "guaranteed to support a few pre-defined subsets of protocols".


> If the physical layer can "miss pairs" then I agree we lose the whole point.

It's not about missing pairs, Thunderbolt 3 cables are active in the sense that they contain electronics. That's why a Thunderbolt USB-C cable has much larger plugs than a regular USB-C cable.


Having never used Thunderbolt myself, I feel this "cable chip" is an intentionally retarded design that violates the end-to-end principle. Is there anything technical that prevents it from being absorbed into interface controller chips inside devices?


I'm not a Thunderbolt expert, but I think the reason is twofold.

Thunderbolt was originally designed as an optical link and optical connectors are highly problematic. Misalign the fiber just a tiny bit and you have considerable losses. So they went with a hybrid design where the converter is in the plugs and the connector is electrical. Later it was changed to all electrical but the design where the connection between the two plugs and the plugs and the devices are two separate things was retained.

The second reason is that this design still makes sense. With traditional designs the driver in the end device is always a compromise. For example the Ethernet driver has to be able to drive lines up to 100m long, but how many Ethernet lines are really that long? For Thunderbolt they dis-coupled the physical characteristics of the line driver from the driver in the end device. They basically move all the problems of the physical connection (line length, EMV and so on) form the end device to the cable.


It might be providing timing information for the SERDES based on the length of the cord.

Edit: it looks like it might be that and/or an active PHY similar to 10Gbps Ethernet.


The cables are generally backwards-compatible, but most people will buy the cheaper cables. And those won't work.


And then there's this: http://www.macrumors.com/2016/10/28/macbook-pro-tb3-reduced-...

"MacBook Pro (13-inch, Late 2016, Four Thunderbolt 3 Ports) The two right-hand ports deliver Thunderbolt 3 functionality, but have reduced PCI Express bandwidth."


Yes, because the PCIe root complex in the CPU can only connect one other device besides the southbridge, and that's used for the Thunderbolt controller on the left handside. The second Thunderbolt controller is connected to the southbridge (as are all the other PCIe peripherals), so it doesn't have the same number of PCIe lanes available as the one directly connected to the root complex.

Apple could have solved this by connecting a PCIe switch to the root complex and attaching both Thunderbolt controllers below it, but that would have consumed additional energy. Alternatively they could have used a beefier CPU with more PCIe root ports on the CPU, but I guess those available would have been too energy hungry. Which kind of means this is Intel's fault for not providing a low-energy chip with enough PCIe root ports on the CPU.

I'm wondering what the situation is like on the 15" version with discrete graphics. This would require 3 root ports directly on the CPU to drive both Thunderbolt controllers and the GPU with full speed, I assume that's indeed the case since it's not mentioned in the document.

Another thing not mentioned in the document is that energy consumption will be suboptimal if one device is attached on both sides of the machine because it prevents one of the Thunderbolt controllers from powering down. One should connect both devices on one side to improve battery life.

Edit: On Skylake the PCH is apparently optional, the functionality is mostly integrated into the CPU, so the limitation is really the number of lanes provided by the CPU, and this wasn't sufficient to connect both Thunderbolt controllers with 4x. The CPUs used in the 13" model all have 12 lanes, the ones in the 15" model have 16 lanes. So for the top-of-the-line model this could be 4x for each of the Thunderbolt controllers, 4x for the GPU, 2x for the SSD, 1x for Wifi, 1x for HD Audio?


Re:your edit, the PCIe lane configurations from the CPU aren't that flexible. There is no 4x+4x+4x+2x+1x+1x configuration.

4x Thunderbolt + 4x Thunderbolt + 8x GPU, with everything else on the PCH would make sense for the 15". Or maybe they connected the SSD (also 4x PCIe) to the CPU, and one of the Thunderbolt controllers to the PCH.

Hopefully they used the full set of CPU lanes. Most laptop manufacturers have a tendency to under utilize the CPU lanes and put things on the bandwidth constrained PCH for some reason.


This feels like something Steve would not have let out the door, those kind of rules should never be the customer's concern.


Yes, he did. As nsxwolf commented, the 2010 MacBook Pro had USB 2 ports that ran at different speeds. And I remember vaguely that that wasn’t the first time.

It’s just not your problem because, really, how many people are going to be inconvenienced by the right Thunderbolt ports being slightly slower than the left Thunderbolt ports?


Full Thunderbolt 3 for all four ports on the 15"?

I mean, I'm not sure how you can get 40 GBit/s with 4x PCIE 3.0 in the first place, every quote I find says 32 GBit/s. Maybe there is that much overhead in Thunderbolt.

But surely 4x40 Gbit/s would require 16 lanes at least. I don't think Intel makes any consumer CPUs with more than 16.


The 40 GBit/s is definitely confusing.

My understanding is that that's total bandwidth across all protocols. So the mix of Displayport and PCIe 3.0 can't exceed 40 GBit/s. The 4x PCIe 3.0 on it's own is 32 Gbit/s, as you mentioned.

Each controller provides two Thunderbolt ports, which share bandwidth. For the 15", 4x PCIe to the left side Thunderbolt controller, 4x PCIe to the right side Thunderbolt controller, and 8x to the GPU would be a sensible configuration. Though who knows if Apple took this approach.


What do you mean by consumer CPUs?

I've got an ivybridge i7 with 40 lanes right now in my system.


4x PCIe 3.0 (~30Gbit) + USB 3.1 Gen 2 (10 Gbit) = the 40 GBit number quoted.


Apple has a "long" history of differentiating ports by left and right (e.g. https://news.ycombinator.com/item?id=12711127) which has been frustrating to me. For instance sometimes the left USB port "dies" after a sleep, while the right one keeps working. I agree it's going to be a mess for a long while, similar to HDMI cables that work or don't without any clear reason for the casual buyer.


Wasn't USB supposed to be the connector to rule them all? Now it seems we're going backwards where just because you have a connector doesn't mean it supports what you want.


That was always going to be the end result of a single connector standard. Not all devices are going to support all modes of use.

In the end, what's really going to dictate how this shakes out is what chipset manufacturers decide to do. And there aren't as many of those as there used to be. Most likely, eventually, all intel chipsets are going to do thunderbolt 3 over usb type-c and everyone else will follow.


Exactly. Bluetooth for example is the same: there are loads of different authentication and communication schemes depending on the device used. The difference of course is that it's mainly the designer who has to worry about it.


We are not going backwards. Only Apple is.


> Things only Apple can do


And some vendors are apparently not having all ports do all things, so two USB-C ports, one can charge at high power and one can't, but from the outside they look identical, plugging into either can charge, but the low power one will take forever.

While I'm a big fan of backward compatibility, I feel that at some point it is better to start fresh rather than try to wedge another solution into the same mechanical configuration. And while I get that people didn't appreciate motherboards that went ISA->PCI->AGP->PCI_e, it saved people from the frustration of plugging cards in that wouldn't work.


So one of the things I wonder here is, if I plug all 4 ports into chargers, will it charge faster? I know the onboard battery is current limited, but I wonder how it works in practice.


Since there will be many occasions where power is being passed in on more than one of the USB-C connectors, the new Macs are designed to take power from the ONE charger that offers the highest power (capped at some limit) and reject power from all others. So, plugging in all four will NOT charge faster than plugging in just the one offering the highest power. This is a very explicit, intentional design decision, not some hidden side effect of something else.


That also makes me wonder whether devices that both charge and provide power get confused. Is there a hierarchy of charging beyond simple amperes provided? Otherwise I could see plugging in a spec-max 100w portable battery and draining it prior to pulling from the wall charger.


>Your MacBook Pro only draws power one power supply, even if more than one is attached—so using multiple power supplies will not speed up charging.

https://support.apple.com/en-us/HT207256 (scroll down or click "Charging MacBook Pro")


Interesting! I hadn't even considered this.


I once had a large plastic tub, full of SCSI cables, there were around 10 kinds of connectors, in both male and female configurations and about 10 different kinds of cables. Disk drives would have one kind of connector and often computers would have a different style connector necessitating lots of A to B connection issues. And the cables were expensive, often over $100. It was an absolute mess.

It seems that the USB-C connector, finally, represents a small, robust, easy to use connector that is capable of high data bandwidths.

It wouldn't make anything any better to have different connectors and different cables for charging, mice, keyboards, disk drives, monitors, etc. I just hope that I'll be able to get by with a handful of different lengths of the highest end cables (e.g. the thunderbolt 3) and use them for everything.


Read the article. That's crux of the complaint -- that there isn't just one type of cable for USB-C. Heck, the USB-C power cable that comes with the the new MBPs can only do power and USB2 data. See for yourself: http://www.apple.com/shop/product/MLL82AM/A/usb-c-charge-cab...


Yes, maybe I wasn't clear. The USB-C connectors look like they will simplify the end points so that's an improvement. We should need fewer cables.

I'm hoping that by buying the more capable cables (the Thunderbolt 3 versions) that I will be able to use them on all of my devices, even the ones that don't require Thunderbolt 3 like my 12" MB where I only use the USB-C connector for charging, USB3, and sometimes an external monitor (not at Thunderbolt 3 speed).


Yeah but, I think the point is you will never be able to do this. There is no "ultimate" USB C cable that can just do everything. You have to use the cable that ships with your device, or risk, not only getting the correct performance, but actually damaging your hardware.


The only problem is that TB3 cables seem to be length limited, compared to USB PD cables.


There's actually a semi-legit reason to make a usb2-only c-c cable: because it can get away with not having the high-(super)speed differential pairs, it can be thinner, lighter and cheaper than a full-function cable. Compare to charge-only microusb cables - they are indistinguishable from real cables, but lack critical functionality. If they were easily distinguishable, this would not be nearly as much of a problem.

One really major (to me at least) concern with moving from USB to thunderbolt is that thunderbolt is a PCIe connection, with the same security issues as firewire (a device can basically access all your RAM, extract keys and passwords, plant exploits etc). By bundling that into the same form factor as the (by comparison) far safer USB and hdmi/displayport we're putting users at risk.


Recent MacBook pros use the IOMMU for isolating PCI devices. With that, the devices can't read arbitrary ram (if Apple configures it correctly).


Back when we had different connectors for different things, we knew one thing: if it fit, it worked. But the proliferation of incompatible connectors, driven by the advancement of technology, meant that nothing fit! So we created one connector to rule them all: USB-C. Now everything fits, but nothing works.


"Everything fits but nothing works" - I love that!


This would probably be too hard & expensive to implement, but it seems as the most user-friendly solution: create connection protocol where both ends send all possible protocols they can work with (standard codes could probably be established) and the cable would include small chip & LED diode in both connectors - when you connect two devices, it would either light up green (both ends share at least one protocol) or red (incompatible).

For example, I plug one end into my phone and the other one into my sound card - it the phone doesn't support external sound cards, the cable would light up red - otherwise, it would flash green for few seconds, then turn the light off.

Another (maybe more practical solution) would be to implement this in both ports, but some vendors may not comply with the specification. (in the cable, you could only buy those cables that support the negotiation protocol - it would be your choice if you decide to pay a bit more for a better cable)


I say establish matched color coded ends for both the cables and rings around the plug ports.


I think that's too subtle for many consumers and it has been hard for manufacturers to stick to that, especially when selling cords in specific shades of colors is good product differentiation.


It worked for PS/2 keyboards and mice. When Microsoft first introduced PC'97 everyone mocked it, saying that even if the computer had coloured ports you never had a coloured keyboard and mouse or vice versa - but eventually after a few years the standard became established enough. And that standard relied on some truly nasty shades of purple and green.


Food for thought: will USB-C be the "last" standard connector? Speaking in terms of the physical connector, not the data protocol. I'm sure it will eventually prove not to be, but it's got a lot going for it and I suspect it will last for a very long time. If USB-A was the dominant connector for nearly 20 years I think C could see a run of 50 years or more. RJ45 connectors are around 40 years old and aren't going anywhere soon. I wonder what the qualities would be of a connector to replace USB-C.


Well, Lightning (which is older) is already significantly thinner than USB-C, 1.5mm rather than 2.6mm, though it has fewer pins. Looking at this visual comparison, I'm a little concerned that USB-C will start being too thick if the phone thinness war ever starts up again:

http://josh-ua.co/blog/2015/3/15/usb-c-dimensions-size-compa...

Lightning also differs in not being hollow on the male side, which, aside from reducing thickness, apparently has both advantages and disadvantages for durability.


Wow, This is an interesting idea. It looks like using wires for I/O will be be outmoded in the not too distant future and who knows, maybe in 50 years they may be outmoded even for recharging. So this could truly be the last universal connector standards.


[flagged]


Since they're far from the only manufacturers of USB-C cables and don't own any controlling IP (like they do for Lightning for example) I doubt they will. As long as I can get compatible cables from legitimate manufacturers I don't much care what Apple charges for their versions.


Why doesn't the USB consortium standardize (and, ideally, enforce) labeling of ports and cables by capabilities? Kind of like washing instructions labels on clothes, only printed on the cable.

The ports on a laptop wouldn't have to be physically labeled if the OS could display a list of their capabilities in a user-friendly manner. Or, perhaps, they should have the most important label (e.g. thunderbolt or not). Something the committee would decide.


Well the USB IF does have standard naming and labels for the protocols. The issue is that USB Type-C is a cable/port standard, not a protocol standard. Makers ought to label the USB Type-C port with the highest USB speed it can handle ("SuperSpeed" or "SuperSpeed+" typically) but they don't. And Apple left the Thunderbolt icon off the ports on the new MacBook Pro! So buyers have no immediate way to know if a MacBook supports USB 3.1 gen 1, 3.1 gen 2, or TB3. (hint: gen1)


How would a consortium force you to print stuff on the cables you produce/sell?


By making it one of the conditions for licensing the specification and permission to use the USB trademark to you.


Well it worked for the color of the ports up until USB3.


USB3 ports and cables are (when spec compliant) easily distinguishable from USB2 due to them being blue. Why was the same not done for USB-C (black for USB3, red for Thunderbolt)?


How is it going to help to a regular user? I doubt regular users even know what those colors are for, adding more colors just brings more complexion


At least you (or technical support) can look up the difference. If the cables all look the same then you need to physically test the cables to know their capabilities, which is a waste of time and resources.


Colour is one of the easiest communication methods we have with 'Regular users' as they already use them as a natural key elsewhere - even things like credit/loyalty cards (Chase, Virgin) or games consoles (green vs blue vs white)


Very interesting article.

Can here anybody maybe even explain a little bit more about the video (Displayport) alternate mode? As far as I understand now both USB3 and Thunderbolt support it, but they support it with a different Displayport standard. How will that work if I plug in a future monitor with USB-C? Will there first be some negotiation in which both devices clarify whether to use USB oder Thunderbolt. And then another one in which the alternate mode is set? Or is displayport directly available on some dedicated pins of the cable and if yes, would it be the same for both cases? Or is displayport somehow modulated/multiplexed on the remaining data stream, and in a different fashion for USB3 than for Thunderbolt?


"Alternate Mode" means "using the same USB Type-C pins for other protocols". That protocol can be HDMI or DisplayPort or Thunderbolt or even analog audio!

The USB consortium specifies using HDMI 1.4b and DisplayPort 1.3 (and MHL 3.0) on the Type-C port. So non-Thunderbolt machines have these specs as a maximum.

Thunderbolt 3 can also pass video signals, and Intel specifies different versions of the protocols: HDMI 2.0 and DisplayPort 1.2. Again, these are the maximums.

So if you have a USB Type-C monitor (not Thunderbolt) or a native USB Type-C to video cable, you're limited to HDMI 1.4b (1x4K at 30 Hz) but instead you might be able to use DisplayPort 1.3 (1x 4K at 120 Hz). If your video card, connector, cable, and display supports it, of course.

If you have a Thunderbolt-native monitor, you might be able to do 2x DisplayPort 1.2 (4K at 75 Hz or 5K at 30 Hz) or 2x HDMI 2.0 (4K at 60 Hz and much more). If your video card, cable, and the Thunderbolt controller support it.

Essentially, you can pass video directly over the port (in USB Alternate Mode) or over Thunderbolt. Then there are external video adapters that use USB or Thunderbolt data. But that's not in scope of your question.


Thanks for the answer. Also to fragmede.

That means from USB-C I would either go to thunderbolt or directly to displayport AM. But how would it go on from Thunderbolt mode to video (displayport 1.2 version that intel specified)? Would Thunderbolt again dedicate some pins for it or is the signal somehow multiplexed into a big thunderbolt stream which carries everything?


Thunderbolt multiplexes that in, yes.

Although, more accurately, it’s actually just a PCIe connection – you can even connect a GPU via thunderbolt via USB-C


Best explanation I found was from the following forum post which was discussing why the 4K LG only has USB2 ports on the back. Will quote it here. http://forums.macrumors.com/threads/apple-teams-up-with-lg-f...

With the new displays, its the 4k one that only supports USB 2.0 speeds, the 5k one has "Three USB-C (USB 3.1 Gen 1, 5 Gbps)".

Looks like the two displays are actually using different technologies: the 4k one connects via "plain" USB-C, which means it uses USB-C's "DisplayPort Alternate Mode". USB-C has 4 pairs of high speed data wires plus 1 pair for "legacy" USB-2 signals. In DisplayPort Alternate Mode, some or all of the high-speed pairs are physically dedicated to DisplayPort signals. Unfortunately, Intel's USB-C/TB3 controller only supports DisplayPort 1.2, so to get 4k at a decent refresh rate takes over all 4 of those pairs just to run the display, leaving only the legacy USB-2 channel for other uses.

As for 5k, you can't do that with just 4 DisplayPort 1.2 data pairs - remember the Dell 5k display needs two DisplayPort 1.2 cables. However, Thunderbolt 3 works in a different way: rather than physically allocating wires to DisplayPort alone, the displayport data gets moshed together with the rest of the Thunderbolt signal so the display & peripheral data shares the same physical wires - the DisplayPort connection is a "virtual" one. Plus, Thunderbolt 3 has enough bandwidth to provide two virtual DisplayPort cables (8 virtual "wires") down a single physical cable, so it can can drive a 5k display and still fit some high-speed USB data around the edges.

That said - it probably doesn't make sense to hang your USB3.0 RAID array off the same pipe as a bandwidth-hogging 5k display. If you're gonna go 5k and want seriously fast external storage you can probably forget the whole single-cable docking thing until Thunderbolt 4 rolls around (...by which time you'll probably want a 25k display wall...)

USB-C DisplayPort mode can support 5k by using DisplayPort 1.3's higher data rate (although, again, that only leaves the legacy USB 2) but not when the computer's usb-c controller and GPU only support DisplayPort 1.2 - and that currently includes anything with an Intel USB-C controller (which is part of their Thunderbolt 3 controller). Complaints on a post card to Intel.


Fantastic explanation. Right on!


No one can say at this point what your hypothetical future monitor will use. There's a standard, of course, but there's more than one of them, of course.

There's DisplayPort, HDMI (yes, it's different from DP), and now Thunderbolt 3 Alternate Mode for USB Type-C, and then also since it's still USB, there will also be video over USB (aka the displaylink chipset).

Alternate Mode is feature of USB Type-3 to allow a bunch of pins to be used as a host-device negotiates, keeping USB control pins intact, so Thunderbolt 3 is just Thunderbolt using the AM pins after negotiation.

I'll bet that if Apple starts making 5k-iMacs-without the-iMac (aka monitors) again, that in true Apple fashion they'll run video over Thunderbolt-only, just like the now-discontinued thunderbolt only displays did. This means there's a high likelyhood possibility they won't work with PCs, despite having a working cable.


Thanks!

Regarding Apple monitors: Apple seams to be partnering with LG now instead of producing it's own monitors: https://www.computerbase.de/2016-10/lg-ultrafine-4k-5k-monit... . Sorry, I have only the german article here, but there's maybe also somewhere one in english. The infos on the connector are also somewhat vague. Maybe it's the small model using USB3 alternate mode and the bigger 5k one thunderbolt mode, which makes it make only. For the expected apple customers it probably doesn't matter as long as it works with the MBP. However for other potential users (which do not seem officially supported) it would be quite interesting.


I'm not an expert but my understanding is closer to your dedicated pins option. Basically everything that connects over USB-C starts with the USB-C handshake, after that alternative modes allows USB-C to get out of the way and (most of) the pins of the cable can be used for whatever the devices want.


Color-coding would have helped solve the cable mess, especially for users who are not tech-savvy. Imagine if USB-C were green, Thunderbolt were blue, Lightning was yellow, and DisplayPort red. If you have a rat's nest of cables behind your desk, it becomes easier to tell what's what, which makes it more manageable and less frustrating.

Also easier to guide non-tech-savvy family and friends on the phone: "See the red cable? Is one end plugged into the monitor?" "Yes" "Good, now plug the other end into the laptop."

is a better conversation than:

"There are a dozen cables here, all alike!"


I'm sure that would be great for everyone except Apple. Even blue ports (probably the least offensive color) are too ugly for Apple hardware :)


I think I can answer at least one of the questions, on why make 2.0 only C cables.

When I got my Nexus 5X, I bought some assorted A to C cables to go with it. I noticed that the 3.0-capable cables are awfully thick and heavy, and not so convenient to carry around with a mobile device. I bought some 2.0-only A to C cables that are much thinner, lighter, and more flexible, and use those instead. Considering that I will basically never need the extra 3.0 speed for connection to my phone, I'll take the cheaper, lighter, more flexible cable every time.


Is the long term dongle free? C to C everywhere? My TV could have USB-C instead of HDMI ports. I could use the same cable I use for charging my phone to hook my laptop up to the TV.

This solves the cable problem (every cable should support the full spec) but it doesn't quite solve the support question. Just because I have a cable that works between my phone and TV doesn't mean it will actually do anything.


I hope so. It also solves the hub or dock problem in one go. I just get back to my desk, plug in one USB-C cable and I now have all of my devices plugged in.


There are Thunderbolt (2) hubs that allow you to do just that even now, bit pricey though.

Also I don't think USB-C is the deciding factor here, it'll still have to use Tunderbolt to provide the hub functionality.


>The core issue with USB-C is confusion: Not every USB-C cable, port, device, and power supply will be compatible

Not every USB-A port, device, cable, and power supply are compatible. I'm not sure I understand what his point is. That people who refuse to do research are going to occasionally run into incompatibility problems? Like they have since the dawn of the computing age? And?


The consequence of this incompatibility is damage to the host device; that's what's new. Never an issue with RS232!


As far as I'm aware it is only out-of-spec cables that can cause damage. If everything is in spec then the worst case is just reduced performance or the connection not working. I'd add that – IMO – if an out of spec cable causes damage, then the cable manufacturer should be held responsible.

I'm sure that an out of spec USB 2 cable can cause damage as well.


The out of spec cable problem is slowly beginning to resolve itself. Major online retailers, starting with Amazon, seem to be taking action. Between that and possible product liability issues for bad cables (though that's more hope than anything likely for various reasons), we can expect cable manufacturers to get with the program by the time USB-C is further mainstream. Right now, aside from some brand-new Android devices, USB-C is still trickling in on laptops and other devices.

The really crazy thing is that there's very little reason to build crappy cables. Per unit, your savings from cutting major corners are as close to zero as you can get. Judging by Benson Leung's Surjtech[0] testing woes, even cutting out any QA testing to save money wouldn't be sufficient to explain all of the cable problems. QA testing certainly can't explain how you can completely forget to wire in the SuperSpeed wires despite labeling it as such. That's not the sort of thing to happen as a one-off. Losing your sales channels, facing potential class action lawsuits, and seeing your brand dragged through the mud all add up to far more significant costs.

The biggest problem, IMO, will be when USB-C becomes prevalent enough that we see the $1 cables in checkout lines and gas stations. If there's anywhere we're going to see bad cables being hocked, it's there. Hopefully by that point, manufacturers will have gotten burned enough that good QA will become the norm.

0. http://arstechnica.com/gadgets/2016/02/google-engineer-finds...


Right -- it's cables that do not comply with the USB-C spec that can cause damage. It's not as if plugging a Thunderbolt 3 device into a laptop that doesn't support Thunderbolt will damage the laptop. It just won't work.


and trying to argue that out-of-spec cables causing damage to your computer is a deficiency in the spec makes about as much sense as saying the etherkiller demonstrates a problem with the RJ45 spec.

(http://etherkiller.org/)


As far as I'm aware it is only out-of-spec cables that can cause damage

But see http://www.computerworld.com/article/3133627/technology-law-...


Hmm, don't get me started on standards conformant +12V/-12V RS232 and common 0/5V TTL RS232.


RS232 is a protocol, not a port. Rj45 is a port that rs232 uses and if you confuse that with ethernet, you will definitely damage something.


But not by using the "wrong" cable. The cables for RS232 and ethernet alike are passive.


PowerOverEthernet might be a surprise for a rs232 device.


Like what happened to R2D2 on Cloud City.


Correct. I was referring to POE scenarios.


Unless the pins were wired wrong.


The point is that due to the much better backward compatibility and do-it-all nature of USB Type-C, consumers are much more likely to run into incompatible devices and cables than ever before. Walk into any store and you'll see dozens of cables and peripherals that look the same but perform differently (or not at all) based on a confusing array of protocols and standards.

It's true that USB has always been a bit confusing, but now we have many, many more incompatibilities to worry about.


> Not every USB-A port, device, cable, and power supply are compatible.

While strictly true and you could drag up some pretty bad abuses of USB A especially as used for mobile charging, the peripherals / protocols usually just work. But here, if you have a cable in your hand with a USB-C and a DisplayPort connector it's pretty much anyone's guess whether a) it'll work with the given machine at all b) if it does, which version of DP will be supported with the given machine.


So you'd rather have 8 different standards? I know I love having to have a DVI, displayport, VGA, and HDMI adapter in my bag in order to have my bases covered for presentations.

I'm confused the scenario you're imagining in which someone both doesn't have their own, known good cable, and also is in a place that doesn't provide a cable, where displayport would even come into play.

I can think of exactly 0. If the projector needs a displayport capable usb-c, it's going to already be attached to the projector. If I need one for home, I'm going to buy one that supports it.


> So you'd rather have 8 different standards? I know I love having to have a DVI, displayport, VGA, and HDMI adapter in my bag in order to have my bases covered for presentations.

If there are 8 different cable types I'd rather have 8 different connectors, yes. Remember going around the office asking to borrow a cable to charge your 'phone and having to figure out micro vs mini vs sony's weird thing vs apple's weird thing? It was a bit silly but at least having physically different connectors meant that you could tell whether a given cable would charge your phone or not. Now it sounds like that's no longer true.


Now imagine the future abuses to USB-C :)


A meta comment. There are many other threads lamenting that this is not a "Pro" machine, but all this cable discussion is not foreign to audio, video and IT professionals and prosumers. If you want to get the max of your pro computer's IO you will need to learn your cable specs and protocols.

It does look that the future will require some rebuilding of our cabling. I have a thunderbolt hub that connects to my screen, my external thunderbolt drive, and a plethora of USB devices. I only use a single Thunderbolt port on my laptop. I like this Future. With these bandwidths I can see us connecting more interesting devices to our laptops


This strikes me as being the same situation as hdmi cabling. Anyone who has bought a 1080p tv then a 3D tv then a 4K tv then an hdr tv knows that not all hdmi cables are made equally. This is not great, but it's very far from a nightmare.


I was going to post this very same thing, the upside is it'll be easier with USB-C because it'll appear in far more devices over time than HDMI.


Is it possible to make one Type-C cable that supports every possible protocol that can go over Type-C and works as long as the devices are compatible? I.e., if the two devices can talk over Type-C, then the cable will work?


Type C cables only come in two types: USB 2.0 and 3.0/3.1. USB 2.0 cables have fewer wires and only support USB 2.0 data and power. USB 3.1 cables have all the wires (and are expensive) and support higher speeds and alternate modes.

Finally, Thunderbolt 3 requires active cables for longer lengths and higher speeds. It can only do 40 Gbps with 0.5m passive cable, and 20 Gbps with 2m passive cable. Anything longer requires active Thunderbolt 3 cables.


It might be possible.

There is a bit of a problem with thunderbolt's highest speed mode (40gbit). It requires active cables which have a chip at each end of the cable.


Yes, except for HDMI alt-mode. The problem is they tend to be expensive/stiff cables because of the need to support Thunderbolt 3.

But HDMI alt-mode is a cluster anyway, because it takes over the conductors used for powering the device, so hopefully it will just go away and be replaced by the (imo) superior DisplayPort.


One interesting thing is that when the port is the same for everything, the port itself (the shape, size, look and whether it matches the other thing you're looking at) ceases to be a useful interface for connecting things together physically. Instead we need other indicators, labels, and on-screen error messages to tell us those things, which is a much more indirect and less clear way of understanding connectivity.

Did anyone ever stop to ask if we really wanted everything to go through one port, even if everything wasn't really inter-compatible? I think we had it pretty right before, with a mix of ports, some of which were exclusive to a purpose (like HDMI, power, audio), some of which were generic (like USB, FireWire, Thunderbolt). Now we've removed clarity for what exactly? Aesthetics? "Simplicity?" The technological advancement of a single standard? There could be good reasons, but we should be aware of the usability tradeoff.


The USB 3.1 gen1 and gen2 thing still really boggles my mind. It's almost as if the USB-IF was trying to confuse people. Who retroactively renames a standard?


They should have just named things this way:

USB 3.1 gen1 => USB 3.1, USB 3.1 gen2 => USB 3.2

I don't know why they would do differently.


Agreed, 3 levels of versioning is too much for most users. Even 2 levels isn't ideal when dealing with a novice user.

  - Good: Thunderbolt 1, 2, 3
  - Good: USB, USB-2, USB-3
  - OK: DisplayPort 1.2, 1.3
  - OK: HDMI 1.2, 1.3a, 1.4, 2.0
  - Bad: USB Hi-Speed, SuperSpeed, SuperSpeed+
  - Bad: USB-3.1 gen 1, USB-3.1 gen 2, ...
  - Bad: LEV, ULEV, SULEV, PZEV, AT-PZEV


Yeah it's one of the stupidest choices they made. Also, the idea that not all cables must support 100W (20V, 5A) so now the cheap chinese cables are all going to report they can but will catch fire the instant they do.


Well they did change the encoding of USB 3.0 to become USB 3.1 gen 1. But that's about it. And it's backward-compatible so it doesn't matter.

Having the higher data rate be "same name gen 2" is beyond dumb.


I'm a bit confused. The ports on the computers themselves can have different protocols- that makes sense. But the cables themselves can also support different protocols? Maybe I'm just naive, but can someone explain how a "dumb" cable supports different specs?


They aren't dumb cables anymore is why.

I'll use Thunderbolt 3 as the example because it's the most stringent. At the 40G signalling it uses the connector loss is extremely high. The maximum passive cable you can have is << 1 m before the rx will start to encounter bit-errors. This forces manufactures to put re-drivers in the ends of the cable to boost the signal, which is obviously expensive.

Even just USB-C charging cables that need high-voltage have chips in each end to negotiate the correct protocol for both the devices and the cable.


Specifically, different cables support different bitrates and Wattages. This is due to the number and thickness of the wires in the cable.

I think Monoprices's labeling of a 5Gbps or 10Gbps USB-C to USB-C is false, however. There is nothing about the USB 3.1 Gen2 (10Gbps) spec that requires anything more than a regular USB3 cable. I think the "5Gbps" version is just an older product description from when 5Gbps was the fastest USB available.


Plus, as noted by lee_s2, the higher-cost/higher-protocol cables are "active" with chips in them.


Different grade of shielding/what kind of bandwidth you can achieve over those cables.


USB standard bodies can borrow a page from the Ethernet port and signal standards. 10-mbps to Gbps evolution does not have to be painful for users.


The 8P8C/RJ45 connector which was used from 10Mbps to 10Gbps might be the same end, but different speeds when different cabling grades (CAT3/5/6, etc.) were used.

While most 1Gbps equipment can work in 10/100Mbps mode, most 10Gbps equipment can only downgrade to 1Gbps mode.

It's actually a pretty good comparison to USB-C - lots of different modes with one physical interface and many things that don't work quite right or have significant caveats.


Start getting involved with 10Gb Ethernet and it is no longer trivial. You've got Cat 6, 6A, and 7, and all sorts of different length and shielding considerations and often don't get anywhere close to 10Gb.


At least for most Ethernet cables I buy, the category is actually printed on the cable itself. I don't see that happening on USB cables.


It would be nice to see something like #lanes@XGhz + 1 .


But Ethernet isn't included on many laptops anymore...

This rush to be "thin" has required changing the physical ports, and has contributed heavily to this issue.

For a few years there will be confusion as cables/connectors/standards shake out, but it's gonna be awesome after that when we have super thin laptops that just use one connector.


There is no reasonable excuse that a laptop need be so thin that it cannot accommodate an ethernet port.

Oh well, I'm a late-twenties dinosaur that still expects to be able to remove batteries and pop off panels to upgrade hard disks and RAM...


I bet it has more to do with the amount of use these ports get (i.e. less and less people using ethernet). And besides, when I get to work, I hook up my laptop to a plain old USB hub which has an ethernet connection.


Exactly, anyone sitting down at a desk would prefer to plug in one cable that include power, ethernet, keyboard, mouse, and external display. These thunderbolt cables are the dream, it's just going to take a couple years for the peripheral market to catch up (surprisingly the iPhone is included in this).


Generally the cable hasn't mattered in consumer devices - as long as the device and cable are good and you plug it in to the right port, it'll work. A cable's a cable, after all, right? Unfortunately, that's not true, hasn't been true for a bit, and Apple's only partially to blame. DVI-A, anyone?

Some of Apple's dongles have a microcontroller inside in order to do the signal conversion, so it's a wonder they're only $30. That lighting-to-3.5mm jack that comes with the iPhone 7? Tiiiiny DAC - http://www.macrumors.com/2016/09/20/lightning-earpods-teardo... (The other option being dumb signaling with the iPhone itself doing the DAC and passing the signal, as USB-C allows with alternate mode).

Past Apple's dongle madness though, the bleeding edge of technology has always had a few edges. Despite the connector at the end fitting, HDMI 1.0 cables won't work where HDMI High Speed cables are necessary (though monster cables are still a rip off). High-end 4k TVs need the proper cables or else it won't work, just like a random cable with RJ-45s on the end won't necessarily support gigabit connection speed (or even support ethernet, for that matter).

If Monoprice listing all the possible variations of USB-C cables seems frustrating, and you're allergic to details, only buy the expensive Apple cables and certified Apple accessories and you'll be fine, same as it's always been.

If you need to venture outside their walled garden, yeah, there are some details to know about that the article doesn't go into, but I'm quite excited for what's become known as the USB-C connector to become the global consumer connector standard. Once that's true, the fewer weird dongles we'll all need, and you'll always be able to charge your phone-that-has-usb-c (we'll see if the iPhone 8 picks up USB-C).

What the author glosses over in the article is actually an interesting part of USB Type-C spec, which is Alternate Mode. This allows a device and host to negotiate to speak something other than USB on the pins, be it video, networking, or in Apple's case Thunderbolt 3.

Apple's definitely gone and made things confusing with Thunderbolt 3 - for everyone else. Buying only Apple stuff is going to "just work" as long as you keep buying their newest shiniest gadget, and, well, they're in the business of selling gadgets.


I agree that it's exciting to have a durable/flippable cable that can be used for all sorts of things. The issue is that the nomenclature is unclear, with everyone just saying "USB-C" when that can mean all sorts of things.

I actually spent quite a lot of time talking about "Alternate Mode", I just didn't call it that all the way through.


The proliferation of various ports and interfaces has been disturbing, even in the PC arena where Thunderbolt is rather hypothetical.

Displays alone drive me insane these days. Twenty years ago, you had a VGA connector, and that was it. Then came DVI, which allegedly worked better with TFT panels. Then came HDMI, but there is also DisplayPort which appears to be similar, yet different. I have not seen a display or beamer that will accept DisplayPort input. Does such a thing even exist?

And laptops have, of course, the "mini" version of these, so there is mini-DisplayPort (which looks suspiciously like ThunderBolt) and there is mini-HDMI (which looks suspiciously like USB-C).

I am still telling myself this is a transition, and in five years everything will be USB-C. Once we are there, that sounds like a nice future, but I am not certain we'll get there in time. (Plus, a tea leaf got stuck my Galaxy Tab's USB-C port while riding the train - it took me an hour to scrape and shake everything out before that thing could be charged again. Something that never happened to me with good old USB ports for some reason, even though they were much larger.)


Plus, we have all the old devices that still work just fine using the old ports. I've been using some LCD monitors for five or six years that are VGA-only, and they are still going strong, with no need or reason to replace them.

DisplayPort vs HDMI is one of the real bafflers. Graphics cards always seem to have one HDMI out, and two or three DPs, yet I have never encountered a monitor that have DP-in ports, just VGA or HDMI.


Wow VGA... even 13-14 years ago my monitors were on DVI.

As for DP, you've never seen a 27 or 30" monitor? All high resolution (QHD, UHD, 4K, 5K) or high refresh (120/144 Hz) or G-sync/FreeSync monitors use DP as their main input source.


Okay, for higher resolutions that may be a thing. I have only seen one 27" display, and it was "only" 1920x1080, with one VGA, one DVI and one HDMI input. :-/

(Also, I have seen many PCs with builtin graphics that have no DVI output, only VGA and HDMI (and sometimes DisplayPort).)


I recently got a Dell P2715q and it has DP and HDMI only. As time goes on you're less likely to see DVI connectors on monitors.


I have not seen a display or beamer that will accept DisplayPort input.

Both my Dell U2412m at home and whatever Fujitsu I have at work have a Displayport connector. The Dell is at least 4 or 5 years old.

It's kinda handy, I plug in my MacBook 12" with a hub and HMDI-DVI cable. My wife uses a mini DisplayPort to DisplayPort to connect her Air to the same screen.


> I have not seen a display or beamer that will accept DisplayPort input. Does such a thing even exist?

Highend displays with display port definitively exist. I'm not sure about beamers, though.


> so there is mini-DisplayPort (which looks suspiciously like ThunderBolt)

You're right here, Thunderbolt 1 and 2 use the mini-DisplayPort connector, now Thunderbolt 3 uses the USB-C connector.


Thunderbolt 3 is a great thing (really), though some time is needed for the transition period.


That's just the start of it. So the new MBP has what, four USB-C ports.

Can I put power into all of them? What if I try to do 4xHDMI for all of them? Surely I can't connect four external graphics cards over Thunderbolt 3? Can I chain Thunderbolt devices?

The author also missed the "audio accessory mode". That's right, in some unique star constellation, some of these USB-C pins can be repurposed for pumping out analog audio! Supported? Who knows.

I think before long every USB-C accessory will have to come with some sort of EEPROM that the host reads first to figure out 1) what is this you are plugging in and 2) is this going to work. So that there is at least some user feedback instead of "plain doesn't work" or "oopsie now the port is dead".


They already should [0]

> All USB-C to USB-C cables are considered full-featured USB Type-C cables and must be active, electronically marked cables that contain a chip with an ID function based on the configuration channel and vendor-defined messages (VDMs) from the USB Power Delivery 2.0 specification.

[0] : https://en.wikipedia.org/wiki/USB_Type-C


The answers to all of the questions in your second paragraph are easily available and in most cases the answer is yes (you can do 4x HDMI if they're 4K or less on the 15", 2k on the 13" - video card limitation, not port limitation).

[1] http://www.apple.com/macbook-pro/specs/


It says two displays for both, so the answer is no. I'm not sure a user is particularly interested in the video vs port distinction.

Also says nothing on the four external graphic cards, but frankly the answer is probably going to be no as well.

That's the principal problem here: there is now a port for things that the underlying hardware can't even offer.


> Can I put power into all of them?

Yes though it will only charge from one of them. A neat consequence of this is you can now connect multiple power supplies for redundancy. Not at all important for most people but there are some scenarios where it's nice feature to have available.


Crap. I missed that. You're right.


It is a pity that they didn't introduce a common color coding for USB-C connector cables.


How many color bands would you have to have one each end? The point of USB-C is that it can support many, many protocols. Labeling that is legitimately hard without a massive book.


At least some basic coloring would be nice. Let's say (For the sake of discussion) blue for USB-C (maybe a different shade for 5 ws 10GBits if you want), a black one for a Thunderbolt 3 connection, a white one that only has charging pins (hello Macbok Air cable). Still better than the mess we have now.


There is a real evolutionary fight going on with connectors at the moment.

    > Apple's fastest growing product category.
This tweet highlights the Apple problem right now [0] What is damaging to users is the cost / availability of connectors. What was the last time this connector nightmare played out? Token/Ethernet, Serial/DBX/USB? It pays to be a bit conservative in hardware choice at this moment.

[0] https://twitter.com/dbreunig/status/792034409788518401


The fact that certain devices cab be damaged by the wrong cable is inexcusable.


This is like complaining "mv ~ /dev/null" does what it is meant to do.

I know

- "Keep It Simple, Stupid"

is a thing, but so is

- "UNIX was not designed to stop its users from doing stupid things, as that would also stop them from doing clever things."


Disagree. This is an engineering failure. You have to look at reality; it's often not easy to tell what kind of cable you are planning to use. Any command line interaction is decidedly more involved than the typical user plugging in a cable. This type of interaction should have been planned for and mitigated.

This also has nothing to do with UNIX.


It has to do with designing a system to be used by people. UNIX is a system designed to be used by people.

- "Any command line interaction is decidedly more involved than the typical user plugging in a cable."

Right. "The typical user plugging a cable". I expect the user to become a "typical user" after learning how to choose and use a cable, and getting acquainted with his hardware and software. One is not born a typical user, as you seem to imply (plugging a cable is not a complex task). It is. Everything is complex. Using the command line is complex, and then you factor your "typical uses" into aliases or scripts.

Or maybe you curl | bash scripts from the web, and then cry when they fail / your box catches internet aids.

Or you use an ipad for all the computing you do, and expect things to just work.

See: a typical everyday usecase: https://github.com/alex/what-happens-when

Did the user write "oogle.com" instead of google, and got malware? ....Inexcusable, as you said? Should it just work?

I say: "why did the user write oogle.com? Did he want malware?"

Simple stuff.


The article is pretty good, but this amount of hyperbole is really unforgivable: "If you’re not careful, you can neuter or even damage your devices by using the wrong cable." First of all, the linked post says that C to C cables do not have this problem at all. The issue comes about in relation to how older standards report allowable power draw via resistor configuration. This is a problem that can only occur with USB A to USB C and also USB A to USB B.


No you can damage your devices with C-to-C cables. https://plus.google.com/u/0/+BensonLeung/posts/HakwCMmd346


  Q: Do C to C cables have the same problem?
  No. C to C cables do not have the same problem because they are required to be straight pass through


Did you actually read it? Like maybe the part with the question regarding damaging devices with c to c cables?


I think this understates the number of ways you can connect a monitor over USB-C if anything. Let's see, there's Alternate Mode DisplayPort, Alternate Mode HDMI, Alternate Mode Thunderbolt's video support, Thunderbolt to an external GPU, USB 3.0 graphics, possibly more, most or all of which can be converted to HDMI with different compatibility and performance tradeoffs.


...and most of which won't work with a given display. SO you have to figure out which of the dozen or so possibilities works with your monitor and computer and buy the right cable to go between them.


The politics behind standard committees is horrible. Just call the new standard USB 4.0 which supports alternate mode, power delivery ...


4 is unlucky some places. They'd probably go to 5.0 next.


Tetraphobia is the practice of avoiding instances of the number 4. It is a superstition most common in East Asian nations. … The Chinese word for four sounds quite similar to the word for death in many varieties of Chinese. Similarly, the Sino-Japanese, Sino-Korean, and Sino-Vietnamese words for four sound similar or identical to death in each language.

Wikipedia's "Examples of sensitivity to tetraphobia applied" section is interesting:

https://en.wikipedia.org/wiki/Tetraphobia


Sounds appropriate for this mess.


I was in shenzhen a couple of days ago, and faced with the prospect of having different types of USB-C cables, I bought one of each standard (thunderbolt, USB3.0 and 3.1).

I now have a problem because I don't remember which colour is which. Is there a way to find out without having to break the nicely braided cables?


You could try them with appropriate peripherals and see if they connect at max throughput... But that's going to be very hard since there are pretty much no Thunderbolt 3 peripherals out there.


And here I am wishing HD-BaseT would catch on, but I don't think the people behind that (and USB-C) have any intention of actually making our lives easier -- it's just to having something new to sell :(

http://hdbaset.org/


This makes my brain hurt:

Thunderbolt 3 is really an “Alternate Mode” use of the Type-C port/cable, just like HDMI. But in practice, Thunderbolt 3 is a super-set of USB 3.1 for USB-C since no implementation of Thunderbolt 3 will be USB 2.0 only.

Anybody care to explain?


"Alternate Mode" means "use these same pins for something other than USB". So Thunderbolt 3 uses the same connector but re-purposes the pins and wires to carry 2 or 4 PCIe 3.0 lanes rather than USB. HDMI similarly repurposes these pins to carry traditional HDMI signals.

Anyone implementing Thunderbolt 3 will also be implementing the full USB 3.1 stack in the same chipset. Intel, for example. It would be silly to implement just Alternate Mode Thunderbolt and skip the USB 3.1. That's all I meant.


Here's a slightly unrelated question -

What happens if you plug in 4 power cords into the new MacBooks ?


In Apple's documentation, it mentioned that if you plug in multiple power sources, the MBP would choose only one at a time.


Well, a similar situation exists with UTP cables, where cat5 offers 100 Mbps, cat5e offers 1000 Mbps, and cat6 offers 10000 Mbps. They all look exactly the same unless you go and read the label on the cable and are familiar with this.


It's even worse: getting consistent 10Gb on Cat 6 depends on cable length and electrical interference. Cat 6A and 7 also start coming into play for long cable runs and make the situation even more complicated.


Yes, though cable crosstalk and interference is an issue with most cables. There is STP (shielded rather than unshielded), which protects against external interference.


Such mixed feelings about this. Really nice content, but misleading title. :(

I already knew all that, but I appreciate the write up for others who don't already know all those details. I'm an enthusiast and obsessed with these details of ports, protocols and cables. I predicted this a year ago [0] and I'm very happy with this outcome. Yes, it's a transition period, which is unpleasant every time, but we will be in a fantastic state in a few years.

[0] https://twitter.com/shurcool/status/607351368387469312


What is so crazy about this is, if you can't risk just using any cable that fits in a socket, becuase you could damage your hardware, end users would be better off with a completely unique shape for every cable.

This is a giant leap backwards.


Where does it say you would damage your hardware? Alt modes are negotiated with a standardized protocol as is power transfer. The defaults should be electrically compatible with all devices, though not necessarily functional.


    "If you’re not careful, you can neuter or even damage your devices by using the wrong cable. Seriously: Using the wrong cable can damage your machine! This should not be possible, but there it is."


This refers to out of spec cables, not the incorrect type of cable. As long as your cable is within spec, you wont have issues.


The mistake this article makes is thinking that the typical person will interact with many different USB-C ports and cables than his or her own. The reality, is that people will get to know their own ports, buy their own cables and devices, and things will work 99% of the time.

Only occasionally they'll need to use a friend or coworker's device or cable and then there could be confusion. Although, even then, assuming the friend also has one of the most popular computers/phones/cables, it'll probably still work.


Ah for a world where you never have to go and make a presentation in another office.


Yah, apparently you don't have my wife/kids/in-laws/cousins/etc. Where the rules for USB chargers seems to be, plug your phone/etc into whatever charger you happen to find, and possibly take the cable/charger home when your done leaving your old crappy one.


Are you willing to bet your computer and/or peripherals on that?


> Thunderbolt 3 requires a special cable

Apparently this isn't quite right. You can use normal passive USB 3 cables to get 40Gbps at very short lengths and 20Gbps at medium lengths.

Unless they're too low quality.


It would be nice if calling things "Total Nightmare" did not become a trend.

I've seen more of this negative rhetoric lately and I suspect it's influenced by Trump's speech patterns of describing everything as a "Total Disaster, Sad!"

It's not a constructive way of speaking, and it's hurtful and discouraging to whatever or whomever it is criticizing. That's likely why Trump does it.

How about changing the title to "USB-C adapter confusion: what can we do to improve this?"


This just sounds like the same whining people did when the Mac went to USB back in 1998. "You mean I need an ADAPTER for my SCSI device?!"


Tim Cook is the Steve Balmer of Apple. Balmer lead Microsoft to near oblivion. Really nice to see Microsoft change into a more open company... I'm getting more and more impressed with things like the Linux subsystem.

I haven't bought a Mac or an iPhone in awhile because their hardware is terrible compared to their competitors. Gimmicky features like 3d touch (haven't used it once, intentionally), unnatural scrolling, and this touch bar are things I'll probably use once or twice. Literally the only reason I stick with OSX is because it's a commercially supported Unix system with a nice user interface.

What I don't understand is the "pro" in the name. Doesn't a "pro"fessional need to do things with their computer outside of a coffee shop; usable I/O, gigabit ethernet, slots for interfacing with their other professional equipment, etc. I can totally understand these features in a consumer edition laptop. But there is no longer a reason to call these "pro" laptops.

The silver lining I guess is maybe Apple drives a new wave of people to desktop Linux and we can finally get a nice, modern, desktop environment. Either that or another project to get OSX running on [superior] non-Apple hardware.

Anyway, just my opinions. I wonder if anyone has similar thoughts.


I'm not an Apple fan, but you seem pretty harsh.

- 3d touch: gimmick

- unnatural scrolling: I presume you mean "scroll the content" not "scroll the viewport" I thought the same but when I tried it out myself I actually adapted really quickly and now find it more natural. Either way this doesn't seem like a huge investment in anything, if you don't like it turn it off.

- touch bar: I actually think this will be quite useful. It turns a row of numbered keys into a row of keys that can be matched to the application that is running. Maybe a slight downside to power users (less tactile) but I think it is a pretty reasonable feature.

- gigabit ethernet: Can't you get this over USB-C?

- slots for interacting... Maybe not right now but I think that soon everything will be USB-C. I don't really see any reason for another port.

- maybe Apple drives a new wave of people to desktop Linux: I can only hope :)

I can definitely that especially in the short term the advantages aren't clear or not significant. However I am really looking forward to the universality of USB-C and most of the other features appear to have a market so I'm not so much losing faith in Apple over all but feeling that they are moving further away from the demographic I am in.


You didn't buy an iPhone in 'a while', but you imply that you own one where you can (unintentionally) use 3D touch, so either a 6S or a 7. So you didn't buy an iPhone since... last year?


> What I don't understand is the "pro" in the name. Doesn't a "pro"fessional need to do things with their computer outside of a coffee shop; usable I/O, gigabit ethernet, slots for interfacing with their other professional equipment, etc. I can totally understand these features in a consumer edition laptop. But there is no longer a reason to call these "pro" laptops.

You can buy a single ~$80 hub that gives you all of those ports, AND you only need to plug 1 cable into your computer when you sit down at your desk. If you buy a new USB-C monitor, then all of those things are integrated in the display. Even the charger.

It has more IO throughput than pretty much any laptop out there with multiple Thunderbolt ports at 40Gbps. Even most desktops don't have _multiple_ 40G ports.


Rather off-topic don't you think?


Pointless Mac vs PC debates are never off topic on internet forums :-)


> we can finally get a nice, modern, desktop environment

Have you tried Gnome 3? Not being sarcastic :). These days it's a very stable, slick environment with a very good window manager. Beats the pants off of OS X at least.


>Have you tried Gnome 3? Not being sarcastic :). These days it's a very stable, slick environment with a very good window manager. Beats the pants off of OS X at least.

I used to use OSX but now I use Arch with GNOME3 and your comment is not based in reality. GNOME3 doesn't even have feature-parity with Snow Leopard. It's many years behind the OSX in features and polish department [0]. I encounter various GNOME3 issues and bugs daily and have been subscribed to many issues on GNOME's Bugzilla to keep track of fixes. The worst issue, by far, is memory usage and memory leaks. On my PC, GNOME3 sometimes ends up using 1.5GB or RAM. GNOME is also infamous for being the only Linux DE that regularly removes features that people use [1].

Anyway, while I do like GNOME and have used it every day for the past 4 months, it's nowehere near OSX's Aqua.

[0] https://igurublog.wordpress.com/2012/11/05/gnome-et-al-rotti... [1] https://mail.gnome.org/archives/commits-list/2013-March/msg0...


> What I don't understand is the "pro" in the name

You can do all those things (and more) via ThunderBolt 3. You can even reuse old hardware by putting PCI-E cards into external enclosures. That's pretty Pro don't you think?


I imagine that there are very few companies worldwide that would refuse the sort of oblivion Steve Ballmer led Microsoft into


Does anybody know how the protocol/mode gets negotiated? Its unbelievable what capabilities such a small port has.


Well, that depends on the protocol being negotiated and the connectors and devices on either end, of course. Having just one or two methods of protocol negotiation would be too simple.


Well, i would expect that. But how is the detection done? i don't think USB was designed with an alternate mode/thunderbolt in mind. I would expect some tricky solution to play nice with legacy USB-devices, or not?


I think legacy USB 2 has dedicated pins and devices are expected to deal with all of the old legacy signalling methods in addition to the new ones.


You know whenever there is a bad press about Apple product they start to black list those reporting websites. Blacklisted company don't get any pre-released news or products for reviews. I wonder how many websites have been blacklisted.


So am I right in understanding that… (i) When my new MacBook Pro arrives, I need to learn which is the Thunderbolt port. (ii) I can simplify matters by always buying the top-spec leads. If so, how do I know which to buy?


Answer for (i): They're all Thunderbolt.


But some might be slower than others. https://support.apple.com/en-us/HT207256


Is the article correct in claiming that the CABLES are backward-compatible?

I have a Nexus 5x, which uses USB-C. I want to buy a cable to charge it and connect it to my computer. Would a Thunderbolt 3 cable work?


Yes - thunderbolt 3 cables are required to support the full 100W of power. It sounds like there are only really four types of USB-C cable, at least for the time being - USB2, USB3 without power, USB3 with power and thunderbolt 3.


interesting read on the technical details of the ports b/w 13"/15" and port placement: "Thunderbolt 3 Ports on Right Side of 13-Inch MacBook Pro Have Reduced PCI Express Bandwidth" ~ http://www.macrumors.com/2016/10/28/macbook-pro-tb3-reduced-...


USB needs to go back to being "universal". The USB spec has been getting progressively more complicated over the years; it's time to cut back.


Has it ever been "universal"?


Yes, hence the name. It's quite easy to implement a USB 1 controller, and there was very little difference across implementations.


I'm reasonably certain there have always been differences in power delivery and connectors.


What a bunch of alarmist BS. Drop all the exclamation points. No one takes you seriously if you end every damn sentence that way.



I think the nightmare is if we give up and go back to separate ports and cables for various things.


What happened to Thunderbolt 3 over USB 3.1 Alternate Mode?


It's in there.


Anyone have a mirror?




I activated W3TC and CloudFlare so it should be working now


I love


> Although it looks exactly the same as a regular USB-C cable, you need a special Thunderbolt 3 cable to use Thunderbolt 3 devices!

They're clearly putting a lot of lead in the water in Cupertino lately.


Intel's headquarters are in Santa Clara. https://en.m.wikipedia.org/wiki/Thunderbolt_(interface)


Worse case scenario: USB-C is gonna ruin the entire PC/Mac industry due to confusion and potential damage.


huge overreaction




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: