Hacker News new | past | comments | ask | show | jobs | submit login
Apple Lightning (2020) (nyansatan.github.io)
279 points by aix1 on Dec 28, 2022 | hide | past | favorite | 140 comments



I was disappointed the article didn’t include details about the physical connector. Lighting’s performance specs might be outdated, but I’ve always liked it’s physical interface. It’s rugged, reliable, simple, and pops in with satisfying but subtle tactile feedback.


>> It’s rugged, reliable

In my experience it works well for 2-3 years then wears out and charging becomes very difficult after that. The connector also breaks easily as it is essentially a totally exposed fragment of a circuit board.

The design was light years ahead of everything else for a long time but these days, everything should be USB-C.


I’ve never found Lightning connectors to fail, but the cable to connector junction does seem to wear out after a few years, especially on Apple cables with that terrible vinyl-like stress relief material.

But USB-C is pretty young. It remains to be seen if cables and connectors are really durable forever.

I am still suspicious of the complexity of the nested male/female design with the inner male blade being part of the device and hard/impossible to fix if broken. . . but have not seen any problems on my increasing number of USB-C devices yet.


I've broken one USB-C port since 2017, and it was in a power bank. The arcing was brief, and the smoke release moderate. Once I teased the remaining wire fragments inside the socket apart, I filled the socket with hot glue, and continue to use the power bank today.


I've broken all USB-C I've had except the last one (and except another one that I lost) which I now have to use to charge everything :|


Charging issues after a couple of years are usually due to lint accumulation in the phone opening. A little careful time with a toothpick will usually get it sorted.


It isn't just that. It physically wears out as well.


One pin on the cable (male part) becomes oxidized (black) after time. One can rub the black stuff away using a regular rubber and it usually works again.


I've used contact cleaner and toothpicks to solve this problem before. It only works so many times, eventually it's just totally shot. I've never had a cable fray but I've had 3 or 4 lightning connectors become useless.


My first gen iPad Pro and iPhone 6s Plus say otherwise.


I didn't find lightning to fail so much as have the female end fill up with lint. Go digging with a wooden toothpick, pull out some bellybutton lint and it worked good as new.


I've always had more issues with USB-C connector than lightning, it's very sensitive and lint/dust gets both ends of it. Absolutely hate latching mechanism of usb-c.


I agree. I still use the iPhone dock that supports the whole weight of the phone with the lightning port. I've had no issues.

It was totally designed to enable a minimalist Apple retail store aesthetic. I'm convinced that's the only reason for the design. Totally unnecessary design constraint, but very cool...talk about a company with a holistic view of their products, thinking about how details of the design will aid retail display.

I'm still looking forward to a USB-C iPhone, if Apple doesn't go portless (fingers crossed on that).


The male part of the connector isn't the issue. It's the spring inside the device that keeps it plugged in (obviously doesn't matter with the dock because gravity does that) that wears down.


Agree, I do enjoy the feeling of plugging in a Lightning cable more than a USB C cable, but the spring-loaded mechanism wears over time and eventually there’s no snap at all.


From my experience, when there is no more snap it tends to be an accumulation of dust and grime in the port preventing the cable from fully seating. I usually clear it out carefully with a sim card ejection tool.


I use a sharpened wooden toothpick to prevent any possibility of shorting the pins


Look closely at a lightning cable. The edge is metal. When you plug in a lightning cable, you briefly short every single pin together. There's probably a greater risk of bending a pin inside the port than there is shorting pins out.


Yes, it’s a similar design to RJ45, where angled pins are lifted by the leading edge of the connector. Lightning doesn’t have the grooves to guide pins, but the tight tolerance and setback of pins means you really can’t land a pin on the wrong connector, let along short across them.


Probably a good precaution to take - but I've always recklessly used a paperclip. Been doing it since the first Lightning connector devices came out and have never had any negative effects from doing so. Seems pretty robust.


I've had good luck with poster-tack goo... just have to make sure not to leave any behind.


Yup, have done this many times; my phone is definitely due for it.


Lightning is capable of doing USB 3 speeds. The first gen iPad Pro had lightning ports and had usb 3 speeds.


Right. And likewise I doubt the USB-C iPhone is going to have USB 3 capability - given that Apple's had the opportunity to put USB 3 in those phones for a while now. For whatever reason it's not a cost they feel like paying, even on the Pro phones that shoot in ProRes and presumably could benefit from quick offload.


I agree. We already see that in the latest low end iPad that has USB C. But only transfers data at USB 2 speeds.

Maybe the Pro models will have the full USB C capabilities - video over USB C and full USB 3 speeds like the iPad Pro.


Rightly or wrongly, Apple is driving to pure wireless and portless devices.

Best hope for USB-3 will be an updated MagSafe connector on a portless phone, using 60ghz networking from phone to MagSafe and usb-C from MagSafe out.


Are they? The only device that is pure wireless is the Watch.

The AirPods Pro2 would have been the perfect opportunity to go wireless. It already works with both standard Qi chargers and the non standard Watch charger


I'm sure they are.

It reduces the attack surface (you can't break into the phone by plugging something into it)... which also means that jailbreaking will be a bit more difficult.

It also completely removes them from the debates and regulation of connectors. Lightening? micro usb? usb-c? All of that goes away.

By removing cords and cables it also removes the "my device doesn't work because I was using a flaky 3rd party cable" ongoing support questions.


Why would you think that the EU can't regulate whatever wireless standards Apple chooses to use in the future as they are doing with current wireless technology?


Mostly because because Apple is already able to be charged with the major technology (it's not something they're going off and doing on their own). The Qi wireless charging standard is one that a lot of phones and accessories are using ( https://makezens.com/phones-with-wireless-charging-technolog... )

> The MagSafe Duo Charger conveniently charges your compatible iPhone, Apple Watch, Wireless Charging Case for AirPods, and other Qi-certified devices. Just place your devices on the charger and a steady, efficient charge begins on contact. The charger folds together neatly so you can easily take it with you wherever you go.

They'd need to move the entirety of the wireless power transfer devices and accessories to some other standard.


It sounds to me like you described a situation where the regulations are working as intended.

In the scenario you describe, the EU wants Apple to remove a nonstandard lightning connector and replace it with a standard one for increased interoperability and decreased e-waste.

It looks like what you describe is exactly what happened, which is exactly what the EU wanted.


Don't put it past them to mandate that all phones must have a physical port, and that it must be compliant with a blessed EU standard.


You say this like it's a bad thing.

Why do you think it's a bad thing?

Do you think governments mandating that analog telephones used RJ-11 and twisted pair to be bad?

What about mandates on electrical receptical standards?


Did it require a special lightning cable (with more data pins) to use those speeds? If it was the standard cable, it's perplexing why Apple hasn't used that in every "Pro" iPhone/iPad.


So it’s more complicated than I thought.

When the iPad Pro was a host - ie you connect a usb 3 storage device to it via the camera connection kit - basically a lightning to USB adapter, you could get USB3 speeds.

But if you tried to connect it to your computer, it would be USB 2.


> But if you tried to connect it to your computer, it would be USB 2.

And I believe this is because all Lightning to USB cables (be it USB-A or USB-C) only support USB 2.0.

The Camera Connection Kit supports/supported USB 3.0.


Via a weird hack where they used the pins on both sides of the port simultaneously. I imagine they had some issues with it since, as a feature, it didn't last too long.


Reliable? I find these cables broken over, and over. Many last only 2 months or so.

At the samy time, my USB-C cables just work.


You must switch devices every year or two... Every old iPhone or iPad I've seen, it eventually wears out and becomes loose.


There's a lot more information on this stuff in the Chinese and Vietnamese part of the internet, the part that Apple's lawyers can't easily reach (nor Google and most other mainstream search engines these days, for better or worse.) Repair/thirdparty accessory industries are also increasingly keeping what they've RE'd to themselves because they know they're fighting a war against Apple with what they've found, and no one is going to dare pop the golden goose.

If the references to authentication and certificates weren't enough, there is definitely some DRM crypto going on in there. Yet the Chinese were able to clone the 1-wire authentication chip within a week or so of the official products' release.


> There's a lot more information on this stuff in the Chinese and Vietnamese part of the internet, the part that Apple's lawyers can't easily reach

Where? Will you post some links?


As far as I know often it’s just a late night run of exactly the same factory that makes the legitimate products.


No, they actually cloned the authentication chip using an 8051:

https://www.eevblog.com/forum/oshw/oshw-apple-lightning-conn...

It's likely they reverse-engineered and managed to extract a key from a real one.


This makes me hopeful that VIN spoofing on Tesla superchargers with a cloned adapter will work at some point for other vehicles. If a proprietary cable auth chip can get reversed that hard - why not an "open spec" ev charging protocol/cable/adapter?


> This makes me hopeful that VIN spoofing on Tesla superchargers with a cloned adapter will work at some point for other vehicles.

Some jurisdictions take stealing energy as a crime. Germany, for example, has up to five years of prison on the books [1].

Kids, don't mess around with energy if you don't exactly know what the f..k you are doing and know the legal and technical risks involved.

[1] https://www.gesetze-im-internet.de/stgb/__248c.html


1. That's not really a deterrent to the hacker mindset, necessarily.

2. The idea is you'd be tricking the supercharger into letting you pay for it, not necessarily stealing it.


That law doesn't seem to apply.


I think the point was that stealing energy is likely to be seen as a crime in general, not that this specific law applies to this specific hypothetical.

In general stealing anything is frowned upon and is at best a temporary imbalance in technical ability to steal versus economic incentives to stop theft.


Someone using a VIN spoofer to free-ride or enjoy lower rates on Tesla's supercharger with their own vehicle is definitely acting unauthorized.

Besides, courts have ruled that even charging a phone at your workplace without authorization fulfilled the criteria, the case was only thrown out later on because firing someone over stealing 1.6 cents worth of electricity after 19 years of employment is ridiculous [1].

[1] https://www.rodorf.de/03_stgb/bt_14_248c.htm


Yes, but unauthorized is not what the law says. The law is about using a wrong conductor to steal energy. Not the case when the charging hardware is not modified.


On a related tangent: I'm really puzzled by these PWM-based _digital_ signaling protocols --- more so (apparently in this case) where the period isn't even fixed. The most prominent example is probably the WS2812 RGB LED, a.k.a. NeoPixel. I can get it when a PWM line carries an analog-ish duty cycle signal, but using positive and negative _widths_ to signal bit-for-bit when you control both sides of the link doesn't make sense to me engineering-wise --- if not for speed (think USB3+), not for robustness (think something like Ethernet), and not for impl simplicity (think I2C/SPI/SDIO), then for what? Maximum pain when implemented by a third party? (/s but not really)


The article mentions the protocol is very similar to the 1-Wire protocol, and links to the Wikipedia article, which provides some helpful clues: https://en.wikipedia.org/wiki/1-Wire

> To send a binary number "1", the bus master sends a very brief (1–15 μs) low pulse. To send a binary number "0", the master sends a 60 μs low pulse. The falling (negative) edge of the pulse is used to start a monostable multivibrator in the slave device. The multivibrator in the slave reads the data line about 30 μs after the falling edge. The slave's internal timer is an inexpensive analog timer. It has analog tolerances that affect its timing accuracy. Therefore, the pulses are calculated to be within margins. Therefore, the "0" pulses have to be 60 μs long, and the "1" pulses can't be longer than 15 μs.

The other “simple” protocols you mention all require a clock line, whereas 1-Wire/IDBUS do not. UART comes closer, except it requires both devices maintain their own independent, reasonably accurate clocks. (How do you tell whether that long high pulse was 4 or 5 1-bits in a row if your clock is terrible?) If you’re trying to design a very simple system that needs to work with wildly inaccurate clocks, you need some kind of self-clocking protocol, like this, where the difference between a 1 and a 0 bit is an order-of-magnitude difference in pulse width.

When I was a teen, me and a friend built a device (for a competition) that could encode and decode messages onto a roll of receipt paper by moving the paper across a reader/writer head (a sharpie attached to a servo, and a photosensor) to draw and read lines. Clocking turned out to be by far the hardest problem, because the speed at which the paper moved was too unpredictable and inconsistent (it depended on battery level, inconsistencies in the motor, how heavy the roll was, how hard the marker was making contact with the paper, etc), We spent many hours designing, testing, and tweaking the software and hardware until we arrived at basically this exact encoding scheme — plus a Hamming code based on letter frequency since we were encoding English messages. We felt very proud of this elaborate way to encode text — until we realized we had just reinvented Morse code.


(correction: we used a Huffman code based on letter frequency — though we did also use a Hamming code for error correction).


Is it this protocol we're talking about? https://cdn-shop.adafruit.com/datasheets/WS2812.pdf (Your comment piqued my curiosity and I looked it up.)

I think the desire to only use a single wire for signalling is understandable, at least in setups where there are physical wires, rather than tracks on a flex PCB, running from LED to LED.

Since every rising edge is the start of a new bit, encoding each bit with an H followed by an L makes the signal self-clocking. With this, the two sides don't need very accurate -- or shared -- clocks.

With the tolerances specified in the datasheet, to tell a zero from one the receiver only needs to tell whether a pulse is shorter (T0H) or longer (T1H) that 0.5us.

T0L and T1L are indistinguishable (their allowed durations overlap) and don't need to be measured by the receiver. That said, I do agree that it's weird that they weren't chosen such that T0H+T0L=T1H+T1L at the centre of the tolerance range.


That said, I do agree that it's weird that they weren't chosen such that T0H+T0L=T1H+T1L at the centre of the tolerance range.

It's likely they intended them to be the same, but for whatever reason the design characterisation discovered the distribution of periods are actually slightly different from the ideal design, so instead of trying to do a potentially very expensive redesign, they just documented the reality and left it at that.


This looks very similar to 1-Wire, which is for ultra-low power devices that communicate on and are powered by the same wire. The reason the pulse widths are different is that the falling edge of the pulse kicks off an oscillator-based timer that triggers a read of the value of the line at some point between the two durations.


The WS2812 protocol is simpler than it first appears. Here's a great blog post about it:

https://wp.josh.com/2014/05/13/ws2812-neopixels-are-not-so-f...


I think others have covered it in other posts but I’d just like to add that this protocol is only for negotiation, everything continues in the native protocol afterwards.


good catch, vouched. you're shadowbanned for some reason, fyi.


Discussed at the time:

Apple Lightning - https://news.ycombinator.com/item?id=23705546 - July 2020 (229 comments)


I guess it shouldn't matter... but why does this author omit the period at the end of each paragraph? It irks me.


Only the author knows, but if I’m to infer it’s a habit picked up from text messaging as to not show aggression?

https://lifehacker.com/dont-use-periods-in-texts-1843744818


or he could be an old Algol programmer, and considers the period to be a statement separator, unnecessary/redundant at the end of paragraphs


See also related: The Hitchhacker’s Guide to iPhone Lightning and JTAG Hacking [1] on DEF CON 30

[1]: https://youtube.com/watch?v=8p3Oi4DL0eI


I upvoted this immediately just because I liked the design of the website. In a way, web design is a kind of new rhetorics


Really cool. Will this new? info be able to help Apple fans in any way? Still wondering when they will go USB-C.


> Still wondering when they will go USB-C.

The EU is forcing them to do so by mid-2024, so it is widely believed that the iPhone 15 will be USB-C.

The Siri Remote and AirPods charging cases already are.


> AirPods charging cases already are

They are not. Even latest models from September still use Lightning and wireless only.


Sadly true. I was planning on buying the new AirPods Pro’s as the rumor mill was saying they’d be USB-C, but alas they are still lightning. Weirdly enough the new AppleTV 4K remote is USB-C (both were released at the same time). I guess it makes sense to keep the AirPods on lightning as long as the iPhone is. Hopefully if the iPhone 15 is USB-C they will release a new version of the AirPods with USB-C, too.

Until then I’m sticking with my 8 euro Xiaomi Bluetooth earbuds.


The Beats Fit Pro are USB-C, use the same chipset as the AirPods Pro, and are (for my particular ears) much more comfortable.


For the avoidance of doubt: they use the same chipset as the old AirPods Pro (H1), not the new H2 that's used in the latest AirPods Pro.


The usb-c apple TV remote was shown about a month later than the stoll lighting airpods pro 2.


Ah yes, you are correct. I'm not sure why I thought that as I clearly remember them being two events now that you've made me think about it!


iPad Pros also use USB C, so they already have the requite support on the portable electronics side. Right now it seems mostly a transition for the phone accessories.

Personally, I think apple has wanted to switch to USB-C for a while on iPhone. They waited until the EU actually passed legislation so they could find a person to point fingers at when their customers complained about accessories breaking compatibility.


> iPad Pros also use USB C, so they already have the requite support on the portable electronics side.

It's always been there, lightning is just a connector, the physical protocol is USB.

> Personally, I think apple has wanted to switch to USB-C for a while on iPhone.

Yes, but also after the DC -> Lightning transition they kinda promised there would be no new transition for a while, which was pretty safe at the time, but USB-C landed a few years later.

I actually expected the iPhone 14 to switch, as 2022 is the 10 years anniversary of Lightning, and about how long the Dock Connector lasted (although technically it survived until this year and the discontinuation of the iPod Touch).


> although technically it survived until this year and the discontinuation of the iPod Touch

The iPod Touch has been using Lightning since the 5th generation released in 2012.


When it was announced by apple they said the 30-pin connector has lasted a decade, and lightning was the connector for the next decade. That was September 2012.

The decade is up.

Personally I would be happy to keep lightning. It’s simple, it works. I have many lightning cables. If I see I lightning cable I know exactly what it’s for. With USB-C you can’t know the capability of the cable by looking at it.

But I suppose lightning has had it’s term in office and it’s time to move on.


All iPad models use USB-C now. The only caveat to that is that when they introduced the new iPad model with USB-C, they kept the old model as part of the line-up and that old model still has Lightning.


There’s some variation between them when it comes to software though - with custom USB drivers only being available for the M1 based iPads that are supported via USBDriverKit

I’ve got an iPad mini and what I really want is a USB oscilloscope that works with it


> The Siri Remote and AirPods charging cases already are.

that's patently not true, bought latest airpods pro last week to my wife and it still has that obsolete lightning port


My bad, I saw a headline about it, but it was a one-off retrofit: https://9to5mac.com/2022/05/10/usb-c-airpods/

It is in the works, though: https://www.theverge.com/2022/8/9/23298151/apple-usb-c-airpo...


They’ll just go wireless and remove the port entirely


“Going wireless” doesn’t implement “must charge over USBC”.


That’s not what the regulation says. It says if there is a port, it must be usbc.


How would that work with CarPlay, or how would someone charge on the go?


I don't understand.

Just place a phone/charging case on a Qi charger? Doesn't CarPlay support wireless communication? Android Auto does.


Cars didn’t really start getting wireless CarPlay until around 2021. It would take a while for those cars to age out. My guess is that if Apple makes a portless iPhone, they will also make a wireless to wired CarPlay bridge.


There are already third party wireless to wired CarPlay dongles.


There is wireless CarPlay now. But it does have higher latency.


If Apple answer to that regulation is to go wireless and use a system that's compatible with other brands, that is a massive victory for the EU and the planet in terms of reduced waste.

The goal was to reduce waste and they already included in the same regulation that if companies don't agree on a common tech for wireless (like it is now) but go their own way to sell exclusive ones that can't be used with other brands then they would regulate that too.


Charging cables contribute such a minuscule amount to landfills it’s laughable. It’s feel good legislation that distracts from real environmentally conscious efforts.


Meanwhile, the power loss associated with mandatory wireless charging over the typical phone's lifetime will impose a carbon cost of its own.


>Charging cables contribute such a minuscule amount to landfills it’s laughable.

Do you also include that actual charger and not only the cable part? Can you show me your numbers, do you mean by weight/volume , CO2 wasted in creating and shipping the chargers ?

But there are more benefits by using a standard port, I can use a Samsung charger for a table in my Asus phone, I can share a charger for 2 products and not buy a new one if the old one is gone.


Apple chargers are USB chargers. To charge a Lightning device you use a USB to Lightning cable.


So can you use your Apple cable with the charger from a Samsung ?


Yes. And you can use your Samsung cable with a charger from Apple.


Yes


So why are some Apple fans complaining? They keep their chargers and will need a new cable when buying a new phone, is Apple exploiting this and selling cables at giant prices ? Also if iPad and the laptops already use USB then you might share same cables now if you are out of battery and you do not have a phone charger with you.


IDK why people are complaining. For me the biggest issue is shared chargers which were solved with USB-C->Lighting. USB-C is a mess of what cable to does what. The biggest problem with lightening isn't the connector, but slow data speeds. That's why it made sense to move the iPads. Though now with iPhones taking 4k+ video, they need to move them also.

But, IMO, from a cable and connector standpoint, USB-C is worse than Lighting.


Every USB-C cable can do at least as much as every Lightning cable.

You can completely ignore the different cable capabilities.


Yes


Removing the connector entirely has the downside of creating a ton of dead or difficult-to-recover devices in the case there's a bug in the later steps of the boot rom that provide some basic flashing interface - Samsung learned that the hard way recently with their faulty GVI3 update for the Watch 4 series. It bricked the watch so hard it couldn't boot up to WiFi-capable Odin mode any more, and lower-level interfaces on the SoC were not accessible any more because there were no ports.

[1] https://www.sammobile.com/news/new-galaxy-watch-4-update-sol...


Oh, sure, they will allow standard Qi charging - but they will also offer an Apple-branded charger that is incompatible with other devices and will charge a lot faster. Never underestimate the capacity for malicious compliance, especially when it comes to laws that are considered unfair.

Also, my arguably 20 year old high school-level knowledge tells me that charging by induction will by necessity waste more electric energy than a cable would.


you are right. Phones get pretty warm when using wireless charging.

On the iPhone X it was so bad that you couldn't use Facetime while on a wireless charger, because the phone would throttle the CPU after 10 minutes or so because the phone got too hot.

Not sure if they fixed it on later models, but I guess this is why they are limiting charging power. Glass and stainless steel aren't particularly good heat conductors.

Maybe Magsafe support more power because the exact alignment reduces waste heat.


iPhones are compatible with the Qi standard which is limited to 7.5 watts.

MagSafe however can charge up to 15 watts.

So in some respects it is still a proprietary connector.


Qi supports at least Baseline Power Profile (which is 5 W) and the Extended Power Profile (which is 15W). I'm not sure what and why Apple did to introduce 7.5W which is proprietary Apple extension.


The Qi 1.2 standard is up to 15 watts.


Which was released in 2015 and Apple still does not support.

I doubt they will let anything compete with MagSafe short of more EU regulation.


I have a usb-c iPad. I do prefer the lightning connector but usb-c is a close second and leaps and bounds better than the previous usb phone interfaces.


Who cares about proprietary standards? Build and sell something for this connector and you'll be sued into oblivion by Apple's lawyers.


Yes because USB C is an open source standard that anyone can freely implement…



> All implementation examples and reference designs contained within this Specification are included as part of the limited patent license for those companies that execute the USB 3.0


True, though I imagine the agreement you’re thinking of isn’t the one they’re talking about here.

https://www.usb.org/document-library/usb-30-adopters-agreeme... is the agreement- it’s royalty free, basically “sign here, don’t patent our stuff”. Not “open source”, but it’s pretty reasonable for what you get IMO- we live in a world all about IP. Many international standards are worse. Plus, it’s not like they’re gonna sue you if you don’t sign it, unless you attempt patent shenanigans.

The $$$$ agreements for VIDs and trademark rights on the other hand… can’t say I’m too fond of those ones.


Apple Lightning was better than micro USB, but surely not better than USB-C (a standard which apple itself helped promulgate). The king is dead. Long live the king.


> but surely not better than USB-C

For data transfer speeds (assuming you picked the right cable), yes. For everything else, no. IMO, Lighting is the better physical connector. The device side connector tends to be what fails with USB-C, Lighting tends to have the cable fail which is what you want in a failure. USB-C cable capability is a mess, though at least we're past USB-C cables frying devices.

As far as ubiquity is concerned, depending on location, Lighting can sometimes be easier to find than USB-C. They are least equal in the US.


biggest problem i have is: if there is some new better standard can we ever move to it?

micro-USB would likely not have improved much without some kind of competition. a big part of why USB-C is reversible was because Apple (and others) made reversible connectors and people showed a lot of preference for that.


Will we want to move to it? Lightning has lasted this long in spite of a better standard, and plenty still don't want to change. Even if there was no law, and there was a new port, I can't imagine consumers caring enough to change again, or manufacturers like Apple caring enough to change seeing as their iPhones are still USB 2.0. Like a physical network effect. Samsung's Galaxy Note 3 and Galaxy S5 had a microUSB 3.0 port. They switched back to microUSB 2.0 in the Note 4.

Still it doesn't prevent devices from having multiple ports. Newer tech will be picked up first by bigger devices like laptops that will be able to do USB-C and a new port at the same time. Or maybe a phone with two charging ports, the same way the ASUS ROG phone does.


Lightning still has better ergonomics and sturdiness compared to USB-C.


Do we need something better or is USB-C "good enough"? Micro-USB was clearly terrible and had to be replaced that much was clear since its very introduction, but USB-C seems pretty good. I don't see it as desirable to move to something "better" in the coming decades, considering the immense cost of switching connector interface on something like this. (That cost also applies to micro-USB FWIW, micro-USB is just so bad that it's worth it.)


I can already imagine that USB-C could be superseded by something that allows surface mounting the pins (similar to magsafe 1&2? like the old pebble watches had: https://www.joom.com/sv/products/1491901009384399338-201-1-2... ).

USB-Cs limitations (that I am already aware of) are: that it consumes a lot of internal space on devices (which we’re feeling less and less as our phones get bigger: but our watches are relegated to wireless charging with no data pins for example), that it's not clear what cable/device can do what (something original USB tried to address with "host" and "client" connector classes) and that it handles comparatively fewer insertion/removal cycles (than lightning, at least).

Regardless: I’m not sure that I am ever happy saying something is “good enough” in perpetuity, even if I cant see all the potential shortcomings of device requirements right now.

You are absolutely right though, the switching cost is high: and monopoly makes it higher.

(FWIW I agree with the EU on this decision, I remember phones before the common charger mandate and that was a nightmare)


> Do we need something better or is USB-C "good enough"?

USB-C won't be reasonably "good enough" until all I need to reasonably use USB-C devices with USB-C computers is USB-C cables.

Suppose I have a computer with a USB-C port, and half a dozen USB-C peripherals that I want to connect at the same time.

Before USB-C it was simple when I had way more devices than ports. I'd buy an inexpensive USB hub. The hub connected to a computer's type A port using the same kind of cable that my peripherals used to connect to the computer's A port. The hub had several type A ports that could then by used for my peripherals, using that same kind of cable.

If the peripherals needed more total power than my computer's type A port could provide, I could buy an inexpensive powered hub.

Now, with USB-C, the inexpensive USB-C hubs connect to the computer with a C to C cable but usually only have one (or zero) fully functional USB-C ports for peripherals. The rest are type A ports.

To actually get a hub that adds multiple fully functional USB-C ports to a computer it seems the right now you have to buy an expensive Thunderbolt 3 or 4 hub.

The best I've seen for inexpensive USB-C port expansion is this [1] or this [2]. The first provides 4 downstream USB-C ports, but they are data only, with no power or video support. The second is the same as the first except it also provides a fifth USB-C port that appears to be a power input only you can use with a USB charger to provide power to allow the hub to power or charge your computer.

[1] https://www.amazon.com/Minisopuru-Ports-USB-Hub-Chromebook/d...

[2] https://www.amazon.com/Minisopuru-Ports-USB-Hub-Chromebook/d...


In retrospect, there's no way that USB-C was worth it! Let's look at what it added over B…

- Reversible: Useless gimmick that nobody asked for

- DisplayPort alt mode: Useful in theory, but no phones support it in practice

- Symmetric: Host-side USB-C never ended up being mainstream anyway (and AB ports already existed anyway)

- USB3: micro-B already had the "3.0 appendage", if anything turning it internal made it easier for manufacturers to silently revert to USB2

- Power Delivery: Qualcomm QuickCharge already existed and could have been standardized without a new connector


- Reversible: Important feature everyone asked for. One of the main application for a device-side USB connector is to plug the device in to a charger at night. That usually happens in a dark room, where reversibility is incredibly important. It was literally touted as the main reason why Apple shouldn't switch to micro-USB. But more importantly, micro-USB is a garbage connector even for being non-reversible, it's harder than necessary to insert without looking, it breaks way too easily and seems to take very few connection cycles before it stops making a good connection.

- DisplayPort alt mode: Every laptop uses this. It's what lets us have docks with display outputs.

- Symmetric: Every laptop has host-side USB-C.

- USB3: The micro-B "3.0 appendage" is ridiculous and huge, and no device which uses micro-B ended up supporting it in practice (except for I believe a couple Samsung phones?)

- Power delivery: Qualcomm QuickCharge is proprietary garbage. It could have been standardized yes, but is there any reason to believe Qualcomm would have wanted that?

- And it turns out it's incredibly useful to have both Power Delivery and DisplayPort and data transfer in one cable. It allows the world we currently live in where you can have one dock connected via one cable which handles video, power, USB, networking, audio, etc. (And yes, docks are generally too finicky, there's a lot to complain about in terms of implementation. But the idea is solid.)


> It was literally touted as the main reason why Apple shouldn't switch to micro-USB.

Apple using it as an excuse has nothing to do with it being a problem in the real world.

> it breaks way too easily and seems to take very few connection cycles before it stops making a good connection.

Never even heard of this actually happening in practice at the time. How hard were you pushing them in?!

> - Symmetric: Every laptop has host-side USB-C.

A few laptops have it. No peripherals (outside of those docks) actually use it because the hubs barely exist and very few have more than one.

> The micro-B "3.0 appendage" is ridiculous and huge

So? It's still tiny compared to the cable itself, this isn't a real problem.

> and no device which uses micro-B ended up supporting it in practice (except for I believe a couple Samsung phones?)

Those devices don't supposed 3 over USB-C either. If anything, USB 3 support in peripherals has declined the last few years.

> - Power delivery: Qualcomm QuickCharge is proprietary garbage. It could have been standardized yes, but is there any reason to believe Qualcomm would have wanted that?

QCQC over USB-C was a thing too, for a while. I guess we'll just have to wait for USB-D for fast charging!


> I guess we'll just have to wait for USB-D for fast charging!

Huh? USB-C cables already support mandatory 60 watts, optional 240 watts.


> USB3: The micro-B "3.0 appendage" is ridiculous and huge, and no device which uses micro-B ended up supporting it in practice (except for I believe a couple Samsung phones?)

Nearly every external portable HDD I've seen has these (and comes with a corresponding short USB 3.0 micro-B to A cable).


I'd like some kind of dumb USB-C which is somewhat compatible with USB-C but has two or 4 wires only and simpler connector. MicroUSB is used in many devices because it's dirt cheap. USB-C is much more complex mechanically and probably will never be as dirt cheap.

Of course it should be visually different. Like cable with two contacts on each side so you would understand that this cable is not going to provide full speed, but still will work and charge.


> if there is some new better standard can we ever move to it?

It seems inevitable that phones and most tables will move to all wireless eventually. The only thing holding it back is faster charging. The EU standard doesn't prevent Apple from moving to a wireless charging method as I've understood. The regulation only covers the case where you have cable for charging.

So yes, since the new standard will likely be wireless, or semi-wireless (like MagSafe), Apple can probably move to it.

My pet theory for why Apple hasn't moved to USB-C already is they didn't want to send the signal that USB-C is getting long term support on phones from them. Sticking to Lightning forces makers of peripherals to use Bluetooth or WiFi if they want a solution that works for all phones and is future-proof, and I think that's exactly what Apple wants. The problem is the transition has been delayed much longer than they hoped for, since wireless charging technology hasn't developed as fast as they hoped (see the last-minute cancellation of the AirPower charging mat for a dramatic example of their miscalculation there)


Wireless charging is significantly less efficient than wired charging. I could rather see it being outlawed for that reason at some point.


From a practical point of view does USB-C avoid the fluff problem with lighting (the yearly clearout with a toothpick!)


I have had to clean out my Pixel phone’s USB C port multiple times. It requires shaving the toothpick down to fit. I eventually bought an iFixit cleaning kit with some tiny brushes.


It seems to be less of a dangerous problem. The USB-C receptacle doesn’t have moving parts (the springs that retain the cable and the flexible pins that make electrical contact). While it does have the little protrusion that has the electrical contacts in the middle, it’s less subject to wearing out or being destroyed by sideways force.

In general, the USB-C receptacle seems better designed for longevity.


> While it does have the little protrusion that has the electrical contacts in the middle, it’s less subject to wearing out or being destroyed by sideways force.

... which is pretty easy to achieve when trying to clean out a socket from pocket lint. It's better than micro and mini USB, yes, but a Lightning-style inversion would have been way more durable.


> In general, the USB-C receptacle seems better designed for longevity.

I swear every other comment on C vs Lightning has a completely different view on which is more robust than the previous


Huh? USB-C has a tiny, fragile tongue inside that you can very easily break off. Lightning is much, much more strong.


Replying here, but you all got tricked by Apple’s misleading design.

In any connector of this sort, there are two pieces: the plug on the cable and the receptacle on the phone. Lightning has a very visually appealing and physically durable plug, and that’s the part that you see. But this hides several aspects of the connector that make a poor functionality tradeoff.

On a USB micro B or USB-C cable, the plug is retained in the socket by little spring-loaded clips on the plug. On micro, they’re on the outside and very obvious, and they stop working well if the plug gets a bit bent. On USB-C, they’re in the concealed portion of the plug. In both cases, they are components that can wear out, and, when they do, you replace the cable and you’re in good shape. These spring-loaded clips mate with fixed notches in the receptacle, and those notches are much longer lasting.

Lightning does this the other way around. The plug has notches, and the springs are concealed in the socket. If the springs wear out, replacing the cable doesn’t help.

Similarly, the electrical contacts are springy in the socket. They are very easy to damage with careless cleaning.

Lightning has an additional defect involving arcing. If you look at a charging cable that’s been use for a while, you’ll see that one electrical pad on each side of the plug will be burnt. Presumably this is where the connection arcs every time it’s unplugged. I’m not sure how USB mitigates this, but it seems to be less of a problem. Maybe the contacts are simply thicker and more durable.

Sure, USB-C receptacles gave a tongue, and it’s a weak point, but it’s much stronger than the 10 fragile moving parts inside a lightning receptacle.


> Presumably this is where the connection arcs every time it’s unplugged. I’m not sure how USB mitigates this, but it seems to be less of a problem. Maybe the contacts are simply thicker and more durable.

My guess is that it's probably just the good old "different length pins" trick. A quick web search shows me that the VBUS and GND pads (which carry the power) on the socket are longer, so the rest of the contacts unplug first. And unplugging the CC contact means that the pull-down resistor on it disappears, which tells the source to stop sending power to the VBUS contacts before they start to unplug.


Thanks for the info about the springy parts and where they are!


Reverse engineering of Apple Lightning




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: