That was already called full speed in the USB 1 days though. There's a mode called low speed too which was 1.5mbit and meant for low throughput devices. Full speed was 12Mbit. This way stuff like keyboards and mice didn't need to include the highest end controller chips.
Calling it Full Speed was more a lack of vision than a deliberate attempt to confuse consumers as this HDMI thing seems to me.
So then when 2 and 3 came out they were forced to find superlatives. Hi-Speed, Superspeed.
The same happened with radio. High Frequency was up to 30Mhz. Then they found they could go higher. Very High Frequency, VHF. Then came more advances and Ultra-High Frequency.
Eventually they gave up after XHF and SHF and started using band letters :)
Using relative terms for constantly changing technologies is just a bad idea :)
The newest version of USB seems to have solved this by labeling ports with the supported speed (5, 10 or 20). Assuming they start doing the same with the cables, at least that part of the problem becomes tractable.
USB-C itself has nothing to do with data rate at all. It's really the combination of USB-C and USB 3.x standards made things like hell. It was supposed to have one single cable to do everything but the fact is that I have far more cables which look almost exactly the same but with dramatically different capabilities and I have no idea which can do what at all.
But nonetheless, USB-C indeed introduced one issue that I never imagined: sometimes my phone decided that it should charge the charger instead of being charged...
I don't know, sometimes I just want a very thin cable that is flexible - for charging mostly. I bought a proper USB-C anker cable for my phone early on, and replaced it literally within few days as it was proper big shielded cable that had zero flexibility, horrible for charging. I didn't care that technically it could do 10GB/s, it just wasn't necessary.
In theory, the choice of USB-C or -{,mini,micro}A/B is orthogonal to the choice of version: there are USB-A and even USB-B connectors (with additional pins) that support USB 3.0 as well. There is a logic to this.
micro-USB requires a different extended connector for 3.0 [1], and I don't think there is a mini-USB 3.0. USB-A is generally coded with a different color and the extra pins are clearly visible [2], especially when you hold a 2.0 cable side-by-side. So it is all quite clear except for USB-C.
I'd be fine with charge-only cables if there was a distinctive, mandatory, universally-honored way of indicating that.
I deliberately carry charge-only cables when I anticipate encountering untrusted chargers, and I've designated all mine with a band of red heatshrink on both ends.
For Type A chargers they are readily available online and can be testing using the approach mikepurvis wrote in a comment parallel to this. Or you can make yourself a clunky one as it’s just two wires.
I have seen them called “data blocker” or “secure charging cable” but what you want is one that is a female-to-male device so you can attach it to charging cables not just bricks.
Type C is harder. I have one from the early days of Type C that doesn’t do PD so in the end it’s only useful for phones. I haven’t seen one that does Power Delivery.
It’s just design by committee and long-standing efforts for backwards compatibility. Also the people writing the standards are far too familiar with them and thus a bit lost when it comes to making practical decisions.
Whenever you make changes there will be compromises and someone will have reason to be unhappy.
There is no way that eliminating 2.0, in such a way that everything that used to rate as 2.0 now qualifies as conforming to some subset of 2.1, can be justified as backwards compatibility.
I don't think its particularly odd that the specifications are supersets of old versions; indeed that feels pretty common in the standards world. IETF specs are maybe the odd ones out where you typically have to read like ten different RFCs to get good picture of some standard.
The people writing the standards are also the ones implementing it. That’s the kickback. That’s why every USB 3.whatever device suddenly became USB4 ones.