Incredibly, their language usage specifications doc https://www.usb.org/sites/default/files/usb_3_2_language_pro... begins with this:
> USB-IF emphasizes the importance and value of consistent messaging on USB product packaging, marketing materials, and advertising. Inconsistent use of terminology creates confusion in the marketplace, can be misleading to consumers and potentially diminishes USB-IF’s trademark rights.
I simply don't understand USB-IF's motivation to make this so confusing for everyone. Their board consists of Apple, HP, Intel, Microsoft, TI, Renesas, and STMicroelectronics, so it isn't like it's controlled by low end trashy cable manufacturers trying to make a quick buck from confused customers.
This is just like the 4G-5G-5Ge debacle. Both being motivated solely by financial gain.
You are forgetting 4G LTE... ^__^;
When I was growing up, we had USB, FireWire, and a couple size/speed variations. It was easy to understand and you could tell what a cable was by looking at it.
For people who grew up in this era, the current situation is super annoying. Perhaps the young people of today will view peripheral standards as some sort of super-obscure language that isn’t intended to be decipherable by laymen?
Muddying the waters is a diabolically genius way of attacking the generic cable manufacturers.
And we felt damn lucky to have them.
If we put it in that context, then this new re-branding has to be done for the same reason - so that the same companies can claim to support a "newer" USB standard (and thus a reason for you to upgrade to the new devices), even though nothing has changed.
Then at a glance there is a distinction between the comment itself vs. quotations.
In other words, it's a crapshoot anyway, so who cares? We'll just keep plugging until something sort-of works.
(do I sound crushed by the mind-boggling trans-galactic humongousness of the borderless expanse of human (and especially commitee) stupidity?)
That is assuming all USB-C Cable will support all power combinations ( They don't ) All USB 3.2 Speed ( They Don't ), and All Thunderbolt Spec ( They Don't ).
And it is precisely this comment that makes me thick why USB-C has won. Until the problem becomes so wide spread you have to test each cable whether it is working as intended before people will call for changes.
The point is you will be able to look it up. With cables you can't. I wish it was mandatory to publish the rated max speed, amps and max number of insertions on every USB-C cable. It's only 3 numbers - but there is no chance of it happening.
To think, memory used to be the number soup......
Well, you will if you end up trying to use a part which supports 2×1 with a part which supports 1×2, since they both support the same top speed, but together they only support the lower speed of 1×1.
But maybe there is an electrical engineering reason why it might be practical to do 1×2 but not 2×1 in some device.
USB 3 20Gbit PD 100W 20V
So they only have to put the speed class and PD or not PD. It would make things so much better.
Alas the standard needs standardised....
Even this is technically incorrect: USB-3 A-type sockets, though backwards compatible for f/m and m/f connecting with USB-2, have 5 more contacts compared to USB-2 type A sockets.
USB-C-3 for the new connector format.
USB-C-2 for some (many) cellphones.
USB-Micro-2 (for many phones).
USB-Mini-2 (older phones)
- Edit -
I changed my mind, the connector is more important than the wire version.
Remember how in the Nokia days manufacturers of mobile phones all made incompatible chargers and data cables. Eventually they were hit with regulations. I think this is similar, but more advanced plot against consumers with plausible deniability.
Add to the fact that everyone is skimping on the actual USB 3 standards and a large amount of shitty products on the market that don't do what they say they do (due to poor quality control, making wires longer than spec expecting them to work to spec), we have to rely on guys like Benson to crowdsource working and non working products.
That is exactly what the "SuperSpeed USB" branding is all about. The technical spec is USB 3.whatever and the brand name is " SuperSpeed USB". But neither the press nor vendors seem to be able to stay in the line and actually consistently use only the branding, so here we are
This was the nice aspect of Firewire - once 1394B was out - everyone referred to them as Firewire 400 and Firewire 800. No ambiguity.
Heck even Thunderbolt does this better - 1 and 2 are interchangeable device/cable-wise, and just support higher speeds, but it's just a new 'version' and it just doubles the bandwidth.
With USB you have to read all the fine print to make sure you get what you want/need.
> NOTE: SuperSpeed Plus, Enhanced SuperSpeed and SuperSpeed+ are defined in the USB specifications however these terms are not intended to be used in product names, messaging, packaging or any other consumer-facing content.
Yes, guess how super “SuperSpeed USB 20Gbps” is
4.1 will be what was USB 1.1
4.2 will be what was USB 2.0
4.3.0 will be what was 3.0.
4.3.1GXY1 will be what was 3.2 1x1 (which was 3.1 Gen 1 (which was 3.0)
4.3.1GXY2 will be what was 3.2 2x1 (which was 3.1)
4.3.2GXY1 will be what was 3.2 2x2
4.4.0 will be double the speed of 4.3.2GXY1 (what was called 3.2 2x2) but will max out at 25mm (~1") cables and provide no power at all.
4.4.1 will be like 4.4.0 but will expand the length to 30cm (~1')
4.4.1ZORB22 will be like 4.4.1 but also carry up to 5W power for bus powered devices.
Go ahead and laugh, and then realise this isn't really that different than what we have now from USB-IF.
e.g. USB 3.2 has the same connector as USB-C
edit: I personally care far more about the connector than exact specs of each version
* USB 3.2 Gen 1 5 Gbps (formerly USB 3.1 Gen 1) (formerly USB 3.0)
* USB 3.2 Gen 2 10 Gbps (formerly USB 3.1 Gen 2)
* USB 3.2 Gen 2x2 20 Gbps
Literally everyone that came up with this ridiculous naming scheme and literally everyone who approved it should never be allowed to participate in a working group ever again. This level of incompetency is just absurd. These people cannot be allowed to pollute critical standards with their terrible ideas.
USB-C is a connector, so you cannot say "has the same connector as USB-C".
It is the only connector for USB 3.2, and a connector that is compatible with USB 3.1 Gen 2.
Edit: I meant it is the only connector for USB 3.2 "Gen 2x2", and compatible with 3.1 Gen 2 (now called 3.2 Gen 2)
From what I've been able to find, not even that is reliably true. Looks like the older connectors can be labeled USB 3.2 as long as they use the "x1" scheme in the label and are capable of reaching the speed of that scheme.
hopefully usb-c connectors are tougher then micro-usb's
This is what happens when standards bodies have perverse incentives to confuse. See the SD Association for even worse branding.
Without the standards group:
If manufacturer A is confusing, then they can sell more than manufacturers B and C. So in turn manufacturers B and C will be confusing as well in order to keep up. This is bad for consumers and generates ill will against USB in general (globally bad for A B and C)
With the standards group:
Manufacturers A B & C all agree on something non-confusing, and in order to use the USB trademark, they can't deviate from what they agreed upon. USB is now sunshine and rainbows so USB customers are happy in general (globally good for A B & C).
Something has broken down in the system if they make names that are confusing since USB being confusing is bad for everybody, and none of the manufacturers get a leg up on the others if they all use the same confusing language.
It does Thunderbolt 3, USB-C 3.1 Gen 2, and up to 100W of power delivery. It's a shame it's only available in one kinda-short length, but I think that's a technical limitation (any longer would require an active cable for Thunderbolt).
The big mistake IMO was a lack of obvious keying. Very little stops me from using that 2w-capacity keyboard cable to try to charge my laptop. If we're lucky, there's signaling at the device to tell me what's wrong, but if they had made a different plug on the 100W cable, it would eliminate a lot of cockpit errors.
Having one cable for everything is a huge red herring. If you need a keyboard and a power brick and an external hard drive, you need three cables. It doesn't matter if they're different, as long as the OEM isn't pulling some Apple-style "one single port is enough" BS.
No thank you. The drama around USB-C is insanely overblown; the vast majority of use cases are fine with the 60 watt + 480 mbps required minimum configuration for USB-C cables. In the few rare exceptions where you need >60 watts or higher speed, you're generally going to have purpose-built hardware anyway, so just devoting a special cable to that isn't such a big deal.
But on the other hand doubling the power wire thickness and leaving the signals wires untouched shouldn't add more than 10-20% to the price of the cable.
> The big mistake IMO was a lack of obvious keying. Very little stops me from using that 2w-capacity keyboard cable to try to charge my laptop. If we're lucky, there's signaling at the device to tell me what's wrong, but if they had made a different plug on the 100W cable, it would eliminate a lot of cockpit errors.
Every cable supports at least 60 watts. It's not a big deal if you use the wrong one to charge with.
That means a device can ignore the signalling entirely and just keep taking more power till the supply turns off, and then scale back 10 percent.
More and more Android phones do that, and it works very well.
Only disadvantage is if you plug it into a hub, it can cause the whole hub to go out for a bit. Devices solve that by not using the above algorithm when they detect any active signalling.
> USB 3.2 doubles down on this confusion. 5Gb/s devices are now "USB 3.2 Gen 1."
So when they eventually release USB 3.3, my rusty old memory stick will be auto-upgraded to USB 3.3 Gen 1? I like it!
I'd go as far to say that articles like this exactly help keep the confusion up because the way they keep bringing up the term "USB 3.2" etc as something that consumers should care about, when it is something that should be avoided.
> What this branding meant is that many manufacturers say that a device supports "USB 3.1" even if it's only a "USB 3.1 Gen 1" device running at 5Gb/s. Meanwhile, other manufacturers do the sensible thing: they use "USB 3.0" to denote 5Gb/s devices and reserve "USB 3.1" for 10Gb/s parts.
Both manufacturers are doing the wrong thing: referring to the USB standard version at all. Devices should not say that they support "USB 3.0" or "USB 3.1 gen 1". They should say they support "Superspeed USB" or "Superspeed USB 10 Gbps".
Superspeed might be a silly name, but the branding there is almost perfectly unambiguous, and nicely explicit about the data rate in a way that "USB 3.2 gen 2x2" is not.
What is the most surprising is that USB-IF isn't able to keep the vendors in line despite being industry consortium and owning all the relevant trademarks and other IP. They really should start enforcing their policies more aggressively.
"SuperSpeed" is useless in terms of branding because even technical users, much less your average consumer, have no idea what that means. Is "SuperSpeed" faster than "High Speed" or "Full Speed"? (Answer: "Full Speed" < "High Speed" < "SuperSpeed".)
"20 Gbps USB" is better than "USB 3.2 Gen 2x2", sure, but that's not exactly a high bar. It just means USB has failed so badly with version numbers that people would rather just quote a spec figure.
Wishing for everyone to switch over to "SuperSpeed" and "SuperSpeed+" will not make it so. Fact is, both users and manufacturers are accustomed to "USB 3.0", "USB 3.1", etc. And it's not hard to see why people prefer simple version numbers as opposed to meaningless labels.
USB-IF should have simply released USB 3.1 and USB 3.2 without this ridiculous business of retroactively renaming older versions.
I would point out that this is not a new thing. The situation was exactly the same with USB 2.0, technically it was perfectly correct to call compliant full speed device "USB 2.0", because 2.0 did indeed include full speed mode (as does presumably all the USB 3.x standards)
Those are just some of the questions non-technical consumers are expected to know the answers to when they are buying USB products.
This is absurd.
What, I don't know. But their Joker-like approach to USB standards and practices is costing us all dearly.
It his (or has been) handy for connecting gadgets and drives over the years, but maybe we should go back to PS2 and just use it USB for "front of the machine" ports.
USB 3.2 Gen 1 = (3 + 2) x Gen 1 = 5Gbps
USB 3.2 Gen 2 = (3 + 2) x Gen 2 = 10Gbps
USB 3.2 Gen 2x2 = (3 + 2) x Gen 2 x 2 = 20Gbps
Pretty Clever ( Or Dumb ).
To make matter worst they still have not make USB-C mandatory for USB 3.2. So you could have a USB 3.2 Gen 2 Host with a USB-A Cable that didn't not support the previous USB 3.0 SuperSpeed ( 5Gbps ) so it will fall back to USB 2.0 Speed ( 480Mbps ) , Or a USB-A Cable that did not support the USB 3.1 SuperSeed+ ( 10Gbps ) which is also called USB 3.1 Gen 2 and is now called USB 3.2 Gen 2 so it will fall back to USB 3.0 SuperSpeed ( 5Gbps ) and was called USB 3.1 Gen 1 and now called USB 3.2 Gen 1.
You also have crappy USB-C cables that does not confirm to the USB-C spec. And the consequences depends on the quality of the cable.  And in some extreme cases, you could have a USB-C cable that do not support USB 3.2 Gen 2 x 2, because some how only pins on one side are working as intended.
Although things has gotten better since his post and Google has been putting some pressure into fixing it. USB-C is still a bloody bag of hurt. So in case anyone is still rooting for iPhone using USB-C without MFi. I seriously hope people will reconsider. Using MFi will defeat the purpose of USB-C since you will need a MFi Cable, which is one way to say current lightning to USB-C cable spec and cable are fine. And in case you are wondering about iPad Pro, you should take a look at the controller size, it is roughly 5 times bigger.
Or we make USB 4.0 that standardise on USB-C and force a much more stringent specification on Power Delivery, Speed and Quality. But as we all should be able to tell this isn't a technical problem, more like political, design by committees, marketing problem.
Would love to see "USB 4" .
No ".0" afterwards, just the simpler numeral.
And if you need a bigger connector, then buy the "USB 4 XL connector" cable.
I'd rather go for a (possibly faster) Ethernet with a redesigned smaller connector supporting POE. Smaller devices such as mice and keyboards could talk at PHY level while smarter ones with beefier microcontrollers could employ more and more layers of the network stack so that routing/tunneling data over the network when needed would become trivial. Security wise, there would be no physical difference between an external local disk and a NAS, but software could be used to filter devices from the network. The upside is that there would be a single stack for everything: from storage to security cameras, audio systems, data acquisition gear, sensors and actuators etc. All ports would be electrically insulated, the stack is already open and wonderfully documented, the hardware is already near realtime and there are no royalties to be paid to use that.
> My hope is that in the future — distant future perhaps — your computer will only need one wired communication technology. It will provide power on the connector like USB and FireWire, so it can power small peripheral devices. It will use IP packets like Ethernet, so it provides your wide-area communications for things like email and Web browsing, but it will also use Zeroconf IP so that connecting local devices is as easy as USB or FireWire is today. People ask me if I'm seriously suggesting that your keyboard and mouse should use the same connector as your Internet connection, and I am. There's no fundamental reason why a 10Mb/s Ethernet chip costs more than a USB chip. The problem is not cost, it is lack of power on the Ethernet connector, and (until now) lack of autoconfiguration to make it work. I would much rather have a computer with a row of identical universal IP communications ports, where I can connect anything I want to any port, instead of today's situation where the computer has a row of different sockets, each dedicated to its own specialized function.
Sadly, instead of using existing successful networking systems and automating the configuration, we've created a new union-of-all-possible-protocols with a new connector (or 5) and made understanding the compatibility matrix a nightmare.
Here's an idea to push for this technology. I have no way to develop this, neither the technical knowledge nor the money required, but comments welcome anyway. Essentially I would:
1- design a smaller connector that could host all Ethernet pairs plus power supply,
2- for testing purposes, build a small bridge board (a matchbox sized black box) between the socket and the
original Ethernet plug and a nearby USB port for power alone. Ethernet pairs would pass untouched (or possibly
replicated through a switch chipset+magnetics) while the bridge board would contain any necessary circuit to power
3- Now we have some cheap hardware which is easy to replicate to use on every platform, including SBCs where the
Some proof of concept would be needed to demonstrate that, so here's my question: how hard would be to satisfy point
 and build some hardware (mice, keyboards, audio etc) that can use the new network standard to show its benefits
to the industry?
That's what we need more of, a coherent vision to guide design and development. Early Apple had that, it feels like, then lost the coherence in the 2010's. The phrase "design by committee" is often used to characterize what not to do, so it seems design by a small focused group or individual tends to go in the right direction. But there must be real-world examples of successful "design committees" or "standards bodies", who avoided the gravity/entropy towards producing monsters of complexity and compromise..
In this spirit I think that 'USB 4' really needs to be 'USB 4G'. Then you can have phones and the connectors use the same number. My former workmate would understand it. A '4G' cable for a '4G' mobile phone signal, that would be simple and the least techie people could understand it whilst people on HN faceplant in perpetuity.
USB '5G' would be along with '5G' phones. Everyone could be happy and cables could be fairly random, just try a selection until you get one that works, the clue being that cheaper ones for power adaptors don't let you connect your '5G port' on your computer to your screen or move data to your phone, much like mini USB now. Who reads labels anyway?
This seems like a very bad idea, the two technologies don't move at the same speed and are totally unrelated. I think it would be even more confusing for consumers.
Ethernet at least had 10Mb, 100Mb 1000Mb/Gig, etc.
> USB-IF’s recommended nomenclature for consumers is “SuperSpeed USB” for 5Gbps products, “SuperSpeed USB 10Gbps” for 10Gbps products and “SuperSpeed USB 20Gbps” for 20Gbps products
You have to live with calling it a "SuperSpeed" cable instead of a USB cable, though. :p
They should have visibly differentiated charging and data cables, and required all cables with USB connectors at both ends to be rated 100W, and this new 20 gbps (2x2) speed if also supporting data.
Cables with USB-C and another USB port at the end should be required to meet both the power and data requirements of that connector. (so no charging compatibility connectors). USB-C to USB-A should have been restricted to 3.0 only.
Alt modes can use a plain USB-C cable (like Thunderbolt 3 does), but cannot place additional restrictions on the cable if they want USB connectors on both ends.
If the cable has any other connector on the end (such as DisplayPort), it would be up to the creator of the DisplayPort alt mode to indicate if there are any differing requirements.
Likewise if the cable is integrated into a device, it can meet its needs - be charging only, only use USB 2,etc.
This imho would solve nearly all the problems - the problem is that I can pick up a USB-C to USB-C cable and it could literally be one of (at least) six things internally - with no markings to differentiate.
To me it seems the "press" is distorting this press release.
USB 3.2 Gen 1
o Product capability: product signals at 5Gbps
o Marketing name: SuperSpeed USB
USB 3.2 Gen 2
o Product capability: product signals at 10Gbps
o Marketing name: SuperSpeed USB 10Gbps
USB 3.2 Gen 2x2
o Product capability: product signals at 20Gbps
o Marketing name: SuperSpeed USB 20Gbps
USB 3.2 Key Messages
• Defines multi-lane operation for new USB 3.2 hosts and devices, allowing for up to two lanes of 10Gbps operation to realize a 20Gbps data transfer rate, without sacrificing cable length.
• Delivers compelling performance boosts to meet requirements for demanding USB storage, display, and docking applications.
• Enables end-users to move content across devices quickly, conveniently and without worrying about compatibility.
• Backwards compatible with all existing USB products; will operate at lowest common speed capability.
One for power, one for data. It could be any scale, as long as products like monitors would need data > x, phones will charge faster up to power y, laptops need power z, etc. Two numbers are easy enough to emboss, engrave, print into a connector, cable or device...
Does anyone care about Gbps for USB? I've never hit the bandwith limitation on USB and see USB type-C as just an annoyance making me buy new cables.
> And 20Gb/s devices will be... "USB 3.2 Gen 2×2.
Pretty straight forward: USB 3.x supports 5 * 2^x Gbps. What we really need is MORE retrobranding!
- USB 2.0 becomes USB 3-3.4
- USB 1.0 full speed becomes USB 3-8.7
- USB low speed becomes USB 3-11.7
You have if you've tried to used multiple external monitors over USB (a la macbooks).
Really, USB speed limits are mostly bound by flash memory limits so manufacturers struggle to justify a replacement. I guess faster charging is nice?
You haven't met new Windows (Dell, etc.) laptops yet? If you want multiple monitors with a single connector (there's often just one), it makes all the difference whether you can connect two 4k monitors @60 Hz or not.
External SSDs are probably the only time it matters, or even multiple drives connected through a USB hub.
I've also run into this using external ethernet adapters, which I still want when I'm copying large files or streaming video via airplay or steam. My last 3 laptops were missing inbuilt ethernet ports. USB 2, while much faster than the wifi speeds available at the time, only supports half the bandwidth of gigabit ethernet.
Those are hardly "ridiculously arcane". Full speed is not really relevant, so only thing consumers need to know is that superspeed > high speed, which I don't think as very high bar.
But fine, if we actually see everything branded as "SuperSpeed USB XXGbps" then it's overly long but it's understandable.
I am not convinced we won't see a hideous hodgepodge of different branding guides, though. Some saying "SuperSpeed USB __Gbps", some saying "USB 3.2", some saying "gen __", some saying "SuperSpeed+"
Therefore, If a motherboard has two USB 3.2 Gen 2 10 Gbps port, can it be labeled as USB 3.2 Gen 2x2?
How does all this interact with UCB 3.2?
But they need a new reason to sell you more cables, dongles, and adapters.
Maybe they can call the next one USB 5G! That not only helps confuse the branding even more, but . . . It worked for AT&T.