Hacker News new | past | comments | ask | show | jobs | submit login
USB4 Specification (usb.org)
332 points by ingve on Sept 3, 2019 | hide | past | favorite | 314 comments



Reading through this, I'm left unable to see the underlying forces. None of the people working on this spec are idiots, so how did they arrive here? What are the forces and unstated requirements behind the scenes that I don't understand that have led to a USB4 spec with this many potential flavors and absurd naming conventions to express them? Is it something about cost? Or patents? Or some random company who won't budge on something but needs to be included?


> Is it something about cost?

It's hugely about: cost, and the desire of the USB forum to ship a single "does everything" protocol that keeps a large chunk of the industry from choosing to stay on some old version of the spec forever.

Tons and tons of manufacturers of cheap low end devices out there aren't willing to replace a $0.10 USB controller with something that costs 30x as much when they have no need for the high-end speeds of Thunderbolt 3, fast charging, DisplayPort alternate modes, etc. Consequently the spec can either contain a lot of optional features they can choose to ignore, or it can mandate support for all of the 40 Gbps active cable fast charging etc options, and watch as devices partition themselves between the few the need all of that stuff, and the majority that ship simple micro-USB 2.0 hardware.

If they did the latter, then USB 4 is essentially pointless: that's the exact situation we were already in 5+ years ago, except the cheap side was called "USB 2" and the expensive side was called "Thunderbolt". All we've done is rename the expensive side to "USB 4".

If they do the former, they ship a "spec" so full of options that it's already nearly impossible to tell exactly what devices or features any given USB C port will support without recourse to a specifications sheet.

Fundamentally, "one port and protocol to rule them all" is probably just a bad idea. Trying to serve the needs of both cheap slow devices and expensive fast ones puts the forum in a contradictory position, and the only "benefit" from combining the standards that serve each niche is opaque ports and cables that consumers can't easily reason about the compatibility of.


>Fundamentally, "one port and protocol to rule them all" is probably just a bad idea

I disagree. The situations are:

1)ports are the same, protocol match : works

2)ports are the same, protocol mismatch : won't work, some user confusion

3)ports are different, protocol match : could work but won't without an adapter

4)ports are different, protocol mismatch : won't work. Adapters might exist but those won't work either.

I'd much rather deal with situation 2 over 3 and 4. Display adapters already have situations where you can get adapters to cobble together DVI and VGA ports but still fail to get a proper connection, so its not really a fix anyway.

The USB foundation should just come up with better branding around what kinds of things a port supports. No reason we can't have a thunderbolt icon next on a USB port. Its strictly better than a new physical port, IMO.


You forgot the case 5) ports match, some protocols match, port works at much reduced capability.

This is great for backwards compatibility, but can confuse users. You could have a TB3 drive attached to a TB3 computer by a USB cable with type C connectors that only operates at USB 2 speeds. It will silently fall back to a very slow transfer.

I don't know how the committee could have done much better though; this kind of fallback is important. Some mandated labeling would have helped a lot though.


Indeed, I've just had exactly this case with a high speed USB3 camera. Tested it on 2 PC's and a Pi4 both direct and via a USB3.1 hub and it just wouldn't deliver the theoretical frame rate. Finally turned out the 'premium' USB C cables we were using are actualy only USB2.

From pictures on Amazon [0], a lot of cables being advertised as Type C USB3 are actually missing the 'SS' labelling, despite being advertised as 'Superspeed' (i.e. 5Gbps) in the description.

[0] https://www.amazon.co.uk/Anker-PowerLine-Durability-Devices-... - in this case, it currently provides a link "There is a newer version of this item" which links to a USB2 version(!).


All of this is software UX problem. I plug something in that something should tell me what kind of connection it established, is it any good, etc. Try the fast ones opportunistically and tell me when it has to fall back. That's it. If it works I'll know anyway.

Even a simple micro-USB on phones needed a pop-up to select what functionality the user wants to activate (charging, file transfer, host mode, debugging).

The problem is not new, but most software makers are stubborn miscreants that hate good UX (simplicity, discoverability, accessibility / introspectability, extendability). They usually champion one aspect to the total detriment of the others - eg GNOME3, phone OSes.


The problem is they think users are too stupid to understand that much information and detail. And for the most part, they're right: users are very stupid.


Hiding information from them won’t help them to become less confused.


Tell that to the UI "experts".


> It will silently fall back to a very slow transfer.

Why would it be silent? The computer would hopefully show a small notification that the transfer is happening slower than it would with a newer cable


With the older cable it wouldn't even know what the other side could be capable of.


That's actually untrue because for USB2 vs USB3 my computers already showed notifications that the USB2 cable is slower than necessary. The computer KNOWS the other device, it has a driver for it, and the driver can say "this device is capable of more".


Sometimes yes, other times it might be possible in another way, but a lot of drivers are generic and don't have an exhaustive list of all devices in existence. Devices only need to be identified as a certain (generic) type.


It's still certainly possible for a device to report it's capabilities and plenty of devices seem to correctly do it already.


It would know, the wSpeedsSupported field on the device tells the host which speeds the device is capable of.


but it could at least report the capacity of the cable. that's still useful when you get a badly advertised cable that you thought could do more, or worse, got a fake cable.


This. Apple does something similar if you try to pull too much power


Only in the power case -- how can it know that you didn't intend to use USB 1? It wouldn't be able to tell what the other side is capable of in that case, hence the fallback.


It could know via USB descriptors, and match the device capabilities from the descriptors against what the cable allows.


Does any computer actually do such a thing?


Windows 7 did for USB2 devices plugged into a USB1 port: https://www.makeuseof.com/tag/usb-speederror/

If the computer can identify the device’s capabilities (using some kind of bus command, I’m not very familiar with the USB protocol) separately from speed testing the cable+device combination, no reason why it couldn’t work for USB 2/3/4.

Of course, this approach relies on the device accurately reporting its capabilities, which may not be the case for the more cheap-and-cheerful gadgets.


Windows 8 (and perhaps 10, can't remember) alerted me when I used a USB 3 thumbdrive in a USB 2 port


Windows 10 shows a notification whenever my USB-C portable monitor is using USB video (via the monitor's DisplayLink chip) rather than DisplayPort.


Just try to get some quick charge with third-party equipment going these days. It's a lottery.


> No reason we can't have a thunderbolt icon next on a USB port.

In fact, my ThinkPad X1 Yoga has two USB-C ports with Thunderbolt icons next to them. They support USB 3.1 Type-C Gen 2 with Thunderbolt 3 (including DisplayPort).


How many DisplayPort channels do each ports support?

How many PCI-E lanes does each port get?

Do the ports share the channels/lanes?

Do they both support Power Delivery? How many Watts does the thinkpad need to charge?

Even with 'clear' labelling, it can still be a crapshot as to the full capabilities of the device.


Well, this is where we get the phrase "Plug and Pray", isn't it! ;-)

Right now I have an Uptab Mini-DisplayPort adapter in one port, and a StarTech full size DisplayPort adapter in the other. The Uptab adapter has its own USB-C input for power, so that is where the power supply is connected.

The mini and full size DisplayPorts are each connected to a 4K monitor, and I use all three - the two externals and the ThinkPad's WQHD. This all works great when the Uptab is in the rearmost USB-C port and the StarTech is in the other port. But if I switch the two around, one of the monitors goes into low resolution mode!

This kludgy configuration is just because these were the adapters I had on hand. There are a number of adapters that plug into a USB-C port and give you two mini-DisplayPort outputs - I have been meaning to try one out.


> They support USB 3.1 Type-C Gen 2 with Thunderbolt 3 (including DisplayPort)

And therein lies the problem.

Used to be, we could call a USB port a USB port. Everybody knew that if the computer had the port, it could work with the peripheral; if the computer didn't have the port, it would not work with the peripheral. If the peripheral required different functionality then it required a different port that was equally straightforward and obvious.

Now, it's all supposed to run over USB - at least, over some USB. Consumers go out and buy a device and then are confused when it doesn't work. They know their computer has a USB port, but that's not enough. The port no longer represents a definite protocol, or even a definite power supply capacity. Are they supposed to check to see whether each of their USB ports says "USB 3.1 Type-C Gen 2 with Thunderbolt 3 (including DisplayPort)" next to it (which it doesn't)?


Pardon my ignorance, but isn't (2) really dangerous when it comes to things like power delivery?


It would be if the different protocols didn’t have the same baseline power specifications and require negotiation between controllers to supply anything else. This is generally the case across everything sharing the type-C connector, though occasionally you will see a company make the terrible decision of trying to pass off their out-of-spec implementation as a “proprietary port,” as Nintendo did with the Switch.


Is there something special about the USB-C connector on the Switch?



There were some slightly off-spec things in current draw but none of that is relevant to the article you linked.

That pin is supposed to use data signals of less than two volts. The nyko charger uses nine volts. The switch has no over-voltage protection on that pin, and breaks at more than six volts.

The part about a slightly thinner plug isn't relevant at all.


(2) has killed people, such as the patient who was electrocuted in 1986 due to an IV pump power supply using the same connector as EKG leads. (Casey, "Set Phasers On Stun" 2nd edition, pp.177-180)


USB doesn't automatically deliver maximum power to whatever is plugged into it. The client device has to ask for it. The default 5v 500mA isn't going to hurt anybody.


Neither is 5V 2A, but that's more to skin resistance that's really high at such low voltages.


5V at 1000A is completely safe too. The only thing that matters with electrical safety is voltage, and anything under 50V is generally safe for humans, all because of skin resistance.


No, because power delivery is negotiated from a safe USB1 type base levels. If the device doesn't request it, it won't get delivered.

The only outliers are moronic implementations like Nintendo, where they ask for capabilities, receive the charger capabilities and then just start pulling power at hardcoded levels even if the charger doesn't specify it supports those power levels. But there's nothing you can do against that - you even with different port shape, a broken implementation can still fry your device.


You started with "no" (ie, it's safe)... then ended with "can still fry your device".

I infer your point: the shape of the connector in theory is decoupled from risks in power delivery... but in reality the prevalence of cheap knockoffs and "moronic implementations" makes me question the wisdom of combining [potentially dangerous levels of] power delivery with data transfer.


Didn't the charger from the Pixel not implement this correctly and send full power down the line even if a device can't handle it?


The Pixel charger was fine AFAIK. The Nexus 5x and 6P had rogue chargers


There's nothing dangerous about drawing less current than is available, or a device trying to upgrade voltage and/or current from a base level (5V 500mA) but being disappointed when the source can't oblige.


> 4)ports are different, protocol mismatch : won't work. Adapters might exist but those won't work either.

Why would adapters necessarily not work in this case? It looks like you're claiming that translating one protocol to another is always impossible, but that can't actually be the idea... right?


Most of these kind of adapters (usb-c/usb-a, dvi-d/hdmi) don’t actually do any protocol conversion, they merely take advantage of fallback/cross-compatibility that the endpoints actually implement and rearrange some wires. Adapter that need to do something more complicated (like dvi-d to vga) have to be powered. And in some cases like USB 3 to thunderbolt or DisplayPort it’s simply impossible for the host controller to provide the necessary IO. Thunderbolt expects to provide a full PCIe interface to the CPU that doesn’t fit within how USB 3 controllers communicate with your CPU. DisplayPort requires a connection to your graphics adapter, which is also not under the USB 3 controller’s purview.


That would be case 3...


No, in case 3 there's a protocol match. In case 4 there isn't. I'm asking why an adapter is supposedly unable to deal with a protocol mismatch. Dealing with mismatch is the whole idea of an "adapter".


If an adapter exists we can categorize it as case 3 for the sake of this discussion. I think you're needlessly hung up on the word protocol when that really doesn't matter.


> I think you're needlessly hung up on the word protocol when that really doesn't matter.

In that case, you've described two situations:

1. Ports are the same.

2. Ports are different.

This doesn't seem like a particularly fruitful analysis.


"could work but won't without an adapter"

The adapter is the downside. And there's no guarantee that the perfectly suitable adapter actually exists.


Isn't that too late. They aren't going to publish any new specs after this.


Wait, isn't the whole problem with the cables and not the ports?

It matters little what subset of USB any given device supports. No reasonable person will expect, say, high-speed video from a cheap mp3 player, just because it's the same USB as their screen. However, it's perfectly reasonable to use the cable that came with aforementioned mp3 player to connect a computer to a screen, and expect the video signal to be carried through.

I don't see a good reason why cables shouldn't be required to carry everything USB supports.


> Wait, isn't the whole problem with the cables and not the ports?

> It matters little what subset of USB any given device supports. No reasonable person will expect, say, high-speed video from a cheap mp3 player.

No, it's both. Laptops ship that have one USB-C port that will carry DisplayPort alternate mode, and another, equivalent looking port, that won't, because it's not wired to the GPU. Consult a manual to figure out which port or ports is wired for video.

Your Nintendo Switch will work with an DisplayPort demuxing dock to output an HDMI signal, but put the forthcoming Nintendo Switch Mini, with an identical looking port in the same dock, and no video will output, because the mini lacks a DisplayPort crossbar mux IC connected to the port, and only uses it for charging and USB-C peripherals. This is a frequent source of complaint on gaming forums.

Laptops ship that will charge off of one USB-C port but not another, because the manufacturer saved a few cents by only wiring one for the optional high-speed charging spec.

Etc. Etc. Etc. While some things obviously don't support some USB 3 or USB-C optional features (obviously my USB-C keyboard does not output DisplayPort), consumers are often frustrated in other cases when their intuition and the equivalent looking ports suggests something should work when it doesn't.


> Laptops ship that have one USB-C port that will carry DisplayPort alternate mode, and another, equivalent looking port, that won't, because it's not wired to the GPU.

The solution is to mark ports with symbols. It already happened with USB 2.0. I remember charging and not charging ports there, with a label on the charging one. Furthermore people learn with quickly which port does what on their laptop.

On the other side, is this phone I'm writing on capable of video out? No way to know unless I google the spec. I think it does, but I don't remember. It's not I'm doing that every day, or every month.


Remember when the ports that did a thing, only worked with other cables that were meant to do that thing, and generally if the cord fit, it was in the right spot? That was great, now I have to look at tiny little symbols, and hope I know what each obscure symbol means. The progress is amazing.


I literally have a box of pre- USB-C cables. S-Video VGA RCAx2 Coax RCAx3 VHS hi-fi optical HDMI DVI-A DVI-D and a LOAD OF SIMILAR crapola! the box is 12" x 18" x 12" and it's FULL!!! Good riddance!


Won't that just be replaced by the same box full of USB cable that can only do displayport, some that can only do networking, some that can only do storage, and some that can do a combination of the previous, probably at different speeds... The only difference would be that when you need a specific one, you would have to try them all until it works ;)


depending on the cost, i could dump all cheap cables and replace them with high quality ones that can handle everything.

that one cable to bind them all.


Now image buying a $70 cable each time you need a new one


If each of the types of cables is probably less than $5, how do you get a $70 price tag for the combined all-capabilities cable? I'd be surprised if it would cost more than $10.


Because that's how American brick-and-mortar retailers make their profit: they take a cable that you can buy on Ebay for $5 and mark it up to $50, while keeping their prices on big-ticket items (like a TV or laptop) somewhat competitive because people price-shop on those things, but not on accessories.


@temporal

Have you seen the prices for VGA cables, cat5, etc. at Walmart, best buy, etc.? They absolutely will jack the price up because they know the only people who buy that stuff there are ignorant or too desperate to wait for shipping.


Brick-and-mortar prices are constrained from the top by competitive pressure too. They won't be selling full-capacity USB cables with 10x markup, because they'd lose even more business to on-line stores.


Nope, and I recently experienced this fun first hand.

My work desk has a built-in USB-3 C port, and says it supports USB Power Delivery. The desk is plugged into the wall. Excellent, I can use that port to charge the MBP, right? No! Plugging into that port will cause OS X to believe it's connected to a power supply — so the battery is "charging" according to the OS. Even more fun, is after leaving the laptop in this setup for some time: there's no visual indication in OS X that the battery is about to die and AFAICT, OS X would happily take no action, since it believes it's connected to a power supply¹.

Why? Because the USB "power delivery port" only delivers 10 W. (A normal MBP power adapter, connected to a wall, can deliver like 80 W. How much the laptop needs is dependent on use, but in my case, it's >10 W.) And AFAICT, the desk is compliant with the letter of the spec; USB PD ports are allowed to negotiate whatever power they can deliver; it just allows a device to draw as much as possible/needed, but doesn't guarantee any max rate.

Why anyone puts a max 10 W port on a desk basically hand-tailored for the SV 4-ft desk open office "experience" is beyond me, though.

¹and, technically, it is.


> there's no visual indication in OS X that the battery is about to die and AFAICT, OS X would happily take no action, since it believes it's connected to a power supply.

AFAIK, macOS is aware of three different states: 1) no power connected, 2) connected to power (but not charging), 3) charging battery.

2) is indicated by the black background of the battery charging icon, and 3) is indicated by a white background of the battery charging icon.

In the case where the power source can't deliver enough energy for charging (in your case), it is in state 2); and I'm pretty sure macOS always takes action when the battery is about to die unrelated to the state. It'll backup the RAM when the laptop's battery level is dangerously low and shut off itself, and will be restored later.


> AFAIK, macOS is aware of three different states

So, you're correct, and I actually did not know this when I wrote the post you've replied to.

But the point still stands: regardless of which of icon 2) or 3) you get, it still isn't going to alert you to the issue: both are normal icons. 2) appears during a full charge, since then you are also not charging, but connected to power. That is, regardless of 2) or 3), both icons also appear during a normal, actually charging charge cycle, so neither indicates an issue.

> I'm pretty sure macOS always takes action when the battery is about to die unrelated to the state

As I recall it, it was awfully close. But that's not to say that there wasn't still time, and that the OS couldn't have acted. (I didn't drive it to empty to find out!) So, it could be that I'm wrong on this point.


> Why anyone puts a max 10 W port on a desk basically hand-tailored for the SV 4-ft desk open office

If the people are the same, you have answered your own question. Open Offices are not a bright idea in the first place; not sure why you would be expecting anything other than similar ideas :)


That sounds like an issue with the Mac or MacOS mainly. I have a ThinkPad which charges over USB-PD, but if it can't negotiate 45W with the charger, it doesn't show it is connected to AC.


>Why anyone puts a max 10 W port on a desk basically hand-tailored for the SV 4-ft desk open office "experience" is beyond me, though.

Seems simple to me: it costs a lot less to put a cheap, crappy 10W port there, and then they can advertise that their overpriced desk has this fancy feature. The idiot managers at tech companies (probably non-technical HR types) who make the purchasing decisions don't know any better.


You could have the same problem if you connect your MBP to a 45W(MBA) charger


> Wait, isn't the whole problem with the cables and not the ports?

When buying cables from a reputable manufacturer there doesn't seem to be much of a problem. There are only two figures that matter: the speed they are rated for and the amperage. A good manufacturer will state both clearly on the package.

But in reality, if you don't want to pay $50 that Apple or Belkin charge for a cable, it's a nightmare. (I assume manufacturing cost couldn't be more than a 10th of that.) I've yet to buy a third party USB-C cable that has lasted for 100 insertions (they are supposed to be rated at 10K insertions) - and I must have bought 20 of them by now.

I have no idea how to buy a reasonable quality USB-C cable. I've resorted to treating them like consumables - buying them 5 at a time, and ensuring I always have a few spares in the house. It's doubly frustrating because the OEM cables that come with the devices are invariably good, so they are out there, I just don't have a clue how to find them.


Same problem, now your cable costs 10x more


Since there would be only one type of cable, economies of scale would be massive. Even if they didn't eat all the cost, so what. Cost of the cable is a fraction of the cost of a device, and the whole point of doing a universal protocol is that you can use cables from one device with a different device.


The 40 Gbps speeds of Thunderbolt 3/USB 4 are sustainable over only very short runs with passive cables -- anything longer than about 50 centimeters requires active circuitry.

There is no level of manufacturing scale that will make an active signal cable cheaper than an equivalent-length passive one, and cable manufacturers, who live and die on razor thin margins, are never going to be convinced to forgo the cost savings of building cheaper passive cables when their client's devices don't even use the higher speeds the active circuitry is required for.


Huh. Fair point about data rates, but then again, I'm not sure if forcing manufacturers into a single, active cable would be impossible. Standardizing a single cable type shouldn't impact the margins on cable sales - whether the cable costs $0.1 or $10, every manufacturer will have the same costs (in the limit).

> to forgo the cost savings of building cheaper passive cables when their client's devices don't even use the higher speeds the active circuitry is required for.

That's why I think standardization should disallow this very attempt at costs savings. A given device may only need a passive cable, but as a user, I want to be able to use that cable with a different device. Moreover, cables are often bought separately (e.g. replacements). It's already a huge problem with USB 2/3 - buying a cable is a lottery wrt. transfer rates and charging amperages.

(The last time I lost that lottery I happened to be in China, so I went to a local seller with a voltage measurement app, and started testing cables they offered one by one. After finally finding a cable that could push proper 2+ amps, I bought something around 60 of them, because back home, everyone sells USB cables packaged in hard plastic, so I wouldn't even be able to run this test.)


> I'm not sure if forcing manufacturers into a single, active cable would be impossible.

> ...

> That's why I think standardization should disallow this very attempt at costs savings.

There's a somewhat famous Common Lisp FAQ along the lines of "Why didn't the standardization committee just force implementers to do X thing?"

And the answer is: your premise is wrong. Standards bodies don't have the power to force anybody to do anything. If implementors don't like your standard, they'll just ignore it. Standards live or die by their ability to raise consensus.

If Generic Device Manufacturer #4823738 looks at the USB 4 standard and decides there isn't a big market for USB peripherals whose 2 meter charging cable (say, a gamepad that charges via a fairly long 5 W USB, like a gaming console uses) runs $75 because it requires fairly heavy gauge wire to additionally support (unneeded by the gamepad) 100 W charging and has to contain (unneeded by the gamepad) 40 Gbps active signalling circuitry, they're just not going to implement the bloody USB 4 standard. They're not going to implement it if there's a $1 to be saved by going some other route, honestly.

Instead of the "universal protocol" dream dying on the rocks of a hodge-podge of subtly incompatible cables, you've simply killed it by convincing manufacturers to ship USB 2 devices until the end of time, instead.

The standard committee can impose onerous, financially punitive requirements on anything calling itself USB 4, but they can't then make manufacturers adopt USB 4, so in the end they can't solve this by forcing anyone to do anything, which is why the entire idea of "one universal protocol" is hamstrung by its own internal contradictions.


Let's say the USB standard mandated that all USB-C cables be fully-featured, such that they can't be manufactured for less than $10 each. What's to stop a vendor from selling cheap "cell phone charger" cables? They could print "Compare to Belkin USB-C cable* *not affiliated with Belkin or licensed by the USB Implementers Forum" on the box, and maybe even be able to survive a lawsuit. And even if they're not in the clear legally, these cheaper cables would be found in every "black-market" outlet, which these days includes Amazon, your local gas station, and many big box stores. How long would the USB-IF be able to maintain that stance, while their licensees clamor to be allowed to sell cheaper authorized cables?

I think the solution is to improve branding. When I look at the USB-C cables in my bag, I can't tell at a glance which ones are full-featured: one of them has a tiny logo that maybe indicates something like that, but it's not obvious or prominent. I tend to end up plugging in the wrong one and discovering that the device isn't charging, performing as well as expected, etc. If the USB spec defined consistent, prominent branding for full-featured cables, this would be much less likely to occur. (yes, I know I should replace the inferior cables, but I only use them when I work away from home, which is infrequent enough that it hasn't crossed the threshold to be worth the time)


And - fun addendum to this - Active Thunderbolt cables don't work as basic USB cables - https://appleinsider.com/articles/17/08/15/psa-thunderbolt-3...


There are trade offs other than just cost. Want a long, thin, flexible cable to charge your laptop? Then you prioritize power delivery at the expense of data speed. If you insist on 40 Gbps capability, something else has to give.


So maybe the laptop cable would have to become thick instead of thin, to support the data rates. There's infinite number of tradeoffs you can make to shave that extra fraction of a cent, or to segment your market, but at some point somebody has to unilaterally declare, "this far, no further!". A standards body is a good candidate for that someone.


but i really want that long flexible charging cable (that 3m i got was my best purchase ever), so i am afraid, as much as i too am in love with the idea of a single cable that works everywhere, i won't be happy if my charging cable becomes a stiff inflexible wire.


Scaling factors don't work that way. The economies of scale are where the majority of the market is, and that's low spec cables. So low spec cables will always have better economies of scale than high spec ones, and will have cheaper materials costs.

Any company that tries to compete by upgrading all their output to the top 25% of the market, paying materials costs at the top 25% rate across the whole production run, is committing economic suicide. All it takes is one competitor to not do that and only address the low end, and the whole strategy collapses in ruins.


> All it takes is one competitor to not do that and only address the low end, and the whole strategy collapses in ruins.

That's why I was talking about putting this in the standard, so that only one type of cable is considered compliant. This way, all competitors would have to upgrade too.


And people of limited income that just need basic capabilities get screwed. They could have had significantly cheaper cables that would serve them just fine, but we're not going to let them even though they are the significant majority.


This argument can be used to justify any amount of wasteful production and environmental damage. It needs to die.

Prices are to some extent arbitrary. I'm pretty sure people with limited income would be able to buy devices with slightly more expensive cables just fine, especially if you remove the backpressure of a reduced capability alternative keeping the price higher.


Higher performance cables, with more robust materials, shielding, etc are almost certainly more wasteful and environmentally damaging than cheaper ones.


> It's hugely about: cost, and the desire of the USB forum to ship a single "does everything" protocol that keeps a large chunk of the industry from choosing to stay on some old version of the spec forever.

why is it so bad if they stay on the old version? AFAIK, usb already bends over backwards to support every older version of the spec. imo, it would be great if USB3.x supported all the latest extensions, USB2 was still kinda fast, and USB1 could still be useful for that really old device you have sitting around (or if you just want to be able to trickle charge a phone with the only micro-usb you have around).


I feel that all of this would be MUCH less of a problem with simple code that are required on each USB cable with the exact specs it supports. This code could be put into a tool which then decodes to show what exactly that cable supports.


This works until some mgf decides it's easier to claim to support everything, ships a simple controller that does that, it gets widely used, and a lot of harware becomes mysteriously incompatible as a result.

Trusting vendors to be honest in declaring capabilities is not always the best of all possible approaches, historically.


Make 'em get certified, if they' re caught cheating they lose their license to stamp the logo.


As I understand it, one of the reasons earlier iterations of the USB standard were popular with manufacturers was that it was logistically and administratively easy. You didn't have to license a bunch of expensive patents and certification is quite easy.

Going against that might be possible, but it strikes me as perhaps difficult to get implementors to go along with.


Do you really think a Chinese manufacturer of cheap cables cares if they’re displaying various logos with the proper license or not otherwise misleading customers? Look up the whole “CE” vs “China Export” thing...


What's strange is that stuff gets imported and sold in the west. It used to be illegal and you couldn't get pirate products in shops but now Amazon somehow just does it flagrantly. Doesn't customs ever seize a container of cables with the wrong logos on them anymore?


How would customs know? They would need to test them, which seems unpractical.


As if a logo on a cable is important.


For Display Port cables, if they're not certified, you run the risk of strange hardware faults at best, smoking hardware at presumably worst.

USB4 is tremendously more dangerous if the cable does something really wrong.


The problem is that you don't know if the logo is valid. Anyone can just print it on their product. Even buying from Amazon won't save you all the time.


To me it'd have been immensely helpful. If there had been 2 manufacturers with a logo (or a clear spec), I'd certainly have bought from them.


When it so blatantly doesn't work, it'll get called out in reviews and possibly delisted. It's really easy to stamp the actual capacity on a cable, and there's not much incentive to lie.


With a mix of capabilities, it may not be as simple as "doesn't work". Imagine a cable that 95% of reviewers say work just fine, but 5% say doesn't work at all. Maintains a 4.7 star rating or whatever.

I'm suggesting that they could mostly do fine with lying blatantly about capabilities in hardware. It would take very close analysis to find that it only had some of the capabilities advertised.


I would agree with this idea… except device manufacturers can't even manage to not put the port in backwards, or print the labeling on the wrong side of the connector. (Yes, the standard mandates a particular orientation for the ports. But there's so many non-compliant devices out there, good luck knowing that.) I don't think they'd get such a thing even remotely close. There's just too much low quality crap out there.


I thought this was the idea behind the USB-C eMarker.


> watch as devices partition themselves between the few the need all of that stuff, and the majority that ship simple micro-USB 2.0 hardware

What's wrong with that? Just create a little spec addendum for USB 2 that officially allows the use of modern physical connectors. It would be so much better in any metric other than deliberate customer confusion than a big, bold "USB $some_high_number" claim with practically invisible fine print admitting that it only supports some obscure subset of the new specification that happens to be exactly USB 2.

The partition happens no matter what.


> It's hugely about: cost, and the desire of the USB forum to ship a single "does everything" protocol that keeps a large chunk of the industry from choosing to stay on some old version of the spec forever.

Yep. Significantly, their incentives are the exact opposite of mine as a consumer.


"one port to rule them all" would work fine if the computer could just show a popup explaining why your peripheral wasn't running in the mode it's supposed to.


Back when I wanted to do Computer Engineering, I more-than-skimmed-less-than-read the USB 1,2,3 specs and the PCIe 3 specs. The contrast was startling. The USB specs were a complete mess. The PCIe specs were not. This extended from the highest top level decisions (understanding the benefits of packet switched networks) to the lowest (missing figures, ambiguous and redundant terminology, etc).

Something was systematically different between the USB-IF and the PCI-SIG, and whatever it was persisted over many years... and likely continues to this very day, if outward appearances are any indication. If anybody has insight into what makes these groups different, or even just speculation, I'd love to hear it.


That's not a very good comparison. USB covers up to the application layer, while PCIe is much more low level than that.

USB is a great standard. Or at least, was until 2.0. It was ahead of its time in performance and the same interoperability principles carried it forward for over 20 years. If you compare the first USB standards to other standards of the era, the amount of thought that has gone into it is fantastic. The fact it has no rivals even today is telling. Compare it to the clusterfuck that's Bluetooth, which took well over a decade to get to a point where its kinda reliable (and it can still vary between implementations).

Unfortunately USB became too complicated with the various modes and backward tidbits it now supports. Also, the rates it now supports have taken it to a whole new level of implementation complexity (ICs, board design, power options, etc).


> USB is a great standard. Or at least, was until 2.0. [...] Unfortunately USB became too complicated with the various modes and backward tidbits

Fortune had nothing to do with it.

USB's deliberations on packets were distinctly underdeveloped. PCI's were well developed. USB got it wrong. PCI got it right. The compatibility complexity explosion in USB and lack thereof in PCIe is a direct result.

This pattern continues across every single stratum of the standard that I dug into. Why?

(Every single stratum except the app layer. As you note, PCIe does not operate there and USB does.)

> It was ahead of its time in performance

I thought USB was always in the value segment and never competed on performance. That's the stated intent and that's what I remember: it was never the fastest external port on my computer.

> If you compare the first USB standards to other standards of the era, the amount of thought that has gone into it is fantastic.

My entire complaint compares it to PCIe and notes the comparative lack of thought.

> Compare it to the clusterfuck that's Bluetooth

Agreed, Bluetooth is worse. Maybe clusterfuck is the default state of affairs.

The question, then, is if there's anything to be learned from the USB/PCIe dichotomy. Is there an Engineering Steve Jobs keeping the PCI-SIG in line? Does it have fewer members? Is having one dimension on which you do not compete in the value segment critical to maintaining overall quality? I feel like there's a case study to be had, but I don't know enough to rank the causes or generalize with any certainty.


Parent post complained about "absurd naming conventions". Those were there in USB 2.0: Low, Full and Hi speed (and with USB 3.0, Super speed) are a source of pure confusion. The messaging regarding speed versus USB revision is confusing to the public as well. Bus versus Self powered is also inpenetrable terminology to anyone who hasn't read the specification.

There are also some dubious technical decisions, even in USB 2.0. Toggle bits -- sequence numbers that have been degenerated to a single bit -- are a frequent source of headaches for device implementors, but don't provide the intended integrity improvement. The VID/PID design seemed like a good idea in the USB 0.9 days, but it soon became clear that a large number of implementors would be making very different devices based on the same silicon, and the standard never evolved. Instead of developing a proper Battery Charging device class, USB-IF gave their seal of approval to the goofy schemes the wallwart manufacturers had come up with.

And "USB as she is spoke" is worse: high-power bus-powered devices claiming to the self-powered in their descriptors, Hi speed devices with ceramic resonators, hubs that source current into their upstream ports, devices with USB A receptacles and A male to A male cables, USB-RS232 bridges that require drivers despite the existence of the CDC ACM device class, crimped shield termination on A to B cables that will satisfy EMC standards at the time of manufacturing but not a month later when the cable reaches the consumer... all bearing the USB logo... and USB-IF does nothing to address any of these situations, only caring about getting their annual blood money from licensees.


> USB is a great standard. Or at least, was until 2.0. It was ahead of its time in performance and the same interoperability principles carried it forward for over 20 years.

Firewire was better


> Firewire was better

Firewire was faster but it wasn't better. It could only handle the minority usages (external storage), but not the majority ones (mice & keyboard - which were always wired back then).

Firewire also had hellish compatibility, whereas things like USB HID was amazingly plug & play in an era where that really didn't exist. And early on in Firewire's life it made the mistake of changing physical connector and not being backwards compatible.


> Firewire was faster but it wasn't better. It could only handle the minority usages (external storage), but not the majority ones (mice & keyboard - which were always wired back then).

I'm not sure why you say it could not handle mice and keyboards - pretty sure it could but just were never used for it due to license costs.

> Firewire also had hellish compatibility, whereas things like USB HID was amazingly plug & play in an era where that really didn't exist. And early on in Firewire's life it made the mistake of changing physical connector and not being backwards compatible.

I never experienced compatibility issues with it. The S800 standard was backwards compatible, sure they used different connectors but you could have passive cable that converted S800 port to S400 and plug that into a computer that only supported S400.


FireWire wasn't trying to support mice and keyboards. Nor was it trying to be all things to all users. It was trying to be an ultra-fast interface. And FireWire absolutely was backwards-compatible with an adapter. Still is. I never had any compatibility issues with FireWire.


Can you expand on that? IIRC it's not that simple, there are trade-offs for both solutions. Quoting wikipedia's Firewire article:

>USB requires the presence of a bus master, typically a PC, which connects point to point with the USB slave. This allows for simpler (and lower-cost) peripherals, at the cost of lowered functionality of the bus.

https://en.wikipedia.org/wiki/IEEE_1394#Comparison_with_USB


In the beginning, USB1/2 vs Firewire was a much clearer comparison.

Firewire could power a 2.5" spinning hard disk. (usb could not provide enough power)

You could hook two macs together with firewire, and turn one into a hard disk via target disk mode. (peer to peer)

Firewire could transfer video from cameras without dropping frames. (guaranteed bandwidth)

On the other hand, USB was always the cost king.

I believe there was a $1/chip royalty for firewire, and that limited its growth.

USB in practical terms has caught up.


The other difference was the Firewire did a lot more in that chip, basically offloading a lot of work from the CPU, so it was higher-performing and lower CPU usage.

USB was made mostly by Intel, and they intentionally made it "dumb" so that the CPU had more work to do, in order to push high-performance CPU chips. But of course, the big factor was that license cost, which made Firewire devices cost more, whereas USB was very cheap to implement.


Firewire sucked for simple low bandwidth devices like mice and keyboards though. Even though the connectors were about the same size it really was quite different from USB. Firewire was mostly about bulk data transfer and streaming. It was much more specialized than USB, and it's really not a surprise that it died off once USB got fast enough to do the same streaming, even if it wasn't as good at it.


> Firewire sucked for simple low bandwidth devices like mice and keyboards though.

Did it? I know it was not used for it but I think this was more because of the patent license cost and the relatively low cost of mice and keyboards which made the license cost of $1 a big chunk of the total cost.

> It was much more specialized than USB

Not sure what you mean by this, if you mean less mainstream, yes it was less mainstream, still better. Support for DMA, memory mapped devices, daisy-chaining, peer to peer. It really was revolutionary and still is.


There aforementioned PCI-Express is to some extent feels like "Firewire done right", mainly because the frame formats are strikingly similar. On the other hand, Firewire, like pre-3.0 USB, is not truly packed-switched, but uses shared bus with TDMA-ish timeslots (and the whole mechanism of arbitration and root-election is hellishly complex).


We're really going with a protocol that involves constant polling (rather than interrupts) as being ahead of its time? :)


The USB hardware "polls" by acting as a master. It's a master-slave protocol which is just fine for simplicity and bus utilization. The polling is abstracted from the software with the concept of hardware endpoints (mailboxes, really). How are you going to implement interrupts over a serial bus, unless it's multi-master (very complicated implementations, slow)? You don't have direct interrupt lines like in PCI.


I don't have a strong opinion on the topic (I never really bothered to look under the hood but from a purely user perspective USB has been quite serviceable for me so far) but I think you're begging the question:

>You don't have direct interrupt lines like in PCI.

Surely the people who've designed USB could have opted to add an interrupt signal if they had deemed it necessary. Sure it would've added one more signal but given that there were only 4 pins on the original USB it doesn't seem overwhelming.

That being said I agree with you that dismissing polling wholesale without digging deeper is quite silly, it can be a problem when it increases latency or wastes CPU cycles but as far as I know latency isn't much of an issue in most uses of USB (even for things like keyboards, mouses and controller it's usually dwarfed by the latency of the video pipeline) and I doubt anybody handles the USB phy in software so CPU usage shouldn't be a worry.


The whole point of USB is reduced wiring. Side channel interrupt would not have made sense.

"Polling" doesn't involves the host CPU, adds no latency. It is done by the USB controller, which sends poll packets to the device. This is all a philosophical decision by USB which is master/slave and doesn't support anything asynchronous by the slave.

The master can choose to ignore certain slaves on the bus if it desires, which isn't always possible with an asynchronous slave. Recall that many devices dangle off a single USB chain.


> it can be a problem when it increases latency or wastes CPU cycles

I think it wastes power. On a setup with interrupts, you could clock gate the host controller and if the device needs to say something it can raise the interrupt line and wake up the host. I don't think there's a way to clock gate a USB1 controller and then do e.g. wake on LAN.


PCIe does interrupts over a single serial lane just fine.


Conventional PCI can also do in-band interrupts (the mechanism is exactly same), but many devices do not support that.

The reason is that PCI is multi-master by design and PCIe uses duplex packet-switched serial links. There is no master who decides who gets to use the bus at given moment (for parallel PCI there is, but it is relatively simple circuit that is conceptually part of the bus itself, not of particular device connected to it).


> None of the people working on this spec are idiots

There are a lot of very smart people in the world that lose sight of simple concepts and their value, however.


I've witnessed some Standards negotiation in a previous job, involving the Khronos OpenMAX standard (which is DirectShow/GStreamer for mixed HW/SW multimedia codecs).

My company and another had opposing solutions to a problem, which made opposite tradeoffs and opposite implementations (ours worked better for a more HW-focused solution, theirs for a more SW-focused solution)

How does one choose? In the end, it's backroom negotiation, diplomacy, and compromise. In other words, politics.


In other words, and as the IETF demonstrates, consensus.


I suspect the underlying force is "A Committee".


One smart person will find the most elegant solution to a problem.

A committee of smart people will find the most elegant solution to fuck up a problem.


That kind of perpetuates the myth of the solo inventor as the only true innovator, when in reality it tends to be groups of people, formally or informally affiliated, that move the ball forward on science, engineering, etc. Like the teams of people employed by Edison, who was as much a project manager as he was an inventor. Or look at something like the Manhattan project, where a vast team of very smart people worked together. Or the Bletchly park folks, of whom Turing may have been the most prominent but by no means the only smart person to help solve the problem of decoding German communication. He worked closely with others that offered their own significant improvements to the bombe, including three Polish crypt-analysts (most notably Rejewski) that Turing worked with to further development.


Perhaps it’s more relevant to consumer facing products that require a balance of features, performance, and user experience to be good. I think that’s harder for a committee to do than scientific work. While USB isn’t a singular product, it is a component of consumer products, however. I wonder if much of the success of Apple was due in large part to Steve Jobs’ heavy personal involvement at the design stage, and his famous rough-edged character.


In a committee of smart people everyone has a different problem.


As someone who been to USB-IF committee back around 2006

There are some smart people there, unfortunately some are old-timer in large orgs - they were pushed there "to do least harm".

Didn't care less about how bad some of the proposals were, as long as they were according to the required format.

We tried to fight a bit, but it was a lost cause :\


Can someone in the USB4 working group actually, well, speak out?


I can tell you war stories of USB-IF, but only from back in 2006. Prob nothing changed.


I am not sure they are on HN.


> None of the people working on this spec are idiots, so how did they arrive here?

I’m convinced it’s so they can advertise as supporting USB4 when they really only support the slowest (aka cheapest) method


Design by committee at it finest.


Profits are the reason. With so much confusion the safest bet for the consumer will be to use the cabling from the vendor of the device.

Vendors in turn can charge more.


The real crimes are:

1 - non-mandatory labeling. What does a given port support? What about a given cable?

2 - the name "USB4 Gen 3×2". Honestly I have never had the faintest idea what any of the various absurd names invented by the USB IF might mean. Superspeed? High speed? USB 3.1?


Oh, it's very simple.

USB 3.0 was renamed to 3.1, but that was confusing, conflicting with USB 3.1, so it was rerenamed to USB 3.2 Gen1, so that people understood that 3.0 was ACTUALLY the first generation of USB 3.2

But that caused problems with USB 3.1, which was faster than USB 3.2 Gen1, so it was renamed USB 3.2 Gen2, so that it crystal clear was faster than USB 3.0/3.1/3.2 Gen1.

But that caused problems with USB 3.2, because 3.2 is just two lanes of USB 3.2 Gen2, so 3.2 was renamed to USB 3.2 Gen2x2 so people knew it was TWICE as good as USB 3.2 Gen2.

The problem now is that USB4 eliminates all that clarity, so right now the big marketing push is to clarify it as USB 3.2^2 Gen1: The Reckoning.

To resolve the conflicts of historical naming with USB 1 and 2, those will be renamed USB 3.2 Gen √0 and USB 3.2 Gen √-1/pi.


You got some of that mixed up.

First there was USB 3.0. Technically the standard included older speeds, but people called the new one "3.0" and all was good.

Then 3.1 came out. But what about the poor manufacturers with 3.0-speed ports? Well, someone decided they could call those ports "3.1 Gen 1".

And it was all downhill from there. 3.1 Gen 1 became 3.2 Gen 1, and now becomes USB4 Gen 1. They added more "Gens", even ones where the word "generation" makes no sense. Now the number tells you nothing about speed, and there are two separate nomenclatures for speed that are both confusing.

And I'm still not sure if they did or did not ever implement Gen 1x2...

(It being USB4 instead of USB 4.0 is another stupid poke in the eye on top of everything else.)


> being USB4 instead of USB 4.0 is another stupid poke in the eye

This. I wish to nominate the lot of it for BJAODN


That's utter baffling madness! I wish that was some kind of joke instead of actually being a semi-serious explanation most of the way through.


The "×2" label is really confusing. I am a little disappointed that they still use it after "USB 3.2 Gen 2x2". if a device/motherboard has 2 of such port, either "2 of USB 3.2 Gen 2x2" or "USB 3.2 Gen 2x2x2" will looks like a joke.


The ports are supposed to be labeled using friendly customer names like “SuperSpeed USB”. https://en.m.wikipedia.org/wiki/USB_3.0#USB_3.2 See “USB-IF recommended marketing name”

* 5 Gbps transfer rate: “SuperSpeed USB”

* 10 Gbps transfer rate: “SuperSpeed USB 10/Gbps” - applies to both Gen 2x1 and Gen 1x2.

* 20 Gbps transfer rate: “SuperSpeed USB 20 Gbit/s”


I think the quotation marks should go around the phrase "friendly customer names".

I have never been able to differentiate between "Full speed", "high speed" and "super speed" though I guess they are each different from "low speed. Of course there are several speeds all labeled "SuperSpeed+"


5Gbps, 10Gbps, and 20Gbps seem like pretty clear names to me.


And then you connect two devices labeled with "SuperSpeed USB 10 Gbps"... and it turns out that only 5 Gbps works, because one device was Gen2x1 and the other Gen1x2?


Lately it seems like it would be most efficient to introduce the optional syntax from some modern languages. How about: "U?S?B?".


Someone for the love of god, please introduce the USB committee to the concept of semantic versioning. Stop this Gen madness.


Semantic versioning as it usually applies to software is a bit awkward for numbering the USB specs. The challenge is that USB has two largely orthogonal features that need to be described: link speed, and everything else that has been layered on top as the standard has evolved (power delivery, OTG, etc.)

USB 1.1 supported low speed (1.5Mb/s) and full speed (12Mb/s). USB 2.0 added high speed (480Mb/s) but didn't make it mandatory; you can have a USB 2.0 compliant keyboard that only supports low-speed or full-speed operation. USB 3.0 did the same thing with the introduction of SuperSpeed (5Gb/s).

What they should have done was keep making point releases on the 1.x, 2.x etc. specs when they added things like On The Go and Power Delivery, but keep the major version number representing the maximum supported link speed. By that scheme, we would have all 3.x devices capable of 5Gbps and no more, and what's now being introduced as USB4 would be more like USB 6.5 (if the minor version numbers were kept in sync across the speed grades). But this would deprive many companies of marketing opportunities, and instead force them to advertise USB 3.x support years after USB 4, 5 and 6 come into existence.


They had the right idea by using a different symbol set for protocol vs ports with USB A B C. Maybe they could use greek letters for speeds.


The name needs a 1 in it too. Maybe "USB4 Gen 3x2 1st Class", so people can answer "yes, I guess?" when you ask whether it's USB 1.


Is it compatible with my Microsofts?


Only if its not Bill Gates plugging it in!


> 2 - the name "USB4 Gen 3×2". Honestly I have never had the faintest idea what any of the various absurd names invented by the USB IF might mean. Superspeed? High speed? USB 3.1?

It's so fast, so consumers have enough time to spend additional 2 seconds to pronounce the whole "3x2", "Gen" or other marketing gibberish. Not saying about good connotations of SS abbreviation (as in superspeed logo) in Europe, especially in Poland. /s

Jokes asides, but what is a reason of using 3.1 name for USB-C connector? It is very, very different how it looks, how it can transfer power than early rentangle-shaped connectors of 3.0's. And now again, USB-IF serves now the same drama again with ambiguous name. I'm glad I am techie guy, but I believe 80% people will end with question what they really need to connect device with their laptops.

I wish we could completely drop "superspeed"/"gen 2" and use way simplified name. Such name should not follow the naming that WiFis has now, like WiFi 5 or WiFi 4, because even if we include identifier for people more interested in technical name (standard), like WiFi 5 (802.11ac), it still ends with situation you actually have portion of information, because 802.11ac operates on various powers and streams, i.e. WiFi 5 (802.11ac Wave2; 4 streams @ 1024-QAM), which is way different how it works than former example. However, I really like the quite recently introduced naming like: AC2600 or AC5300[0]. When I someone mentions "WiFi AC 2.6k", it tells me a lot more about given device for a techie person and consumer. Also, let's take a look like we query data from databases, the AC2600 could be an unique indexed column (even a primary key!), it falls below of group within defined standard, has various parameters etc. etc.

On the contrary to WiFi, USB type is easily identifable by parameters like possible power delivery, data transfer and connector type (shape, size). Why USB-IF can't introduce similar naming that are clear for everyone to recognize? USB standards are not the same as an iPhone, they do not need any fancy name to be better than previous generation.

[0]: https://en.wikipedia.org/wiki/IEEE_802.11ac

// Actually, I really got used to 802.11a/b/g/ac naming when it , it has really rooted into my brain, so it's very hear from me speaking WiFi-5.


Edit is not available now, so I have to reply to myself:

// Actually, I really got used to 802.11a/b/g/ac naming (when those standarts had very distinct features), it really has rooted into my brain, so it's very hear from me speaking WiFi-5.


Contrast with the recent naming scheme change for Wi-Fi, which IMO went from overly technical and wonky to quite reasonable. Wi-Fi 5 is followed by, of all things, Wi-Fi 6!


But how will I know if it's wifi 2x2 gen 2 superspeed?


When the USB-IF announced that they're going to incorporate Thunderbolt into USB4 I already worried that this might further complicate USB hardware and software.

Turns out: My worst nightmares just became true. We now have running USB over Thunderbolt. I don't even know where to start facepalming about this clusterfuck.

Take DisplayPort for example: Up to now we had the DisplayPort alternate mode for USB-C. That just worked for most devices with USB-C. Now we also have the tunneling of DisplayPort over Thunderbolt (as Thunderbolt-enabled Macs offer since Apple introduced Thunderbolt in its devices and which they use when connecting to Thunderbolt-enabled monitors). Both methods seem to be optional for USB now, so in future a host and a peripheral device might support "DisplayPort over USB", but while one might only supports the USB-C DisplayPort alternate mode, the other one might only support DisplayPort tunnels over USB4, causing them not to be able to talk DisplayPort to each other.

I guess I'm going to facepalm for a few hours now, while I read the specification in more detail.


> A USB4 host and USB4 peripheral device may optionally support TBT3 -Compatability.

That sounds good from an implementors point of view, but offers another area of possible incompatibilities for connecting USB4 and Thunderbolt 3 devices. Imagine both devices support tunneling DisplayPort signals, but the USB4 device doesn't support the Thunderbolt 3 compatibility layer: As in the example before they won't be able to establish a DisplayPort connection, while other USB4 devices, which offer Thunderbolt 3 compatibility might.


How is this better than VGA again?


VGA has the downsides of the analog signal, like ringing and smearing when you have a higher resolution running over a long cable.

DVI ended up being kind of a mess with the different connector types and ultimately fairly limited digital bandwidth.

Displayport is pretty good except everything Displayport costs 50% more than it should.


DisplayPort also seems to randomly stop working, requiring unplugging/replugging the USB to reset it. Occasionally I'll even boot up one morning to find it's not working at all and something or other needs to be reinstalled.


It supports multiple 4k screens over one cable.


VGA caps at 2048 pixel wide if we are to believe a quick google search and the fidelity of the image may vary depending on the cable and graphic card in use.


I was a USB-C advocate until this point. I only had problems with the non-compliant devices, and if I buy a cable I have to label it what it capable of. I didn't have a problem that I have to know even if my mouse has a USB-C port it won't support DisplayPort alternate mode, but I reasonably expect if I connect my laptop to a monitor it will be compatible


As a consumer, can I please have a standard where I don't have to chase a new version every few years?

I know that many are backwards compatible, but they aren't forward compatible -- my MicroUSB cables don't help at all with USB-C, and type A and B connectors are still around.

By the time all of my micro USB devices have disappeared from my household, I'm sure a new thing will have emerged that's incompatible with USB-C, and so I'll still have to have several different cables around.


I'm mostly fine with how USB has turned out - USB-A ports on computers and chargers, microUSB on devices, and slowly everything is being replaced with USB-C. (and we can all forget about those wide microUSB superspeed plugs)

What really chafes me is MiniUSB. It was superseded by microUSB 12 years ago, but it STILL turns up in brand-new devices. I recently bought an action cam, a dashcam, and a retro input device converter that all had MiniUSB. I only had a single MiniUSB cable for my PS3 controller so I had to go out and buy a bunch of obsolete cables.

Reading threads where the independent maker of the retro converter was defending his choice, "I find MiniUSB is more sturdy". So now everyone buying his thing needs to stock another kind of cable for a device where the socket isn't even going to see much wear and tear. (and MiniUSB being more sturdy is debatable - the socket is designed for much fewer insert/remove cycles).


Mini is more sturdy. I've destroyed a lot of micro devices over the years, never a single mini. Most of my workbench devices (serial converters, logic analyzers, etc) are mini, so they see a lot of use and abuse, but they just keep going. Micro is incredibly fragile in comparison.

Also, a lot of devices like the Gopro and Tomtom came out when Mini was current, and kept compatibility for years to respect their customers who had invested in accessories and stuff, and didn't want all that to become useless if they upgraded to the new unit. I appreciate that.


The problem is that Mini USB tends to fail on the device side, while Micro USB is designed to fail on the cable side.


That may indeed be the design intent, but it's completely at odds with my experience across hundreds of devices and cables over the years.

Micro has rendered numerous devices irreparable, or not-economical-to-repair, because the device-side connector fails. Thermal cameras, audio adapters, countless cellphones. Thankfully my daily-driver phones tend to be Samsungs which keep their USB connector on an easy-to-replace sub-board, but I can't say so much for FLIR's finest. I've thrown out my share of flaky micro cables, too, but those are cheap to replace. It's the device-side failures that have rained on my parade for years.

Mini, on the other hand, has been bulletproof. Countless Beaglebone Black boards, FTDI adapters, logic analyzers strewn about my bench, my TomTom, my GoPro, even my old Blackberry was mini. And not a single one of them ever died because of a USB connector problem. (The magic smoke has escaped from more than a few, but that's another story entirely!) I've thrown out plenty of mini cables, sure, but the device-side connectors simply don't fail like their supposed design would suggest.

Regardless of their intent, I consider inclusion of a micro port to simply be a design flaw at this point. Mini or C is the way to go.


The industry has decided that it is more profitable to make you buy adapters or a new device every couple of years (and create more e-waste in the process), hence the accelerating churn. And each new revision of the standard will introduce its own set of problems, guaranteeing them something to do for the next revision after that; but the churn situation with USB isn't actually that bad:

    USB 1.0 - 1996
    USB 1.1 - 1998
    USB 2.0 - 2000
    USB 3.0 - 2008
    USB 4.0 - 2019
Also, the vast majority of devices are still 1.x/2.x speed, so it might be more of a diminishing returns type of situation.


Appart from storage and video, there is just no need to go over USB 2.0. My brand new USB microphone uses USB 2 because there is no advantage at all to USB 3.0 for this device.


I have a single USB 3.0 external hard drive. Everything else is USB 2.0. It's fantastic, I never have to worry about whether I have the right cord or if the device supports it, and in the rare case where I need to transfer a file large enough that sending it over the Internet would be a bottleneck, I just toss it on the external HDD.


USB 4.0 doth not exist. The string '4.0' does not appear in the zipped PDF. There is only USB4™


I'm very excited that, just three weeks ago, I bought my first laptop with USB3 ports, which will soon be Yesterday's Hotness.


You know what's nice about USB3 ports though? They're also USB1 and USB2 ports, so they work with like every peripheral made for the last 20 years.

Honestly, I have a new Dell XPS, and in my opinion it has the ideal port combination: 2x USB3, 1x HDMI, 1x USB-C. I'm covered for the past, present, and immediate future. The only loser is a handful of peripherals that plugged into Thunderbolt on my old MBP (basically an Ethernet and Firewire adapter).


You know what's nice about Type-C ports though? They're also USB1 and USB2, and USB3 ports (with a cheap adapter), so they work with like every peripheral made for the last 20 years.


- my cassette player is a CD player (with a cheap adapter)

- my VGA port is an HDMI port (with a cheap adapter)

- my wall outlet is a micro-USB port (with a cheap adapter)

- my Lightning port is a headphone jack (with a cheap adapter)

Adapter-based compatibility is so convenient.


All of these are active adapters USB-C to USB-A is passive(except a resistor). You can use USB-C to MicroUSB-B instead of USB-A to MicroUSB-B cables if your device is not hardwired.


> Type-C ports

> They're also USB1 and USB2, and USB3

> work with like every peripheral made for the last 20 years

too bad there is no guarantee they will work with devices with Type-C ports as the type-C port is a mess and there is no mention on the standard on how to communicate capabilities of the port. Can it do thunderbolt? who knows?! can it charge the device? who knows?! Can it run PCIe lanes? who knows?!


Actually the basic USB fallbacks and simple charging are all handled through passives (resistors) and 100% of Type C cables will support them. Sadly that's the only thing you can depend on by inspection.

Communication of port capabilities are described in the spec; not sure about cable capabilities though.


You'd think that companies couldn't mess up a simple resistor. https://docs.google.com/spreadsheets/d/1vnpEXfo2HCGADdd9G2x9... proves that wrong.


The charging thing in particular seems like a disaster— the situation is bad enough with dedicated power supplies, where we have Amazon pages listing specific devices and laptop models that are known to work; but then there's this whole classes of setups where the correct behaviour isn't even knowable without directly querying the user. If I plug that USB-C battery bank into my laptop, should it be charging the bank from the laptop, or charging the laptop from the bank? Does it depend on whether the laptop is plugged in or not? If the laptop is plugged in to a current-limited source, should it prioritize charging the bank or its internal battery? Gaah.

Here's what the Android prompt for this looks like: https://www.quora.com/Devices-can-charge-or-be-charged-via-U...


Aren’t the adaptors for USBC quite a bit more expensive? Is t there essentially a chip in the cable?


No, adapters from USB-C to the older USB ports are completely passive, they only have a pair of resistors to identify as an adapter.


To be fair, you've certainly taken your time; USB 3.0 has been with us for about 8 years at this point.


Out of the 4 computers I use regularly, only my work laptop has USB 3.0 and it was given to me less than a year ago.

Taking my time? The computers work exceptionally well. USB 3.0 doesn't seem reason enough to replace them, and much less does USB C which is not compatible with any of my devices.


USB-C is a revelation - power, display, storage, all through one cable.


Bah, the big innovation is the reversible connector. Not sure why that took so long.


Exactly. The old USB connector is just horrible... not only is it non-reversible, but it doesn’t “feel” good. I swear my average number of attempts to plug one in is >2, because even if you get the right orientation, if the alignment isn’t perfect it feels like it’s wrong.


The Type-C connector isn't perfect either, if it is only slightly misaligned my screen goes black, sometimes a small bump to the connector is enough. Good thing I don't have external drives connected to it (yet).

Shrinking these connectors has gone too far.


If you look back through old stack overflow questions about the reversibility you’ll find people saying it can’t be done any other way, giving all sorts of demonstrably false arguments about it being impossible to arrange wires like that!


I never understood why something like 4-pole 2.5mm phono wasn't used for USB. Then you'd have no orientation issues at all, and fumble-insertion would be trivial.

An argument I've heard against this is that a rotating connector wouldn't be electrically stable enough. But this is easily addressed - you could easily add a toothed collar to the plug preventing it from rotating, at the small cost of there being a finite number of dozens of possible orientations, instead a continuous infinity. Or better, just design the protocol to be tolerant to packet loss.


Audio jack has this problem that it shorts contacts during insertion/removal. It wouldn't be compliant with EMI too.


Even if they didn't want to make it reversible they could have made it...something like a rectangle. VGA cables are easy to see the orientation of since they are trapezoidal.


I haven't seen these arguments, but not passively reversible perhaps? (Isn't there logic in the cable or something like that?)


Just picture a USB-A connector. You know the metal plates on the tongue? Put metal plates on the underside as well. Connect each plate on one side, to the plate on the other side, but at the reverse end of the tongue. There you go - passive reversible.


Passive reversible USB-A is actually rather common, as a quick search will show; I first noticed this on USB drives (double-sided tongue), but cables are available with those connectors too. The reversible females look quite a bit more fragile than the males; the latter is simply a double-sided tongue, while the former has the contact block thinned to a double-contacted septum.


Reversible micro-B is also a thing, I recently bought such cable.


I read in an interview with the original USB instigator that he wanted reversible cables but in order to get a standard organized and agreed to they had to go for a non-reversible cable to cut BOM cost.



That's a revelation? Thunderbolt has had all that for nearly a decade now.


But that's not the point, is it? The point is that after 8 years, it's exceedingly reasonable for a standard to be superseded.


No, it's not. The longer a standard lasts the better. 8 years is really short. I can still connect new earphones to the jack of my CD player from 20 years ago, or my even older mini-cassette player. That's a good standard. What kind of standard lasts less than 10 years?


I mean, if you want to stick to the 1.5 MB/s transfer rates of USB 1, more power to you.

Analog audio technology was basically perfected 40 years ago with the Walkman. You couldn't develop a significantly better connector today, even the size was optimal for small portable devices.

Data transfer technology OTOH is still improving by leaps and bounds, and the old ports are too big for modern portable devices like tablets and phones.


Agreed! Now XLR, THATS a good standard. Been around since the 1950s at least, and still the de facto connector today. Sure, you need to insulate against background interference and account for ground hum, but honestly that makes me for a clean connection when done right. XLR and Ethernet is all we need.


i still pine for the world envisioned by the first PoE working group; a world where every house's power outlets are just rj45s.


Wow! Never knew this was a thing. Would that be enough to power heavier load appliances (toaster ovens, window AC, etc)?


it's at that state now, yes. 100W PoE is standardized and available on the market. the standards body dragged their feet for a couple decades, vendors implemented their own proprietary high-power PoE injectors and devices, and a lot of the momentum behind PoE died out due to incompatibility hell. imagine every power outlet in your house was networked, could support gigabit data transfer, and the whole thing was so safe you could jam a fork into the outlet whenever the whim struck you -- that was the original dream.


60Hz Electric Grid: 122 years in service.

NEMA 1-15: 115 years in service.

IPv4: 41 years in service.

IPv6: 24 years in service.

Seatbelts: 53 years in service.

Some things are better not replaced every 8 years.


I super agree with everything, with the probable exception of Seatbelts. Is that even a standard with a concern for compatibility? I'm not sure if I can plug the belt of one car to the seat of another. Would be cool, I guess. If my seatbelt ever breaks, I guess that would mean I can obtain a "standard" seatbelt without worrying about compatibility.


There are "heavy" people who carry a seat belt extender because most cars don't have a long enough seat belt without it. So yes there is a compatibility concern.

Though there isn't a standard. Different manufactures use different connectors, not to mention air planes.


There's a standard in that cars must have them, and they must have certain properties in terms of abrasion, strength, surface area etc.


I agree with your general sentiment but NEMA 1-15 must go (it kills a few people each year) and so must IPv4 (it spawned NAT which destroyed the Internet).


My Logitech wireless headphones don't work well with USB 3.x, they only output stereo sound.

It is required to use an USB 2.0 port to get surround sound.

USB 3.x is also unreliable to no end. My external disks, all using USB 3, only worked with the very fast speed for a week or so. Then the cable dies and it is basically just as fast as USB 2.

So, 8 years of unreliability. What a terrible record.


My desktop is plenty powerful, but doesn't have USB 3. Nice to know I can skip it and just get USB 4 on my next mobo.


Type C is the intended replacement for Micro, A and B.


then they should stop including USB C to USB C cables in every new phone I buy


It's the intended replacement, why would they stop including it in new purchases?


I think OP tried to argue that if USB-C is just a replacement for Micro and Mini then why USB-A -> USB-Micro is replaced with USB-C -> USB-C instead of USB-A -> USB-C. This proves that USB-C is a replacement not only for Mini and Micro, but for -A too.


It's intended as a replacement...


Why?


No... usb-c is just a connector


Micro, A, and B, are also just connectors.


USB is that standard.

The awesome thing about USB-C is all you need is a couple USB-A to USB-C adapters, which are cheap, and bam—all your devices are now compatible forward.

And if you already have a USB hub all you need is one adapter.


The USB-C Standard explicitly forbids use of USB-anything to USB-C adapters. Buy a USB-C to USB-other cable instead of trying to light your computer on fire.

The reason it is forbidden is because using two adapters you can convert a USB-C cable into a USB-A to USB-A cable. USB-A ports are not required to handle exciting and unexpected voltage on the wrong pins.


The prohibition is only against USB-anything to USB-C socket. Adapters to USB-C plug are allowed (and can't create the invalid combination you just described).


Apart from fast charging or connecting monitors via USB etc.


Which are things your device also did not do when you bought it. "Forward-compatible"!=="magic upgrade".


I'm sure USB-D will bring the solution for all of your problems!


can't wait for USBS


Stop buying stuff. If you can't do that, stop buying stuff that's incompatible with the stuff you already have.


This is a large but hardly the only aspect of compatibility:

- Can I use friends'/family's chargers? Can I use public chargers?

- Are my cables going to become difficult to find? Can I pick one up at the corner store in the middle of the night because mine failed?

- What's going to happen when I need to interact with old hardware? Are converters available?

Compatibility problems don't arise in perfect conditions, when you're at your desk with all your own current equipment. They arise in all the edge cases you will have to deal with over the at least several years you have a device.


Presumably they will obsolete use of USB4 over non-Type-C cabling since the older 3.x compatible connectors are hacks. That means, moving forward, everything from 1.1, 2.0, 3.x, and 4 will run on the same cabling.

Intermediate hubs will downgrade bandwidth if they can't support the same speed as their upstream host so there will be an element of mystery there without clear markings.


Please father Christmas may I just have one present?

A standard that is actually standardized in practice.

To this day I still don't know exactly how to buy a cable for my raspberry pi 4s. (Yes more a USB c than USB 3 issue but you get my point - a standard that isn't standardized fails at it's raison d'etre)


That's also because the RasPi 4 failed to implement their connector according to the standard. [1]

Now, why that passed muster is a valid question.

[1] https://medium.com/@leung.benson/how-to-design-a-proper-usb-...


Standards bodies need to sue to protect their standard. If a device does not adhere to the full standard, they should not be able to call it USB or use any of logos or imagery.


Raspberry pi foundation being too cheap to buy two resistors in place of one is not a problem with standard.


I'm not qualified to say anything about the Pi OR USB, but I've definitely gotten the impression that USB has a standardization problem (see https://news.ycombinator.com/item?id=20443765 for some convincing-to-me material)

If that's wrong, can you clarify?


My comment was specific to the problem raspberry pi 4 has.


Just buy the official power supply. It's probably better than whatever cheap junk you've intended to abuse your Raspberry Pi with.


Actually cheap junk will work, but any quality charger(eg. Apple) with emarked cable won't.


This is such a funny post since USB-C cables which support more then the basics are actually destroying the PI.


How do you suggest they fix it? With a new standard?


Next revision of the pi will fix it.

It was an example of "layman has no idea which side is up"...but yes the rasp is the culprit here...for not following a rather complicated standard.


If I remember correctly though, the fault was in them not following the reference layout for a particular circuit diagram.



Sorry, what I meant was how should they fix the pains of using USB in practice? Not specifically the raspberry pi issue.


Less accomodating everyone & everything.

It needs more simplicity.

Like the rasp issue - the cables marked active work while the ones not don't. Or vice versa? I don't recall. Never seen any such markings - but apparently there is a distinction there that some how interacts with raspberry not having enough capacitors.

^^ See that paragraph of confusion? That's what standardization chaos looks like. Neither I nor apparently raspberry foundation know whats going on.


So then what should they have accommodated and what should they have left out? Should they have dropped backwards compatibility? Should they have made different connectors for every use case? Those are the exact problems which caused USB to become popular in the first place!


Try finding a USB-C hub that actually works. The USB ecosystem is a complete nightmare. As a consumer this is completely infuriating.


Good. USB3 is still a mess, better make a new one before that gets sorted out.


>"Once the specifications are released, there will be a new round of confusion," the source told TechRepublic. "It's going to be USB4, but you have to qualify what USB4 means, because there are different grades. USB4, by definition, has to be [at least] Gen 2×2, so it will give you 10 Gbps by 2, that's 20 Gbps. There's going to be USB4 Gen 3×2, which is 20 Gbps per lane. 20 by 2 will give you 40 Gbps."

https://www.techrepublic.com/article/usb-if-to-continue-conf...

They really needed to take a page from the WiFi Alliance on the branding. But no, USB4 Gen 3x2. Awesome.


I know youre being funny, but that seems like a pretty good option. USB3 did its best to remove the U from USB.

Starting over wouldnt bother me at all. The only usbc devices i have is a phone, and new phones come with new cables. Lets abandon this monstrosity and move on.


USB4 is basically Thunderbolt 3, i.e. USB3.x + PCIe + DisplayPort.

See the diagram in the linked spec, "Figure 2-1 USB4/USB3.2 Dual Bus System Architecture"


Not... quite.

Thunderbolt 3 did not carry USB 3.x over its bus, it's just PCIe and DP signals. All the Intel TB3 controller chips used in enclosures have a PCIe switch and a USB root hub attached to that. This switch was ... not flawless ... to be diplomatic. It was utter shit, to be less politicaly correct. TB3 eGPUs famously struggled with their occassionally lagging and hiccuping USB ports so much so that the Visiontek Mini and the Razer 2 has two TB3 controllers to avoid this mess with the second TB3 controller solely providing the USB 3 root hub and almost all other eGPUs just gave up on them.

USB 4 will carry 10gbps USB multiplexed with PCI Express and DisplayPort, native USB instead of hotplugged controller. Simpler.


Thanks for the correction!


I remember hearing that on .. Dave2D I think? Or Linus Tech Tips .. one of them said USB4 is Thunderbolt 3.

I assumed the standards were just combining and the two would be equivalent protocols, but I assume that's not true then? Does USB4 basically just add PCIe onto the USB bus? How will that work with AMD board that still don't officially have Thunderbolt/PCI Hotplug support?


> Does USB4 basically just add PCIe onto the USB bus?

Yes. Which is kinda what Thunderbolt 3 does (except encapsulated in the TB packet format, not USB).

The silicon world has been gradually converging on its Serializer-Deserializer/PHY technology. I don't know quite the family history, but I believe the PCIe PHY is very similar to USB3 which is identical to Thunderbolt 3.

Since the link layer was the same, it was then "just" a matter of protocol design to make everything just use the same link. And that's how you got Thunderbolt 3 / USB 4.


PCIe uses different rates than USB, but both at their basics use NRZ (Non-Return to Zero) https://en.wikipedia.org/wiki/Non-return-to-zero. The bigger differences are in the channels they target. The channels characteristics drive the complexity of the signal processing circuitry required (eg Power and Cost) in the PHY devices. Hope that adds a bit of clarity to your PHY question.


> How will that work with AMD board that still don't officially have Thunderbolt/PCI Hotplug support?

They will have to add it. It will be easier now that Thunderbolt doesn't have to be licensed from Intel.


The more I look at the USB mess the more I feel that a revised faster Ethernet with provisions for both insulated and direct connections (magnetics-less == cheaper chipset, lower power consumption etc) for board to board communications then a smaller/sturdier connector plus PoE built in would solve every problem USB tried to solve including the ones it introduced.


    Apple Inc. 
    Hewlett-Packard Inc. 
    Intel Corporation 
    Microsoft Corporation 
    Renesas Corporation 
    STMicroelectronics 
    Texas Instruments
Honestly, I'm a bit surprised at how short this list of authors(?) on such a wide-impact standard is. Where are other silicon designers? Where's Synopsys, or Mentor Graphics (who often provide the Hard IP used by chip designers)? Where's Qualcomm or Samsung? What about Google, for Android?

Edit: actually reading a bit further to the "Acknowledgement of Technical Contribution", this is the list of promoter companies. The contributors list is way longer and includes everyone you'd expect.


Look further down. You showed just the promoters list, there is a much longer contributors list.

Google and Samsung are both there.


Why does USB have two seemingly unrelated versioning schemes? There's USB 1, 2, 3 and now 4 as well as Type A, B and C. This mess looks like manufacturers wanted to create a way to trick the average customer into buying the wrong cable or adapter, whereas initially the reason for creating USB was to increase compatibility.


> Why does USB have two seemingly unrelated versioning schemes? There's USB 1, 2, 3 and now 4 as well as Type A, B and C.

They decoupled the standards for the protocol and the connector. "USB C" is just a port/connector shape -- there's no guarantee that any cable or port that is USB C shaped supports any version of USB-the-protocol. This is the case with eg) dedicated Thunderbolt 3 ports and cables.

It's every bit as silly and confusing as it sounds in practice.


And I forgot to mention the mini and micro versions of USB ports, which add to the overall confusion.


There is also the USB 3 versions of mini and B which don't fit in the USB 2 ports. This is part of the reason I have an entire draw dedicated to different USB cables.


Good job completely failing to address actual problems, USB-IF. What do people actually need? They need to be able to easily tell what the port on their device is capable of doing and they need to avoid buying chargers, cables, and devices that violate the specification, especially when they do so in a way that might destroy a device or even endanger the user. Unfortunately the only thing the USB-IF cares about is "Moar megabits! Moar!"


probably really dumb question with a really obvious answer that i don’t see, but...

why not have a packetized protocol like ip over the cable and devices just go over that? wouldnt things like enumeration, identification, connect/disconnect, interleaving protocols and encryption be much simpler since it’s a known domain?

wouldn’t that be more “universal” than cramming usb2, usb super speed, pci-e, displayport (which afaicr also wraps a few things) etc over a single cable?

am i missing something or is this just a cost issue...?

[edit] slightly answering my own question: pci-e is a packetized protocol... so i guess cramming everything in is more backwards compatibility to get all the other connectors subsumed into a single cable...?


You had a problem. You added TCP/IP and all its associated kludges like mDNS, SSL, etc. Now you have two problems!

Plus now your mouse has 2 gigabytes of RAM inside and burns your hand when you touch it. And your monitor has stuttery motion.


USB is packetized. But without those seperate modes of operation, every single adapter would need a microcontroller in it.


Is that really a problem?


Imagine a world where the reliability of your devices significantly depends on how much you paid for your USB 3->4 adapter. Is that a better situation than what we have today?


It already does in terms of USB cables. Buy a wrong cable, and your device won't charge, or won't charge fast enough, or won't communicate. And you can't tell the good/correct cable from the bad ones visually.

The standard can have any number of optional modes to be implemented by devices. I don't care. What I want is for it to have one cable type. Because it's kind of a point of an universal port that people will use cables from one device with another.


It's unreasonable to expect one cable type, because the additional costs for cables that support very high data rates or very high power delivery are prohibitive for cheap devices that don't need those.

What they do need to do, though, is minimise the different cable options and make them easy to distinguish. My suggestion would have been to have a basic USB-C cable with exactly two optional designations: +FAST and +POWER. It's then easy enough for a device to say "this requires a USB-C+FAST cable" and consumers to then look for the same.


> What I want is for it to have one cable type

Cables have conflicting requirements:

-- high current requires thick wires and heat dissipation (if long cable). Possibly other safety-related requirements

-- high frequency wants more wire surface because of skin effect

-- flexibility and cost don't like either of the above.

I personally prefer to have a choice of a thin flexible cable when situation allows, and am ready to pay for it.


i see, makes more sense then... thanks!


That'll be USB5


Screw it. We're going with USB6.


So FireWire...


Will this bring higher PD? A lot of laptops require 100W-130W of power (see, Lenovo X1E or Dell 7590), while the highest PD I've seen from usb is the Apple charger for 86W. I know Dell produces proprietary 130W usb-c charger, but I'm wondering if usb4 will help make it a standard.


This specification is only about the data bus, not the power bus, since they're separate in USB-C. So no, this will not bring higher power (and higher power won't need to wait for a future USB-5 standard).


thank you for the explanation! Do you happen to know if there's any work on bumping the power bus to 130W?

I see that USB PD rev 3.0 allows up to 100W - https://en.wikipedia.org/wiki/USB#Power-related_specificatio...

Is there any work on rev 4?


I think USB is terrible. One problem is that USB addressing is not by the physical port that it is connected to (among other things, this is a security problem, but there are many other problems with this too); does USB4 correct that? Another problem is how vendor codes and that stuff is work; it should instead be identified by protocol type, and to allow the user to override the protocol type with the operating system functions.

There is other stuff is better, such as RS-232, and Compact Flash (unfortunately it lacks a write protect switch; SD has it but only the full size SD cards and not the smaller size; SD card has other problems though), and then also add some of my own specifications too, such as IMIDI and Digi-RGB, and you can have separate ports for each of these four thing.


"rs232 is better" only if you ignore the incredible mess of horror that has been committed under the name "rs232". What we have today is only the most sensible bits that survived 40+ years of ablation in the real world. Even now its not a great situation: what voltages will your two chips of different types be able to be use for their "rs232" conversation? TTL? will you need +/-12v? 15v?


The Serial Time Link Protocol (STLP) looks really cool to me - I've thought for awhile that the next-gen in audio gear would be more modular front ends that sit on top consumer protocols, which is where things are moving in broadcast/commercial audio installations today. I can imagine a whole host of consumer and professional devices that solve real problems at lower costs than we can today. Really interesting stuff.

edit: one weirdness, STLP has a recommended packet rate of 62.5kHz (actual quote "A Router shall periodically transmit a Serial Time Link Packet on TMU_CLK_OUT. The period between transmissions is implementation specific, but it is recommended that a Router transmit a Serial Time Link Packet every 16 μs"). If the only application domain (besides compliance testing) is audio, how did they arrive at that recommended rate, where it is rather different from standard sample rates (44.1k/88.2k and multiples of 8kHz)? I'm not sure how that affects clock synchronization.


USB already supports isochronous transfers for audio.


Had to read the USB1 spec in 1998 to make a HID Device using a small micro, took me over 3 weeks to get it working and was debugging shit for months afterwards. Would never go there again and for future projects I decided ignorance was bliss and just used whatever stack the chip designers supplied, I suspect that covers 99% of all devices out there.


Have fun when your product falls on its face because the vendor stack got the corner cases wrong.


So basically, they are attempting to redo the mess that is 3.x

very cool.

"When configured over a USB Type-C® connector interface, USB4 functionally replaces USB 3.2 while retaining USB 2.0 bus operating in parallel. Enhanced SuperSpeed USB, as defined in USB 3.2, remains the fundamental architecture for USB data transfer on a USB4 Fabric"


The Grrranimals standard worked. People would match up symbols to see what worked together. Why cannot technology do the same? Yes, the cables plug together, but one has a turtle icon and one has a car icon?


I am genuinely curious to read more about this, but a google search for Grrranimals doesn't give me anything to work with. Can you provide a bit more info, please?

(if this was a joke, consider this my admission of egg on my face)


It's "Garamimals". It's a clothing line for very young kids where they can match the tag on a shirt with a tag on the pants and have an outfit that "goes together."


So USB 4 is basically Thunderbolt 3, got it.

(per Fig 2-1 "USB4/USB3.2 Dual Bus System Architecture")

Edit: with Gen3 speeds (20Gbs/link) and with optional Dual-Lane Bonding for increased bandwidth.


USB3 and thunderbolt (unsure about their exact relation) enabled eGPU's, which is very cool.

What will USB4 enable?


Is this a universal serial bus, or a universal docking station port?


why the bloody fuck don‘t we get a new, clean thunderbolt bus — as it was invented by apple/intel — and leave the usb folks play their mindless games?

jesus.


Why not standardise on USB-C and just make -C with the 2 added Thunderbolt pins (ala Apple) the norm? Why more specs


USB-C is not a protocol. It's just a connector spec.


Huh? Apple does not sell any devices with "2 added Thunderbolt pins".


Yes they do, USB-C cables with Thunderbolt is different from regular USB-C on account of the 2 data pins added.

So none of the USB-C cables I bought would enable me to use external monitors with my hardware because it supported 2 displays over TB only. But the connectors aren’t even keyed so it’s just an irritating mess to keep track of what is and isn’t.

However it would be a great system if TB over USB-C was the default and only probably.


You are mistaken. Thunderbolt alternate mode does not require any additional pins, it does require special cabling though. Most Thunderbolt cables are not capable of any 5-10(-20) gbps USB but they are still using the exact same connector.

USB 4 will change this in that Thunderbolt cables will now be compatible with everything.

The only thing you possibly could think of is VirtualLink which repurposed the existing USB 2.0 pins for two more high speed lanes but even that is using the exact same physical connector.


Why would they attach this as a zip file? Makes reading on an iPhone impossible (CORRECTION: cumbersome)

EDIT: Learnt something new. But the spec on an HTML page would be a lot more convenient than having to install more apps just to read this document.


Is it truly impossible?

No fs access has been one of the things stopping me becoming an iOS user, I thought it'd make some things harder, but I didn't realise something so regular seeming (as in I don't have to be a tech nerd to want to open a .zip) would not be possible.


It is of course not impossible, there are unzip apps for iPhone. It’s just a hassle.

Normal people typically don’t want to open zip files on an iPhone and Apple isn’t in the ‘can open any file’ market. They expect others to provide files their devices do open and mostly that strategy works.


You can save downloads to the Files app, which then has folders for all the storage apps you have installed, like dropbox or icloud or whatever. You can also save to the device local storage, but for some reason only in folders created by other apps.


iOS 13 allows you to store any file on your iPhone and unzip/zip it, without a certain application.


Maybe install an app that lets you open zip files?


Save it to Files, tap ‘preview content’ then swipe over to the second document.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: