Hacker News new | past | comments | ask | show | jobs | submit login
USB4 Specification Announced: Adopting Thunderbolt 3 Protocol for 40 Gbps USB (anandtech.com)
308 points by BogdanPetre 45 days ago | hide | past | web | favorite | 271 comments



What's optional?

That's the most important thing to know is what's optional in the spec.

That could make it great or a major pain. I'm referring to how many features the specification defines as "optional" for manufactures and that the USB organization requires for logos.

This is why even USB-C is a nightmare. Consider a USB-C cable or device port. It could be charge only, data only, monitor support. There are even more permutations and sub-features. Oh, you wanted charging and found one that has it? Know how much power it enough or overkill? It could provide 15 watts, 100 watt, etc. Think you'll just google the specs page? Sure, they never miss providing any of these details or make any mistakes.

Whether USB4 means one thing with nothing optional (or at least a very small number of combinations), will probably determine how much you like it or get annoyed by it.

HP even made it worse with a laptop USB-C port that could technically be used for certain docking functionality but tried to fud-deny allowing it for marketing reasons.


Everything is optional.

USB-IF is for manufacturers, most of whom want to do whatever the cheapest quickest thing is. The user experience absolutely comes second to manufacturing cost and marking convenience.

The naming confusion exists because cheap manufacturers don't want to be branded as the "lesser" product or be offering a "lesser" computer. The confusion aids them. They can ship a slightly cheaper computer or device but market it as USB 3.2 Gen whatever, despite it basically being a USB 2 device.

All features will be optional because mandating features means the chip costs $0.02 more, the cable they slap in the box would be more expensive, etc and they don't want to pay it.


If only it were just about cost, comprehensiveness might be a better way to look at it. I really don't want my laptop's keyboard and trackpad to require clocks to drive my clicks and keystrokes at 40Gbps, that's just going to murder batteries and bank accounts for no reason whatsoever.

So you're left with a choice in this wonderful alternative non-optional world: some expensive temperature controlled oscillator with accompanying nuclear power plant integrated into every mouse, with a dual core processor just in case the host wishes to speak mouse-over-Thunderbolt, with a 240V connector on the mouse just in case the host wants to charge from it, or mice living on a separate low throughput bus where such requirements don't exist. We had that already, it was called the 1970s-1990s, it was an even bigger mess than what we have now, and is exactly what USB's mandate is to avoid

It's not like this is a new problem to USB, it's been a tri-modal specification from the outset to cope with completely different requirements of the peripherals that were to be unified, these horrible recent complexity outcomes are just a natural extension from the early days.

It wasn't so long ago that every budget peripheral manufacturer outside of e.g. printers and mice were forced to bundle expansion cards to implement custom busses just so you could talk to their e.g. scanner. This was still the reality of things as late as 1995 or so. Here we didn't just have custom connectors on the back of the machine, but custom cards that had to be installed to implement those connectors. In the 24 years since 1995, outside of display interfaces I count only 2 major new busses to date -- Firewire and USB. That sounds like a success to me


Not sure you're countering my argument at all because I of course agree we can't just blindly do things that make systems impractical to buy or use.

If fact I would certainly be all for a few basic well thought out combinations as mentioned:

>>one thing with nothing optional (or at least a very small number of combinations)

The problem is, that's not what USB-C/USB-3 whatever is. It's permutations must be in the millions at least, literally. That degree of complexity is simply not necessary to satisfy (forgot 80/20) 99% of users. The 1% (like people here) as usual would have more specialized/flexible/technically oriented options.

It's not comparable to counting number of buses or standards. It's more about the sum total of complexity and confusion within a generation of an ecosystem. Especially when manufactures could have better profits and simplify the equation quite a bit by taking looking at things longer term and as an ecosystem.

It's ironic that often the "savings" you mention individual manufacturers are trying to achieve is a false dichotomy, which becomes visible only when they are willing to take a step back and look at things more holistically.

Often what appears to be only benefiting another company...well you know, the old rising tide lifting all boats thing and so forth and so on...


What are the degrees of freedom in the spec that are resulting in millions of permutations of optional features?


In addition to what RHN said, you've got to count absence of some of those features as another combination so you've got whatever power limits there are and also no power.

You've also got cable lengths which I don't believe are limited by the spec but technologically limited. For example you can by 10 foot cables on Amazon but I believe it's currently impossible to buy any 10 foot cable there that has all the features supported that the spec allows for.

Effectively this adds just as much complexity as something formally defined as a feature because end users have to deal with the additional variables just the same.

I'm really trying to avoid doing enough research and combinatorics that would allow for an exact number but look, we're already at thousands which is ridiculous and you know how quickly the numbers grow with only a handful of additional headaches added in.

We're also not counting edge cases like Apples Adapters that are supposed to allow an iOS device to charge while using USB accessories. That's another niche batch of secrets introduced with the 2018 iPad Pros. It only charges using certain combinations not simple requirements like having enough power.


Not millions, but:

- Power Delivery supports some 5 different maximum values

- 3 different Battery Charging modes

- 5 alternate modes according to Wikipedia (C only)

- >10 different plugs (not an independent variable though)

- 6 different signalling rates

All of those are baked in in the hardware layer. It gets worse when you realize that cables have to support the right combination.

It gets even worse when including things that piggyback on USB, like dual-purpose HDMI and USB ports, Thunderbolt, noncompliant chargers (Apple), Quick Charge, audio alternate mode.


> I really don't want my laptop's keyboard and trackpad to require clocks to drive my clicks and keystrokes at 40Gbps, that's just going to murder batteries and bank accounts for no reason whatsoever.

If it makes you feel better: you can make an ultra cheap type C cable that's basically the same as your old Type A cables and will act like your old fashioned cable except you can plug it in either way in any port.


>The naming confusion exists because cheap manufacturers don't want to be branded as the "lesser" product or be offering a "lesser" computer. The confusion aids them.

Then force them to add appropriate clear markings (of the capabilities in a very simple format), or hit them with fines and deny them the ability to sell to the market without them.


I very much agree with you. I've deliberately avoided USB-C for this very reason. It's not that hard - I don't buy new gear very often, and I'm still using an old 5th-gen iPod every day - but there will be a day when I don't have much of a choice anymore. Bluetooth headphones, external hard drives, game controllers, media players, phones...

The worst part about USB-C is that it can be implemented so wrong it bricks your device! A Google engineer for a while was testing cables and not infrequently they worked poorly, not at all, or outright fried devices. Granted this was years ago, but now with DisplayPort and Thunberbolt 3 over the same connector, the situation has become even more vague.

https://plus.google.com/+BensonLeung/posts/jGP5249NppF

https://www.reddit.com/r/Nexus6P/comments/3robzo/google_spre...


> The worst part about USB-C is that it can be implemented so wrong it bricks your device!

The miswiring that bricked his device, swapping GND and VBUS, could also happen with older USB plugs. There's nothing specific to USB-C about it, USB plugs have a defined polarity for the power pins since the very first version of the standard.


Mitigating hardware damage is also the responsibility of the device manufacturers. Any external cable carrying power could experience a short circuit due to a fault. All USB hosts are required to limit current to deal with that. If they fail, it isn't the cable's fault.


USB-C can send significantly more power, which makes it harder to protect devices in these cases.


It only happens after negotiation, so the higher potential wattage plays no role here. You'd need both a bad cable and an out-of-spec wall wart.


Do you think that combination of circumstances is rare enough that we shouldn't wonder about it?


More or less.

Either scenario, considered alone, is enough to ruin a downstream device. I don't think it's reasonable to worry about the combination of the two as different from either alone.


The failure can happen after negotiation, right?


Yeah that's true, the cable could bend and cause a short after the negotiation, but how is a specification supposed to protect against this?


Many reference design have a USB power surge protection circuit.


There's also the Nintendo Switch issue


That was due to non-compliance on Nintendo's part, hardly a USB Type C issue.


If the spec is so complex even a relatively big and stable corp like Nintendo can't get it right, well, maybe the spec is (partly) to blame?


Are there any widely used hardware specs that a "big and stable" corp hasn't gotten wrong in some implementation? Back when Ethernet was a lot simpler than it is now there were plenty of big players who couldn't even manage to provide unique MAC addresses. Is IEEE754 being too complex to blame for Intel's FDIV bug?


I haven't seen IEEE754 burning down devices though.


Wasn't it that the switch asked for a lower voltage but the charger didn't have that mode and it gave the switch a tad more juice?



> The worst part about USB-C is that it can be implemented so wrong it bricks your device! A Google engineer for a while was testing cables and not infrequently they worked poorly, not at all, or outright fried devices.

No spec is going to look good if you start blaming the spec when manufacturers sell things that don't conform to it. That is what the Google engineer found: these cables didn't conform to the spec. If you are going to do that then bricking the device is the least of your worries as USB is clearly to blame for this as well:

https://www.abc.net.au/news/2014-06-27/knock-off-usb-charger...

The real USB 3.2 issue is different. As a way of telling the user what they are buying, the USB 3.2 2x2 scheme is so bad it's almost comical. They could have insisted every device be marked with the maximum it actually supported. I guess they thought making something like this mandatory:

  20G/3A/60W+dp+tb
for a device that supported 20G bit/sec, 3 amps and 30 watts with display port and thunderbolt pass through would use so much space there would be no place left for the market to earn their wages. Or something.


Market opportunity: A USB-stick-shaped device that, when you plug it into a USB-C port, probes the port and lights up LEDs to show the port's capabilities.

  [x] 5 Gbps
  [x] 10 Gbps
  [ ] 20 Gbps

  [ ] Power Delivery

  [x] DisplayPort Alternate Mode
And so on.


Yeah until the spec changes/expands and you now need another device to tell you whether that stick is up-to-date on the spec :D


As long as the actual port remains identical the widget could be made up to date with a software upgrade.


You cannot add a led with a software update :)


Not with that attitude you can’t!


Sure, but you could work around that. Even with a fixed amount of LEDs, you have on, off, blinking, and color to work with.


You can add another word or symbol on an OLED screen though, which is what I'd assume a widget like this would use.


Also how any watts of power delivery. From observation it seems to vary.


Even better if it is actually a usb stick, too.


>This is why even USB-C is a nightmare. Consider a USB-C cable or device port. It could be charge only, data only, monitor support. There are even more permutations and sub-features. Oh, you wanted charging and found one that has it? Know how much power it enough or overkill? It could provide 15 watts, 100 watt, etc. Think you'll just google the specs page? Sure, they never miss providing any of these details or make any mistakes.

They should be pressured (by law) to include all these details in a simple form (akin to nutritional labels in foods), either on the cable itself or on a sticker attached to it.

Ideally there should also be a few profiles, so that you know if you want a cable for your monitor, you get e.g. a "USB4 / profile 1" (and the cable could mark the profile it belongs to on top).


Or like the good ol rca cables: color, or . Add some stripes in the head of usb or similar and add some stripes with meaningful color.


The only way you can make a specification like USB "universal" is for parts of it to be optional. No one is going to build a USB mouse with a 40Gbit link speed, for example.


Right, but if it weren't for the USB forum's BS, that mouse would use a 12 Mbit/s link and be labelled "USB 1"

But no, thanks to their long-established BS you slap a USB C connector on that 12Mbit/s connection and congrats, you've got a USB 4 mouse.


AFAIK, a 12 Mbit/s link on a USB-C connector is still called "USB 2.0 low speed". It won't surprise me if that keeps being the case even after the USB 4 standard.


>No one is going to build a USB mouse with a 40Gbit link speed, for example.

No, but they could build cables with a "40Gbit link speed" that also work with any much smaller speed mouse.

Devices don't have to have the superset. But if a consumer wants to buy a cable that has the superset of powers, they should be able to find such cables and buy them.


Is that true though? Usually throughput comes at the cost of latency. Is that not the case with USB cables?


That's when the full throughput is utilized though. Why couldn't it have "modes" depending on the latency needs of the device?

(Plus, what low throughput device has low latency needs that can't be covered by cable able to push data to a 4K monitor at 120 hz or more)


Don't mice often run at 1000 hz?


What if it's a high DPI gaming mouse?


Back-of-the-envelope calculation: if you send a 100 byte status update 100,000 times per second, that's just 80 Mbits/s.

To saturate a 40Gbit/s link you could send a 1kB update every 200 nanoseconds (that's 5MHz), assuming perfect efficiency.


Even if there were some hypothetical mouse that needed that kind of speed (there isn't[1]), it's irrelevant to your parent's actual point.

[1] https://www.howtogeek.com/193866/are-there-any-benefits-from...


Even then it could make do with much much much slower rates -- those are monitor rates, and we're talking about a mouse here (hi dpi or not).


USB-A can be 1.1, 2.0, 3.0, 3.1 Gen 1, 3.1 Gen 2.

Same for USB-C. It is just a standardized connector with a defined pinout.

If you force the standard to always provide 15 watt, then how will you do that on phones or a raspberry pi or other low power devices? If you force SuperSpeed or SuperSpeed+ over the connector then you won't see it on budget or midrange phones.

It makes perfect sense to differentiate between the supported protocols and the connector itself. Otherwise you would need a ton of different connectors that you may never need on your device.


I think there needs to be some brand for the full-fat port with everything connected to a certain standard.


Why is USB-C like that? Is it because all smartphones require USB chargers in the EU and so everything that manufacturers might want has to be in the one cable spec?


Ars Technica just wrote an article about the confusing names of USB versions https://arstechnica.com/gadgets/2019/02/usb-3-2-is-going-to-... and now they have another one about this name. https://arstechnica.com/gadgets/2019/03/thunderbolt-3-become...


I feel like hand-wringing around that is a bit overexaggerated - at the end of the day the differences really matter for only really tiny subset of devices that need those 10GBit+ transfers and in that case you can guide yourself by the "SuperSpeed 10Gbit" "Superspeed 20Gbit" logos on the actual box.

For everything else the protocols are forward and backward compatible, so you just plug in the cable and it works.


So remind me, why can you walk into any apple store, pick up a new MacBook and a 5K LG monitor, and when you plug in the monitor using the USB-C cable that was bundled with the MacBook it just won't work at all?

My point is - the standards might be back and forward compatible. But the cables and ports definitely aren't - and your average consumer has absolutely no way of knowing. They look absolutely identical on the outside and when they don't work it's for completely non-obvious reasons.


There are two problems:

1. the same connector is used for a wide variety of capabilities, and it can get confusing knowing what you can use together.

2. the names of these connectors is confusing.

You're complaining about the first, and that's a very hard problem. There are huge advantages to using the same connector and cables that can support 100W of power, 5K displays and also support peripherals that retail for less than $1 and be backwards compatible with peripherals over 20 years old. The disadvantage is the confusion you mention.

The complaint in the article is about the second, the stupid naming. There's no good excuse for that. It really exacerbates the confusion from the first issue.

But now it's on us and journalists just to not use the stupid naming. They've provide "marketing" names that aren't silly, so everybody should just use them, even if it's awkward. IOW "Superspeed USB 10Gbps", not "USB 3.2 gen 2".


>There are huge advantages to using the same connector and cables that can support 100W of power, 5K displays and also support peripherals that retail for less than $1 and be backwards compatible with peripherals over 20 years old.

This, quite honestly, sounds wasteful. I am fairly certain that manufacturing these connectors that are much better is more expensive than either the old, less sophisticated, connectors or simply new lower performance connectors.


>You're complaining about the first, and that's a very hard problem. There are huge advantages to using the same connector and cables that can support 100W of power, 5K displays and also support peripherals that retail for less than $1 and be backwards compatible with peripherals over 20 years old. The disadvantage is the confusion you mention.

Well In your example the Connector and Cable, specifically cable will have to be an expensive PD 100W Cable for it to correct, if the cable were designed for the $1 peripherals it wouldn't work with the 100W and 5K Display.

The sentence would have been correct without the word cable, but without it the argument wouldn't stand. Because you just can't find a Cable with the same Connector and expect it to work with 5 / 10 / 20 years peripherals because the cable didn't support it.


I had this happen to me once with an HDMI cable I bought for my monitor, which I thought was defective until I found out that it worked on my TV. It was then that I remembered that HDMI has several speed grades, and my monitor happened to have a high enough resolution that not all HDMI cables would work with it. Happily, the HDMI cable I had been using on my TV worked on the monitor, so I just swapped the cables and all was fine. So this issue is not exclusive to USB.

In fact, with USB the situation should be better, since all cables except the slowest ones are supposed to have a built-in chip describing how fast they can go. So in the MacBook example, in theory it should be able to detect the issue, switch to a lower resolution, and present an on-screen warning telling you what happened and why.


This absolutely does need to be way more clear. There should be distinct and visibly obvious branding from the standards group.

I should be able to tell by looking at a connector:

  * If it's claiming to be a certified cable/device or not.
  * The USB standard it conforms to (version)
  * The speed of data transfer it supports
  * The maximum wattage it supports
  * The maximum voltage it supports
  * (the connector type, but this is obvious based on shape)


Why is that? You can't do that from looking at a file with an unknown extension.

Physical connector technology and signaling protocol are entirely different things. I'd be fine with literally everything using the same physical connector, but knowing that I can't and shouldn't plug my 5VDC battery-charger into a 110VAC outlet.

I still have speakers that are connected by stripping a lampcord pair with my teeth, pushing a little button, jamming bare copper strands into a hole, and releasing a little button. I think you're asking for too much. Everything using one reversible connector that almost always guarantees 5VDC in a fallback mode is better than what we have today.


Pick any random USB-C cable.

Can you tell by looking at it what speeds it supports? Is it USB-3 or USB-2? How about the power delivery aspect, how much power is it cable of transferring?

You might be able to tell some of those things from modern cables, if they're bragging about a speed factor, but otherwise it isn't clear.


I got a new laptop this year and I couldn't even work out what protocols it supported. No where on the laptop spec sheet did it say that supports hdmi and displayport over usb c but when I use the plug for it it works. When I use the same adapter/plug on my phone it doesn't work.


Pick any random cable, regardless of what’s on the ends.

Yes, I can determine what it is used for, because I bought it for a purpose. What you are describing is the same issue with barrel-style AC or DC wall warts, and those powered the world for a few decades just fine.


Except that as I said in another comment - the USB-C charger that comes with OnePlus phones will not charge the Nintendo Switch. Both devices have full USB-C certification and display USB Association logo on the box. But it's either OnePlus or Nintendo doing something dodgy with the standard and the Switch just doesn't charge.

So yes, I bought a USB-C certified charger, to charge things. What does it do? It charges some things when it wants to, without providing any kind of indication why it doesn't work when it doesn't.


That is the same outcome for when dodgy things break the pre-C USB spec as well. I’m not sure why OnePlus or Nintendo breaking spec is an indictment of the connector they chose.


Apple and most TB vendors at least make the distinction of putting the TB logo/name on the cable ends (similar to USB3.0 cables that have "SuperSpeed" on them to indicate... something higher than 480Mbit).


The cable problem with USB-C I think might be the largest annoyance when it comes to the promise of USB-C. For the highest speed stuff you can only really count on the cable that came with the device because for the extreme speeds you need a good cable with the right length.

(The monitor should have a compatible cable bundled with it though. There's some major cable length limits and shielding when it comes to the highest speed stuff like monitors and eGPUs with USB-C.)


No wiring issues necessary; the MacBook just isn't powerful enough to drive a 5K display.

(I think the question you meant to ask is why the MacBook can't drive an external 4K Thunderbolt display using its included cable, which is more interesting—IIRC, it can do so through an HDMI adapter plugged into the USB-C port, but it cannot do so over a USB-C cable. This still isn't about the cable, though; you can take the MacBook's cable and use it to connect an MBP to a Thunderbolt display just fine. Instead, it's about the MacBook's combined Thunderbolt/USB-C controller not supporting the recent-enough version of Thunderbolt to have the bandwidth over the wire required to feed a 4K@60Hz display. When an HDMI dongle is plugged into the USB-C port, you're taking a direct GPU->controller->HDMI path, which avoids the anemic old-Thunderbolt bottleneck path.)


Uhm....you can definitely connect the LG 5K UltraFine display to the latest MacBook using the USB-C(TB3) cable included with the monitor and it will work(and even charge the MacBook at the same time). So the MacBook definitely has enough power to drive it. But if you use the cable that was bundled with the MacBook it won't work even though externally it looks the same and will fit ports on each side.

Less extreme example is how the OnePlus USB-C charger won't charge the Nintendo Switch, even though they are both USB-C certified devices.

Edit: sorry, I just realized that I could have been misunderstood. I forgot that "MacBook" is a device that exists. Obviously(to me) by MacBook I mean the MacBook Pro.


A lot (most?) charge cables are USB 2.0 — just enough data to negotiate the higher power. If they supported faster data too, they'd be more expensive. But it definitely should be more clear! Apple's minimalism doesn't help in this case, either.


> Apple's minimalism doesn't help in this case, either.

Apple don't make a "USB-C data" cable - they make a USB-C charge cable, or a TB3 cable, and the TB3 one has the thunderbolt logo on each end.

They also refer to it specifically as "USB-C Charge Cable", everywhere.


Yes, if you buy it standalone. But what about in the computer box? I've opened a lot of MacBook Pros, and I can't remember seeing any labeling.


I’ve gotten into the habit of taking the supplied cable on all Mac portables and replacing it with a Thunderbolt 3 40gbs 100w cable from Monoprice before handing it over to the staff person. It cuts down on “why won’t this work” questions surrounding monitors and such.


Except that longer Thunderbolt 3 cables apparently don't support 10 GBps USB 3.1 speeds. So if you want a cable that actually supports everything, it probably has to be the 0.5m Thunderbolt 3 40gbs 100w cable from Monoprice (maybe the 1m one too, but the 2m version is an active cable that won't support USB 3.1)


I'm glad you mentioned this -- didn't show up on a "USB-C" search and I didn't think to search for thunderbolt. I'm switching!


That would have fix most of the problem we have. The only thing left is how to make these cables much much cheaper.


“Apple’s minimalism” that somehow spreads to every single USB cable by every single manufacturer?

IIRC it was a deliberate decision by USB-IF to not enforce any additional labels on USB cables and connectors beyond the USB logo.


I wasn't clear: I meant in the computer box. Most boxed cables will say "charge cable" if they are USB 2.0-only, including Apple's, but there's nothing when you open your new computer. It could at least have a label on the cord wrap or even underneath it in the box that says "charge cable".


The cables should have a ss or ss 10 or ss 20 label on them. The end devices should too I believe.


Apple cables and ports have no logos on them whatsoever. Just plain white/silver if metal.


Not true. See their thunderbolt cables: https://www.apple.com/shop/product/MD861LL/A/apple-thunderbo...


Fair enough. But their USB-C cables and chargers do not:

https://www.apple.com/uk/shop/product/MLL82ZM/A/usb-c-charge...


That's because it's a USB Type-C cable that can handle only USB protocol. Thunderbolt 3 ones do have a logo on them [1].

Though, I do agree, having clear information directly on a USB Type-C cable on which features are supported would be much appreciated.

[1] https://www.apple.com/shop/product/MQ4H2AM/A/thunderbolt-3-u...


That's not a USB-C cable.


All OP said was that their cables don’t have icons. He/She didn’t specify just USB


"should"

The USB logo on usb1.0 cable should be printed on the "upper" side. How many times did you inserted it the wrong way?


The spec also mandates the orientation of the connector (socket) side but many manufactures have ignored that too.


Exactly. Without a qualification test, nobody would pay attention to those detail.


This issue is neither new nor irrelevant.

When USB 2.0 was introduced, it was a huge increase in speed. USB 1.1 is 11 Mbps whereas 2.0 is 480 MBps (theoretical). When this happened some camera manufacturers (Nikon I believe was one of them?) were identified as changing the marketing on their cameras to "USB 2.0". What this actually meant though was "USB 2.0 Full Speed", which is USB 1.1 speeds. USB 2.0 Hi-speed is 480.

Consumers don't need this or want it. Manufacturers just don't want to say "USB <not latest version>".

You see the same thing with AT&T deciding its LTE iPhones are 5G because they want their network to sound more impressive than it is.

It's also why the strength bars on your phone show the strongest signal (2G/3G/4G) that your phone is getting, not the one its actually using.

The truth matters. False advertising matters. Markets only work with a minimum level of trust. All of this legerdemain erodes that trust.

I'm Australian and this is one place where I think the ACCC (equivalent to the FTC/FCC rolled into one) has way more teeth than their US equivalents. The ACCC, for example, ended ISPs advertising "unlimited" ISP plans that weren't truly unlimited (which was basically all of them). This also covered ISPs with hard or soft data caps where throttling would come into play. The ACCC also took ISPs to task on the truth in advertised speeds for services (which matters are lot for xDSL).

A more sensible standard would've been something like:

USB <version><plug>

eg USB 3.1C, USB 3.0A.

Or put the letter first.

But this whole USB 3.2 Gen 1 = USB 3.1 Gen 2 or whatever is just pure nonsense.


That sounds like '10Gbit ought to be more than enough for anybody' to me.

Plus the good thing about USB is it just works, you don't have to worry about any tiny subset (well that's the theory anyway).

I've had enough of my monitor, mouse, keyboard and printer all using different connectors thank you very much.

Edit: Making make sense.


The issue is with USB-C you can run into limitations at the cable level where the cable is too long or poorly shielded to provide devices with the right data speeds/power requirements and some companies (cough apple cough though it's bleeding more into other places as the aesthetic gets copied) refuse to properly label things. And some of the differences are not spelled out anywhere, for example to get the top thunderbolt 3 data rate of 40Gb/s on a passive cord you cannot have a cord longer than .5m. To get a longer cable to pass the full speed (to drive say an eGPU or high res monitor) you have to get a more expensive active cable (that may not work because companies online are unscrupulous and will try to rip you off). All of this is extremely opaque to users who just want to have a slightly longer cable to arrange their desk as they prefer.

At least with the different connectors and cables I didn't have to worry so much and things generally worked when plugged into the right place.

[0] https://www.cnet.com/how-to/usb-type-c-thunderbolt-3-one-cab...


>>> I've had enough of my monitor, mouse, keyboard and printer all using different connectors thank you very much.

Personally, I've had enough with my 1yo laptop, my 3yo desktop, my 5yo peripheral and that funny dongle I need for work all having different USB cables. USB was meant to save us from having multiple connectors but has done the opposite: every year there is a new USB cable/hole design.

I cannot be the only one here who has to inspect every online purchase for high-rez pictures of the USB ports. I really don't care that my laptop has 10+Gb/w potential. I just need to know that my current equipment is physically compatible.


This is almost as bad as SpaceX's rocket naming scheme: https://external-preview.redd.it/yQJH1H8y7ypKanxgZKwIaJom5h4...


As someone who knows next to nothing about how USB actually works and just uses it, what goes into making each USB generation faster?

At the physical layer, is it the materials in the cables are getting better? Or new ways of using the same materials?

At the protocol layer, is it newly developed computer science theory being applied or is it just old fashioned pragmatic engineering iteration, looking at usage and making the existing protocols more efficient with tricks and shortcuts etc?


I'm no means an expert but up to USB 2.0 there were 4 pins (2 for power and 2 differential signals for bidirectional data) With USB 3.0 they added 4 more signals for separate transmit/receive. Initially they used 8b/10b encoding and then switched to 128b/132b encoding. They also increased the signaling rate with each generation. Also in the USB 2.0 and earlier the higher level protocols were kind of inefficient with lots of back and forth. So they improved upon that. And I'm sure they also tightened up on the cable and connection specs to increase the signaling rates. So a lot of little improvements added up.


> With USB 3.0 they added 4 more signals for separate transmit/receive

So is it not actually serial any more?


Serial/Parallel have commonly accepted technical meanings in the context of data interfaces which aren't necessarily intuitive.

Parallel refers to taking a raw bit stream and sending N bits at a time down N wires to a receiver who reads N bits at a time and reconstructs the bitstream. This was a common technique in the early days of computing because it is very simple to implement and low clock speeds meant it was often the only practical way of increasing the throughput of an interface.

As clock speeds increased, keeping each of the N signals in sync became increasingly difficult. This eventually lead to parallel interfaces falling out of favor and serial interfaces becoming dominant.

The defining characteristic of a serial interface is that the data signal itself also serves as the clock signal which keeps sender and receiver in sync.

Modern "parallel serial" interfaces like USB 3 consist of multiple serial links which are multiplexed using a framing protocol of some sort. So from a signaling standpoint they are still considered serial interfaces, even though data is in fact being sent in parallel.


Thanks! I don't know much about hardware design issues and this is a very readable explanation.


Great explanation - thanks!


Awesome comment.


It's a low-speed differential pair that negotiates the initial communication. Once the bus knows it can go to a faster speed, it will switch to a separate receive differential pair and transmit differential pair.

    -Power
    -Ground
    -Signal Ground
    -USB2 data+
    -USB2 data-
    -USB3 RX+
    -USB3 RX-
    -USB3 TX+
    -USB3 TX-
So it's a slow serial bus that negotiates traffic onto a faster serial bus.

https://en.wikipedia.org/wiki/USB_3.0#Pinouts


> It's a low-speed differential pair that negotiates the initial communication.

AFAIK, it's not, the SuperSpeed (USB3) wires have their own independent negotiation; the device switches to the USB2 wires only when USB3 is not detected. This also explains how the proposed VirtualLink alternate mode can have USB3 without the USB2 pair.


Transmit and receive previously shared pins, USB 3 split transmit and receive into separate pairs, which are still serial

edit: yikes, looks like USB C doubled the number of those separate pairs, for a total of 6 data pins (2x RX, 2x TX, 2x USB 2.0 RX/TX)


It’s better to say they were mirrored to make the connector work the same upside down, and then they expanded the spec to allow simultaneous use, effectively doubling the throughput.


Multiple serial connections in parallel.


It's more like… multiple serial connections over the same cable – kind of like PCI-E, SATA etc.


> At the physical layer, is it the materials in the cables are getting better?

Yes, and more expensive on that. Laptop makers totally hate usb 3.0+ for that.

The industry standard for internal digital connectivity was the FPC - flexible printed circuit board, a very economical and ergonomic solution, but then came USB 3.0.

Regular polyimide FPC can't reliably handle the high frequency signalling. Still, many laptop makers still use it, to regrettable results.

The few solutions left are the micro coax cable, running twisted pair inside the case, or passing it over a hard pcb bridge. All are very expensive, and hard on assembly lines.

This is why in most budget laptops manufacturers do a following trick – they put all usb 3.0 ports on one side, and 2.0 on the opposite. This way they only have to run the low frequency, low lane count usb 2.0 to other side of the laptop.

Only very recently, there came an FPC maker that makes custom cabling with some proprietary dielectric that can handle 3.0. And they charge more for it than for hard pcb bridge.

Even if the cable material issue is solved, you still have to find way to connect it to PCB. High frequency connectors are not cheap either.


Thanks for sharing and interesting industry perspective.

Color me unconvinced on one point though:

> Only very recently, there came an FPC maker that makes custom cabling with some proprietary dielectric that can handle 3.0.

A brief glance at IPC-4203A (which has been around since 2013) tells me this is a consequence of market cost sensitivity, not one driven by the inherent performance limitations of available material.

The latest IPC-4203B was just published circa March 2018 last year. Is there something new worth looking into here, or is this "recent advancement" just a manufacturing process-driven one which has made high-performance FPC more viable in the throwaway consumer electronics market?


It's more to finding FPC factories dealing with anything amounting to a "specialty product," and finding somebody who can engineer that FPC with all attention needed to signal integrity and fickleness of high frequency circuits.



Others have mentioned tighter tolerance in the cable and PCB parameters.

Each successive frequency increase has required more complex PHY design.

On the digital side : more sophisticated bit encoding schemes

On the analog side : tighter tolerances for clocking and signal jitter, more sophisticated transmitter and receiver technology (better emphasis/deemphasis, better equalization)

Here's taste of the PHY changes : https://www.chipestimate.com/Type-C-USB31-PHY-Challenges-in-...


Nope. Much simpler. Just enabling the use of the other data lanes that were already in the cable waiting to be used one day.

USB-C has 4 differential twisted pair data lines.


Not 'just' at all.

USB 2 has a single twisted wire pair, specced for moderate speed.

USB 3 replaces that with two pairs with much tighter requirements, one for transmit and one for receive. This involves a tenfold speed boost (.5Gbps to 5Gbps), with much more modern transceivers using much higher frequencies on much better wires. But it also drops the length a single passive cable can be from about 5 to 2 meters.

USB C doubled the number of pins while making the plug reversible, so they went ahead and made the special case of C-to-C cables connect to all the pins and have four pairs of wires.

But that's only a doubling from 5Gbps to 10Gbps. How are we getting up to 20 or 40? It's mostly coming from restricting cable length even further, down to 1 meter or only half a meter. But on the bright side you can get an active cable, with chips inside, that can get you back up to 1 meter or 2 meters or 60 meters. As long as you're willing to pay enough.


Now, if only someone would actually ship a USB-C 3.2 (or Thunderbolt) active cable. :(

I have a couple of situations I would like a 7m-ish Thunderbolt 3 cable but can’t seem to find one, even for the multiple-hundreds it would probably cost.


Corning has several lengths of optical thunderbolt 2 cable, do you need it to be thunderbolt 3 specifically?


This is for 5K and docking station specifically.


Ah, so just barely more than thunderbolt 2 or USB 3.2 can handle at 60Hz.

It's really disappointing that Display Stream Compression hasn't been built into everything for the last 5+ years, and is only now barely getting into equipment. 2:1 or 3:1 compressed display data looks fine, and being able to add more pixels or bit depth or refresh rate far more than makes up for the tiny losses. Without it, tons of screens are stuck at significantly reduced framerates or 4:2:0 subsampling.


Agree… but also, why isn't anyone making optical Thunderbolt 3 (or USB-C for that matter) cables? :(


That sort of explains why USB cables have been getting more expensive. More copper.


Probably more like "better tolerances" or something like that. The labor/time put into a cable is the most pricey part, raw materials are probably only a few cents, even if you double the copper into it.


It's more the process of qualifying technology that's increasingly faster with every generation than the cost of raw materials.


and more silicon. These cables are more than just wire.


Why not turn them on all at once?


Crosstalk with other pairs.


I still wonder why the 'S' in the name - USB has stopped being serial (or bus) for quite some time.


> ... it will not be exactly Thunderbolt 3 as its functionality will likely be different.

I don't understand how this isn't simply ratifying/renaming current TB3-on-Type-C-connectors and this cryptic sentence in the page doesn't help. Anybody know?

This isn't to say that such a renaming might not be a good idea! But it would be nice to know if my current Type C ports with TB and DB support were in fact already "USB 4.0."

Also: speaking of nomenclature: notice that according to the press kit slide show, USB 3.1 Gen 2, USB 3.2 Gen 2x2, and USB4 all have the same "Alternative Branding": "Super Speed+". Madness!


Here's an idea, let's make different connector shapes depending on the cable/connector's capabilities, lets do one for connectors that support display, one for power, one for thunderbolt, that way they're easy for the average consumer to identify!

Oh wait that's the issue USB was supposed to solve in the first place....


My hope is that Type-C is going to be all of those things so that in the future, I only need one kind of cable (until the inevitable new plug that supplants it, but hopefully universally used as well). Currently, I have to pay attention to a USB3 Type-C cable vs a Thunderbolt 3 cable as there is a difference (I was shopping for a long TB3 cable and was unable to find one. Instead, seems the longer ones tend to be USB3 Type-C cables).


That's because TB has length restrictions slower USB protocols do not.

Don't forget higher power PD cables (it can go up to 100 W!) should have higher gauge wires as well.


This is not a realistic expectation, unless you are OK with your "one cable" being 0.5m long. The "one-cable" thing is a dream, unfortunately.


"Active Thunderbolt 3 cables support Thunderbolt at 40Gbps data transfer at lengths of up to 2m. Optical cables are targeted later, with lengths of up to 60m."

https://blog.startech.com/post/thunderbolt-3-the-basics/


They are, of course, significantly stupidly expensive and would cost more than many devices using USB as connecting protocol.


Of course the optical cable won't transfer 100W of power, so we're back to needing two types of cable (power delivery USB-C and non-power-delivery USB-C).

I guess technically you could make a USB-C cable that transfers signals optically and has high-gauge wire for power, but that sounds unwieldly and expensive.


> Of course the optical cable won't transfer 100W of power

You sure about that? The optical cables will still have copper wires for power transfer. Only the high-speed data is carried over optical fiber. Eliminating electrical interference from the data path might even permit higher voltages, and thus thinner, more flexible wires for the same power.


And don't forget that the One Cable costs $40.


As long as the only difference between cable capabilities is speed, I'm happy. We're not quite there but we're close.

Also it would need to be marked on the cable somewhere.


This is one of those "it's a bug, not a feature" things. I don't mind having unified connectors so long as we have segregation between powered/unpowered connections at the cable level. It prevents faults by design.

There's no reason that an unpowered device should have the same connector as a powered one, and quite a few reasons why it shouldn't.


Its not going to happen in the future either, There is never going to be a desktop motherboard that delivers full spec (100W) USB3 power delivery to say, 6 ports at once, or a laptop even capable of delivering to 1.

The USB spec is too ambitious and its inevitable that some use cases are mutually incompatible.


TB3 is already the "all of the above" cable, so hopefully USB4 will be as well.


The problem comes in practice: First, OEMs don’t bundle Thunderbolt cables with a device unless the device actually has a Thunderbolt controller onboard. Meanwhile, USB-C cables retail for $8 while Thunderbolt cables retail for $40.

So the end result is that no one actually has “all of the above” cables, but rather a collection of cables with different, often invisible constraints. Maybe the move to free licensing will bring TB cables down to USB cable prices, but it remains to be seen.


I like how this was solve for USB A/B connectors by color coding the inside. I'm curious why this wasn't brought over with USB-C, it seems like such a simple solution, though maybe there are just too many variations / combinations for it be of practical at all.


So… maybe I'm missing the details, but is this just basically rebranding TB3 as USB4? That would make a lot of sense, if so.

Edit: Ah, I missed "it will not be exactly Thunderbolt 3 as its functionality will likely be different". That's a clear as mud, then.


Whoa. Does the addition of Thunderbolt imply that all USB4 host systems are exposed to DMA attacks[0]?

Doesn't this open up every USB system (all systems?) to arbitrary, uncontrolled memory access including silently flashing new firmware/microcode to system components?

0: https://en.wikipedia.org/wiki/DMA_attack


That's what the IOMMU is for. USB4 being Thunderbolt could therefore be a good thing, because it will force operating system vendors to actually set that up correctly, making the hardware more secure generally.

The option also exists to require approving or whitelisting devices before they are allowed to work over Thunderbolt.


On my system (Dell XPS running Windows) I have to manually approve new Thunderbolt devices and cables. The driver prompts you to accept or ignore them. I assume this is implemented to prevent DMA attacks.


Not USB 3.2 Gen 2x2x2?? Come on, consistency people!


It ought to be USB 3.3 Gen 2x2x2, and USB 3.2 Gen 2x2 will be retroactively updated to USB 3.3 Gen 2x2 because everything is always the newest version of USB.


Best part is all USB 3 ports get to be rebranded as USB 3.3 as well.


I'm honestly surprised that they haven't rebranded USB 2 into USB 3.2 Gen 0 and USB 1 into USB 3.2 Gen -1.


Would USB 1.1 be USB 3.2 Gen -1.1 or USB 3.2 Gen -0.9? Hmm...


Neither. It gets to be USB 3.2 Gen -1 Plus.


Clearly we need a "semantic versioning" campaign for USB.


They're always free to skip ahead to the next major version on a whim. But for consitency's sake it should be something like USB 4.0 Gen 4x4.


The consistency is in being not consistent (unfortunately)


USB 4? You sure they don't mean USB 3.4 Revision 4 UltraSpeed HyperBus XP for Workgroups 3.11 Xtreme Edition?


I'd be curious about 40Gbps for what cable length. TB3 is great, you can in theory dock a laptop with a single cable that will do charging, 10gbe, USB hub, etc, but with a max 50cm cable, which makes it quite "un-lappable". I'd love to do the same with a 2m cable.


Copper thunderbolt limit is 3 m. Optical thunderbolt can go much farther (naturally) but I've never seen it in the wild -- it might even have been abandoned by now.


But you can't charge a laptop with an optical cable. As for copper, I understand you don't get 40gbps with a 3m cable (though not a specialist).


You can, however, charge it with a combined cable that uses copper to bring power to the laptop (and optical converter), and optics to transfer the data.


Good idea, wonder if anyone will ever make one!


> But you can't charge a laptop with an optical cable.

I don't think that matters. If you care about full PCIe through a cable you presumably are charging through a different port.

> As for copper, I understand you don't get 40gbps with a 3m cable

Actually you do with TB: that's the spec and you can find independent verification via a web search.


I have a 2m TB3 cable that I bought off amazon [0] a year ago. Some of the reviewers reporting having trouble but it has worked like a charm for me.

[0] https://smile.amazon.com/gp/product/B01H5QF2TK


CalDigit makes a 2m 100W TB3 cable. No backwards USB compatibility though, it's only Thunderbolt.


TB3 is only limited to 50cm for passive cables. Active cables can be longer.


Royalty free is the huge boon here.


Intel said that a couple of years ago, and yet you won't find an AMD laptop/desktop that supports thunderbolt.


Intel said they'd open the Thunderbolt spec and make it royalty-free years ago, but only now are they actually doing it (by donating it to the USB-IF).


Because no one has yet bothered to develop an IP other than Intel.

Much of the IP especially for PHY is licensed from other companies especially in the case of AMD.

AMD only recently has introduced CPUs that can be put into premium laptops and still they don't have a CPU with more than 4 cores and an integrated GPU.

With Intel integrating TB into the CPU directly it also would mean that no one is currently making discrete TB controllers Titan Ridge is the last controller you can buy as a discrete unit and these can technically run on AMD systems but the cost doesn't make it sensible.

With USB4 there will be TB3 compatible IP from multiple 3rd parties which AMD and OEMs could license to integrate it into the SoC directly or into the system itself via PCIe.


One, what AMD laptop?

Two, I'd be interested to know if AMD bans board manufacturers from including TB or if they just don't do it because their heuristics say users don't care or want it.

I'd imagine now with the rise of USB-C it would be the latter. Before that AMD was stuck on a 5 year old desktop platform that probably couldn't run thunderbolt ports even if board makers wanted to include it.


>Two, I'd be interested to know if AMD bans board manufacturers from including TB or if they just don't do it because their heuristics say users don't care or want it

No, AMD boards already include the required TB hardware. They just can't enable it (yet) for whatever technical or legal reasons. Someone hacked them to enable it (https://www.youtube.com/watch?v=uOlQbP63lDQ)


There is no required hardware needed for Titan Ridge to work other than a PCIe expansion slot, you need UEFI drivers for it but that's pretty standard for anything today.


A couple of newer laptops are coming with Ryzen processors. Off the top of my head, the Huawei Matebook D has an AMD option.

Intel has just now opened Thunderbolt up (the topic of this article), despite the announcement coming two years ago.


Don't you still need to pay for a VID?


"The detailed USB4 specification will be published in the middle of 2019 and half-assed, cheap hardware that only implements a subset of the total features list in order to cut corners on cost but still use the new name should appear in 2030 behind flimsy out-of-spec connectors and loose, easily damaged cables."

I had to add a few things to manage expectations and more accurately reflect reality, but I think we've got it now.


Honestly, I'm waiting for old USB(USB A, USB B) to be deprecated and USB C become the new universal USB. As in ATM machine, you get the drill. Current USB cable situation is a mess - my phone is USB C, but it came with a USB A to USB C cable. Good thing I can connect it to my 2017 MacbookPro using it's original USB C charging cable! However then I can't charge. Sometimes I use a USB C to female USB A dongle to use my phone's USB A male to USB C. And the whole thing has become expensive, good USB C cables seem to be made out of gold. Life would become so much easier if everything was USB C to USB C


Don't get your hopes up, as this is just a dream. USB-C only specifies the connector. What your cables support, and what end devices support is anyone's guess. See for example https://superuser.com/questions/1199917/what-is-the-d-shaped...

In my opinion, the mess is actually much worse with USB-C, as with earlier connectors I at least had a chance of guessing whether the cable/connection will work by looking at the connectors. With USB-C? No idea.


In practice this is significantly smaller issue than most media and techies made out to be.

Most devices are USB, they work and charge over the connector just fine. On laptops, connecting displays work as well. They will either use TB or negotiate for HDMI/DP which is also fine.

The only real outliers are crappy companies which deliberately break protocols like Nintendo on their Switch.


>Most devices are USB, they work and charge over the connector just fine.

There are USB-C Charge Cable, that does no Data ( To save cost )

>On laptops, connecting displays work as well. They will either use TB or negotiate for HDMI/DP which is also fine.

That is assuming the cable could do TB, which is also not true.

Once you get USB-C to everywhere to everyday normal users, they expect it to plug in and work.


It's not like we already had a cable and protocol that could encapsulate arbitrary data and transmit it between devices without concern for what upper level protocols were in use or what the data type was, with the only concern with regards to the cable being the max speed supported.

Oh, wait, Ethernet is a thing. I wonder how they just completely missed that when making USB 3. The only thing that should be differentiated between cables is max power delivery rating and data rate.

Edit: typo


It's a software issue. No one cares what the port says. When I plug something in the OS should notify me, and try to autonegotiate with the device. And if there's a problem, let me know what is it. Not enough power? Port or cable issue? Great, I'll just plug in a different cable into a different port, etc.

The OS knows the USB controller hierarchy, knows what the deveice wants, etc.

Cables might be a mess, but it's because low-level software is absolute bovine manure biogas plant on fucking fire level shit when it comes to user experience.

Just a few weeks ago I had to spend about 1.5-2 hours (not exaggerating) trying to get 2 Logitech gamepads working on Win10. (It worked fine at first, but wouldn't survive a reboot. It worked fine in Linux.)


> good USB C cables seem to be made out of gold

USB type C cables are active (they have silicon in them and handshake themselves with the host rather than simply being wire (possibly plus a resistor) as with old USB cables.)

Cost will come down as volume goes up. Currently the only way to buy a cheaper cable is to buy a noncompliant and possibly dangerous one.

> Life would become so much easier if everything was USB C to USB C

I would agree with the added caveat of "... and if you could tell by inspection what sort of cable you have: speed, power, and protocols supported -- and some way to verify conformance"


> Cost will come down as volume goes up.

I have no doubt overall cost will go down, but I'm curious by how much can we reasonably expect, and if said difference is actually meaningful.

Consider Type-C qualification testing[1] as it stands...it's quite convoluted. Any cost savings from volume economies of scale would be all for nought if testing such high-performance products can't scale proportionally. Given Type-C was specifically designed for the cost sensitive consumer market from the onset and how capability keeps increasing in ways that make one wonder if it's even the same fundamental underlying technology anymore, I wonder how close to the "market noise floor" a qualified device truly is.

In the defense/aerospace industry, it's commonplace for a $0.07 electronic component to have $10 of testing behind it before profit and indirect costs are baked in. Same outcome with MIL-DTL-38999 interconnect, which has literally been around for longer than I've been alive, supplied by multiple competing manufacturers, and leveraged by multiple industries (not just mil/aero)--testing requirements ultimately keep costs high. Although not the mil/aero market, it wouldn't surprise me if Type-C is already close to a similar point of diminishing returns.

[1] https://www.usb.org/usbc


Most of the cables aren't going to have anything more than a few-cent chip that lists the capabilities.


That's true; a few cents of chip and a couple of cents more complex assembly, propagated through the supply chain. (there's a little bit of handshake for PD, billboard etc but I agree with your point that the cost will be pennies).

Once they become ubiquitous I imagine the usual forces will drive the cost way down so we'll end up with something like the current mix of good and crap cables. Hopefully nobody will advertise 100 W over 30 gauge wire, but probably...!


What kind of BS phone is that? Pixels support USB-PD which is something I'd say should be mandatory if you have a USB-C port. I can charge other phones, charge from other phones, charge my Nintendo Switch, charge from my PC, etc.

That they gave you the USB-A to USB-C cable as your go-to cable is a red flag that they are peddling the connector as a marketing bullet point.


Huawei P20. It can charge my MacBook and vice versa. Using the original MacBook Pro USB charging cable. It's a solid phone, but it comes with a USB A type wall plug.


Does this also mean that soon every computer will be making noise/radiating waves around 40GHz? Can someone with more depth in this area shine some light on this?


PCI Express is already working at those data rates.


Computer cases are electrically insulated.


As any radio amateur will tell you, PC shielding in the wild is generally poor. Plus the vast majority of PC RFI is radiated via the connected cables ( be that USB or the power cable ) or the monitor which form broad-spectrum antennas.

A PC and its appendages is a horrendous radio squawk-box and should be kept away from anything RFI-sensitive. The FCC 'must not cause harmful interference' clause is pretty worthless.


Thunderbolt is PCIe.


Not quite: https://twitter.com/whitequark/status/1097777102563074048

"Thunderbolt, uh, does not expose PCIe lanes directly. Thunderbolt is an MPLS-like packet switching network that can encapsulate PCIe TLPs over a PHY and MAC without a spec, chips without documentation, and software with barely any support."


Thunderbolt encapsulates PCIe frames. It's not PCIe any more than say ethernet is UDP.


The USB consortium is mostly EEs, and probably a majority analog EEs. I wouldn't worry about this.


One more question. Does this also mean that soon every computer will become software defined radio up to 40GHz?


No, they combine multiple lanes at a lower clock rate.

Additionally, the cables are twisted pair, which among other things minimizes emission. And high-speed cables are also supposed to be shielded.


I think the signaling is more like 10 GHz and data is scrambled to spread EMI over a wide range.


USB cables are supposed to be shielded.


Is tinfoil-hatesque EMF poisoning real?

I personally believe that it is real (just go camping where there's no service and you'll feel better) but it's hard to prove without decades of study and money, which won't happen because there's so much money to be made.

There are other reasons why camping makes you feel better. Earthing/grounding gets you in touch with the earth's natural schumann resonance (through the canvas tent) and the fresh air. Also too getting away from distractions.

There's a real study on this, but they don't speculate what the cause it: http://time.com/4656550/camping-sleep-insomnia/


Could that just be the feeling of being outdoors, and away from distractions for a while?


Stop thinking about it and go with your gut feeling. EMFs are clearly the only reason you would feel better camping. /s


I dunno, but turning my home circuit breakers off does make me feel better


tl;dr: Intel made Thunderbolt 3 royalty free in 2018, so now it's essentially being adopted as USB4.

I had to look up what the point of 40Gbps was, and the industry evangelization site https://thunderbolttechnology.net/ explains that it will allow driving one 5k display or two 4k displays, which is not possible with the 20Gbps offered by USB 3.2 or Thunderbolt 2.

This also helps external GPUs, where TB2/USB3.2 is the equivalent of 2.5 PCI-e 3.0 lanes, while TB3/USB3 is 5.

With that, it seems the goal is to be a viable alternative to PCI-e and HDMI, rather than just improve on today's USB3 speeds for existing device classes.


Thunderbolt is PCIe so instead of a backplane you can use a cable. So basically anything you'd use PCIe for (drives, external GPUs etc) you can use TB for.


I'd be wary of plugging thunderbolt devices in, see http://thunderclap.io/


Not quite, because the equivalent pure pcie connection is faster than thunderbolt due to the overhead in thunderbolt.

We need another doubling in thunderbolt's speed for it to not have a penalty when using eGPUs


It's been a few years since I was programming those things but IIRC the TB 3frames are comparable to PCIe 3 frames (terminology is slightly different between the two).

The real thing is that TB3 only has four lanes while Intel's server (and I believe some desktop) chipsets have 16-lane PCIe. Not sure about the mobile chipsets (again, I no longer follow this as closely/ as I used to). So indeed you won't be able to pump data into an eGPU as quickly as you could a high performance internal one. Or perhaps we should add "yet"?


Does 40Gbps mean two 4K displays daisy chained?


In theory, yes. Since driving one 60hz 4k display at 4:4:4 chroma takes roughly 18gbps then daisy chaining 2 displays should be possible.


It just means on the spec sheet its enough bandwidth for 2 4k streams. (HDMI 1.4 4k@60FPS == ~18Gbps)


Yeah. One 4k60 screen is between five and six gbps.


Yes, or with a hub.


Will it be possible to connect a Thunderbolt 3 Macbook or Intel Ice Lake PC via USB-C cable to a USB 4.0 device and obtain 40Gbps throughput?


Article says "The USB4 specification will be based on the Thunderbolt protocol that Intel has contributed to the USB Promoter Group. The new interface will use USB Type-C connectors and will maintain backwards compatibility with USB 2.0, USB 3.2, and Thunderbolt 3 interfaces", so bet on probably. But maybe not, they might only mean that a USB4 computer can use a TB3 device.

I expect they'll continue the trend of inscrutable capability differences in USB-C ports, cables, and devices. To get the USB4 40GBps mode (effectively TB3) it's still going to be a more expensive active Thunderbolt-style cable, not a vanilla USB cable. Which means that standard markings for those cables will (hopefully?) be part of the USB4 spec.

Unfortunately, knowing USB-IF the major change from the TB3 to USB4 spec will be to drop the current Thunderbolt symbol on the ports and cables and replace it with a new variation of USB-super-hyper-mega-speed symbol to make cables as confusing as possible.


or port color. We've used blue and red at least, guess it is time for green (in which case, confusion for the Razer Blade users).


Apple will never use coloured ports.


pulling my MacBook Pro out of my backpack to look, it looks like they aren't fans of iconography to say what a port is capable of either. Can't remember if my old 2013 MBP had the icons or not, but my current one for sure doesn't.


All ports on Apple's laptops behave the same for charging - they'll continue to provide power while sleeping, but won't start providing power. If you want to use it as a battery in your backpack while it's closed you have to plug your phone in, wake the computer up, then put it back to sleep. At least that was the case last time I tested it.

For the most part they don't need colors because all of the ports are the same. One recent exception was that the 2016/2017 MBPs had inconsistent Thunderbolt bandwidth because they didn't have enough PCIe lanes. IIRC left side had full 40Gbps ports but the right side had 20Gbps. The 2018 version has full speed on all four.

EDIT - another potential point of confusion, they don't mark Thunderbolt ports vs USB-C ports. On any given device they're all the same, but they expect you to know that the 12"-mini-macbook is just USB-C while the Pro and Air are all Thunderbolt.

So yeah, it'd be helpful to have some symbols on the laptops. They do label ports on their desktops. I assume this comes down to Ive not wanting labels on his beautiful chunk of aluminum where someone might accidentally see them while using it.


Wow, "only starts charging while awake" really explains some confusing mornings waking up to a dead laptop.


To clarify, I'm talking about using the laptop as a power source to charge other devices (phones, tablets, etc) off your laptop's USB ports.

If you plug them in while the laptop is asleep they won't start charging. If you plug them in while the laptop is awake they'll start to charge, and they'll keep going if the computer goes to sleep later.

If you laptop isn't charging itself while plugged in to the wall you have other problems.


we use yellow for power/charging


Yellow IIRC means "provides power even when the computer is sleeping."

It'd be harder to do markings like this on USB-C ports though. They don't have a big visible chunk of plastic in the middle.


I was being a bit facetious with the initial suggestion, honestly. Mostly poking at color being used as a universal indicator, and then you have a case like Razer where they introduce green to be complementary to their laptop's color scheme.

Beyond just switching to brand + version number (ie [USB logo] 4, [thunderbolt logo] 3, etc.), I don't know of a great way to really go about this. Otherwise you end up with the super speed championship edition thing you posted, which isn't going to help anyone really.

I think the color one was a great idea on their part in that you didn't have to look too close to see which port you were plugging into, but all it takes is one detractor to deviate from it (say if MSI came out with a gaming laptop with red USB2 ports), and then the whole idea is shot.


Probably depends on the specs of the cable, and other factors like which port you plug into, cable length, etc.

Some people are complaining about how this doesn't match expectations, which are that you get the same speed just by using the right shape of cable (and don't care about cable specs). I think that the reality is that this is really pushing the envelope, and 40 Gbit/s isn't that far from typical memory bandwidth (order of magnitude). There are only so many lanes of PCIe to go around, and you're going to end up with some ports getting more than others. We already have problems in Ethernet land with the different cat 5e / cat 6 / cat 7 cables, where bandwidth autonegotiation is mandatory. Most people just want it to work, and if you want your full bandwidth you have to pay attention to the exact specs of every device in the signal path because you can't be sloppy the way you could with 100BASE-T or even 1000BASE-T.

That's a bit of a rant, but I just think that some of the problems with user experience in this area are just down to the fact that this really is a lot of bandwidth to try and push through an external cable, and you end up with the simultaneous goals of having high bandwidth, lots of ports, having all ports be the same, and not charging an arm and a leg for a laptop. Sacrificing bandwidth on a couple ports makes sense to me, in this context.

Seems like the user experience pays, cause I'm not sure how I'm supposed to know which ports have bandwidth.


This is a poisoned gift, there was a twitter conversation posted a couple weeks ago about the cluterf*ck that thunderbolt is... quite a scary read :-/

https://twitter.com/whitequark/status/1097777102563074048


Honestly, most of this addresses that, by opening the spec and getting rid of royalties.



Is it an unpopular opinion that USB is already fast enough and has enough features and USB implementers should instead work on cost reduction and compatibility so that we can all move away from USB 2 and pre-type-C connectors?


USB is already fast enough for all the things we've traditionally used USB for: wifi, mice, audio, smartphones, printers, thumb drives, HDDs, etc.

It's not fast enough for next generation use cases: external displays, eGPUs, or modern SSDs.


It's not fast enough. USB cannot drive a 4k monitor at 60hz. You can use the physical interface to do it, but this uses Thunderbolt. This means that the dream of a one-cable-for-everything setup is not here yet, unless you use a Thunderbolt dock, which multiplexes the USB stuff in with the monitor signal. A thunderbolt dock costs around 10x more than a simple USB dock.


To be honest, I'm not really sure why this is necessary. I don't see why couldn't have two interfaces - one simple/moderate-speed one for most use cases, and a fast one (which could incorporate the slower one) for when it's needed. Trying to cram everything into one interface seems to introduce too many compromises (I no longer know what the port does) and a lot of complexity.


the point is you shouldn't have to know what the port does, it should seamlessly interconnect. you shouldn't have to be thinking about wires etc. it's like how bluetooth somewhat seamlessly interconnects across generations of bluetooth. that's how all standards should be


One cable for everything will mean that the cables cost an insane amount though.

Oh, and they will either be short or require power to run the cable itself.


USB-C is already there up to 4k@30hz, and none of the stuff you're saying is true, unless you consider current USB-C cables insanely expensive. You can get a USB-C dock that provides power as well as the ports you need on one cable for ~$60.


Aren't USB-C cables limited to 2m though? And $60 sounds a lot to me.


I am of the extremely unpopular opinion that USB 2.0 is plenty fast for my needs. For those who do need more than 450Mb/s, however, it seems logical to standardize on a single type-C connector. But with so many competing and superseding protocols implemented over the type-C cable, I feel like labeling should be mandatory so that users know what they can and can’t use a cable for (and don’t fry a power supply or run into incompatibilities).


Is it fast enough for 4K120 or 8k displays?



I haven't actually seen an 8k monitor yet, but a couple weeks ago I was at an Apple store and checked out 5k monitor. PDF text was noticeably sharper and at the same time smoother than on my 4k monitor at home. Both are relatively recent 27" IPS panels made by LG.

So I don't know if I will be able to tell the difference between 5k and 8k in 27" size, but I know that 4k is not enough.


So the first video is that framerate is better than resolution for gaming. Not relevant to this discussion. The second is mostly about how screens and eyes work, I'm not sure why you linked it at all.

8K is useful for crisp lines and text on a big close monitor.


I probably can't see even 320x240, but that doesn't mean I don't like having the extra pixels since I don't stare at one spot the whole time. Upgrading to 4k felt extremely liberating compared with 1080, IMO - productivity is so subjective I'm not sure it can be said scientifically that more screen real estate is necessarily pointless. Plus, imagine a screen the size of a whole wall - if you want to project a life size image of a desktop computer onto it, you can only ever render some small subset of your pixels as its screen, so if you want to render it with a 1080 display you'll need a lot more than 4k. The number of pixels one needs is highly dependent on what one is doing with those pixels.


My favorite monitor is my 42.5" Dell ultrasharp. The screen real estate is amazing, but it has old school 100% scaling. I would absolutely love the same panel dimensions with twice the number of pixels.


That's approaching the visual acuity of humans. I think 8K is going to be what we're going to end up with everything.


USB or the alternate modes over USB-C connector?


The modes don't matter if bandwidth is not there.

Ultimately, I just want every monitor and TV to support a single universal interface, fast enough for the highest resolution/refresh rate. I don't care how it's called or how it works. If "USB 4.0" can be it, great.


Well you can already get cheap usb-c cables that only have usb 2 speeds and power delivery capacity, if that is what your hinting at.


This is quite meaningful for external graphics solutions, PCI peripheral devices, and other modular pc components.


I am starting to see computers with USB C on the host side. The future of one single connector to rule them all will be here soon. Wait a minute, we already had that, it was usb 2.0?!?!


USB-A/B for USB2.0 lacked the pin-count to provide compatibility with other connector standards by using a passive adaptor (except a few like PS/2). It also lacked the data bandwidth to encapsulate many other connector electrical protocols when using an active adapter except a few like RS-232 and 100mbps Ethernet. It also couldn’t deliver much power either - 500mA at 12V is only 2.5W - enough for a mouse or webcam.


True, I just realized that this is for display bandwidth. And also in the future, possibly for inexpensive 40gbps networking/video/gpu


For me the true killer feature of USB-C is charging everything. Finally, a single charger for the laptop, the tablet, the phone, the router etc.


I never saw anyone hooking up a monitor via USB 2.0


I used the usb2.0 to dvi adapter a while back on a an old polycarbonate mac for which I wanted to drive a second monitor.

It actually worked really well for that use case (coding/ web browsing). It probably wouldn't work at all for anything stressful. Getting the driver set up was a little challenging compared to the usual no effort required.

The website for the product... well frankly not good. https://www.newertech.com/products/viddu2dvia.php


I've seen them for sale on Amazon and elsewhere. I always thought they'd be neat to throw in a supplemental kit bag for a second display on the go. I think some of the smaller ones were even bus powered.


I've seen those usb 2.0 display adapters and I never used one but I heard that they had low refresh rates and/or low resolution, due to the 5 volt limits.

I personally use a usb 2.0 NIC so that I can be packet capturing on multiple switch/vlan/ingress or egress at the same time.


Bus powered and decent res (not 4k), but very workable for writing code a casual web browsing. I put a comment about my experience with one above.


Those are USB 3.0-based DisplayLink connectors.

Any supposed USB 2.0-based display connectors (I’ve never seen any, I’m just speculating) would have been some VNC-like contraption involving an emulated frame buffer on the host.


I've used bus-powered USB 2.0 display adapters in the past, even up to 1920x1080. They're usually a software-based renderer writing to a frame buffer on the USB device. Big updates or lots of USB traffic meant dropped/torn frames, and lots of extra CPU power used to drive it. It worked well for word processing and light web usage, but you wouldn't be watching an HD movie on it.


I've seen those usb 2.0 display adapters and I never used one but I heard that they had low refresh rates and/or low resolution, due to the 5 volt limits. I personally use a usb 2.0 NIC so that I can be packet capturing on multiple switch/vlan/ingress or egress at the same time.


They weren't that great but we had them 15 years ago. VTLink. Dual monitors for laptop users, PC or Mac.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: