Many of the cables/dongles are long and skinny relative to the tiny metal connector, placing a fair amount of torque on the port. A number of the cables I've used are poorly fitting as well, exacerbating the mechanical strain.
A good example of this problem is the Yubi-4C. The connector protrudes about half a mm from the port and it has enough play to wiggle up/down/left/right. Having had to replace the logic board and I/O boards on my 2016 MBP 15 within 3 months due to port damage, I get a bit nervous when I notice fit problems like this.
Aside from outright damaging the ports, poorly built cables seem to wreak havoc on the mechanism that grips cables on Apple machines. Prior to the board replacement, the Apple power cables would gradually come loose by their own weight. Fortunately, the Geniuses did agree that this was a problem and replaced the top case along with the guts of my computer.
The Dell XPS 13, which I'm getting in a week or so, has both a proprietary power jack and supports USB-C charging. I'm very happy that I won't be breaking many more USB-C cables.
The proprietary connectors used by Nokia, Ericsson, etc, before USB where all "better" in the sense that the contacts were just pressed against each other, just like what used to be (?) on Ipads and Macs.
I've killed a number of USB ports by walking away with or dropping the phone while it's connected.
I'm not entirely sure whether I prefer it over USB-A, but I also prefer it over an HDMI or full-size DisplayPort connector.
Glad I didn't bother with Type-C on my desktop, or laptop (but did get USB 3.1 Gen2 Type-A ports). And just got these connectors for when I need to transfer some data. https://www.amazon.com/TechMatte-Connector-Resistor-Approved...
With my 2016 MBP, the USB-C connectors have had no mechanical problems. I plug in various dongles, connectors, and USB drives every day. I'm left wondering what one needs to do to physically break the ports – I'm not very careful!
It looks to be a much simpler and sturdier connector.
I meant the lightning connector (ie. the one being used on iPhones).
I know it supports some form of USB3, but the display out is handled by having a h264 decoding dongle (!). It unfortunately doesn't have enough pins to really support the range of alternate modes and high data speeds of USB3.
What is far more of an issue for the standard is the established network effect. USB-A has become so commonplace that its built into our furniture, and micro-USB is a legally defined requirement for device charging in some areas. These foundational forces will prove to be a massive deterrent to USB-C uptake, and far more of a risk to the vision than multiple cable types that are clearly labeled!
But, the bright side of this I think is Apple's adoption of the Qi charging standard. Its clear that an evolutional jump to wireless charging will entail new technology, so I see Qi taking up the "one way to charge everything" vision.
Practically every mouse, keyboard, printer, camera, charger, and external drive made in the last 20 years has used USB-A. Practically every car in the last 10 years has USB-A. Every hotel room, every airplane, every DC power converter. There are still millions (billions?) of devices shipped every year with USB-A.
We are nowhere near the inflection point.
Two compatible cable standard is easy to handle. The hard part comes when you have USB-A, USB-C, Lightning, Micro-USB, and Mini-USB all competing. Trying to keep enough of each type of cable on hand gets to be pricy.
So you really don't need that many cables. C-to-C, C-to-mini, and A-to-C will do you.
Where? Wouldn't this outlaw iPhones?
Obviously 'voluntary' here deserves some scare quotes since the EU had made it very clear that it was willing to swing its sledgehammer and formally regulate if the manufacturers couldn't agree on a standard amongst themselves. Which would have been less good for everyone (including the EU), e.g. since formal legal regulations are a lot harder to change with the times, So they were pretty incentivized to work something out, and happily they did.
One really good EU standardization effort which actually worked except for Apple.
The main point was to facilitate that chargers could be reused, rather than fill up drawers and landfills.
Except for Apple, which doesn't offer any low-margin phone and thus can still use their custom charger and bundle an adapter.
Thus the charger can be reused on a different device with a simple change of cable (a different story is that Apple has their own way of signaling max As to the device).
Still, it's kind of sad you can't buy an iPhone and plug it into your new MBP
Not perfect by far, but better than it used to be before EU started to tell phone manufacturers to wisen up.
It’s easy to forget how crazy bad things used to be before the EU mandated a standardization.
Is it surprising that it doesn't work well? Not really. The compromises are pretty extensive. And between the users demanding thin phones and rugged connectors, its really really hard to do that with anything other than perhaps titanium.
I have a bunch of computers, my oldest was built in 1968 the latest was built last year. Invariably the port complexity goes up with the older computers, and connector reliability also goes up. I've got AUI cables for Ethernet that work as well today as they did in 1984 when they were all the rage. I've got 50 pin SCSI-1 and 36 pin Centronics printer cables that are reliable and functional. I've got computers with 3 of the four USB 2.0 connections on them unusable either due to fusing issues or mechanical strain. I've got a phone handset (my developer's version of a Nexus 1) with a micro USB connection that won't hold the connector in the plug.
I guess the bottom line is that we can make really reliable and ugly connectors.
Those thick cables also had a significant impact on the mechanical strain - they required a substantial connector to avoid disconnection due to cable movement. But they were mostly used for desktops anyway, and weren't really intended to support the idea of moving the device around.
So, I'm sure most people appreciate the progress in the electronics side, the amazing data rates, the engineering of the wiring, the flexible, lightweight cables.
But where we've gone wrong is usability side of things. Why did it take so long to come up with a connector which can be inserted upside down? USB C is still not as good in this respect as a 1/4" jack - which supported insertion in ANY orientation since it's use in telephone exchanges since 1878!
1/4" jacks were also bidirectional - the same connector on either end of the cable. Another issue we have to deal with with USB.
I would prefer that any cable you can plug into a USB C jack is interchangeable. Conceptually, it's a pipe between two devices. If they connect together, it should work. I don't think people generally understand that the cable itself has electronics in it that negotiates power and data rates. And I don't think we should go down the route of marking cables either. They should just be universal.
I hope Apple takes the lead on cleaning up this mess. They've committed to USB C, and this is a major problem right now.
Contact based connectors rather than pin based based connectors as mentioned in some other thread might be the answer. I don't know what problems they might lead to though. Personally, I'm a huge fan of apple MagSafe 1/2 cables.
And thus we get a "standard" that is very adaptable to company "needs" but a nightmare in terms of electrical and computing (never mind basic daily usage).
I'm really looking forward to seeing where USB-C is in two or three years.
Try replacing "suboptimal" with "Pareto optimal" in your everyday life. It made a positive difference for me.
The joke was you could fry an egg on your CPU because it got so hot:
And that high variance in cable quality is a bigger deal than it needs to be. You can't tell by looking at a cable what bandwidth and amperage it's supposed to support. Even if it has thick wires you can't tell if it has a chip to enable 5A charging.
Also there's a non-quality factor. USB-C cables with no high speed wire pairs are valid and are quite common. You don't need more than USB 2.0 in a charging cable.
Devices are a bit more complicated, but I'm hoping that the Hub situation will sort itself out (early in the days of USB 1.1, I remember plenty of devices that wouldn't work with hubs, or only with certain hubs).
The various proprietary extensions might be less consistent with labeling, but at least every Thunderbolt-compatible cable I've seen has a little lightning bolt icon in addition to the standard USB ones.
It turns out that the Apple USB-C high-speed charging cables just don't support the required Thunderbolt standard, and the software can somehow tell that the cable is incompatible despite it fitting in the slot.
I remember discovering that some of my Micro-USB cables could only charge and couldn't transmit any data. I honestly don't understand why people talk about the issues with USB-C as if it was a new thing. Seems to me like most of the problems are theoretic and that most end-users are actually happy with the new standard.
1. They already have a lot of accessories for lightning and they would kinda lose that sweet 30 % MFi for 3rd party lightning accessories
2. A lot of people who don't know why USB C is great will probably bitch about how Apple is changing the port on the iPhone _again_, like they did when they switched from 30 pin to lightning. Could be a PR nightmare
Take a given cable and there's a number of dimensions you need to considered:
- Supported power standards (none, USB, various other wattages and standards)
- Supported bandwidth. I think some don't even support data transfer (which might include the USB-C cable that comes with the Apple Macbook charger)
- Supported modes (USB, Thunderbolt, DisplayPort?)
There's no colour scheme that can adequately handle all the possible variations.
But there are other problems: it makes no sense to have all 4 ports on a 15" Macbook Pro (for example) support power. You only need one port to charge with. I think I read that not all the ports on the 13" Macbook Pro are the same (in terms of capabilities).
This is just strikes me as a (hollow) victory for principle ("one port/cable to rule them all") over pragmatism. This always reminds me of the quote: a foolish consistency is the hobgoblin of small minds.
This whole thing is expensive, unnecessary, user-unfriendly and confusing (to the average user).
- Displayport, obviously.
- HDMI, because that might be useful in some cases.
- MHL Alternate Mode for USB 3.1, which is not-quite-HDMI.
- USB itself, like the external mini video cards created by DisplayLink
- PCI-E, as the monitor could contain any regular video card. Unlikely, but technically possible.
Is the monitor going to work when plugged into your laptop? What about your phone? Will it charge either of those? I'm not sure how this has managed to gone this horribly wrong, but somehow "supports usb-c" has become completely meaningless. Oh, and it can also support analog audio as well!
And if anyone thinks I'm exaggerating, please tell me which devices do and which do not work with this "usb-c to vga" adapter, because to me it is literally impossible to tell: http://www.belkin.com/uk/p/P-F2CU037/
USB-C solves problems that nobody actually had, and in doing so creates a bunch of new ones that are expensive and difficult to solve for the end user. It isn’t quite as dumb as 801.11ad docking, but it’s close.
Once do-it-all chipsets get cheaper, all ports will do everything a user expects of them.
Type-C isn't great TODAY and that's because many implementations of Type-C are first generation.
Shitting on 1st generation technology implementations and saying it's a technology problem rather than an implementation problem isn't helping.
When the first USB rolled out I don't remember any issues like this. The big problem with previous tech was lack of support (like only having one usb-2 port and the rest being usb-1.1) and expensive cables (we do have those with USB-C). These were fairly ubiquitous for a good 10 years and I don't remember any issues.
I feel like there could have been better consumer-facing design--like color bands to mark support on cables and ports (just like USB-2 was blue and USB-3 were red). They would designate if a cable or port would support Thunderbolt, USB-3, HDMI, DisplayPort, or high power. Resistors have been doing it for almost 100 years and this version would be much simpler. Or like PCI-E have like a lane system where more lanes support higher level features.
But this isn't only a branding problem with incompatible cables. You just can't have a USB-C hub with this standard. From my (possibly flawed) understanding, because USB-C switches modes to carry something like DisplayPort, for example, you can't also carry a USB-C signal at the same time, so you can't split USB-C (without one of the ports shutting off while in that mode). Maybe you could if you hub didn't support those exclusive modes? I haven't seen any actual USB-C hubs for sale (that give you more USB-C ports on the other side).
A USB-C connector has one USB 2.0 pair, four high-speed differential pairs, and two sideband wires. For USB 2.0 you need only the USB 2.0 pair, for USB 3.1 you need the USB 2.0 pair and two of the high-speed differential pairs.
For DisplayPort, you need two or four of the differential pairs, plus the two sideband wires. If you're using two of the differential pairs for DisplayPort, you can use the full USB 3.1 at the same time; if you're using four of the differential pairs for DisplayPort, you still have USB 2.0 left.
The same applies to other alternate modes. As long as they need at most two of the differential pairs, you have USB 3.1, otherwise you have USB 2.0. And power delivery has its own set of wires, so it's also always available.
(The exception is the 3.5mm plug adapter alternate mode, where everything except analog audio and slow charging is unavailable.)
I'm not sure how all those one-cable USB-C docking stations work then, as they carry video and audio one direction (to the dock and then out to the screen), and multiple USB devices and power the other direction (from the wall and connected peripherals and through the dock). If it truly couldn't do multiple things at once, none of these docking stations would work with a single cable, and they do.
This post says
> What the problems is - how the USB-C lines within a cable are used is determined at plug-in time, depending on what you plug in downstream of the port. For example, you can't have full speed USB 3.1 and full video at the same time. So the USB-C makes a choice for you. If you plug in both a high-res monitior and USB 3, then you can't have both at once.
Still not technically clear to me. It does sound like with just a hub your transfer speed will drop when using a hub and driving a monitor. Maybe the limitation I'm thinking of is if USB-C is carrying something like Thunderbolt, you can't carry a second Thunderbolt signal? Because USB-C splitting compromises so much, they're so rare in the market?
I have heard the order in which you plug in USB-C will tell it which direction to charge (which gives credence to things being determined at plug-in time). For example, plugging the cable into your phone then plugging it into a computer will make the phone charge the computer instead of vice versa. When I heard that they were speaking about a specific phone and laptop, so it wasn't theoretical, but I'm having a hard time corroborating this. This  article says the direction power flows is a computer setting. What's scary is there's a lot of articles warning you not to use your USB-C laptop charger to charge your phone because it could fry it.
Even if the signals of some ports are electrically compatible, I try to use different shapes of connector for them if they are not logically compatible (the functions are fixed, not swappable). Because you can be sure nobody will always look at icons which are faint, small, can only be seen correctly under a certain angle and light, etc. Supposing you even know which icon means what. Supposing you can even have a sight of the icon and port, that it is hidden behind something and you have to try if it fits blindly.
You do not want to be called every other day because "it doesn't work anymore" because someone connected the wrong device to the wrong port :-). So for some serial ports with incompatible functions, you do not put all DB-9 ports, you put 1 DB-9, 1 round DIN, 1 rectangular (and hope they do not manage to fit the rectangular connector in the DB-9), 1 DB-25... It looks messy, shambolic? Yes it does. Definitely. But it is efficient because it is idiot-proof. And I mean a very large definition of 'idiot' which encompasses about everyone.
I can tell you about the number of times I 'plug' a USB cable into a DisplayPort or the power cable into the telephone plug because they sit in the same area and have sort of compatible shapes... I generally quickly notice because it does not fit well, but just imagine the nightmare if they were truly the same physical connectors...
I think the point of USB-C is that the same port can support both functions, and so you don't have to care into which port you actually plug-in the cable. My proposal was just for when you were buying a new device, to make sure it supported the same technology as your existing devices - only then you would compare the icons.
Is it possible to purchase USB-C cables that work for everything? Or are (some of) the different supported protocols incompatible?
I would much rather have 4 USB-C ports than even 8 ports that all do one thing only. It's like having $40 cash instead of 8x $10 gift certificates (of which perhaps only one you might want to use).
It wouldn't be so bad if it wasn't for the fact that USB-C/TB3 has been such a disaster and even mechanically a disappointment.
But it's never actually worked, I ended up RMAing the docking station. Since the laptop itself can only support one display, I can't have dual monitors. So now I have a 43" 4K monitor, a USB hub and a power connector that I need to plug in/out all the time. And Win10 still gets confused about DPIs and scaling and stuff.
On my ancient-but-still-kicking Dell Precision, at least the docking station actually works and you can simply click in and out in three seconds.
For me, I'm enjoying the power of the USB-c/Thunderbolt connector on my Dell XPS. I use a docking station, which I'm sending 4k video and audio out to while receiving video, mouse, ethernet and power over the same cable. One cable docking is amazing, and really shows off the possibilities of the technology (although in this case I'm pretty sure it's Thunderbolt doing the heavy lifting).
But even my CPU could manage to have 8 lanes feeding a GPU and 8 lanes feeding a switch for the thunderbolt ports, something that provides enough bandwidth for the vast majority of use cases.
... But they have already disappeared. The 13" Macbook Pro that I'm talking about just has those two USB-C ports and none of the others. If you are plugged in for power you just have one port.
My old 10.1" has:
* 1 power connector
* 1 ethernet connector
* 1 VGA connector
* 3 USB ports
* 2 audio jacks
* 1 memory card slot
* 1 mysterious connector
This is old, but still gold
1. Missing protocols in the IP layer can be solved by software. If your CPU is missing a thunderbolt chipset, a driver update will not help you.
2. The physical layer is not actually the same, some USB-C cables don't have all the necessary pins for power charge or high speed data.
My Switch is another great example. It won't even charge with some of my adapters, and I had to get a C -> HDMI splitter that was designed specifically for the Switch, despite the fact that Nintendo isn't even using any proprietary protocols (just less common ones).
Also in the case of wifi and bluetooth compatibility issues have a chance to be resolved in software on your end. In the case of USB the list of supported protocols is the property of the hardware interface and the options to resolve these issues are limited.
In the end if you buy a device that has an "USB" port the specification that it has "USB-C" is meaningless, you want to know the exact list of hardware protocols that the port supports.
Sure you can - it's gotten better in recent years, but it used to be that a lot of devices didn't support WPA2. This was (typically at least) not fixed in software.
Turns out that it was a mini-DisplayPort version of the Cinema Display, and that even though it worked fine with the built-in Thunderbolt 2 port, it didn't with the one in the adaptor. She ended up trading her Cinema Display with somebody who had the Thunderbolt version and an old laptop, because at the time that was easier than finding an mDP->USB-C adaptor.
Google Chromebook Pixel
MacBook Pro (2016)
Dell XPS 13”
So those devices work and everything else is a bonus, I guess.
And yet, very profitable for the companies rolling it out so poorly, who can charge more for products featuring the new ports and nickel-and-dime us with expensive adapters, dongles, and cables to replace the perfectly functional USB-A ecosystem.
It's quite a clever trick they've managed to pull on their customers -- convincing us all our problems can be hand-waved away with a single, forceful utterance of "it's the future", thus relegating any criticism to "whining and moaning" about first-world problems.
That even Apple can't manage to simplify the mess within its own laptop line, let alone tablets and phones, is indicative of how poorly thought out this whole idea was from the start. They seem to have a device for just about every combination of incompatibility that is possible.
The goal was noble:
- provide enough power to charge an external device and ditch the bulky power supply and second cable
- support extremely high-speed devices like monitors and SSDs
- standardize the connector with a slimmer, symmetric design
So far it's been an utter failure of execution.
Different strokes. Being able to charge on either side is one of the biggest usability wins of the new MBP to me.
FWIW a number of studio headphones support plugging in a cable to either can rather than restricting you to just the left one, if you want to go that route. V-moda sells a few pairs like this and denotes the feature as "dual inputs".
This was a total amateur mistake by Apple.
> This whole thing is expensive, unnecessary, user-unfriendly and confusing (to the average user).
Please don't assume to speak for everyone. Being able to charge from any port from either side is very convenient, modern, and I love it.
This was the promise of usb-c - at least, that's what many of us thought was the promise. Now we're finding out reality isn't quite up to it. And it's far worse for people who, naturally, assume if the plug fits, it'll work.
Maybe that ideal is impossible, or financially prohibitive. Usb-c implies it isn't, though.
It works exactly like that on modern Macs - whatever you plug into whichever USB-C port will work. Worst case scenario is that your ultra-high-speed TB3 RAID array will work ever so slightly on the left hand side ports, but it'll still work and it'll still be plenty fast.
And I understand that USB-C ports on other highend laptops are the same? They support TB3, HDMI, DP and other alternate modes so this will quickly become a moot point, just like having to differentiate between USB 1.1 and USB2.0 ports?
I actually see this as a feature. What would be insane if only one special port supported power. That would just be confusing.
There is actually a convenient color code, at least on thinkpads, where black means usb2, yellow means alwayson and blue means usb3.
Never mind that with the older plugs, the cables were basically passive. But now there has to be special resistors in them to signal what kind of power delivery they can handle, and oh so many companies get it wrong. Because there is also a special resistor that need to be used if you have a A to C cable, that overrules all the others.
It is frankly a fire hazard waiting to happen.
At best you can say you won't damage equipment in the process of discovering the quirks of USB-C. Except, of course, for the cheap not-up-to-spec cables that might cause damage.
IMHO, regulatory bodies should be taking a hard look at multi-use cords over a certain power level. It’s literally a matter of time before people die as a result of some fake Amazon cable.
Now, presumably the designer of this dongle had some reason to limit the power output. Perhaps the heat dissipation is not enough (the dongle gets pretty hot even when using it just for USB and HDMI). What if some other manufacturer is not so careful and allows negotiating more power through a similar device?
I think that's the point. With off-the-shelf aftermarket parts, it's not always clear if you're getting something that can provide clean pass-through power or not. Or even if that's something you should be worried about.
The 13" MBP have non-equal USB-C ports, if true, is totally Apple's fault. I'm actually surprised at why nobody have sued Apple for misguiding their customers (I don't think Apple disclosed that not all ports on 13" MBP has all the capabilities they bragged about).
Plenty of us complained about it. Some even saw it as a reason to avoid Apple.
> The alternative is like, what, we settle all the capabilities this port can have, and say that's it, we cannot add new capabilities to this port to confuse consumers, we must use a new port for new capabilities?
Yes. Different ports should look different.
Be careful what you wish for. New capabilities are added to ports all the time. If that's the case, we'll have a new port every year.
Isn't it Intel's fault given the limited number of PCIe lanes on their dual core CPUs?
> I don't think Apple disclosed that not all ports on 13" MBP has all the capabilities they bragged about
It's noted here: https://support.apple.com/en-us/HT207256
Search "macbook thunderbolt pci" on Google? It's nowhere in sight by page 3...
Search "thunderbolt pci" on support.apple.com? It's result 8, after info about connecting TB displays, network adapters, cables, and so on.
Yes it's there, but let's not pretend Apple is going out of their way to note it.
With "macbook pro 13" thunderbolt speed" it's the very first result (for me at least)
In any case, no, they aren't going out of their way, nor should they IMO. I think that would confuse many more people than the amount that are looking specifically to use > 2 full-bandwidth Thunderbolt connections at once on a dual core machine. I'd wager that > 95% are going to be happy they have two extra 10Gbps USB and charge ports.
Actually on that official tech specs page Apple bluntly claimed that:
Four Thunderbolt 3 (USB-C) ports with support for:
Thunderbolt (up to 40 Gbps)
USB 3.1 Gen 2 (up to 10 Gbps)
For cables, you can whittle it down to two things. Bandwidth, and whether it supports 3 amps or 5 amps. Supported modes is a function of signal bandwidth. 5 amp can be a little power icon, and bandwidth only needs about 3-4 options. So most cables would only have a bandwidth rating.
For ports I agree that it's much too complicated.
Depending on cable configuration, pinout, wall plate and structured wiring system, that 8P8C might be usable (or not) for multiple different types of data networking, from the assorted ethernet speeds to E1 to token ring, or for a serial console, or delivering power and audio to a remote speaker, or hdmi-over-utp, or even -48V telephony, and let's not even get started on the only-subtly-different but actually incompatible RJ45S connector, or people sticking RJ11 plugs in 8P8C ports.
And yet the world has coped with this proliferation.
I agree, MIDI is a bit shit.
Twist-lock appears to be a USA-specific thing.
- USB-C is a connector and not all ports/cables/hubs support all of its features and modes. So not everything that can be connected using a USB-C will work the same way, or even at all.
- Due to their potential bandwidth demands, computers can’t have very many USB-C ports
- USB-C will be phased out and replaced before settling down
IE. No user testing.
Most people don't know about it anyway, making the c port reversible was a good decision.
You can have A ports that deliver 20V and 3.1 data rates, and C ports that deliver only 5V and 2.0 data rates, and they are both valid according to spec.
There should be a cutoff point for "designed for idiots" - if you can't plug in a USB cable, you should not be touching anything electric really.
How big a mess USB C is? Check this note from Plugable: "We have had several confirmed cases where lowering the Power Output of the internal Wi-Fi adapter to 75% in Dell XPS models has helped with USB-C disconnect behavior."
Also, while you can get a small USB C to dual DisplayPort MST hub http://a.co/hyXGdBA and also you can get a small USB C to DisplayPort adapter with USB C power passthrough http://a.co/1KE2imb you can't get an USB C MST hub with power passthrough in a single, small, relatively cheap device. You need an expensive, quite huge device like the ThinkPad USB C dock to get this functionality. It seems as if every USB C accessory maker took a blood oath on hardwiring a HDMI converter to the second port of their MST hub. (Yes, taking a USB C to DP converter and an ordinary MST hub is a cheap solution, of course, but it's not elegant at all especially because both insist on using cords, creating a mess.)
Also, there have been laptop chargers with 65W or 90W on the laptop side and one or perhaps even two USB charging ports, if you are real lucky then 2.4A each. Now that both your laptop and phone charges via USB C you'd think you could get an AC adapter with two USB C ports and perhaps a few USB A thrown in for good measure. Dream on. The highest wattage adapter I found with two USB C ports is 55W, it doesn't even charge a single laptop, it's all 5V. There's a well known ;) brand called LVSun which sells a 80W charger which can do a non USB C laptop + USB C phone (or a USB C laptop + a non USB C phone) plus it has a few USB A ports as well. Still, it's not two USB C ports.
I'll never understand articles like this - let's not standardize on a form factor because not every single application of that form factor has the same requirements???
- Protocol: 3.0 or 3.1
- Power: QC, PD, or none of the above; how many watts
- Alt-modes: HDMI, DP, TB3
- Chipset features: UAS
Once you've assigned all of these to a color, the higher end cables are going to look like a pride flag or something. The solution is really not that simple at all and it's far from guaranteed that once adoption picks up that you can just assume your device/cable combo "obviously" supports what you want.
On top of all of this, construction of USB-C cables are far more complex than your run of the mill cable. They're active devices with apparently absurdly tight tolerances. Did everyone forget about the Benson Leung's spreadsheet of killer cables? Even if the industry can figure out a user friendly branding, it's still a roll of the dice for whether you'll fry your laptop.
Lastly, USB-C seems like a mandatory weakening of security. In a few years time, you won't have any other choice but USB-C and now suddenly any random charger (or cable!) could be the easiest rootkit ever deployed. The cute pen testing exercise of dropping USB thumbdrives with backdoored Word docs is going to get a lot more serious. I wonder how long until we have "hardware firewalls" that attempt to rein in USB-C devices. We're past the "USB condom" at this point.
Previous discussion (710 days ago, 397 comments): https://news.ycombinator.com/item?id=10508494
not a property of the cable
> Alt-modes: HDMI, DP, TB3
> Chipset features
> They're active devices
Only long-length thunderbolt cables are active.
> Benson Leung's spreadsheet of killer cables
The check there is "does this A-to-C cable say it's A-to-C, or say it's a 3 amp source". If you use it with an incorrectly designed charger, the charger could get damaged. "Killer" is not really fair, because the only "fry your laptop" situation was an instance of a wiring mistake that could happen on any kind of cable.
cesarb is right, you can represent it as max protocol and max current, and type-A cables have approximately the same setup.
Not a rhetorical question; I'm not well informed in this field.
then each year as new standards are added you make the shade of green slightly lighter. then you just need to know what shade of green your use supports and make sure you cable and ports are as light or lighter.
As for devices - I guess I'd say know what you're buying? How is it any different than the fact I have to know that some mini-displayport ports can do thunderbolt and some can't? Some cables can and some can't?
I mean... Belkin has already started putting the thunderbolt icon on the cables they sell that support it. This isn't anywhere near as complicated as he's claiming. The only gray area is people buying the cheapest possible cable they can find on Amazon then acting shocked when they get what they paid for.
Costly cables are usually better but you don't have any warranty.
You can use bensong leung tests to find a good one but need to hope that they have not silently changed the cables since his test.
When you buy a new device, you have no idea of the capabilities of the cable that comes with it.
So you need to remember whether it uses PC or QC or just always use it's provided cable.
All in all we are still far from "let's take any usb-c cable and connect it to an usb-c device".
It is too bad, it is a very nice dream.
Don't act like cheap cables frying your equipment is normal or expected. Off-brand products usually aren't as good as the expensive stuff, but we have a right to expect them to not be actively dangerous.
That's 90% of people though.
> It’s comforting to think that over time, this will all settle down and we’ll finally achieve the dream of a single cable and port for everything. But that’s not how technology really works.
Like they refused to do blue USB-3 ports.
When USB-C was announced and I saw the pin-outs, my immediate thought was, “This is going to be a support nightmare.” I should have written that down, so I could point at it as proof of my genius. :P But I didn’t know what a garbage fire USB-PD is.
I was thinking of Alternate Mode. Alternate Mode is what enables Thunderbolt 3 on USB-C. We already had an alternate mode for USB: MHL. Which divides HDMI ports between plain HDMI ports and HDMI ports that support MHL. The Alternate Mode multiplies the number of ways displays can connect to video sources via USB.
Not to mention what everybody else has commented on.
All these incompatible options are using the same connector. That’s obviously confusing to users and frustrating to support personnel.
In particular as we have a rudimentary power spec in the 3.1 "data" spec, involving resistors in the cables(?!), and a separate power delivery spec that goes above and beyond the 3.1 power stuff to define anything up to 20V at multiple As.
Frankly to me there seems to have been a initial C port design that was a "simple" mechanical reversible implementation of the existing 3.0 wiring. And then someone decided that rather than simply have the pins being mechanically doubled (either in plug or in port) they could make each be an individual wire and run even more stuff through the port. And thus we get Alternate Mode that can carry just about anything the OEMs dream up...
It's against recommend practice on such topics, for what I think are obvious reasons.
Why is this the case?
I read that thunderbolt v3 can handle upwards of 40Gbit/s.
Ok... but what is limiting the number of USB-C Thunderbolt-enabled ports that a given computer can have?
Surely each of those ports are not being maxed out to 100% of their bandwidth 24/7?
Why can't the bandwidth just be limited as needed like in a cheap router for example?
(I know nothing about hardware, FYI)
Take the USB-C to HDMI adapter. The USB port is useless for connecting to external drives. When I plug it into my MacBook one of three things happens:
1 - nothing
2 - it works at USB 2.0 speeds
3 - it erratically works for a minute or two then locks up my laptop
Meanwhile, the plain USB-C to USB adapter works perfectly.
Ironically since then i've just made Thunderbolt 3 a requirement for all company laptops and the whole company, mac and windows gets to use the same dongles, chargers and even dockingstations, which I count as a win.
Forget wattage -- voltage is also a big deal. Want to charge a Dell laptop? You need 20V. Charger support for PD at 20V is somewhat related to wattage, but it's easy to find 40-odd watt chargers that can't supply 20V. And there is usually nothing in the specs that tells you this.
Also, why can’t we just switch to optical cables already? We’d only need +V, GND and the fibre. Just use the 3.5 mm audio plug (round, fits any way up) with a hollow point, like Mini-TOSLINK did.
It turns out to be cheaper and easier to aggregate multiple serial links than to run a wide parallel bus. Practically every modern high-speed interconnect uses multiple aggregated serial links -- USB, Ethernet, SATA, PCIe, DDR3+, DVI+HDMI+DP+LVDS, etc, etc.
> why can’t we just switch to optical cables already
Too expensive. A big part of USB's dominance in the mid-90's was that it was way cheaper than the alternatives of the time.
Thunderbolt was originally designed for optical interconnects.
> Just use the 3.5 mm audio plug
It's physically too large for current and upcoming designs.
I miss TOSLINK.
You need 2 power and 2 ground pins per side to carry the large currents needed. Those all get wired to the same (thick) cable (though usually split in two to make the cable more flexible). That's 8/24 pins, 2 wires total.
You have two pins (per side) to signal which mode the cable is being used for and which way around it's plugged in. Because there are so many alternate modes with each having its own signaling standard, you can't pack those in-band, particularly for the case of analog audio, which doesn't have a digital data frame at all. Two are connected across from one connector to the other, so there's only three new wires for the 4 pins. So that's 12/24 pins, 5 wires total.
You have two pins per side for USB 2.0. Those get wired to one side only, but are connected together in the device (not the cable). The opposite side pins are unconnected. So two connected and two unconnected new pins per side, and two new wires. 16/24 pins, 7 wires total. Some cables have active transceivers in them, and to ensure that the far-side transceiver is powered, there's an extra wire carrying its supply. This brings us to 8 wires.
The remaining 8 pins are mapped to four high-speed differential pairs. The important thing about these is that they ONLY support high-speed low-voltage differential data and are protocol-agnostic. To get the enormous bandwidth the standard needs they have to split them up into four different serial lanes. So it's not a single serial bus. It's 5 serial buses, a handful of power and configuration pins. This brings us up to the full 24 pins (22 of which are in use) and 16 wires (or 18 if you opt for the thinner, doubled-up power and ground wires). It's very much not a serial bus.
As for optical cables, that would work, but then you need to transfer all the control signals in-band and either stick to a single signaling standard (and have to do high-speed conversion for everything else) or somehow enclose various protocols in a single frame (like ethernet). This would require a fundamental redesign of peripherals, rather than a small change (a mux and control IC and new port). Additionally, optical transceivers are very expensive and power-hungry at high speeds, to the point where even for extremely high link speeds copper is still being used for ethernet. TOSLINK succeeded because at the very low data rates it has, the optical link hardware is cheap and doesn't use much power.
My monitor at home (XR382CQK) supports USB C and that's ok, it charges, does video, and acts as a hub via one cable. The problem is the world hasn't had time to embrace USB C. I'd gladly trade my old setup of needing two cables (one Magsafe + one Thunderbolt) for not needing a dongle literally everywhere else I use my laptop with an external display.
I was just last week forced to use a USB-C-to-DisplayPort cable, instead of a straightforward DP to DP cable to get a 4K monitor to work. Using the latter, the computer thought everything was fine but the screen flickered like a CRT, horizontal splits and all, and went black every 3s.
There are so many layers involved I don’t even want to think about it, and am just happy it worked...
Reminds me of when HDMI first started shipping and people found they had to power up their new TV and games console/BR player in a certain sequence to get HDCP to play nice...
It's probably something unrelated to the hubs. Who knows though :) Many moving pieces.