
The impossible dream of USB-C - okket
https://marco.org/2017/10/14/impossible-dream-of-usb-c
======
cjcampbell
Since I got over the hurdle of buying cables and adapters, I've been mostly
happy with USB-C. The one lingering concern is actually mechanical.

Many of the cables/dongles are long and skinny relative to the tiny metal
connector, placing a fair amount of torque on the port. A number of the cables
I've used are poorly fitting as well, exacerbating the mechanical strain.

A good example of this problem is the Yubi-4C. The connector protrudes about
half a mm from the port and it has enough play to wiggle up/down/left/right.
Having had to replace the logic board and I/O boards on my 2016 MBP 15 within
3 months due to port damage, I get a bit nervous when I notice fit problems
like this.

Aside from outright damaging the ports, poorly built cables seem to wreak
havoc on the mechanism that grips cables on Apple machines. Prior to the board
replacement, the Apple power cables would gradually come loose by their own
weight. Fortunately, the Geniuses did agree that this was a problem and
replaced the top case along with the guts of my computer.

~~~
mchahn
I've broken several motherboards and many cables over the years. It drives me
crazy. The USB-C is the worst. I'm on my third cable now. At least I finally
figured out to make sure I get USB-C chargers with removable cables. Now I
only have to replace the cable, which I've done twice in the last month.

The Dell XPS 13, which I'm getting in a week or so, has both a proprietary
power jack and supports USB-C charging. I'm very happy that I won't be
breaking many more USB-C cables.

~~~
KitDuncan
Dells proprietary charger is the worst. They have a cable dedicated to
insuring the use of original chargers, which appears to be especially prone to
breakage. If it does, your XPS won't charge anymore, but it still gets power.
I managed to repair my charger, but it was a real clusterfuck.

~~~
Namidairo
My oldish XPS and charger is a tad tempermental now in that it detects it as
non genuine on occasion.

------
tammer
I think these are actually the most minor problems with the USB-C vision. The
VAST majority of consumers will never encounter a thunderbolt device, and the
ones that do can be expected to understand the cable difference.

What is far more of an issue for the standard is the established network
effect. USB-A has become so commonplace that its built into our furniture, and
micro-USB is a legally defined requirement for device charging in some areas.
These foundational forces will prove to be a massive deterrent to USB-C
uptake, and far more of a risk to the vision than multiple cable types that
are clearly labeled!

But, the bright side of this I think is Apple's adoption of the Qi charging
standard. Its clear that an evolutional jump to wireless charging will entail
new technology, so I see Qi taking up the "one way to charge everything"
vision.

~~~
Something1234
> micro-USB is a legally defined requirement for device charging in some areas

Where? Wouldn't this outlaw iPhones?

~~~
SEMW
Contra sibling comments, it technically isn't a legal requirement. It's
entirely voluntary, a memorandum of understanding between manufacturers,
facilitated by the EU.

Obviously 'voluntary' here deserves some scare quotes since the EU had made it
very clear that it was willing to swing its sledgehammer and formally regulate
if the manufacturers couldn't agree on a standard amongst themselves. Which
would have been less good for everyone (including the EU), e.g. since formal
legal regulations are a lot harder to change with the times, So they were
pretty incentivized to work something out, and happily they did.

[https://en.wikipedia.org/wiki/Common_external_power_supply](https://en.wikipedia.org/wiki/Common_external_power_supply)

------
ChuckMcM
A committee goes into a room to design a single vehicle that will do all jobs,
it will carry heavy loads, it will commute to work, it will fly between
airports, it will work on the watar and below the water. Such is the challenge
facing these guys.

Is it surprising that it doesn't work well? Not really. The compromises are
pretty extensive. And between the users demanding thin phones and rugged
connectors, its really really hard to do that with anything other than perhaps
titanium.

I have a bunch of computers, my oldest was built in 1968 the latest was built
last year. Invariably the port complexity goes up with the older computers,
and connector reliability also goes up. I've got AUI cables for Ethernet that
work as well today as they did in 1984 when they were all the rage. I've got
50 pin SCSI-1 and 36 pin Centronics printer cables that are reliable and
functional. I've got computers with 3 of the four USB 2.0 connections on them
unusable either due to fusing issues or mechanical strain. I've got a phone
handset (my developer's version of a Nexus 1) with a micro USB connection that
won't hold the connector in the plug.

I guess the bottom line is that we can make really reliable and ugly
connectors.

~~~
jamhan
Seen "The Pentagon Papers"? [1]

[1]
[https://www.youtube.com/watch?v=aXQ2lO3ieBA](https://www.youtube.com/watch?v=aXQ2lO3ieBA)

~~~
sukruh
You meant "Pentagon Wars".

------
cbhl
USB-A had all of these compatibility concerns too, when it first came out. I
remember having a Gateway All-in-One with only two USB-A ports in '98\. You
had to worry about USB 1 versus USB 2, and some devices wouldn't get enough
power through a hub, and so forth.

I'm really looking forward to seeing where USB-C is in two or three years.

~~~
colejohnson66
I’m too young to have experienced the transition, but were there cables that
could fry your computer? [https://arstechnica.com/gadgets/2016/02/google-
engineer-find...](https://arstechnica.com/gadgets/2016/02/google-engineer-
finds-usb-type-c-cable-thats-so-bad-it-fried-his-chromebook-pixel/)

~~~
cbhl
Oh, in those days, computers would fry themselves -- literally -- from
overheating. CPU vendors raced to break the 1 GHz barrier, but thermal
throttling wasn't as mature. If I pushed my computer too hard for too long
(say, with a really tough ray trace) then it would just shut off uncleanly
from being too hot.

The joke was you could fry an egg on your CPU because it got so hot:

[https://www.theregister.co.uk/2002/03/01/how_to_fry_an_egg/](https://www.theregister.co.uk/2002/03/01/how_to_fry_an_egg/)
[http://www.phys.ncku.edu.tw/~htsu/humor/fry_egg.html](http://www.phys.ncku.edu.tw/~htsu/humor/fry_egg.html)

~~~
narrowtux
no

------
yoavm
I just _love_ the fact I can now charge my laptop and my phone using the same
charger - and both charge super fast. I travel often and going with one
charger is a life saver. Being able to charge my laptop from my old power bank
to get two more hours is also super cool.

I remember discovering that some of my Micro-USB cables could only charge and
couldn't transmit any data. I honestly don't understand why people talk about
the issues with USB-C as if it was a new thing. Seems to me like most of the
problems are theoretic and that most end-users are actually happy with the new
standard.

~~~
marcosscriven
I know it's extremely unlikely, I do wish Apple would move its phones and
tablets to use USB-C too.

~~~
iwintermute
I can't understand why they won't do it.

~~~
narrowtux
I think 2 reasons:

1\. They already have a lot of accessories for lightning and they would kinda
lose that sweet 30 % MFi for 3rd party lightning accessories

2\. A lot of people who don't know why USB C is great will probably bitch
about how Apple is changing the port on the iPhone _again_, like they did when
they switched from 30 pin to lightning. Could be a PR nightmare

~~~
iwintermute
No PR nightmare is bigger than loss of headphone jack - I imagine lot of
people had great and expensive headphones that are no longer usable directly.
Apple seems to survived that.

------
zecken
Strongly disagree with the article, it seems the author is confused about the
internal hardware. The actual pin connections within all USB Type-C cables are
the same -- no matter if they are shipped with a product that uses
Thunderbolt, Displayport, or any other standard. The only variance cable to
cable (and this exists on existing micro cables as well) is on things like
shielding, AWG, and ferrites which would determine signal & power integrity
over the length. IE don't use a very long & skinny cable for delivering lots
of power or high speed data unless the cable was bought for that specific
purpose, which is something consumers do today if they aren't using
professional installers.

~~~
Dylan16807
Most of the rant is about different device capabilities, and how cables being
more or less the same makes it almost worse instead of better.

And that high variance in cable quality is a bigger deal than it needs to be.
You can't tell by looking at a cable what bandwidth and amperage it's supposed
to support. Even if it has thick wires you can't tell if it has a chip to
enable 5A charging.

Also there's a non-quality factor. USB-C cables with no high speed wire pairs
are valid and are quite common. You don't need more than USB 2.0 in a charging
cable.

~~~
aidenn0
It would be nice if we had some standardized iconography for the cables
(xAmps, thunderbolt/non-thunderbolt, &c.). There are really only a few useful
permutations of cables.

Devices are a bit more complicated, but I'm hoping that the Hub situation will
sort itself out (early in the days of USB 1.1, I remember plenty of devices
that wouldn't work with hubs, or only with certain hubs).

~~~
matt_kantor
There is standardized iconography[0][1]. It's enforced under trademark law by
the USB Implementers Forum. In addition to the old "basic" (USB 1.0/1.1), "hi-
speed" (USB 2.0), and "on the go" icons there's now "superspeed" (USB 3.0),
"superspeed+" (USB 3.1), and icons for the power delivery spec. The new icons
are just as inscrutable as the old ones, but they exist and the USB
Implementers Forum requires correct labeling for all cables and devices.

The various proprietary extensions might be less consistent with labeling, but
at least every Thunderbolt-compatible cable I've seen has a little lightning
bolt icon in addition to the standard USB ones.

[0]:
[http://www.usb.org/developers/logo_license/Trademark_License...](http://www.usb.org/developers/logo_license/Trademark_License_Agreement_Licensed_Mark_Requirements_062017.pdf)

[1]: [http://www.usb.org/developers/logo_license/USB-
IF_Logo_Usage...](http://www.usb.org/developers/logo_license/USB-
IF_Logo_Usage_Guidelines_FINAL_071317.pdf)

~~~
Dylan16807
The icons exist, but the fraction of type C plugs that have icons on them
seems to be extremely low.

------
cletus
I'll go further than this. USB-C isn't misguided, problematic or questionable.
It's insanity.

Take a given cable and there's a number of dimensions you need to considered:

\- Supported power standards (none, USB, various other wattages and standards)

\- Supported bandwidth. I think some don't even support data transfer (which
might include the USB-C cable that comes with the Apple Macbook charger)

\- Supported modes (USB, Thunderbolt, DisplayPort?)

There's no colour scheme that can adequately handle all the possible
variations.

But there are other problems: it makes no sense to have all 4 ports on a 15"
Macbook Pro (for example) support power. You only need one port to charge
with. I think I read that not all the ports on the 13" Macbook Pro are the
same (in terms of capabilities).

This is just strikes me as a (hollow) victory for principle ("one port/cable
to rule them all") over pragmatism. This always reminds me of the quote: a
foolish consistency is the hobgoblin of small minds.

This whole thing is expensive, unnecessary, user-unfriendly and confusing (to
the average user).

~~~
crote
Insanity seems to be the right word for USB-C. Let's imagine you buy a new
monitor, which has a usb-c port. Fortunately, your device also has a usb-c
port so it will just work, right? Well, that monitor could support at least
the following protocols over that cable:

\- Displayport, obviously.

\- HDMI, because that might be useful in some cases.

\- MHL Alternate Mode for USB 3.1, which is not-quite-HDMI.

\- USB itself, like the external mini video cards created by DisplayLink

\- PCI-E, as the monitor could contain any regular video card. Unlikely, but
technically possible.

Is the monitor going to work when plugged into your laptop? What about your
phone? Will it charge either of those? I'm not sure how this has managed to
gone this horribly wrong, but somehow "supports usb-c" has become completely
meaningless. Oh, and it can also support analog audio as well!

And if anyone thinks I'm exaggerating, please tell me which devices do and
which do not work with this "usb-c to vga" adapter, because to me it is
literally impossible to tell:
[http://www.belkin.com/uk/p/P-F2CU037/](http://www.belkin.com/uk/p/P-F2CU037/)

~~~
andrepd
In other words, they're still different ports (for all practical purposes),
they just all have the same connector shape (which makes it harder, not
easier, for the reasons you mention).

~~~
kbenson
Put that way, I'm not entirely sure it's worse. Coming from a world where the
connector helped you determine what you could do, it _seems_ worse, but that's
because we've all internalized the cost of not having the right cable and
having to buy different cables for everything. Having a cable that works for
most things (if you were lucky enough to buy a good cable, and I'll freely
admit _that_ is a clusterfuck of planning) leaves you with just having to know
if your ports will work together. That seems solvable, even if it's not
currently solved. I think we're a step closer to the dream of a simplified
connector ecosystem, even if we aren't quite there yet.

~~~
TheSpiceIsLife
I don't have any USB-C devices, yet.

Is it possible to purchase USB-C cables that work for everything? Or are (some
of) the different supported protocols incompatible?

~~~
cesarb
Yes. What defines the protocol is the endpoints, not the cable. Just avoid the
"USB 2.0" USB-C cables, which do not connect all the pins (only the ones
needed for USB 2.0). Of course, higher speeds and higher charging currents
need a higher grade of cable, but that higher-quality cable will work with
lower speeds and lower currents. Take a look at the Wikipedia article:
[https://en.wikipedia.org/wiki/USB-C](https://en.wikipedia.org/wiki/USB-C)

------
maltalex
TL;DR

\- USB-C is a connector and not all ports/cables/hubs support all of its
features and modes. So not everything that can be connected using a USB-C will
work the same way, or even at all.

\- Due to their potential bandwidth demands, computers can’t have very many
USB-C ports

\- USB-C will be phased out and replaced before settling down

~~~
ghaff
One of the folks involved in the original USB standards work once told me that
the lack of reversibility (or at least a more obvious indicator of
orientation) was the main thing they got wrong with the initial standard.

~~~
HocusPocusLocus
There are huge indicators. People just jam it till it works.

IE. No user testing.

~~~
on_and_off
Unfortunately some devices don't respect the indicator :(

Most people don't know about it anyway, making the c port reversible was a
good decision.

~~~
digi_owl
The port itself is nice, is in the cube of data rate, power rate, port type
that is the problem.

You can have A ports that deliver 20V and 3.1 data rates, and C ports that
deliver only 5V and 2.0 data rates, and they are both valid according to spec.

------
chx
I wrote up the fun that is USB C at
[https://superuser.com/a/1200112/41259](https://superuser.com/a/1200112/41259)
.

How big a mess USB C is? Check this note from Plugable: "We have had several
confirmed cases where lowering the Power Output of the internal Wi-Fi adapter
to 75% in Dell XPS models has helped with USB-C disconnect behavior."

Also, while you can get a small USB C to dual DisplayPort MST hub
[http://a.co/hyXGdBA](http://a.co/hyXGdBA) and also you can get a small USB C
to DisplayPort adapter with USB C power passthrough
[http://a.co/1KE2imb](http://a.co/1KE2imb) you can't get an USB C MST hub with
power passthrough in a single, small, relatively cheap device. You need an
expensive, quite huge device like the ThinkPad USB C dock to get this
functionality. It seems as if every USB C accessory maker took a blood oath on
hardwiring a HDMI converter to the second port of their MST hub. (Yes, taking
a USB C to DP converter and an ordinary MST hub is a cheap solution, of
course, but it's not elegant at all especially because both insist on using
cords, creating a mess.)

Also, there have been laptop chargers with 65W or 90W on the laptop side and
one or perhaps even two USB charging ports, if you are real lucky then 2.4A
each. Now that both your laptop and phone charges via USB C you'd think you
could get an AC adapter with _two_ USB C ports and perhaps a few USB A thrown
in for good measure. Dream on. The highest wattage adapter I found with two
USB C ports is 55W, it doesn't even charge a single laptop, it's all 5V.
There's a well known ;) brand called LVSun which sells a 80W charger which can
do a non USB C laptop + USB C phone (or a USB C laptop + a non USB C phone)
plus it has a few USB A ports as well. Still, it's not two USB C ports.

------
tw04
So... USB-C is bad because cable vendors haven't come up with a good way to
easily identify what features a given cable supports? Which could be done
extremely easily by things like color coding... exactly what we had to do with
USB Type-A ports.

I'll never understand articles like this - let's not standardize on a form
factor because not every single application of that form factor has the same
requirements???

~~~
Steltek
Hang on, it's not like type-A where it's just a single parameter
(1.1/2.0/3.0). A USB-C cable has tons of potential features and a port has
possibly more features still.

\- Protocol: 3.0 or 3.1

\- Power: QC, PD, or none of the above; how many watts

\- Alt-modes: HDMI, DP, TB3

\- Chipset features: UAS

Once you've assigned all of these to a color, the higher end cables are going
to look like a pride flag or something. The solution is really not that simple
at all and it's far from guaranteed that once adoption picks up that you can
just assume your device/cable combo "obviously" supports what you want.

On top of all of this, construction of USB-C cables are far more complex than
your run of the mill cable. They're active devices with apparently absurdly
tight tolerances. Did everyone forget about the Benson Leung's spreadsheet of
killer cables? Even if the industry can figure out a user friendly branding,
it's still a roll of the dice for whether you'll fry your laptop.

Lastly, USB-C seems like a mandatory weakening of security. In a few years
time, you won't have any other choice but USB-C and now suddenly any random
charger (or cable!) could be the easiest rootkit ever deployed. The cute pen
testing exercise of dropping USB thumbdrives with backdoored Word docs is
going to get a lot more serious. I wonder how long until we have "hardware
firewalls" that attempt to rein in USB-C devices. We're past the "USB condom"
at this point.

~~~
cesarb
Hang on, I thought there were only two main axis for a USB-C to USB-C cable:
maximum protocol (3.1 Gen 2, 3.1 Gen 1, 2.0) and maximum current (3A, 5A).
Also, wouldn't most USB-C cables be passive, and the choice of alternate mode
be left to the endpoints? I don't think a passive cable cares whether the
wires are being used to carry DisplayPort, HDMI, or something else.

~~~
kuschku
The maximum current isn’t at 5A anymore, as many USB-C cables now support 100W
USB power delivery

~~~
cesarb
Isn't 100W USB power delivery 5A at 20V?

~~~
kuschku
Good point, yeah. For some reason I had assumed they’d stick with the same 5V
as always, but you’re right, they use 20V for PD.

------
Decade
The core problem is the large number of options.

When USB-C was announced and I saw the pin-outs, my immediate thought was,
“This is going to be a support nightmare.” I should have written that down, so
I could point at it as proof of my genius. :P But I didn’t know what a garbage
fire USB-PD is.

I was thinking of Alternate Mode. Alternate Mode is what enables Thunderbolt 3
on USB-C. We already had an alternate mode for USB: MHL. Which divides HDMI
ports between plain HDMI ports and HDMI ports that support MHL. The Alternate
Mode multiplies the number of ways displays can connect to video sources via
USB.

Not to mention what everybody else has commented on.

All these incompatible options are using the same connector. That’s obviously
confusing to users and frustrating to support personnel.

~~~
digi_owl
That, and allowing OEMs to decide port, data rates, and power delivery spec
individually.

In particular as we have a rudimentary power spec in the 3.1 "data" spec,
involving resistors in the cables(?!), and a separate power delivery spec that
goes above and beyond the 3.1 power stuff to define anything up to 20V at
multiple As.

Frankly to me there seems to have been a initial C port design that was a
"simple" mechanical reversible implementation of the existing 3.0 wiring. And
then someone decided that rather than simply have the pins being mechanically
doubled (either in plug or in port) they could make each be an individual wire
and run even more stuff through the port. And thus we get Alternate Mode that
can carry just about anything the OEMs dream up...

------
flyGuyOnTheSly
>Much of USB-C’s awesome capability comes from Thunderbolt and other Alternate
Modes. But due to their potential bandwidth demands, computers can’t have very
many USB-C ports.

Why is this the case?

I read that thunderbolt v3 can handle upwards of 40Gbit/s.

Ok... but what is limiting the number of USB-C Thunderbolt-enabled ports that
a given computer can have?

Surely each of those ports are not being maxed out to 100% of their bandwidth
24/7?

Why can't the bandwidth just be limited as needed like in a cheap router for
example?

(I know nothing about hardware, FYI)

~~~
Viper007Bond
PCIe lanes usually. CPUs only give you so many.

------
bluedino
The state of adapters is atrocious. Apple couldn’t even release a good one to
go with their flagship products.

Take the USB-C to HDMI adapter. The USB port is useless for connecting to
external drives. When I plug it into my MacBook one of three things happens:

1 - nothing

2 - it works at USB 2.0 speeds

3 - it erratically works for a minute or two then locks up my laptop

Meanwhile, the plain USB-C to USB adapter works perfectly.

------
Roritharr
Two years ago i saw this coming and got pummeled for saying so:
[https://news.ycombinator.com/item?id=9645013#9645650](https://news.ycombinator.com/item?id=9645013#9645650)

Ironically since then i've just made Thunderbolt 3 a requirement for all
company laptops and the whole company, mac and windows gets to use the same
dongles, chargers and even dockingstations, which I count as a win.

~~~
mleiphamellis
If you don’t mind sharing, I am very eager to know which cross platform dock
you settled on!

~~~
Roritharr
CalDigit TS3

------
lytedev
I thought the entire point was to have _cables_ that work everywhere, which is
my preference. I think the ports themselves can and should do different things
- even my USB-A ports on my laptop have little icons beside them indicating
their purpose or capabilities. The point is to not have to keep many dozen
different cables on hand for everything.

~~~
lmm
That part doesn't work either though. There are at least four different types
of cable (3.1-speed, 3.2-speed, high wattage, and thunderbolt)

~~~
colejohnson66
Regarding the 3.1-speed: Is there any rational explanation for “USB 3 5GB”
being renamed to “USB 3.1 gen 1 5GB” and “USB 3.1 10GB” being renamed to “USB
3.1 gen 2 10GB”?

~~~
wmf
That was probably done to protect the companies selling "obsolete" 3.0 devices
by giving them a "free upgrade" to 3.1.

------
amluto
> And, of course, there’s usually no way to tell at a glance whether a given
> cable, charger, battery, or device supports USB-C PD or at what wattages.

Forget wattage -- voltage is also a big deal. Want to charge a Dell laptop?
You need 20V. Charger support for PD at 20V is somewhat related to wattage,
but it's easy to find 40-odd watt chargers that can't supply 20V. And there is
usually nothing in the specs that tells you this.

~~~
colejohnson66
Is there any reason the spec couldn’t say all cables deliver 5V and the
recieving end converts that to what it needs? Because from my understanding on
electronics, if you double the voltage, you half the amperage. So supply
5V@20A and let the recieving end convert that to 20V@5A

~~~
SomeHacker44
Less current usually means less heat generated in the wires and so you can use
smaller copper wires. Higher voltage requires better insulation on the wires.
Altogether I think it's a good idea to use 20V at 5A for 100W rather than 20A
at 5V. You will get much more heat and voltage loss at 20A for the same size
wire.

------
perilunar
This may be a stupid question, but why does a _serial_ bus require 24 pins?
That’s only one less than the old DB-25 _parallel_ cables. (I know it’s really
only 12 mirrored, but still). USB 1-2 got away with 4 pins.

Also, why can’t we just switch to optical cables already? We’d only need +V,
GND and the fibre. Just use the 3.5 mm audio plug (round, fits any way up)
with a hollow point, like Mini-TOSLINK did.

~~~
ianhowson
> why does a serial bus require 24 pins

It turns out to be cheaper and easier to aggregate multiple serial links than
to run a wide parallel bus. Practically every modern high-speed interconnect
uses multiple aggregated serial links -- USB, Ethernet, SATA, PCIe, DDR3+,
DVI+HDMI+DP+LVDS, etc, etc.

> why can’t we just switch to optical cables already

Too expensive. A big part of USB's dominance in the mid-90's was that it was
way cheaper than the alternatives of the time.

Thunderbolt _was_ originally designed for optical interconnects.

> Just use the 3.5 mm audio plug

It's physically too large for current and upcoming designs.

I miss TOSLINK.

------
plaidfuji
New MacBook Pro user here: the four USB-C (+ stereo mini audio jack) port
configuration doesn't bother me as much as I thought it would. It's nice that
I can plug the power cable in on either side of the laptop. One almost
inevitably needs some kind of display dongle unless you have both HDMI and VGA
on board (and VGA is an ugly port, let's be honest), although I'll say the
lack of native HDMI is a pain point for me. But also, if I'm using the laptop
without power, I now have 4 ports available, which is more than enough.

~~~
BoorishBears
Another new MacBook Pro user here: The two ports the 13" model came with are
not enough. And the official Apple USB C to HDMI/USB adapter charges slower
through the passthrough than a direct connection, meaning if I'm low on
battery I need to dedicate a USB port to charging.

My monitor at home (XR382CQK) supports USB C and that's ok, it charges, does
video, and acts as a hub via one cable. The problem is the world hasn't had
time to embrace USB C. I'd gladly trade my old setup of needing two cables
(one Magsafe + one Thunderbolt) for not needing a dongle literally everywhere
else I use my laptop with an external display.

------
tschellenbach
With all the hubs I still can't get my 2017 Macbook Pro 15inch to reliably run
an external monitor. (it lags every now and then). So far I didn't find out
yet which of the components is actually causing it. Crazy how complex it is.

~~~
ScottBurson
I don't see how lag could be caused by a cable or hub. It's not like a remote
desktop or VNC connection, where what flows over the cable is commands to
change a stored image; there is no stored image -- every pixel is re-sent for
every frame. Any lag must therefore be internal to the laptop. If you have an
image on the monitor, and it isn't scrambled and the colors are right, the
cable and hub are working.

~~~
ricardobeat
You’d be surprised.

I was just last week forced to use a USB-C-to-DisplayPort cable, instead of a
straightforward DP to DP cable to get a 4K monitor to work. Using the latter,
the computer thought everything was fine but the screen flickered like a CRT,
horizontal splits and all, and went black every 3s.

There are so many layers involved I don’t even want to think about it, and am
just happy it worked...

------
exabrial
A lot of the problems seems to be implementation... Apple's quality and value
has done nothing but nose-dive, while they focus on "impossibly useful" things
like the touch bar. How removing ports is a "feature" on a "professional" line
or laptops still baffles me.

------
protomyth
_But due to their potential bandwidth demands, computers can’t have very many
USB-C ports_

Isn't this more a function of Intel product differentiation and closing off
their bus to others? I get the feeling the killer server ARM is going to be a
bandwidth monster.

~~~
revelation
Mostly it's the authors ignorance of the standard. Just because you can
connect a display to an USB-C port doesn't mean that is taking up USB
bandwidth or requires the CPU to do more work than previously. It's just more
ports integrated into one.

~~~
andreasley
If there's a port, there has to be sufficient bandwidth available, regardless
if anything is connected and what that might be. USB 3.1 Gen 2 supports
10Gbps, so 4 ports means 40Gbps. If it's a MacBook Pro, it has to support
Thunderbolt 3 on all ports. That's 160Gbps for 4 ports, or 16 PCIe lanes. That
happens to be exactly the number of PCIe lanes the Intel i7-7920HQ CPU has.

------
flatfilefan
The nice thing about standards is that you have so many to choose from.

Andrew S. Tanenbaum Computer Networks, 2nd ed., p. 254.

------
tobias__
USB-C feels like the dynamically typed programming analog of the cable world

------
chrisBob
I am the only one at work with the new MBP and I get frequent questions about
it and the ports. I usually tell them that I firmly believe USB-C is the way
of the future, but alas I live in the present.

------
Ellipsis753
I like USB-C and I buy into the dream a lot. Additionally, the fact that I can
plug it in either way around to charge my phone is amazing!

The USB-C cable is obviously very capable, it's just a bit of a mess with
interoperability. However, given that USB-C requires licencing, I feel like
they can probably fix a lot of these issues afterwards by changing what they
do and don't allow. (In a way kind of similar to software patches for the
licence itself.)

For example, I assume they could start charging more to use older USB A/B,
drop some supported USB-C features that are kind of messy and start charging
more for devices that lazily only sort of implement bits of the specification.

The old USB shape has been around for 20 years and counting, if the new USB-C
shape is around similarly as long, we have plenty of time to get these issues
worked out. (This is already going so much better than ipv6 adoption!)

------
pasbesoin
If they were going to engage in this, they should have made clear and
authoritative capabilities markings on the equipment a required part of the
standard.

And, as others mentioned, I also worry about torque and stress on the port,
particularly in mobile scenarios.

------
eugenv
At least it's reversible, so that's a step forward.

------
crispinb
It's so confusing, to ensure compatibility a consumer might be tempted to just
buy everything from one manufacturer. This would be very upsetting for Apple,
I imagine. Proprietariness by stealth?

------
erikb
Not impossible, just unfinished. I didn't read all the points because it feels
like a pointed list, and not really a running text. But it seems to me like I
haven't seen a single problem that can't be overcome with fine tuning the
USB-C system. E.g. some ports have thunderbolt, some don't. Well it looks like
in the future all of them are expected to have thunderbolt or a similar
standard.

All I'm taking from it is "It's not fully usable yet" and I continue to use
normal USB for longer.

------
Friedduck
After a month or so of owning a brand-new 15” MBP my regret is that I didn’t
buy one of the holdover 2015s. The charger is absurdly large and I have a
growing collection of dongles that need to accompany me depending on the task.

I’m currently using a Rube Goldberg two-dongle solution to get photos off of
an SD card.

Moreover I still can’t find what I need for my home office in USB-C.

I won’t even get into how much _less_ functional the touchbar is than the hard
keys they replace. Terrible user experience.

------
myrandomcomment
Just went from MacBook Pro to MacBook (fly 120k miles a year). Got Apple 4K
(LG) monitor. Disc are USB2 from monitor as is Ethernet. Damn thing does not
wake from sleep with USB keyboard. I had an LG 21:9 34” display with
thunderbolt drives, network. The 4K monitor is a great picture but 21” is
crap. Apple just put thunderbolt in the MacBook please. USB3 can die. Happy
with physical interface, rest is crap.

------
ngcc_hk
Whilst it raise good issues, thunderbolt are shown on computer and cable.

------
glasz
the most brilliant thing about this: apple sacrificed magsafe for the
clusterfuck that usb-c is.

------
mjcohen
I have an Asus C302CA chromebook and got a USB-A hub for it
([https://www.amazon.com/gp/product/B01M59IOAW/ref=oh_aui_sear...](https://www.amazon.com/gp/product/B01M59IOAW/ref=oh_aui_search_detailpage?ie=UTF8&psc=1)).
It only works for storage on one of the two USB-C ports. I was about to return
it when I thought of trying the other port.

------
mtgx
The USB guys should have never allowed so many standards live within "one"
standard. It may be better for the OEMs, especially as they both save money on
ports and get to mislead customers into thinking they may get that fast
version of USB-C, when they really won't. But so many standards acting as one
is very confusing to consumers.

------
osteele
Perhaps a similar simplification can be applied to NEMA connectors
[https://en.wikipedia.org/wiki/NEMA_connector](https://en.wikipedia.org/wiki/NEMA_connector),
to reduce the number of extension cords and add thrills to our daily lives.

------
ksec
I think it is clear, the future is _mostly_ wireless. Why would we _need_ a
Wired Connection?

The only reason we need wired is either Power or High Speed Connection.

Power: It is highly unlikely in the next 5 years time we wont have a true
wireless power transfer,( i.e not sitting on a Power Mat ), enough energy to
power a Laptop, Monitor or Desktop.

High Speed: That is anything from Monitor Cable, Massive SSD external storage,
External GPU, and.... I cant think of anything else. ( Suggestions Welcome. )
Basically, what Thunderbolt was designed for; External PCI-E and DisplayPort.

Sometimes this year or next we are going to finalize what I hope will be the
Wireless Standards for decades to come, 802.11ax and 802.11ay.

802.11 ax is the successor of 802.11ac, basically what the industry learned
from LTE and LTE-A, and moves the best of those tech across to WiFI, works in
both 2.4hGhz and 5Ghz. We could expect real world, uncontended speed of 1Gbps
in 2x2, and 2Gbps in 4x4, and both are very conservative numbers.

802.11 ay is successor of the failed 60Ghz 802.11ad. Real world speed 10Gbps+.

Bluetooth 5 which and Improves version of 4.1. And Bluetooth 4.1 is the
arguable first Bluetooth which really makes it acceptable in general usage,
and if 5 improves the better.

Assuming no Patents non-sense, we could expect to have WiFi chips in what we
are connecting now via cables. You will only need cable for Power and High
Speed Connection. And coincidentally Thunderbolt is going to be an Open
Standard Next year.

What we need to fix / improve is power delivery on USB 3.1 branding. May be
Future revision of USB will do that. For now, sitting in a transition period
between Wired and Wireless will always be a little painful in the short term.

------
HocusPocusLocus
Sounds like he's complaining about a labeling issue on cables and ports.

The different USB-C capabilities is a feature, but the lack of labels is a
huge bug.

Who wants to fix? Make a Web 2.0 website with photo-realistic 3D renders of
your solution, then take it on the road.

~~~
toyg
Two words: washing labels.

Just have a rough industry agreement of which features should be represented
and what their symbols should be, then put them on connectors (which are
pretty long, there is plenty of usable space) and on packaging and listings.

~~~
mchahn
> Two words: washing labels.

Great Idea. However, I had to write software one time that had all the
possible washing symbols as input. There were 30 or 40 possible symbols. No
human could possibly know all of them.

~~~
toyg
Yeah I know, I even downloaded an iOS app to read them ("Laundry Day"); but in
practice I noticed most people just learn the one or two they care about and
ignore the rest - because if you don't own a tumble dryer, you don't really
care about the symbol for that. So if you care about Thunderbolt 2, you would
look only for that icon, ignoring the rest; power might have a scaled one like
washing temperature, and so on.

As usual though, the problem is that the industry has to come together to
agree these symbols, be it formally through the USB consortium or informally
in some other venue.

------
clebio
This is the sort of well-written, detailed reporting that I believe to be
true, but which ought to have citations for each claim. Of course, cataloging
and adding those sources would mean the article takes a lot longer to write
and publish.

~~~
flatfilefan
If you ask me he should actually set up a proper crowdsourced database of all
those cables and cases. ... 3\. Profit!

------
markcerqueira
It's not perfect (it may even be terrible) but I can charge my computer, my
phone, and my Nintendo Switch with it which is a huge improvement over the
last generation of cable mayhem.

------
flying_sheep
To solve the problems what the author is facing, we need to invent another
port and cable standard -- which is far worse than keeping USB-C-like port and
only buying a new cable.

------
rocky1138
Sorry, but what exactly is wrong with USB A and micro USB? I've never
understood why USB type-C was a thing. You can get 3.1 speeds on the old
connector.

~~~
aidenn0
1) Obviously USB-C is smaller than full-size A jacks.

2) Micro-USB type A never really caught on; I'm not sure there ever was a
super-speed version of it

3) Super-Speed micro type-B is also much larger than USB-C

4) USB-C can optionally carry PCIe and/or Display Port

5) USB-C is reversible along all axes; either end goes into the computer
flipped in either direction, same on device.

------
ebrandell
Dongle life. I hate to think that USB-C will become obsolete so quickly, but
hate it even more that I'm required to use it right now...

------
marichards
This reminds me of abstract classes

------
pier25
USB-C is the JavaScript of cables and ports.

------
woah
You could write a blog post called “the impossible dream of Ethernet” and
agonize about the fact that there are many different network protocols.

~~~
nayuki
What parts of Ethernet/networking cause confusion and incompatibility? Please
name them specifically.

~~~
rocqua
\- PoE

\- hubs vs switches (old issue)

\- cat5, cat5e cat6 cat7

\- 100Mbps vs 1Gbps (there are still commercial 100Mbps products for sale)

\- The horror that used to be crossover cables before auto-detection.

~~~
nayuki
Power over Ethernet: I haven't studied this one, so I'll assume it has some
subtleties involved.

Hubs vs. switches: Hubs are awful (have used one), burn them with prejudice.
Collisions bring everything down to 1 Mb/s. And hubs are disallowed in gigabit
anyway, due to no CSMA/CD.

Category cables: USB has different cables too, and at least Ethernet didn't
change the number of conductor wires. But the biggest difference is that
Ethernet cables are often in walls or inaccessible places, whereas USB cables
are user-replaceable.

100 Mb/s vs. 1 Gb/s: One problem I noticed is that a link might start at
1Gb/s, but after many weeks a momentary error (e.g. power blip) makes the link
drop to 100Mb/s or even 10Mb/s and does not automatically recover to the
higher speed without manually replugging. IIRC USB doesn't drop down to a
lower speed in this manner.

Crossover cables: IIRC if you are using purely switches (no hubs), you will
never use crossover cables. Other than that, pervasive MDI/MDI-X has silently
solved the problem. Though I wish the standard didn't create this problem in
the first place.

I do appreciate the fact that Ethernet treats nodes as peers. Whereas USB is
based on a tree rooted at the master and the master must poll devices to ask
them to transfer data. This tidy idea gets into trouble when, for example, a
phone can be either a host (OTG) or a client. Also it is impossible to
directly share USB devices between two hosts (e.g. a webcam accessible to two
computers simultaneously).

~~~
rocqua
Ethernet is definitely a standard with fewer issues than USB, but I'd say that
is down to a narrower scope. That has allowed us to converge to essentially a
working standard of 1Gbps over Cat5e.

------
sneak
At least it fails open, to the common baseline. “can plug in, play audio, sync
at usb2.0 speeds, and charge slowly”.

Compare and contrast 30 pin dock/lightning.

------
randyrand
"There’s usually no way to tell whether a given USB-C device requires
Thunderbolt, either — you just need to plug it in and see if it works."

Or how about something called google? Who are these people plugging random
devices into their computers that they know nothing about?

~~~
jotm
Don't even bother... if you're not designing for morons, you're not selling
anything these days, apparently.

~~~
throw5427
[https://en.wikipedia.org/wiki/The_Design_of_Everyday_Things](https://en.wikipedia.org/wiki/The_Design_of_Everyday_Things)

[https://en.wikipedia.org/wiki/Don%27t_Make_Me_Think](https://en.wikipedia.org/wiki/Don%27t_Make_Me_Think)

