Hacker News new | past | comments | ask | show | jobs | submit login
USB-C is still a mess (androidauthority.com)
216 points by vo2maxer on June 6, 2020 | hide | past | favorite | 204 comments



Heh, just this week I was struggling to get a device to work with my laptop, until after 90+ minutes I realized there were different types of USB-C, and the one I had (the Apple charger cable) only did charging.

I needed one that transmitted video, so I then went to order a $10 USB-C, only to realize those also only did charging. Finally found what I needed, but I thought USB-C was finally "universal", but turns out much of that is just marketing...

Edit: Turns out the one I just bought ($40 USB-C Apple thunderbolt cable) doesn't even fit into the device because the edges are too thick. Ridiculous


I think it's one of those things where it sounds good as an idea but breaks down in implementation.

"Let's design a port/interconnect standard that does everything" is great in concept, but one manufacturer or another is going to leave out bits in the cable to keep costs down or not implement something in their device-side firmware. You end up with the situation we have now where there's insane levels of fragmentation, and you can never be quite sure whether two things are going to work together or not.


Clear branding would help with a lot of it. E.g. mark dumb cables that only do charging as USB-PWR while full feature cables are USB 50 or USB 100, for example, depending on data rate.


Which only covers a couple of the dozen different use cases, now covered by USB. Which is why thunderbolt and the like would require yet another marking.

All this nullifies the point of having a single port+cable.

If your going to mark it, why not just modify the cable to have different keys, one for charging, one for higher speed, etc. Then at least you know right away its not going to work.

But for that matter, once you have fixed the plug & port compatibility you might have well have just used different ports. Because that is what you have, they just look similar enough to cause confusion.

The whole thing, is just a false set of choices brought on in large part by the same industry (mobile phones) which couldn't be bothered to actually make their parts compatible with actual standards. Its doubtful you will ever see any of those manufactures actually build a fully compliant USB part either, since they have regularly proven to be unable to do it even with the simpler standards. This despite charging top dollar for parts that are built with the cheapest design/etc.


Did micro USB have different keyings for different power levels? No, we just plugged in our device and it charged at the rate the charger could support, and if you needed fast you found a fast charger.

Broadly speaking faster charging and faster speeds are just... faster. The beauty of USB is that you can plug into a billion different chargers around the world, anywhere, and while the charge rates might vary, you can charge up anywhere. Keying breaks that.


There is only one power spec for usb cables before type C. Anything beyond that was a proprietary variant that, yep, wasn’t always obvious upon inspection.

Not defending USB type C here.


Sorting by speed, plus an icon for higher power cables, covers almost all real world uses. It would not have to be complex, and would not nullify the point of a single port/cable.

Even just marking speed would mean a cable never unexpectedly fails to do its job.


Worth mentioning that USB Type C is the plug and it is indeed universal. The protocols are a whole different matter and finding a good cable is a challenge.


Or for that matter even knowing which ports on a device are which. I have a tablet that has two usb-c ports. Only one will charge the device.


They should've just color coded the connectors to capabilities... Like with USB 1/2/3 as black-white/blue/red

Now they had a new form so they could've just used black for charging, blue for some data and red for display. I guess that ship sailed already though.


That's true, we swapped physically visible complexity to software invisible one.. one set of wires and one connector but 24 different set of features depending on who knows what.

I don't see mainstream user liking this at all.


Fortunately the time will likely sort it out. We had similar issues with the original USB at the beginning. (When you actually needed extra pci card to support it) Now USB connections "just work". Hopefully it will happen to usb-c as well.


It’s been 6 years since the 2015 single-USB-C MacBook came out. Patience is wearing thin. Also I don’t recall the original USB having widespread problems with non compliant cables that can fry your device or burn down your house. And of course there still aren’t any true USB-C hubs available.

All this would be even worse without the tireless efforts of Benson Leung and his merry band of USB-C avengers.


Yes, and unlikely to change unless USB4 spec mandate anything. And so far this doesn't seems to be happening.

i.e There is no real solution insight. And USB-C will remain the same for the next 3 - 5 years. And yet proponents are still defending it. It baffles me. Why cant we just admit mistakes and walk back to simple solution?


There was only a very short time with hardware issues on USB. What really plagued it in the beginning was that Play and Play (the big selling point for USB 1.0) was really mostly Plug and Pray.


In tech support we called it Plug and Die

Thankfully it more or less sorted itself out, but I still secretly curse 'driver first' installations, so counterintuitive for users


Not quite - I use a wired Apple keyboard (otherwise a great keyboard, feels way better than the magic keyboard and the usb on the side connects to my mouse) that doesn’t work on my MacBook unless I use the extension cord, on another MacBook it works without. It might have to do with something with the NVRAM battery being weak but I didn’t have the time to explore further. [1]

In theory we should be able to get rid of cables, but that is also a messed up standard, I’d rather connect my keyboard that extends to the mouse than switch devices with Bluetooth.

Sounds like we need another standard: https://xkcd.com/927/

[1]: https://discussions.apple.com/thread/5676379


>In theory we should be able to get rid of cables

In theory, but no thanks. I don't want wireless peripherals.


What's the downside of a wireless keyboard and mouse that just work?


It doesn't just work. If I get a wireless keyboard and mouse that "just works" in theory but in practice does not "just work" always, then that's a load of hassle that far outweighs any advantages it may have.

In essence, the drawback is the lack of trust, because the promises of wireless that "just works" have been repeatedly made and broken. Fool me once, shame on you, fool me twice... I won't allow myself to be fooled again by such promises.

So if you do have a system that just works, then I'd refuse it on principle (because I simply won't believe your claims, no matter what you say) until at least multiple years have passed with it being widely used resulting in a general consensus that it really does just work in all cases, without there being e.g. a 1% minority talking about all the many edge cases they encountered where it turns out that it does not actually "just work".


>I won't allow myself to be fooled again by such promises

I get the sentiment, but its kind of silly in practice. Things improve almost constantly.

Just trying to say I have been using a wireless mouse and keyboard for years, they have never given me any trouble. In fact, even less trouble than wired peripherals have given me over the years. Maybe you should bend your rule and try out some modern stuff if your previous experience was not a modern device.


Maybe you're lucky. The relevant radio band is crowded enough where my wireless peripherals live that I get annoyed by them at least once a day.


They don't "just work". Whether it's Logitech or Apple mouse, my cat laying in front of the pad breaks the connection. (Not enough to disconnect, just adds jitter)

When playing games, wireless keyboard/mouse introduce enough latency that twitchy platformers or competitive shooters are impacted. (Unless you buy super expensive game-oriented device)

Bluetooth devices are terrible... by design because of Bluetooth and its problems. Special dongles don't function well when plugged in on the back of monitors and sometimes need manual "replug" to register after a reboot.

Then there's battery replacement / charging.


I hope this doesn't get treated as an advertisement since I'm not affiliated -- but do try Logitech G900 (also branded as Chaos Spectrum) or G903 mice.

My wife played competitive Overwatch for a few months some years ago and she said the mouse feels exactly as a wired device. I play Quake every now and then and can confirm the same. Best mouse we ever used, we have 4 at home.

It has the added benefit that it disables its wireless circuitry when you plug it in -- it both recharges itself and becomes a wired mouse.

(Yes, both mice can be viewed as expensive. But we took the plunge to invest in reliable periphery and have only been happy with our investment.)


Wireless is subject to interference and introduces new security concerns. They run on batteries which you then have to replace. They tend to cost more. That's a lot of trouble to avoid a two foot USB cable.


"In theory" all of those problems can be solved, though. I want to know why someone doesn't want wireless even in theory.


Latency -- You'll never get a wireless mouse faster than fiber-optic, since the computer screen would interfere with optical wireless transmission ;)

I jest. The real theoretical issue is pairing (portability). I can't think of a wireless solution that provides all three of:

1. No dongle. I don't want an extra thing to keep track of.

2. Fast and convenient pairing ("plug" and play). Connecting to a different computer should not be a hassle.

3. Secure. Device must only pair with computers of my choosing.

Dongles, WPS buttons, and bluetooth number confirmation are existing solutions that provide any two. A physical plug combines "connect" and "authorize" steps, neatly sidestepping the issue.

The closest thing I could think of to solve this would be a standard, as ubiquitous as USB, for wireless connection: each device has a code that serves as protocol negotiation, identification, and authorization. So you open up a "connect a device" dialog on the computer, type in your device's code, and the computer automatically discovers the device and pairs with it. But even this is a compromise on both usability (another thing to remember...) and/or security (it could be printed on the device, like a serial number, but then...).

And, of course there's the myriad practical issues mentioned elsewhere in the thread. I think if they were (truly) solved, I would buy wireless devices for my home setup, to do away with cable management, and keep cheap wired ones for use with other people's computers. I don't anticipate this happening for at least a decade or two.


I may be misunderstanding your point, but couldn't the USB-based pairing that Apple uses for their keyboards and mice solve functionally all of these issues?

There's no dongle (they're bluetooth) and pairing is dead simple (plug it in once and it's available everywhere). I don't think I've ever seen a security analysis of their peripherals, but the potential for a mitm attack on the keyboard seems like it would be equal to or lower than a wired version.

This assumes, of course, that you're using Apple products across the board but it is a solution nonetheless.


D'oh! Yes, if I understand correctly (I don't use Apple products[0], so this is the first I'm hearing of it), that would solve the issue, since USB cables are common enough that I wouldn't need to bring one with me just for pairing. I'll still be sticking to wired for practical reasons, for the time being, but maybe those will be fixed sooner than I expect :)

[0] I swore off of walled gardens after an unsavory experience regarding EOL of the 1st gen iPod touch — Apple effectively bricked my working hardware by taking all apps for it off the app store.


It’s not practical when you have to do it several times a day and having a cable lying around and having to connect it sort of defies the purpose of a wireless keyboard/mouse.

Same applies to Bluetooth speakers by the way, pairing is just horribly annoying. I wish they’d just replace the volume buttons for a dedicated pairing button instead of having a different way to pair on each device (do I hold the volume or that other button? Was it 3 or 15 seconds)

I must admit having several computers is a bit of an edge case, but still.


For a keyboard, wireless buys me nothing but introduces a number of new potential issues.

For a mouse, wireless buys you some convenience and freedom of movement. However (and this is slightly niche so it really doesn’t apply in general) I use a trackball, so the wireless advantages are 100% negated.


Being able to move around (several monitors) and easy use of a standing desk are some advantages. I have a workstation under my desk and a laptop that I frequently hook up to my monitors frequently.


Philosophically, wireless is just going to be more complicated than wired. Say I'm sitting in front of 4 computers with 1 keyboard. Without interacting with the system at all, which computer is connected to the keyboard? With wireless its difficult/impossible to tell whereas it's fairly clear with wired.

KISS is the philosophical principle at play. https://en.wikipedia.org/wiki/KISS_principle


In theory there is no difference between theory and practice but in practice there is.


But we're not talking about practice. The difference between theory and practice is not a reason to dislike the theory version!


It is! In engineering, as discrete from science, the point of theory is to know what is going to happen in practice. If theory can't tell you, then it's useless for practical purposes. If the theory is ignoring important aspects of reality that make it useless, then what's the point of theory?

Eg In theory we can move faster than light, we just need a source of infinite energy. In practice, "infinite" is an impossibly large amount of energy, so any theories that allow for faster than light travel are only interesting for entertainment/theoretical purposes.

For wireless connections, the gulf between theory and practice is just too large, so theory remains theoretical.


The theory version doesn't actually exist.


It's more stuff that can go wrong. I understand why you would want a wired mouse for a weird setup where you don't want a wire from the mouse or /can't/ have a wire from the mouse to the machine. But I don't see the need just like I don't see a need for wireless charging or wireless hdmi or wireless southbridge.


Wireless hdmi: Though it’s technically not the same, think of chromecast, I think it’s nice to be able to stream stuff on the tv without the need to physically connect it.

Since power cables are still the easiest to find in a normal household (compared to any cables that carry data), I’d argue that anything except for these can have at least a little bit of value if it’s wireless and done in a convenient way.


The problem is unnecessarily occupying wireless spectrum for devices than can use a cable.

You can make a wireless monitor. I expect it to work just fine in a home environment. I expect it to fail in all kinds of crazy ways in an office with dozens of these monitors side by side among other devices also using wireless communications like phones.


Stopping to work in the middle of a live presentation or in the middle of a game. My mouse is excellent, I love it but I have a magnetic USB cable next to the mouse pad for the times where I get a warning 10 seconds before shutting down.


What if it was powered wirelessly, with a range as big as the cable?


This is not the world we live in. It would also need to work with many such devices in proximity, such as office cubicles or university lecture halls.


The Logitech G903 mouse has a special pad that also charges it wirelessly.


Maybe I am a stupid layman but isn't that mouse pad basically a low power induction cooker? Am I superstitious to not want to keep my hand on an induction cooker for hours per day?


As far as I know, yeah, it is. No clue how strong it is but I'd be worried a bit as well.

We both use the G900 anyway. Needs charging like twice a week. Depending on usage it can even go once two weeks only.


Potential interference, crowded spectrum, battery required, extra weight due to the presence of battery, and if the encryption is weak then you have a signal broadcasting all that you type in the vicinity which could be a security nightmare.


I didn't realize people had such low expectations for "just works". Getting a reliable signal for such a short range and low bandwidth is obviously possible, as is using proper encryption. Batteries can be light, with the weight compensated for, and charge wirelessly.

The reason I said "just works" was to avoid an exhaustive list of issues that have already been solved in other products, or could be solved with some effort.


The need for battery charging or replacement. That alone makes me shy away from them.


Batteries.


Just wondering, by Apple Keyboard, is that the same as old Scissors Keyboard ? And by Magic Keyboard, you are referring to the new Keyboard on MacBook Pro 16"?


I think so. From the page on apple keyboards [1] I’m referring to A1243 and A1843.

[1]: https://en.m.wikipedia.org/wiki/Apple_Keyboard


The solution to this is a "self test" every time a device is plugged in.

Each device should run a full test suite of the cable and the device at the other end, and if any fail, it should refuse to work.

Part of the test suite should be checking that the device at the other end of the cable is also running the test suite.

Everything should be tested - for example if USB can transmit video over certain pins, the test suite should involve sending a frame of video, even if your device is a usb stick and doesn't need video. That way nobody can leave out bits of the spec.

USB is insanely fast, so thousands of tests should be doable in under a second, and in fact if the other device can't pass the tests quick enough that should be reason for failure.


Who would buy, e.g., a laptop that refuses to work with most existing cables / devices? What do you gain from having a flash drive fail to work because you can't send video over it?

I can see the value of having this test suite for you to run personally, when you want to test. Or having someone certify capabilities and publish the test results for a particular piece of hardware.

But I can't imagine anyone (especially a non-tech-savvy person) using it by default and without an escape hatch - it should "just work".


I think the point is that if every device did this from the start, there wouldn't be a market for anything but cables that work for everything (because a cable that doesn't work for everything then works for nothing), so all cables would just work, for everything.

That sounds pretty unrealistic to me, though.


Looked at the other way, who would buy a faulty cable that doesn't meet this self-test specification? That would be a feature imo. Cable makers will need to all start meeting the specification very quickly or nobody will buy their cable.


It always seems that the drivers or something could say what the cable did or if it was detecting a signal or something.

I had a ton of octopus cable swag and took a bunch without testing on a trip and found out none of them charged my iphone! Frustrating to say the least!


I bought a bunch of Micro USB cables for the same reason. Most of the ones I had on hand didn't have data lines as they were only for charging, which can be fun to troubleshoot as devices connected to the cables just don't show up.


Those are a special kind of evil. I always made sure to properly mark them to not spend time debugging non-working USB connections.


Some of them will let you trick the phone into charging with a full 2/3 amps, even though you've plugged it into a laptop or something that would normally show "charging slowly".

Obviously you're relying on the laptop circuitry to not burn out, but laptops are typically designed with very robust USB circuitry because people often short them out by putting in broken cables.


You might want to keep them around, because sometimes you want to ONLY charge a device, without allowing the possibility of a data connection.


Sure but I always flag them with red electrical tape so their true nature becomes obvious


That's strange, I've never encountered a usb-c cable that did not support data at all. My understanding (and the article suggests) that charging and data are fine in general, but just not as fast as the device can support. Basically they will charge or transfer data at regular usb or micro-usb speeds.


Ya, I found the answer on Apple's site:

https://support.apple.com/en-us/HT208368

> Compared with Apple USB-C Charge Cable The Apple USB-C Charge Cable is longer (2m) and also supports charging, but data-transfer speed is limited to 480Mbps (USB 2.0) and it doesn't support video. The Apple Thunderbolt 3 (USB-C) cable has Thunderbolt logo on the sleeve of each connector. Either cable can be used with the Apple USB-C Power Adapter.


I definitely have encountered this. One was even a device intended for development via USB. For some strange reason, the USB-C cable shipped with the product only supported power, requiring the customer to buy another cable for data transfer.


Just to clarify, are you talking about a usb-c to usb-a cable, or a usb-c to usb-c cable? I think the usb-c to usb-a cables are just temporary while we get to usb-c everything, and so are not implemented that well. With usb-c to usb-c cables, I've never had a problem with power or charging.


I’ve amassed quite a collection of charging-only Type-C to Type-C cables over the years. People usually don’t realize they have such cables because they typically come alongside a charger, so people only ever attempt to charge devices with them.

Grab a USB-C to USB-C cable that came with a phone or similar device and was intended for use with a USB-PD charger and give it a test. You might get USB 2.0 speeds if you’re lucky, but you’re unlikely to see anything beyond that.


nope, there are plenty of power only usb-c to usb-c cables.

Even more fun, many of them don't properly list what power they can handle and fry your device or catch fire if you put to much through them!


> Even more fun, many of them don't properly list what power they can handle and fry your device or catch fire if you put to much through them!

That doesn't make sense. Any cable will be capable of many more volts than USB will ever put through it. How is anything going to get fried?

I suppose a cable could get too warm, but lying about capacity only takes you from 3 to 5 amps. That little bit extra should never be enough to cause a fire. And if it can't even handle 3, then the problem was not that it was lying about capacity. It also costs extra money to lie about being a 5 amp cable since that requires a chip.


The only case I know of where a USB-C cable fried something, it was horribly broken. https://www.engadget.com/2016-02-03-benson-leung-chromebook-...


Volts don't melt power cables, amps do. Regular cables do 2.5 to 3A, the 5A cables are relatively rare and more expensive.


Yes, I know.

I mentioned volts for frying, and I mentioned amps for fire.

And I don't think either failure can be caused by a cable failing to properly list what it can handle. Do you?


It's crappy devices all around trying to implement an insanely complex spec.

The Nintendo Switch can get fried in dock mode (and Nintendo usually has pretty top notch QA/abuse testing outside of joysticks)

Here's a report of an A to C cable on fire from Anker (another pretty well regarded manufacturer) https://www.reddit.com/r/Android/comments/7j3k38/anker_usbc_...


Are you talking about the thing where sending 9 volts on a data pin fries the switch? I don't blame the spec or Nintendo for that one.

That cable is more of a complexity problem, but it wasn't because it misrepresented capabilities or anything. They put in a chip which didn't reset the connection when you unplugged one end. I don't know if that's really a spec problem, though.


The cable that comes with the Mac chargers is only a USB 2.0 + USB-C power delivery the cable and doesn’t support usb 3.0/3.1 data rates or TB.

USB type-C cables that support power delivery can have 2 ground pints, 4 VBUS (power) pins and 1 CC (cable connect/config channel) pin.


The first MacBook model with USB-C shipped with a USB 2.0 cable for its charger that only supported charging. I don't know if this is still true, though (I haven't checked the one that came with my newest one).


Do you have a source? I looked for a while and couldn't find anything except ones that say it supports data, just usb 2.0 speeds.


Found it here: https://support.apple.com/en-us/HT208368

> Compared with Apple USB-C Charge Cable The Apple USB-C Charge Cable is longer (2m) and also supports charging, but data-transfer speed is limited to 480Mbps (USB 2.0) and it doesn't support video. The Apple Thunderbolt 3 (USB-C) cable has Thunderbolt logo on the sleeve of each connector. Either cable can be used with the Apple USB-C Power Adapter.


Right, but it literally says there it supports data transfer. So it's not charging only.


I haven't seen one either but I can easily believe it, as there are proprietary magnetic USB-C adapters that do not support data (they only have 5 or 6 pins). I can imagine a "normal" cable behaving the same way.


Two pins for power, two pins for data, a pin or two for cable detection, what's the problem with that number of pins?


2 data pins is USB 1.X/2.X USB 3.X requires 6 data pins at a minimum.

USB type-C cables that only support charging will only have 5-8 pins 1-2 GND, 2-4 VBUS (power) and 1-2 CC (cable connect/config channel) pins.

The standard config for charge only cables (e.g. the Nintendo Switch charger) is 2-2-1.


> 2 data pins is USB 1.X/2.X USB 3.X requires 6 data pins at a minimum.

USB-C cables don't have to support 3.X to support data. You only need the two pins.

> The standard config for charge only cables (e.g. the Nintendo Switch charger) is 2-2-1.

That's a shame. It's really not that much effort to support 2.0 data.


I specifically bought a charge-only USB-C cable so I wouldn't have to worry about plugging my phone into an untrusted port to charge.


I have a somewhat similar problem, where my nvme drive enclosure works with the USB-C port on one side of my laptop but not with the other - but both ports should be identical spec wise. Nothing makes sense any more.


Does this same thing happen in different operating systems?


I have a MIDI piano that takes USB 2 Type B. I bought a B-to-C cable and it works fine with my laptop and desktop. I decided I needed an extension cable (for connecting a microphone) and I opted for USB-C.

The piano's cable doesn't work with the extension, unless you connect it the right way 'round. USB-C shouldn't need a right way 'round...


Generalized extension cables aren't legal with USB for several reasons, including this.


And yes, when you get a USB-C that does data, it could also have been a cable that only do USB 2.0 Speed, or USB 3.1 Speed but not USB 3.2 2x2 Speed. And with USB4 there will be additional layer with Thunderbolt 4.

And all these were known since Day 1 in 2016.


A while back I went down the USB-C charging rabbit hole with the simple goal of finding a 3rd party charger that could charge all my USB-C devices (13" Macbook, Motorola phone, Nintendo Switch) as fast as the charger that came in the box.

I found it tremendously difficult to make a purchase decision; the device manufacturers and 3rd party vendors provide very little information and everyone wants their own trademark: USB-PD, Quick Charge, PowerIQ, iSmart, VoltIQ, Turbopower... it just goes on and on.

However, I decided to hope for the best and get some well-reviewed USB-PD chargers and bought the following two chargers:

https://www.amazon.com/dp/B07PLR7T1M/ (RAVPower 61w wall charger) https://www.amazon.com/dp/B075WQQG7C/ (Necktek 45w auto charger)

They work out really well and met my requirements, with the exception that the Necktek doesn't provide the full 60 watts the Macbook can use, but I knew that was the case.


It’s especially fun with the switch where third party chargers reportedly will break the device. Although afaik that only applies to docked mode.


I was really annoyed when I found out that only some USB C chargers can be used to support docking mode in the Switch.

So stupid.


That's on Nintendo, they use their own protocol for video instead of following the spec.


Why not just use the MacBook charger for everything? I've had no issues using a variety of MacBook usb-c chargers for lots of devices.


The RAVPower is 50% smaller than a 13-inch MacBook Pro's AC adapter while providing the same wattage. Also, the RAVPower has two ports, a USB-C and a USB Type A so you can charge two devices at a time.

The RAVPower is also cheaper than buying a 2nd Apple AC adapter when you want to leave an adapter at your desk and have a 2nd one in your bag to use elsewhere.


What extra88 said. My Macbook charger is fixed at my desk and I wanted a travel charger that was compact and also had a USB-A port for other low-power devices I need to charge when I travel like my Fitbit and Kindle.


All these names invented by phone manufacturers have nothing to do with USB-C capabilities, but to their branding.


It's worse. The phone manufacturers have one set of names and the 3rd party charger manufacturers like Anker and RAVPoewr have their own. My experience is that USB-PD actually works as advertised, but it's really difficult to tell if a device actually supports USB-PD.


There has to be a post-mortem on this.

My take is that the standard is too lenient and the cables aren't color coded.

Lenient standards mean that "usb-c" could support 1W or 10W and your phone plays footsy to figure out how high it can go. Pick a standard, pick a level, don't exceed the standard even if you can.

Color coding? Yup. Colors. A human must be able to visually determine what specs a cable matches and doesn't match. Before, different standards had different ports, so color didn't matter. But usb c wants to be the same port for multiple standards. Fine. Color code it. It can even be a small dot by the plug, as long as it doesn't wear off.


> Color coding? Yup. Colors. A human must be able to visually determine what specs a cable matches and doesn't match.

Shapes (of, e.g., logos) are much better than colors as a universal label for this, though colors as an additional market are fine. Colors alone are a problem because of common defects of color perception.


yeah a 2d logo on the connector or a cable pattern is better, good call.


How about keyed connectors and sockets? Why have incompatible cables with identical form factors?


We’re talking about a standard that was invented primarily for laptops. That is, it was invented to replace the wide variety of single purpose ports with a few universal ones. Keying the connectors defeats that purpose since you now need different ports again.

I know people love to complain endlessly about Apple taking away all of the single purpose ports on MacBooks and replacing them with USB-C. The problem the article is describing is a problem of counterfeit and mislabeled or unlabeled cables as well as improperly designed devices. For most users this isn't a problem and the ability to charge the laptop and plug in any device in any port is very convenient.


In theory, that would be convenient. In practice, you get stuff like this:

https://apple.stackexchange.com/questions/363337/how-to-find...

TL;DR: if your mac has high cpu load, try using the other side port for charging.

Note that this is irrespective of the mess the article talks about, which just exacerbates the problem.

In USB-C land, some (Endpoint-to-endpoint) connections are definitely a lot more equal than others.


Changing a cable that can charge my phone (but slowly) into a cable that can't physically fit into my phone... probably isn't an improvement?


The connectors could be keyed such that backwards-compatible functionality also has backwards-compatible connectors.


Can you explain how this would be different from the current situation? When both cables and devices are usually backwards-compatible, what do you even key?

The main thing I can think of to key out is charging-only cables, but those aren't even supposed to exist as far as I know. Cables are all supposed to have at least the USB 2.0 wires.


USB 3.0 B-Type and USB 3.0 Micro B will allow USB 1.1/2.0 cables to connect to the respective ports, but at lower speeds.

Last two on https://www.cablestogo.com/learning/connector-guides/usb


So you want keying that prevents you from using a too-good cable, like with micro? Cables either have to match the port speed, or be worse?


It's less about being prevented from using a cable thats "too good", and more about knowing what a given cable supports without having to plug it in. If a cable and both devices had this hypothetical USB-C+Thunderbolt keyed connector, then it would be clear by visual inspection that all 3 components necessary support that level of connection. Instead we have the confusing situation that we're in where the only real answer is to try it.

Even at $70, the cable is likely the cheapest component of the three and I would rather know that I've got something that works rather than take two trips to the store.

Yes, on the off chance that you're teleported somewhere and end up with a laptop, 4k monitor, and the slow cable, you won't be able to get 4k because of the cable, but imo that trade-off would have been worth it compared to the current confusion.


But USB-C is universally-comaptible with capability detection and least-common-denominator functionality, so you are back to one connector with no actual keying if you do that.


They aren't incompatible, that's rather the point with USB-C. Any combination of compliant cable and socket works with the capabilities of the least capable component in each dimension. Everything just works, though some combinations work better for certain tasks.


I’d rather be able to charge my devices slowly than not at all. To be honest, I really don’t care about fast charging. I’m carrying around battery packs either way.


Exactly, the main complaint with using the wrong cable is the charging is only at regular usb speeds, or the data transfers at usb 2.0 rates. That's way better than nothing, and usually doesn't matter too much.


Agreed. If a cable plugs in it should "work", and all usb c cables "work". I can use a crappy usb c cable to charge my laptop. The issue is that how "well" the cable works varies wildly and is not visually determinable. I can use this cable to charge my laptop...but how long will it take? That's the issue


> How about keyed connectors and sockets?

The only cable I can think of that had keying for capabilities was DVI, and that turned out awfully. So many times you had cables that could work, but the keying rejected them.

Keying is much worse than simply having cables of different speeds.

> Why have incompatible cables with identical form factors?

They're not incompatible.


Everything (recently) old is new again.


My mind is blown.


I vote shapes: wider = bigger pipe, the analogy works.

Plus I would delight in saying, "Yes, but higher power means bigger electrons..." to non-engineers.


white with a blue stripe or blue with a white stripe?


This article misses the biggest issue with usb-c cables, which a lot of people don't even realize. A lot of cheaper devices can only be charged with a usb-c to usb-a cable, and not a regular usb-c to usb-c cable (C2C). Basically, they don't support C2C but people don't test or check this until they travel and realize the cable they brought doesn't work on these devices.

The cause is that the devices themselves are incorrectly implemented. This ranges from toothbrushes, to flashlights (where 99% of usb-c flashlights can't do C2C), to shavers, and small electronics/appliances. The problem is there isn't a standard way to know about this (or a name for this problem), so it's hard to tell what supports it or not. Generally, devices made by well known brands support C2C, but not cheaper Chinese-made devices, but there are exceptions in both cases.


you literally need 2 5.1k resistors attached to your USB-C connector to get 5V@3A out of any PD compliant charger.

And yet, even the Raspberry Pi foundation messed this up when they designed the first revision of Raspberry Pi 4.

It's easy to say "USB-C is a mess", but we should really be shaming the individual companies who mess this up. The spec is public, parsable by any competent EE, and there's tons of vendor support for standards-compliant solutions.


My Sony noise cancelling headphones (WH-1000XM3) will only charge with an A2C cable, and not with a C2C cable. I had to purchase an extra charging cable in case I lost the one that came with it, despite owning multiple C2C cables already that should work fine.

If you're not going to make it compatible, why even put a USB-C port on it in the first place?


I've charged my 1000XM3s with:

- MacBook charger (USB-C / USB-C)

- Samsung phone charger (USB-C / USB-A)

- MacBook Thunderbolt dock cable (USB-C / USB-C)

- Dell XPS 13 charger (USB-C to Power)

- Razer Blade Stealth 13 charger (USB-C to Power)

- Anker Powerbank (USB-C to USB-C)

and some others. They all worked.

And this list pretty much shows why USB-C is so amazing even if it might confuse some people on the edges. I don't need to lug around a bunch of proprietary chargers and cables.


Woah, this is complete news to me! I have tried with my Macbook chargers (90W and 60W), an Apple USB-C cable, and an Anker USB-C cable and nothing will charge them. They are running the latest firmware too. The only thing that works is the original A-to-C cable, and a cheap one on Amazon that I bought as a backup....

Guess I need to do some more testing — thank you!!


I have the same headphones but do not have the same issue. I regularly charge using a C2C cable.


Same here. I’ve had no problems using my Apple USB-C charging hardware with my headphones.


I use my MacBook Pro charger with my xm3s regularly.

Charges super fast.


That’s not at all true, I charge mine with a c2c cable from my MacBook charger every single day.


They work with c2c charging. Your source is probably not playing nicely with power delivery. For instance, Apple's original usb-c chargers lacked certain standard profiles and don't work with non-Apple products.


The XM3s are well known to support C2C charging. It's possible you have a defective cable or headphone if it's not working for you.


I suspect it's your charger. Mine charge fine with C2C.


It’s a mess, but one that’s still an improvement on what came before.

I can now charge my phone, laptop, gaming system, VR headset, and USB battery pack with the same cable. Sooooo much nicer for trips.


Agree it’s a mess. I also agree it’s a big improvement. I plug in usb c to my monitor, and I get Ethernet, display, usb, and power. That’s all I ever wanted and I don’t think about it anymore.


Agreed. I don't mind that sometimes it takes longer to charge my device. The only real problem I would say is that charging only cables are not easily distinguished from charging + data cables, but that's a problem for all previous generations of small USB.


I think the world needs a gizmo with two USB-C terminals, one for each end of a USB-C cable, and LEDs that light for each feature supported by the cable.

For data, there could be a range of LED to display the throughput of the cable.

Because honestly, I have some cables and no idea what some of them can and cannot do.


I have seen a couple of those, but nothing very user-friendly. E.g. https://bit-trade-one.co.jp/adusbcim/


That's brilliant. Thank you! I would have never found that :-)

I wonder if software could do it, using two free ports on my laptop, a MacBook Pro.


Indeed it would be useful. This kind of cable tester device has existed for Ethernet for decades now.


I saw such a thing recently, I think on Hackaday. But I can't find it now. It was pretty much what you say. A sort of pass through connector for USB cabled which would light up a LED when the wire made continuity.


USB-C is a great system. Power delivery is incredible and works really well.

Having different capabilities in host, device, and cable is, of course, completely natural. I wouldn't want all my cables to be thick and expensive thunderbolt cables.


Thick? I never would have even thought to complain about them. Sure you’re not thinking of firewire? Also doesn’t thunderbolt literally use the same form as usb-c these days?

Edit: yup, thunderbolt 3 uses usb-c connectors.


Thunderbolt cables are super thick. And short too.

With my laptop I had a USB-C dock and then upgraded to a TB dock to get 4k video out. The USB-C cable was decently thick and about 1.5 meters. The TB cable was about 0.5 meters long and super thick. Apparently the length is based on physical constraints with transmitting data at very high rates. And obviously the more individual wires you have in the cable, the thicker it is. And you need extra layers of shielding etc.


FYI, you can easily buy 2m tb3 cables


FYI, Thunderbolt 3 does not support full speed above 0.5m in length in passive cables, so instead of 40Gb/s you’re getting 20Gb/s and reduced capabilities.


Yeah, they just cost like 70$+ and are rather thick which makes them very sensitive for any kind of carrying around. Regularly breaking a 70$ cable in a bag isn't very affordable :/


My last thunderbolt cable (something like 20 cm long) cost about ~$15 fwiw, had to buy one to get my e-gpu to work with my laptop.

To be fair I completely didn't get the thickness was discussing the cable itself, which I TBH don't even notice.


The super thin USB-C 2.0 cables are just ideal for charging. The thicker ones with the USB 3+ signaling are great for high data rates. Just pick the one you need for the job.


The high power levels also need thicker cables, a 100W USB-C cable needs to carry 5 amps or so.


There needs to be a standard for labeling cables though. Like putting the data speed and PD wattage on the cable end perhaps.


The charging is accepts combinations of voltages and current (Amps) and the cable limits just the current (3A or 5A), so a wattage label is irrelevant if the voltage and current are missing.


PD profiles are grouped by wattage and consumers shouldn't have to do mat to see how fast their phones will charge. Surely they could be classed by the fastest PD profile they support.


USB-C on a USB device(UFP-only) system is wonderful. It's quite simple--two resistors--that gets you 5V at 900mA and USB 2.0 High Speed (480Mbps). Or, if you'd like 5V at 1.5A or 3.0A--you generally need a very small and simple chip to handle the CC monitoring.

That's it.

The problem starts occurring when people want to do a zillion things on the port--charge their phone, charge a device, communicate with a device, serve as a device, transfer at Gbps rates, etc.

Remember OTG--oh, yeah, nobody does because it worked like crap. USB-C actually did what OTG was trying to do.

And, the problem mostly isn't USB-C, it's manufacturers:

> All the new port has done is push some components out of the laptop and onto the other end of the cable.

Exactly. And it's the reason I STILL can't get a useful USB-C hub.

Samsung could have avoided the problem with PPS with <$10 of components, instead they pushed those out. Samsung doesn't even supply the "good" charger by default with their device. Don't complain about USB-C because the manufacturers are doing weird things with it to cheapen their bill of materials.

> USB-C audio is basically dead as Bluetooth takes over.

Until latency matters. Then suddenly Bluetooth is a pile of suck. If you're watching a video (the device can delay the video to match latency) or listening to music, fine.

If you're watching something live or you are filming something live, then the latency is maddening.

And that doesn't even start to deal with people who want some sort of audio app. There is a reason why "rhythm" games simply don't exist on mobile.


OTG worked well enough, when I was able to get the right cable (or device - there are some USB flash drives with micro-USB ports that lean on USB OTG to work).

USB-C docking stations are finally hitting the market. They're pricey, but there are some with multiple USB-A ports, USB-C ports (including data), HDMI, DisplayPort, Gig Ethernet, and 3.5mm audio jacks. There are also finally Thunderbolt docks which have thunderbolt ports that connect via USB-C. (They are even more expensive though.)

> There is a reason why "rhythm" games simply don't exist on mobile.

Rhythm games' heyday aka DDR and Guitar Hero may have passed, but there are plenty. https://my-best.net/16982 There are tons of mobile audio apps as well. Korg has a bunch of them. https://www.korg.com/us/products/software/


> Korg has a bunch of them. https://www.korg.com/us/products/software/

No Android and no bluetooth for exactly the reason I mention.

Korg actually has a DAW-lite for the Nintendo Switch but not Android precisely because you can manage audio latency on the Switch but can't on Android.


> the problem mostly isn't USB-C, it's manufacturers

So can't we complain about the joint results of the manufacturers AND the USB consortium and thus label USB a failure (or a success if someday they overcome this)? I think this is turning into a true-Scotsman discussion.


As for the standard, you say this like there is some pure, platonic communication standard that is somehow magically better somewhere out there.

Everything that exists in USB-C is there because someone really wanted it there.

Would you prefer to have 14 incompatible charging standards? That's what you had before USB PD. Now, my 45W USB-C GaN charger is the size of a deck of cards and can pretty much fast charge anything I plug it into (laptop, phone, tablet, etc.).

And you can complain about video, but I've actually had HDMI cables that don't work because I can't tell whether they support HDMI 1.2/1.4/2.0/2.0b/2.1, ethernet, 50Hz, etc.--so it's not like HDMI is magically better. And for years I've been unable to figure out why full-size Displayport cables all seem to magically work while 90% of mini-Displayport cables fail horribly.

And manufacturers, and not the USB Consortium, are absolutely to blame for the fact that they used to include 4 USB A ports but somehow can only cough up 2 USB-C ports on even the most expensive laptops. And somehow they screw even that up such that one port overheats your computer.


USB-C should’ve been the premium cable- high power charging and fast data transfer guaranteed. USB-A should’ve been used as the cheap cable for simple connectivity and lower power charging. Devices would naturally sort themselves out over time depending on their price bracket and the volume savings over time of USB-C production.

Instead, the Forum destroyed the entire purpose of USB-C from the getgo by allowing the “universal” cable to be disintegrated for cost-cutting purposes. Now it’s no longer universal. How were they this dumb?


I half agree.

USB-C should have had clear marking standards for TWO cables (at introduction).

CHEEP, Dirt cheep, USB 2.0 baseline, minimal PD (compat with the best USB 2.0 ports) plain cable.

Full Feature, USB-C premium (with a single color band and trademarked logo in color signifying a 'tested run' of product).

Newer versions should have then embellished on that logo.


So this is something I don't get about cable production. Why are companies selling cheap cables with just some wires connected, but not slightly more expensive ones with all wires? For example if there's a $10 charging-only cable, why isn't there a $15 fully-connected one - instead you have to go with more expensive brand usually at let's say $40+.

Most of the cable is already there, production is happening, distribution is happening... so why is it not used for more products?


Firstly because there's no market for it. The bulk of all USB-C devices out there are never connected to anything but a charger. Ever. When was the last time you cabled your phone to a PC or a USB client device? It works, sure. But statistically, no one does it anymore. So the market for "slightly more expensive everything connected" cables is tiny compared to the market for cables we actually want.

Secondly because the product sucks. A USB1/2 cable had four conductors. A proper USB-C cable requires NINE (with significantly higher signal quality requirements on the USB-3 data line to boot). They are thicker, stiffer, harder to connect, harder to stuff into a purse, and generally a pain to work with compared with the charging cables we actually want.

The root cause is this: USB3 was, in hindsight, a mistake. They squished a completely new electronic bus into an old connector for no good reason. USB-C compounded that mistake by defining a superset connector that does "Everything A USB Device Has Every Been Asked To Do", and then selling itself into a market (phones) where no one wanted that junk.

And this is the result.


Because between the $2 cheap charging cable and the $40 do-all cable there are many variations; the $2 cheap cable can be limited to USB2 speeds (480 Mbps) and 3 Amp charging, but when you go for better you can pick and combine more charging current (5A), better data speed (5 Gbps or 10 Gbps) or both. Depending on what you want, current requires thicker wires but data speed requires better electrical quality, putting everything together is more expensive. Still, you can find good cables at $15, but not in the Apple store or on Amazon, their costs and margins make the $10 cable from China retail for over $25, even $40.


Remember there are companies making USB cables with the data lines omitted. If you can save a few cents without the customer immediately noticing, it will be done.


If it bothers you that USB-C is still a mess, but want USB-C to succeed, I suggest putting your money where your mouth is and only support manufacturers that follow the rules?


Good luck figuring out who follows the rules.


There's no shortage of reviews for products that confirm or deny spec compliance. If you can't find it easily, maybe it's worth waiting to purchase that product..


Reviews generally just say "works for me" or "didn't work for me" and you have no idea if the problem was at one end of the cable or the other, or the cable, or the reviewer.


I think these articles usually ignore the fact that in most cases USB-C is a strict improvement over previous state of having a bunch of proprietary ports with proprietary cables and chargers.

When it fails, even in most cases, it fails in a way that you still get some result - your device still charges, your data still transfers. More slowly, but it still works even if you need to grab a suboptimal cable.

In all the other cases, you're pretty much in the same (or better) situation as you'd be with specialized proprietary port - you need a special cable (e.g. Thunderbolt 3 for fast transfers or 4k@60Hz image) which then ends up being compatible with slower and less capable devices anyway.


Messier yet is iPad Pro using USB-C while iPhone 11 Pro ships with a lightning cable that's USB-C on the other end!


So what happens when you connect the two together? Can you charge the iphone from the ipad, or transfer files?


I don’t believe you can transfer files, but you can definitely charge an iPhone with an iPad if you use a USB-C to Lightning cable.


But how do you charge an iPad using an iPhone?


Some older usb-c devices would start charging based on order of plugging in. i.e. plug the phone side in first and the phone will start charging your iPad. Seems the defaults are smarter now and phones always prefer to receive power.


Charge yes, transfer probably not.


eh, USB-C is a mess, but it's got a messy problem.

I've found it to be a solid improvement. I would still be delighted to swap my thicket of USB cables, and a bizarre assortment of sockets over the past fifteen years, for a world of USB-A and USB-C.


I think the articles like these miss the fact that even when USB-C "fails", in vast majority of use-cases it fails in a way that you still get your result - just not optimally.

Charging my phone with the wrong USB-C cable will still charge it when I need it, although very slowly. But I will end up with a usable phone. Not having a proprietary iPhone cable means my iPhone is a dead brick even if I have another device with USB port next to it.


There were only ever 4 sockets. Rectangular USB-A for host devices and the weird modified square/rectangle for client devices (printers, docks). Micro-USB for tiny client devices, which pretty much completely replaced mini-USB.

One of the factors in the USB-C switch was that what were traditionally client devices, like phones, could now also be hosts. So now USB-C devices need resistors connected to a CC line to identify as host (power source) or client (power sink).

Also, there were good reasons for having multiple sockets. USB-C might be rated for 10k or whatever insertions, but it is quite a bit weaker than USB-A. The ports wear out and are not as resistant to stress since they are smaller and shorter. I'm pretty sure the USB-C port on my laptop died from just plugging & unplugging a dock for 2 years. Even if it wasn't completely dead, it no longer makes a secure connection. Of course USB-A wears out too, but not as much.


There were 10 receptacles/9 plugs[1] before USB-C. Of those 19 total, the only ones I've never seen in the wild are the USB Mini-A receptacle and the USB Micro-AB receptacle.

Receptacles: USB A, USB 3.0 A SS, USB B, USB 3.0 B SS, USB Mini-A, USB Mini-AB, USB Mini-B, USB Micro-AB, USB Micro-B, USB 3.0 Micro-B SS

Plugs: USB A, USB 3.0 A SS, USB B, USB 3.0 B SS, USB Mini-A, USB Mini-B, USB Micro-A, USB Micro-B, USB 3.0 Micro-B

[1] https://en.wikipedia.org/wiki/USB_hardware#Host_and_device_i...


By there were only ever four sockets, I meant in practical terms, not theoretically. Also, Samsung (and a very small number of other androids) had micro USB 3 for a year or two, but it worked with normal micro USB cables and they went back to micro USB shortly afterwards.


> USB-C might be rated for 10k or whatever insertions, but it is quite a bit weaker than USB-A.

Is it? My sense is that USB-A ports just don't see many insertion cycles. If you go to say, a library PC where the USB-A ports are seeing a dozen insertions per day, they lose any grip or friction after a couple years.


When I've thought about connecting all my devices (laptop, cellphone, Nintendo Switch) and then researching a bit about it, I've deleted that thought out of fear.

USB-C is a total mess, I might burn my phone if the voltage is higher than required. The only thing I know is that my laptop won't be affected because I've plugged it into the phone charger just a couple of times by mistake.

Also, beware of the Nintendo Switch, it seems that the charger spec was modified a bit by Nintendo and you might damage your console if the original parts aren't used.


Nothing personal against you, but I think you're going off misleading information. It's not helpful to say "I'm afraid [blah] might happen" when [blah] is simply not happening in any real way, as much as plugging a kitchen appliance into an electrical socket might cause it to explode.

If you do a bit more research, you'll find the problem is basically gone, and you're talking about a very specific situation that required multiple manufacturer errors.


IIRC the only problem with the Switch has been third party docks.

I randomly interchange like five different devices with the same cables/ports (Macbook, Switch, Pixel 3, Quest, Battery pack) and have encountered no issues so far. A huge improvement compared to what I used to have to deal with.


Ironically, just before usb-c started taking over, all of my devices used the same cable for charging. Headphones, walkie talkies, portable batteries, headlight for my bike, Kindles, and my phones. All used the same cable. Was nice.

Now, our phones and switches user usb-c. Everything else is still the older cables.


Yeah, micro USB was okay for that for lower power devices. The major two flaws were that it couldn’t handle bigger stuff like laptops, and the cable end itself was prone to failure (I’ve had way fewer problems with USB-C cables failing).


> I might burn my phone if the voltage is higher than required

Very unlikely. The standard might be a mess but the device signals what voltage it needs and the charger delivers. You don't need to live in fear that plugging you phone charger into your laptop is going to destroy it.

Your Switch can be charged by any of your USB-C chargers. The only issue is docking, it is not related to USB-C charging.


Assuming you trust that the negotiation is implemented correctly...


If you have a hostile charger all bets are off the table, but the cheap USB-C chargers just do 5v/3A (or less current) and the USB-PD chargers all use one of a few controller chips.


> Assuming you trust that the negotiation is implemented correctly...

Well, if you use a proprietary plug and charger, you need to assume the same thing. A cheap broken iPhone charger with proprietary dedicated Lightning port will fry it the same way as a cheap broken USB-C charger will fry a USB-C phone.


Same thing with microUSB though, Qualcomm QuickCharge chargers (for Android phones) can already fry your phone with 12 V


I have a Nintendo Switch (2nd gen). It can charge from its own charger, the Pixel 3a charger with Pixel 3a USB-C cable, and the 3x USB A wall-wart from IKEA with the IKEA USB-A to USB-C cable.

The Switch charger works fine with a DJI Ronin S and the Pixel 3a.

Don't have any experience with laptop USB-C yet as I'm holding on to my 2015 MBP until its last breath...

As for the Switch itself, the most annoying thing is that it requires an active HDMI adapter chip (there are a couple working ones on Amazon that use the same chipset as the official dock) instead of supporting HDMI/DP alternate mode.


Looks like the good ol' EU has to step in again and tell Apple, Huawei, Google and co to stop their bullshit, just like they did before mini-USB was forced to become the norm.


The author is saying linear regulators are better for high current which is questionable. He even references huawei but Huawei chargers don't even use linears. One Plus makes sense because their warp charge is ridiculous, they're doing 5V/6A which isnt even possible using USB PD protocol and theyre using a type a to c cable so clearly doing something proprietary there.

If i had to guess why oneplus can't step down higher voltages, it's probably specifically designed for 5V so it only supports the 5V/3A usb PD profile (hence the ~15W with a PD charger). No clue why they didn't allow 5V/5A for 25W over USB PD since that is possible. I don't think the author mentions anywhere in his article which 60W usb PD charger was used, so its hard to say that it is a valid test. 60W normally indicates 20V/3A, so it's possible the charger could not provide 5V/5A if the oneplus 8 pro did support that.

Anyway, huawei's own PD charger can charge faster over the PD protocol than this guys "60W PD Charger" according to https://www.firstxw.com/view/230511.html.


I think USB4 will fix some of this confusion because it will require Thunderbolt cables, which do everything.

Hopefully they require some sort of labeling to identify them.


There's an excellent sub for usb-c hardware: https://www.reddit.com/r/UsbCHardware/


Is power charging speed a parameter in the spec? I thought it was simply max current delivery, but if the charger can't handle that then it sounds like a charge spec and not a USB issue? Where's the line?


It's part of the USB-PD specification. Some manufacturers still add proprietary extensions though, but in most cases the devices can still fast charge in other chargers.


Let's put (2021) in the title please. This is not going to be resolved and it is intentionally, USB-C is universal form factor, not in universal functionality.


Can the latest usb be "defanged" to only provide power and no data, as a security measure? Or are they too smart for that, now?


You can remove USB communication between the hosts with a cable that doesn’t have any of the USB data lines, but as far as I know the lines for power delivery are required and there is still communication between the PD controllers.

You can use a C to A cable (without the data lines, or plugged into yet another adapter that removes the data lines) but that will only work for phones or other low-voltage devices as you can’t get more than 5V out of it.


Thanks for the info.


USB-C connector may have solved the problem with USB-A of which side is up, but as a consumer, I don't even want to do that - the 3.5mm audio connector doesn't have an 'up' or 'down', and given the smarts that USB-C requires in devices, it seems a barrel connector like the audio jack could be used for power and usb 2.0-speed data.


The article mentions lack of ports being a problem. I certainly would be okay with giving up my audio jack for a phone with two USB C ports.

But also, this is a problem on laptops. The newest Dell XPS 13" actually decreased the number of USB C ports from 3 to 2. So now you use one port for charging, and your laptop only has a single USB port.


> The newest Dell XPS 13" actually decreased the number of USB C ports from 3 to 2. So now you use one port for charging, and your laptop only has a single USB port.

Frankly, the consumer is the idiot for mindlessly buying and not complaining about reduced functionality. Same with the newer tablets not having headphone jacks.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: