For most of computing history, I was generally under the impression that cables were "dumb" -- each pin connects to a wire which connects to a pin on the other end. There's fancy bundling and twisting and whatever involved, but it's still ultimately just conductive wires.
When did cables start getting chips, and why? Did Apple start it or somebody else? Is it solely to try to prevent third-party manufacturers? Is it for the cable to announce to the port that it supports certain specifications of power or data? (But why would that require a chip instead of just some kind of "dumb" extra pin that has some basic electrical property that can be read?) Is it something else?
I mean I understand why certain dongles have chips, because they're connecting between sets of pins that aren't 1-1 or even in the same data format at all. Or why the same might be true for USB-C to Lightning.
But for cables to go from "dumb" to "smart" seems like it kind of breaks all expectations of what a cable even does, and therefore how a consumer will even know what to purchase -- which, of course, has famously been a HUGE issue with USB-C.
Would it be better for us to go back to dumb cables without chips? Or are there good reasons why this is the future, where at some point we'll expect all computer cables to have them?
USB-C _does_ require e-marked cables in certain cases where passing large amounts of power over a thin cable could be dangerous. I don't believe these chips are proprietary nor expensive - just marking silicon that replaced the old resistor system that was somewhat brittle and often wrong.
For Lightning, yeah, there's basically no reason except for the 'Made For iPhone' program, and it isn't even effective at that since Chinese cable makers just clone the keys of the official stuff.
40Gbit DAC is even more annoying because unless the cable is a bit more active, it's essentially 4 10gbit DAC in one sheath.
Second, yes, some does require active cables with DAC. which does not stand for "cable" like the other comment says. DAC is direct attach copper (as opposed to optical). Without an active cable you're limited to 5 meters, although you can risk it and get a 7 meter passive cable. usually, 7m is already active, and 10m+ is guaranteed to be active. The reason you can even do 10G over copper is because it's cat6. Have you seen that cable? It's twinax - more like the cable from your TV antenna. That's not viable for a phone.
What you are talking about is 10G base-T. That does not need active cabling. but again, you can't compare that to a tiny thin lightning cable.
If you're considering downvoting them, there's news articles about when Apple caved and allowed 3rd party cables.
I have more questions than anything else.
Cables with electronics in them are useful for protections like this, and to facilitate safe interoperability of multiple voltage and current sources and sinks like on USB Type-C. The question becomes, is the protocol implemented in a simple, open consumer friendly way or is it implemented with other nasty antifeatures.
call this a “safety in depth” approach
Though likely there's something in the phone _as well_ as the cable.
Guarantee there was a consumer study where they compared reactions between these scenario.
The world is unfortunately not very efficient.
Many people in IT think, that data is just 1s and 0s. But in reality we are living in a physical world and as such some form of power is used to transmit those 1s and 0s. This power may is electrical power in terms of voltages with defined generated current or in terms of current with defined voltages (20mA communication heavily seen in industrial equipment). If you have a slow communication link, EMI and other environmental thinks like temperature is not much of an issue. But as faster you like to communicate you need to cope with those things. The electrical contacts in the connectors doesn't help. So people started to put the transceivers into the cables. Those transceivers with additional electronics like ringing suppression circuitry and other "magic" are then best fitted for the desired cables for the desired environments. Then you started to makes those cables smart like to detect the capabilities of the cables like is it really possible to talk at 10Gbps or is it more a cheaper type where 100Mbps is the max. As more smarter those cables get people get creative to put other things in it like "hey, the cable could become a converter as well".
What shielding does not help as well is contacting problems because of corrosion or dirt. We are talking about a consumer product and not a well designed any lay out wiring by some system integrator. Consumers are doing all kind of bad things to connectors and will complain if it does not work.
I know, I know, usb cables don't need a chip inside, why does lightning. because usb cables are a useless random mess. lightning cables are not - they run at the guaranteed speed, they charge at the guaranteed watts, and they have no issues with shorts despite having contacts exposed and reversible, unlike usb. and usb-C? well, there are active cables available for that, which tackle the issues described with a chip in the cable. just like lightning does.
Considering the main use case (almost exclusive use case) for lightning is for charging and nothing else, I wonder why they have to be so complicated? Shouldn't charging cables work with Vcc/Gnd only wired and a dumb power 5V/2A source? Why is my phone doing some complex detection/negotiation of cables/chargers when I plug it in? Why is a charger considered an "acceessory"?
Tell that the people, who want to connect their professional microphone to make TikToks. Tell that the people, who want to connect a stick drive to offload their videos. Tell that the people who provide different accessories.
Lighting has evolved from the 30-pin Dock connector. The 30-pin Dock connector was the idea to combine charging, USB, and some analog stuff into one single connector. At that time USB provided only 5V and 200mA. Apple was one of the other who improved USB charging by allowing more power. But for the sake of compatibility, you cannot just do this, you need to negotiate it. It is like every USB up to USB3 starts in USB1.1 to negotiate the possibilities.
So, Lighting was not developed out of thin air. It was developed with a history, like many many other communication protocols and standards. Even as Apple is known for cutting of old ties easily, if you look really deep into the details, you will always find compatibility to old out-dated technologies. Its like you still find the good old 8051-core in many things, like even dead simple and cheap charge controllers.
You could argue you that you don't need any of those features, but Apple disagrees and that's why lightning is not just a 5V DC jack.
Yes I understand the reasoning behind it, but I don't understand why it's not seen as a much more important feature to be able to load from anything anywhere so long as the +5V and GND pins are working, regardless of what power source and cable it is.
That there is complexity when the lightning cable and connector is used as a data transfer cable I understand. I don't understand why the charger mode isn't dumb as rocks.
I have a lot of cheap chargers, and they give out all sorts of crazy stuff. One drops the voltage to like 3.5 volts with 1 volt of ripple under load. Another shuts down the port completely if you draw >.6A.
It is really hard to tell how much current can one draw safety.
That doesn’t mean you can remove the microphone.
Optimising the common case in this instance will remove all other possibilities.
This is also tied to fast charging (and whether the device will support it)
That’s at least my understanding from the document.
My theory is that it started as the former and then they realised it would seem less suspicious and more friendly to also use it for the latter.
It's far more likely that the purpose of MFi is to make sure that any officially licensed accessories aren't pieces of shit that Apple will cop the blame for. Most users aren't going to blame the cable manufacturer when their phone won't charge properly, since the vast majority of people have no idea about the complexities of modern charging protocols, so they're going to mistakenly think the phone is at fault and blame Apple.
That makes me even more angry about the ridiculous price of these accessories
With the advent of computing, and connecting teletypes to computers, it was often used to trigger an NMI. Multics and the rainbow books cited it as a way to make sure you were talking to the “real computer and not some program impersonating your computer (since it wasn’t an actual character there was no way for a normal program to even see it much less generate it. The terminal controller/channel controller could see it though and notify the monitor (what we call the “kernel” these days).
What you're implying is historical is still current.
I haven’t designed an embedded device or serial protocol that used break since, well, ever. That’s why I was tickled to see this.
Edit: just saw your uname — my use of the term “tickled” was coincidental
Apple seems to like using 1-wire buses; here's another application of one: http://www.righto.com/2013/06/teardown-and-exploration-of-ma...
It's testament to the noteworthiness of both Apple and this article that here we are talking about it all - about efforts cobbled together from other parts.
> To enable full current, 0x74 request must be issued by Tristar and processed by HiFive. For SecureROM/iBoot that's enough
I wonder if that means, in a pinch, you could turn off the device to use an uncertified charger that iOS would block.
When traveling overseas a cheap charger died, partially frying my lightning port. The phone completely refused to charge and I grew more and more desperate as the battery slowly drained over the next day before eventually dying. Now trying again to charge the dead phone had a surprising result: it actually charged enough to boot up! Ridiculously: it then stopped charging once booted and drained again in about 2 minutes.
The solution I developed was to plug it in _then_ power it off and it would charge (slowly) while completely powered down. Removing power for even a moment would boot the phone and stop charging.
I'd absolutely guess that, in a pinch, you could charge your modern iPhone with a non-working cable if you did the same procedure: plug in, power off, let charge while off.
Or you charge it wirelessly (Qi charging).
I kinda like USB-C.
I now half expect Apple's future mobile devices to not have any ports, will just use wireless charging.
Apple AirPods are already pretty amazing. I haven't used any plugin headphones 6+ months. Not even my beloved Shure; which sound fantastic, but hot damn I hate cables.
Wonder where my pocket lint will accumulate if that happens.
Apple could make a better cable, but it's not worth it to do so.
What's more likely is that the quality doesn't matter. Lots of people only care about the official apple brand, so apple can get away with selling an interior product for more money.
But here's the thing: Apple can't go selling premium cables and including cheapy ones in their phone boxes. The Apple brand has to be viewed as premium all the time, so they put a mediocre cable both in the box and sold on their website.
Whereas Anker has distinguished itself as a company that makes quality cables... by selling quality cables.
Edit: I think a lot of people just naturally don't pay much attention to what they're doing to their cables. I used to work at a company where I was pretty much the only one whose MBP power cable wasn't visibly fraying at the ends, and I'm still using the same one 5 years later.
I like to open the one on my building, and have a peek to see if there's something interesting. I like repairing (often just resetting) discarded electronics and donating them to people who need them.
Anyway, I find loads of broken Apple cables, but very few plain USB cables.
I am guessing usage patterns will explain the difference.
Not using the phone while charging negates this, as does having the cable flat on a table when using the macbook (as opposed to using and charging with the macbook on your lap, where the cable rops down immediately).
This, combined with poor quality products explains the difference in my opinion.
The only cable that frayed on me was a MagSafe 2, next to the computer connector. The cable is ridiculously thin. It was the adapter for my 15 MBP. In comparison, I'm still using the original adapter of my 2008 MBP and apart from being scratched and dirty, it's still as good as new, cable and all.
What I think happens often is that the cable will be under some kind of rotational stress. I've noticed I have a tendency to always turn the phone the same way, so the cable has a tendency to turn around its axis. It's something I see especially often on corded office phones. Also, cables are often at very tight angles close to the connector, so there beign next to no relief ends up tearing the housing.
I've also seen many people pull on the cable instead of the connector to uplug it. That can't be to good for longevity either.
That said, I travel plenty, and I shove cables into my bag like anyone else. I can't think of a single cable for a single product (Apple or non-Apple) that I've destroyed. Some peoples' cables look like they get slammed in a car door twice a day every day.
this is me and i replace these regularly
I'm pretty sure it's from heat damage from the power. Not mechanical strain. The heat slowly weakens the rubber material to the point of failure.
I have heard somewhere post iPhone 8 there is a "updated" version of Lightning Cable that is suppose to be more durable. But I have no way to fact check this.
Then again, i use my cables for charging, and i never use my devices while charging, so the bending/pulling that destroys most cables never occurs with mine.
One thing that does destroy cables though is a 2 year old kid with a fetish for sucking on the lightning end of the cables. My youngest kid did that for 6 months or so, and every cable he sucked on has a visibly corroded 3rd pin.
My partner is always losing cables and ordering the cheapest ones on Amazon, and the thing I hate most about them is that they always make the whole room smell like I dunked my head in a bucket of kerosene. She ordered a chair that had the same issue (Wirecutter approved!) and a jump rope, and who knows what else. Am I just imagining it? I seriously miss the days when I could expect the products I buy to not smell like you'll get cancer just from looking at them wrong.
Most people barely ever use data over a Lightning cable any more (pretty much only for dongles). Charging's pretty much the only thing most people need.
What cheap cables have you seen that don't charge? This is the first I've heard of it.
I work with steel though, so I’m rough as fuck though.
They work fine, as do the slightly more expensive ones from the discount stores around these parts.
What does "diagonal direction" mean here?
edit: funny to see all that we all replied at the same time. Just to add to the conversation, "diagonal reading" is not just a Russian idiom, it also exists in French :)
I guess Russian and French people start top left and stop bottom right, while Germans start top center and end bottom center ... (at least that's what I do).
It seems necessary to point out here that if you fry your iOS device's circuitry experimenting with this stuff, when they capture the device to analyze what happened, you will likely end up paying for those repairs.
For example: Since there's a process for altering on-connector data, exercising that process could break a device in a way that disqualifies it for warranty repair.
And some, like the “chimp” cables mentioned later, do actually care which direction you plug them in :)
And is it criminal violation, or just civil? And if civil, if the violation happens overseas, not much can be done in practice, right? I don't know much about American IP law.
Second of all, it is reversible (?)
It's inferior to USB-C in at least two ways:
1. The part that wears out is inside the phone in Lightning and inside the cable in USB-C
2. Lightning is for the most part USB 2.0 capable (though there is a newer USB 3.0 version - unsure how many devices/accessories support that).
However, where USB-C is really indisputably better than Lightning is that it has 3x the amount of pins (24 in USB-C vs 8 in Lightning), which will always mean that USB-C will be able to outperform Lightning significantly.
I had a Macbook where the power cord got stepped on, the cable head turned 45º or so and continued to function with no port damage whatsoever.
But cheap USB-C connectors are crap. And USB-C connectors that are mounted on the motherboard of the device are a sin (Apple don’t do this - well done Apple!)
Lightning can do USB 3.0 / 5Gbps, and I could see it possibly support 10Gbps with USB 3.2 2x1. But the problem is the first one operate at 2.5Ghz range which is known to course interference with WiFI and USB 3.2 is operating at close to 5Ghz.
I think on iPhone the problem may be bigger due to how closely packed they are. ( But I could be wrong )
That may indeed by Apple's priority, but it's not mine.
Can't speak to USB-C, but this is definitely a problem with lighting. I've had two phones usimply stop charging, presumably because the internal connector had worn out. Battery was fine, but there was no way to charge it.
You can try with one of those dental picks that looks like a tiny pine tree. But, the last time I cleaned mine I needed a metal pick and a magnifying glass. That stuff really gets compacted. Be careful of the internal pins!
Socket wear is definitely a design failure. Hoping it holds up until I end up replacing my current leaves-the-house phone.
The socket is clean, lint was the first thing I checked when it started happening. The uneven wear on the socket sheath is clearly visible under a magnifier. (This is a 6S.)
Has this been a problem in practice? We should have enough data now (over 7 years since release) to know whether this is the case.
Of course this was also back when we made such a deal of turnaround time we used to call a taxi courier to deliver warranty units back to corporate customers with the higher end service plans. I'm not sure that you can get this kind of service any more with any vendor without using some small local VAR. I remember vehemently apologizing to electrical utility linemen that we didn't have a part in stock and so they wouldn't get their T50 back until we got the replacement in the next morning.
Perhaps part of this is compensated for by improved longevity in newer cables, but cycle life is rarely the problem (except perhaps on the iffier miniUSB), instead it's dropped/kicked/dragged devices. In general I am pretty frustrated with the decrease in serviceability of devices but this one is especially irritating, since like most people I've had a lot of devices where the charging port failed before everything else. USB-C feels like it represents a step back in a lot of ways because it's basically put the kibosh on magnetic charging interfaces (even more so than patents).
This may of course be that the there is too great a variance in the design and manufacture of the male USB – C plugs leading to insecure connections… but either way, it’s worse than any other USB standard (even micro) I’ve used, and much worse than Lightning in this regard, in my experience
The ports on the new one seem more stable, though. Maybe they've iterated and improved it - think about how much time they had to optimise USB A.
Mine too. The cables no longer clip in, and fall out without good resistance. This appears to be a problem unique to the MacBook though - the cables are fine and my other devices that should experience more wear on the ports are also fine.
I've slowly cycled through the good ports and I've only got one good one left now!
How is this mitigated in usb-c? As in how do they prevent the inside of usb-c from wearing out?
USB-C is sort of like "inverted lightning". The inside bit on the host looks very much like a lightning cable does.
USB-C is reversible and easy to plug in:
they're both easy to the point where comparing is inane
lightning is much better when you are plugging it in in the dark(for example).
Personally I’m glad Apple picked a robust, reversible connector and stuck with it.
I like lightning connector, but it too has it's silly hacks. The HDMI dongles are an amazing level of silliness, down to a slimmed down iOS image being copied to the dongle and run there to enable a h264 decoder to run on the dongle itself. It works, kinda sorta.
There's been some discussion here on HN: https://news.ycombinator.com/item?id=20554462
This is strange because... one of the big complaints about Apple's Mac computers is they only have USB-C. They ship Macs and iPads with USB-C.
Apple was also a significant contributing member to the USB-C spec/working group.
The point is GP stated any use would be unlike Apple. I quote: “We'll probably never see USB-C from Apple”.
That’s what people are downvoting, a plainly disapproval assertion.
Your ‘overuse’ crack doesn’t jibe well with the suggestion of NIH syndrome; USB-C is an industry standard and all vendors should be and are adopting it. Lightning was invented before USB-C and solved issues that standards of the day (2010), including the various USB connectors, did not for another 5 to 6 years.
HN users have a tendency to downvote things that are easily unproven.
The context is the stuff that uses lightning. Taking it so literally, to the point of downvoting, is definitely not in the spirit of "strongest plausible interpretation".
So, sure, the ipad pro has USB-C, and that's worth pointing out as a minor note. But the reason iPhone doesn't use it is in very large part a combination of stubbornness from being first and NIH.
> Your ‘overuse’ crack doesn’t jibe well with the suggestion of NIH syndrome; USB-C is an industry standard and all vendors should be and are adopting it.
All laptops should have multiple USB-C ports. But getting rid of every single type A port is putting too much emphasis on C.
> Lightning was invented before USB-C and solved issues that standards of the day (2010), including the various USB connectors, did not for another 5 to 6 years.
Are you comparing the invention date of lightning to the widespread adoption of USB-C? Because USB-C was announced in 2013 and almost certainly qualifies as 'invented' in 2012 if not earlier. And there were devices sporting it in 2014, only two years after lightning devices.
So the iPad Pro's USB-C port in its current and previous generations immediately refutes GP.
> Taking it so literally, to the point of downvoting, is definitely not in the spirit of "strongest plausible interpretation".
Discussion about down votes is also in contradiction with the HN guidelines. I prefer we didn't start quoting the guidelines, though; that only detracts further.
Making an assertion in contradiction with the truth isn’t an opinion, it’s just flat out false. People can react how best they see fit, and some have chosen to downvote, possibly because they think a false assertion doesn’t meaningfully contribute; others like me prefer to leave a comment.
But complaining about downvotes isn’t a useful contribution either. I just felt, again, like commenting.
Point is GP was wrong. Value judgements about how many ports Apple puts in its laptops is entirely superfluous to that point.
In my experience many of the people who have an eternally tight clutch on their pearls where Apple is concerned barely actually know anything about the company's product lineup.
Just gonna ignore that I already addressed that?
Yeah, there is one exception in the product stack after many years.
But they're still sticking super hard to lightning, making everything more complicated than it needs to be.
> But complaining about downvotes isn’t a useful contribution either. I just felt, again, like commenting.
That was just an afterthought in my post, to show how extremely people were interpreting the post, despite there being a very valid point if you read it at 80% intensity instead of 100% intensity. The point of my post was valid discussion of Apple.