
(Presumably) Apple engineer comment on Lightning AV adapter design - dmishe
https://www.panic.com/blog/2013/03/the-lightning-digital-av-adapter-surprise/#comment-16841
======
noonespecial
There’s no shell in the image, there’s no utilities (analogous to what we used
to call the “BSD Subsystem” in Mac OS X). It boots straight into a daemon
designed to accept incoming data...(and the fact that the firmware is stored
in RAM rather then ROM)...

But there could be. In fact, it _could_ do almost anything as its a powerful
little computer in its own right and it sounds an awful lot like it boots the
kernel and then runs a daemon in a most familiar fashion. I can think of a few
delightful applications off hand. There will be some epic hacks. Is there like
a "Bunnie" signal?

~~~
nirvana
A little thought experiment... some clever dudes manage to reverse engineer
enough to "jailbreak" this, and then they put a little OS image on it and
start hacking away... then a bunch of people say "hey, this is really cool!
It's like Apple made a little raspberry pi for us!" but then even more say
"but it's so limited, it doesn't have USB out, etc, etc."

And then dozens more just go on and on about how Apple "crippled" the device
by not giving it USB and how this "proves" Apple just wants "control" and why
did they have to jailbreak it anyway?

In the process of getting their panties all bundled up they never realize
they're bitching about an _adapter_ not being a general computing platform.

They're also proving Apple right-- it's engineered to solve a specific problem
and provide specific functionality. Even if it were jailbroken from the
factory, people would be complaining and demanding that it does other
things... than what it was designed to do.

~~~
lobster_johnson
You're extrapolating everything into a very silly future-tense strawman that
won't ever exist. _Nobody_ will be _complaining_ that Apple didn't put USB in
a proprietary SoC intended for video transfer.

------
buro9
It's a strange day when we move from cables that work perfectly well to cables
that need to be software upgradeable to work as well.

~~~
coldtea
> _It's a strange day when we move from cables that work perfectly well to
> cables that need to be software upgradeable to work as well._

Yes, it's a strange day when we move from "cables that work perfectly well BUT
the device with the ports they connect to has to be replaced whenever a new
technology cames along" to "cables that need to be software upgradable but are
far more future proof and capable all the while freeing the device from those
concerns".

~~~
buro9
That is a problem entirely of our own making.

The stereo beside me has components that span 30 years and uses the same
cabling for all components.

I realise someone will come back with "Oh, but DRM, encryption, needing more
information about the source and destination"... but I'd counter by pointing
out that telecoms cabling works perfectly well for transporting all kinds of
things across it. You can put into protocols the things that you need, without
having to create hardware problems that require constantly replaced and
software upgradeable cabling.

~~~
scott_karana
While I sympathize, this isn't just about A/V standards.

This allows for absolutely ANYTHING, barring bandwidth concerns.

Ubiquitous body computers with skin access ports? We can make an interface for
that.

"Quantum broadcast" antenna technology? We can make a dongle for that.

~~~
buro9
Which is _precisely_ why I gave the example of cabling for telecommunications.

~~~
tnicks
I find your example lacking in perspective. In telecom, even under TDM
standards like SDH, SONET, etc., there were a plethora of incompatible
connectors. You can have rj45, rj12, BNC (literally dozens of variations). In
optics the same is true as well with SC, ST, LC, FC, etc. The last decades
shift towards Ethernet everywhere is as a result of "mass" consumer desire
(i.e. you guys on HN building things that use Ethernet. A protocol originally
designed for a limited number of workstations in a small environment.) and
that has put the RJ45 at the top for a lot of devices. Recall that 15 years
ago AS400, ATM, FDDI, all had their own connectors and standards. All of this
(and much more) just to send some bits over a physical connection. Just crack
open a Grays catalog and go to town.

For the record each of these has a purpose and reason. Some were a function of
the materials present at the time others due to specific environmental
concerns. I would not class any in the realm of "lock-in". Due to the capital
intensive nature of the industry they each were good business decisions at the
time. It wasn't as easy as buying a $40 plug.

------
rossjudson
His first sentence is "Airplay is not involved", and then he proceeds to tell
you that all of the parts of Airplay that are responsible for the drop in
quality _are_ involved. Specifically, a compressed video stream _is_ created
and that _is_ sent to the SoC for decompression. That
compression/decompression cycle is responsible for the quality degradation in
the video, and (presumably) the limitations on the output resolution.

"We didn't do this to screw the customer". Sure you did. You have offloaded
the cost of a parallel connection, necessary for HDMI, to the adapter. Almost
every competing system comes with HDMI built-in. 30 pin connectors could drive
HDMI. Lightning can't without having another whole computer in the adapter.

What makes this actively hostile to the customer is Apple's proprietary
adapter design. If there was a real, competitive, standardized market for the
various adapters, on balance it would be a good deal. But there isn't.

Magsafe is nice, but if you have any of a number of common problems with your
Air's power supply/cable, you are in for an $80 charge to buy a new one. Even
if the only thing wrong with it is pin spring that's worth pennies.

Because Lightning is coupled to proprietary decoders and Apple patents, it's
customer-hostile.

And, as we've seen here, the result has _poor quality_. If the quality was
high, then we might be able to overlook it. We shouldn't.

~~~
Cushman
> "We didn't do this to screw the customer". Sure you did. You have offloaded
> the cost of a parallel connection, necessary for HDMI, to the adapter.
> Almost every competing system comes with HDMI built-in. 30 pin connectors
> could drive HDMI. Lightning can't without having another whole computer in
> the adapter.

So every competing product charges every single user, most of whom will never
use it, for HDMI output. Apple charges (more, to be fair) only those users who
need the feature.

How this gets interpreted as "screwing the customer" _baffles_.

~~~
rossjudson
You are easily baffled, then. Apple doesn't just change the connector design.
It also performs authentication/identification/verification on the objects it
is connecting to.

If you're Apple and you want to make lots of money from your very popular
products, one of the ways you can do that is to make them incompatible with
the competitive markets for standardized accessories. You can come up with
your own connectors, and you can use patents, litigation, and trade secrets to
ensure that you have no competition, and that your users have no choice but to
buy high profit accessories from a single source.

I trust you are unbaffled!

~~~
Cushman
If I'm Apple, I don't need to come up with some kind of trick to make lots of
money. I just need to keep releasing best-in-class hardware at competitive
prices by leaving out expensive features 99% of my customers will never want.

I love my iPhone 5, not because of patents or lawsuits, but because it's
really, really well built. I definitely couldn't have gotten it cheaper from
anyone else. How many overpriced accessories have I bought? Zero. Because I
don't need that crap-- and Apple doesn't make me pay for stuff I don't
need.[0]

[0] Not counting the Mac Pro, that is.

~~~
JimmaDaRustla
My GFs iPhone 5 bent like an aluminum pop can...and it is the most expensive
phone on the market!? I'm confused...

------
dmishe
I think the main takeaway is that yes, lightning is future proof and can adapt
to any output you want it to.

I do wonder what's the reason for encode/decode cycle was in the first place
though.

~~~
revelation
He doesn't say, but probably because they just don't have the bandwidth.

And thats where all this future proof talk goes deaf. New interfaces will only
have _more_ bandwidth, thats the whole point. Meanwhile, Lightning still has
the same bandwidth and theres still a considerable penalty in serializing any
other interface.

If they can't even do 1080p lossless right now, they are in much deeper
trouble for the future.

~~~
drewcrawford
> If they can't even do 1080p lossless right now, they are in much deeper
> trouble for the future.

Let me try and shed some light on this mystery. Consider this rumor. Also, I
am not an electrical engineer, so I may be talking out of my ass.

As best as I can tell, Lightning is not (yet, anyway) a real protocol like
Firewire, USB, etc. What it is so far is USB with a different connector and
some negotiating chips.

So when connect your lightning-USB cable to your iPhone, your iPhone squawks
and says "What is this?" and the cable says "I'm USB". The iPhone then sets
its pinout to USB mode and off you go.

What's so great about this, why not just use USB? I think the secret sauce is
that Apple wants to surreptitiously invent new pinouts with faster data rates,
without consulting any standards bodies and having to rally industry support
on TVs, computers, et al. They just build some new controller to push
Lightning at 2Gbps+, put it in new iPhones, and build active cables that spit
out HDMI or UHDTV or USB3 or whatever it is the kids are doing these days. The
active cables degrade gracefully for hardware running at the slower data rate.

What I think you are seeing right now, is the dry run with the off-the-shelf
controller. Get manufacturing ramped up. You need this to plug your phone into
a computer anyway. But it's just the tick. Wait for the tock.

Now the only thing that puzzles me is why they settled on 8 pins, when USB3 is
9. Obviously they wanted an even number so you could plug it in every which
way, but you would think stepping up to 10 pins would let them use off-the-
shelf USB3 controllers instead of USB2. Maybe they can do USB3 on 8-pins
somehow, or maybe the tock will be ready fast enough that it's not worth it.

Further reading: <http://brockerhoff.net/blog/2012/09/23/boom-pins/>

~~~
dmishe
Yep, and the comment about them using already existent h264 support in
hardware I think supports dry-run-fastest-to-deadline solution.

Re pins, I'm not EE either, but I don't get why people concern about number of
pins so much, as I asked earlier in this thread, is there a relation between
number of pins and bandwidth (well with physics involved there is _some_)?

~~~
drewcrawford
Still not an EE, but as I understand it, if you are creating your own
controllers and you also control how the cable is shielded, pin count doesn't
matter much except for power.

 _Traditionally_ it's been easier for most hardware makers to double the pin
counts than to build controllers that run at twice the clock rate and also
make everybody use fancy cables. But _traditionally_ not everybody has PA Semi
across the hall to build chips for you and years of experience selling $50
cables to consumers via direct retail. So I'm betting that the usual economics
of the consumer data interface market don't apply to Apple.

~~~
josephlord
Has this been true since the late 80's/ early 90's when serial ports left
parallel ports behind?

With serial techniques such as differential encoding parallel transmission off
circuit board has been obsolete for a long time. The potential skew between
the pins is too great and synchronisation too complex.

------
lttlrck
It all seems liks a fairly logical progression from the company that brought
us Firewire. If you don't like it instead of whining, vote with your wallet.
Personally I believe this is a smart move and we'll see far more exciting
things running on this interface in future. Plus myriad related patents.

------
DenisM
I think we can all agree that pushing hdmi circuitry off the iPhone reduces
cost of the device, increases cost of the dongle, and drops signal quality. We
can also speculate that signal quality will increase over time by a
combination of better encoding, pass through for already encoded signal, and
raw speed upgrade for lightning protocol itself.

What is puzzling is the timing of the release. There is an obvious drop in
quality, and no obvious reason to save cost in iPhone 5. They could have let
the technology mature to quality parity and only then release it, so what
gives? My theory is that Apple came under serious price pressure, as cheap
smartphones are now the fastest growing segment of the market, and so they are
preparing to ship a very cheap version of the iPhone. Given their position as
high-margin company on one hand and pressure from low-margin competition on
another, they felt the need to pinch every penny. And so the iPhone 5 ended up
being the test bed for the new wave technology with much lower cost of entry,
but more expensive accessories. Unfortunately, what we get in the interim are
both expensive devices and expensive peripherals.

------
sylvinus
I am now patiently waiting for this cable to be jailbroken!

------
runjake
While the explanation seems plausible, I still have my doubts this is
legitimate.

From the wealth of information this anonymous person provided, they could be
tracked down to a handful of employees at Apple.

And if the current day Apple is anything like the year old Apple, this person
is knowingly exposing themselves to termination.

Edit: If you re-read the comment, the author claims they work at Apple on the
technology in question.

~~~
smackfu
"Knowledge of the design" isn't the same as "the designers." "How the HDMI
adapter works" doesn't seem like it needs to be top secret internal info.

------
glasshead969
This takes a page from intel thunderbolt which uses active cables too.
Initially thunderbolt cables were costly for this same reason but as chips
become cheaper and smaller, cost would not be a issue while you still have
advantage of a adaptive interface.

------
batgaijin
Lightning is an Intel feature that Apple is trying to support right? I mean I
can't tell if they firmly support this or if since Intel is there only chip
supplier they are kind of bound to supporting this.

~~~
lunixbochs
You're thinking about Thunderbolt?

~~~
glitch
Yes, he/she is thinking about Thunderbolt.

------
supervillain
So it's a winmodem for HDMI.

------
djanogo
Sounds like "Wireplay" for devices which don't support "Airplay".

------
cake
Man, a computer in a cable seems like such a waste and complex design.

What's the point of doing so ? I don't get it.

~~~
fredsted
Sorry, but did you read the comment?

~~~
XorNot
It still doesn't make any sense. "Here's a world standard which is being built
into literally every display on the planet, and is signal compatible with many
others" (i.e. HDMI/Displayport/DVI).

And for some reason they're worried about some new standard being incompatible
and not being able to upconvert?

The exact same logic would apply if the devices had native display connectors,
and then eventually needed some new connector for it. If you can't get a raw
framebuffer or HDMI over your current link, you're never going to get a new
faster/bigger protocol over it either.

~~~
glasshead969
This isn't just about video, the design helps to keep pin smaller and removes
the need to having dedicated combination of pins to support a certain standard
which may not be used anymore. For example, old 30 pin connected had support
for FireWire which is no longer user anymore but connecter still needed to
have those pins.

I think they know the issue of video quality and are working to fix that. The
beauty of this design is they can improve the hardware/software to push data
efficiently because it just needs to output data stream it's upto the cable to
handle it. As Moore's law kicks in, cables would need smaller chips.

They no longer have to keep changing pin configuration to add support to newer
connectors. The Cables will handle that in the software.

~~~
XorNot
Except all it does is create a new problem for them, without solving an old
one. If you're putting a SOC in the cable then it has to be powerful enough to
talk to the new standard. So you need a custom cable and connector no matter
what.

If you need a custom cable no matter what, then why not support a common
standard - which is a standard and thus common to many devices - and then
adapt it as necessary?

Instead you have this situation: you've got a proprietary standard which can't
handle a common standard well in the first place (evidenced by the fact they
made a compromise to make it work). It'd make sense if there was a lack of
pins or something, but it doesn't - between USB and HDMI/DP there's every type
of signalling you would need to support newer standards within the realms of
the existing hardware.

It's Apple ecosystem lock in and that's it, but in this case its a worse
outcome.

They're not going to be reprogramming iOS devices for faster bitrates - that
means an IC change in the device. It also means an IC change in the cable. And
at the end of the day still means...less capability then using standards would
have.

~~~
glasshead969
>>They're not going to be reprogramming iOS devices for faster bitrates - that
means an IC change in the device. It also means an IC change in the cable. And
at the end of the day still means...less capability then using standards would
have.

I am not sure the assumption that it would need a new chip is true. It may be
a firmware issue that can be fixed in a software update.

>> It'd make sense if there was a lack of pins or something, but it doesn't -
between USB and HDMI/DP there's every type of signalling you would need to
support newer standards within the realms of the existing hardware.

Within in realm of existing hardware is the key point. 30 pin connector was
used for 10 years. In those 10 years there has been huge change in Standards
used and adopted. I would assume Apple has similar plans for Lightning.

Also USB and HDMI is not a perfect standard as you seem to imply. The docking
port on iDevices is used for many other things including things which haven't
been invented yet.

Standards like Micro USB 2 are non starters for Apple. With only 5 pins: +5V,
Ground, 2 digital data pins, and a sense pin,most of the dock connector
functions wouldn’t work – only charging and syncing would. Micro USB 3 is
capable but Larger than Lightening. Also implementing Micro USB 3 would
require a Chip on the host and also handle USB protocol on processor using
precious PCB space. Also implementing HDMI on Micro USB needs a special
convertor chip.

Also many devices break USB spec for allowed power to charge their devices.
iPad with retina display, despite going over limits takes forever to charge.
On lightening, device can multiplex all 8 pins to charge the device.

I agree with your point of controlling the peripheral market. This isn't as
much of a compromise as what would needed to be done in case of Micro USB. MHL
does same thing for Micro USB where a separate controller chip on the cable
repurpose signals to HDMI while USB is off.

------
nirvana
Ideology has trumped engineering, and as hackers, you shouldn't tolerate it.

Frankly, all of this has been obvious all along to any competent engineer,
since the moment Apple introduced lightening. They described it as a serial
bus and talked about how it gave more flexibility. If you think about it for 2
seconds its obviously better to run protocols over a serial bus than to run 30
lines of individual signals, with dedicated lines for analog and dedicated
lines for digital in a world where people want HDMI adapters for a connector
that originally had firewire signals on it, from a time before HDMI was even
common.

But this is Apple, so the REAL reality distortion field kicked in-- a tsunami
of press acting as if Apple was ripping people off with a $30 adapter,
hundreds of mindless conspiracy theories from Apple bashers on Hacker News
about how this is to have more control over people and how this once again
proves that "open" (defined as google, not actually anything to do with
openness) is better than "closed" (defined as Apple, you know the company with
the most popular open source operating system in the world?).

It's one thing to not know enough engineering for this to have been obvious to
you, it's quite another to spread lies and _engineering ignorance_ as a result
of your ideological hatred of Apple. And the latter is basically all I saw on
HN about this format. (Which is one of the reasons I write HN off as worthless
for months at a time.)

~~~
rossjudson
What you're saying is true from an engineering standpoint (serial vs
parallel), but has to be placed in the customer's context.

In this specific case the quality is bad, operation is unreliable, and the
price is high. Consumer devices accept HDMI as input. Serial to parallel video
(Lightning to HDMI) is tough without some heavy-duty hardware -- hence the
exorbitant cost of these adapters.

The SoC design introduces a massive amount of complexity. This has yielded
unreliable operation. And it introduces that complexity at a point of physical
vulnerability -- people don't treat adapter like tiny fragile computers. They
treat them like, well, adapters.

End-to-end serial communications would be nice, but that's not the world we
live in.

Lightning isn't _that_ much smaller than HDMI or Micro-HDMI. Reversibility is
a very minor feature, and not worth the price being paid.

And that's not a $30 adapter. It's a $50 adapter. Did you think it was $30?
That was the old one -- parallel to parallel.

~~~
TorbjornLunde
I strongly disagree that reversibility is a small feature. Whenever Plugging
the chargers to new iOS devices is effortless the same way as headphones
jacks.

Non-symmetrical connectors are an affront to usability.

~~~
jws
I appreciate reversibility once per day.

In 2074 days of owning an iPhone and 1065 days of owning an iPad I have never
used or wanted an HDMI output.

I'd say they made the right tradeoff.

~~~
mikeash
I agree. I don't even have a device with the connector (yet?), but it seems
like a major advantage.

Who are all these people popping out of the woodwork wanting a wired
connection from their phone to their TV? I'm sure some people do this
sometimes, but so many? Why would you even do that? Perhaps this is
uncharitable, but it makes me think that most of the people complaining here
have never done it, never will, probably never even thought about doing it
before, but are now _outraged_ at the thought that the connector is not 100%
perfect for this one uncommon use-case.

~~~
woobar
I have dock to HDMI adapter. I use it in hotels to watch movies on big screen
TV. Beats the crap on cable TV every time.

~~~
mikeash
Thanks, that's the first use case that I could actually see using myself.
Don't think it's quite enough to get me to go buy a cable, but I can see it
being handy for that.

