
x86 finds its way into the iPhone - isp
https://lcq2.github.io/x86_iphone/
======
userbinator
The author may have let his preconceptions get in the way of reasoning a bit
too much --- x86, or at least x86 compiler output, is easy to recognise in
hex/ASCII mostly because you'll see things like function prologue/epilogue
sequences (55 8B EC, 8B E5 5D C3) and NOP (90) or INT3 (CC) padding
everywhere. ARM, MIPS, and Z80 (the other 3 I can recognise by sight) all have
their distinct "textures" too.

 _this is awesome…_

I'll be the first to comment on the apparently misplaced bounds check(!) in
the fragment of code above it; it reads a parameter from the stack, compares
it with 8A, then makes it an index into some array of 8-byte elements and
reads the two values from memory _before_ deciding whether it's valid or not
--- and seems to put -1 into eax if it's not.

Not really a problem if this is running in realmode (or "unreal mode") with no
memory protection (it will just read 8 bytes from somewhere in the address
space, and probably ignore them), but it could crash if it was in protmode
(which the lgdt in the preceding fragment suggests) set up with restrictive
segment liits, and the memory address was not valid.

Then again, the check could be completely superfluous if that function would
never be called with an out-of-bounds value...

------
chrisfosterelli
Can someone explain why it's a big deal they are using an x86 chip? It seems
ARM is the standard in the mobile world, but I'm not sure what the motivations
might have been for the change or if this has drawbacks that make it so
surprising.

~~~
monocasa
There isn't a big deal. Hell, AArch32 has something like 1400 instructions,
it's just as complicated as an embedded x86. And as someone who's ported a
kernel to both, it has just as many weird parts of the architecture built up
over decades (ARM is about as old as 32 bit x86).

The motivation for the change is that Intel's making the baseband instead of
Qualcomm now, so it's not an ARM/Hexagon like it once was.

The author is just bent around a decade plus out of date view of chip
architecture.

~~~
jonstokes
This ^^. The "OMG... HORROR" in this article about the mere fact of x86 usage
is deeply silly. The stuff about C64 and so on at the end... there's just no
redeeming this mess. The mods should nuke it.

~~~
monocasa
So, I just want to throw out there that your arstechnica uarch articles are
one of the things that pushed me into computer engineering rather than
straight CS. And Inside The Machine I consider to be up there with Patterson &
Hennessy. Thanks for all of that! : )

~~~
jonstokes
Thanks!

~~~
evancox100
Hey, same here! Your articles are what inspired me to go into chip design
/computer engineering. Still have my signed copy of Inside the Machine on my
bookshelf :)

Thanks a ton!

------
gip
I was working as a Postdoc at Intel around 2005 when they decided to sell
their X-Scale business (the ARM CPU they got when they bought DEC). My manager
said that embedded x86 will make its way into smart phones. It was a long long
shot back then. I admire folks at Intel for their persistence!

------
dfox
One thing that strikes me as highly weird is x86 code in ARM ELFs. It is
usually the other way: code for some random embedded custom almost-RISC in
ELFs claiming to be for i386 :)

~~~
lcq2
they're not standard ELF files, more likely they're using the ELF format just
to have a list of "load address-size-data" stuff assembled with some custom
linker script, and they did not bother to change it, probably because of
integrity checks or sanity checks along the assembly line

would have been much more fun if they switched to PE format though, like they
did with EFI/UEFI :D

------
bogomipz
>"What I discovered sent a shiver of horror down my spine", "I couldn't
believe my eyes", "Holy __shit "

These seems like a lot of histrionics for a summary that reads:

>"Conclusions Nothing really, I just found this funny and wanted to share."

------
tebruno99
In the closed box that this chip is in, as long as it meets power usage spec
then it doesn't really matter what Arch it is. Not sure what the big deal here
is other than some personal bias the author has against x86

------
ksec
This is the first time Intel is the sole baseband modem supplier for Apple's
new iPhone.

This is also the first time Intel has their baseband modem manufacture in
their own 14nm Fab.

This is also the first time any Intel modem had support for CDMA / TDS-CDMA.

This is also the first time Intel made x86 into their baseband modem. All
previous modem, 8 years since Infineon has been sold to Intel, has all been
ARM based.

So yes, lots of hope for this new Modem. And finger crossed Intel don't mess
this up.

------
yalok
I wonder if the Dual SIM Dual Standby feature has anything to do with this,
even as one of the minor reasons to switch to x86. Even though standby mode
itself is usually the least demanding, and so it will just mean doubling of
memory...

Seems very unlikely, but from product perspective it’s one (and maybe only) of
the new features that’s related to baseband.

~~~
jsjohnst
Doubtful, as Qualcomm baseband chipsets have been in dual sim phones for ages.
It’s more likely two reasons:

1) Intel somehow is doing CDMA now, that’s a major reason previous generations
were split between Qualcomm and Intel

2) The major reason Intel has a seat at the table is due to Apple and
Qualcomm‘s very public fight. Intel and Apple don’t have an entirely happy
relationship, but it’s a far better relationship than Apple and Qualcomm

------
sigmaprimus
Could this is a preeminent move to produce more parts in the US by Apple to
avoid the new tariffs being imposed? How many years of R&D are required before
production of a new phone these days? I hope this change will result in new
low price SoC board PCs running x86 cores to compete with RPIs entering the
market soon.

~~~
klodolph
I suspect the new baseband processors are powerful enough that you can use one
part for everything, rather than a different baseband processor depending on
your network, which makes things cheaper. Tariffs might be a factor but
Qualcomm works with Global Foundries which has fabs both in the US and
elsewhere.

------
grawprog
Arm scares me far more than Intel or Amd. The way they license their chip
designs is what's created the fragmented mobile ecosystem we have today. I
remember watching an interview a while back about arm chip technology with
someone from their company about how their revolutionary virtualization
technology could be used to run any OS on their chip...then the spokes person
laughed and said how this would never happen and would insted be used by
licensees to lockdown their processers even further. I'm really not a big fan
of ARM or the company in general. Intel does some shady things but arm is a
whole different beast altogether. It's designed with arbitrary software
lockouts in mind and their licensing scheme is not conducive to open
development.

------
bowyakka
So while I am sure the author checked this, it bears mentioning that
disassembling CISC is more of a black art than RISC. You can feed any binary
file into a disassembler and get x86 code out, even if that code is invalid.

For example here is a "program" except it's really a meme gif off my phone.
[https://imgur.com/gallery/hoDKeC9](https://imgur.com/gallery/hoDKeC9)

~~~
umanwizard
Are you experienced with reading x86 assembly?

It's crystal clear that the gif from your phone is gibberish (or extremely
obfuscated), whereas the code from the article is normal-looking.

~~~
bowyakka
I am, I am not faulting the original author just pointing out you can get
disassemblers to come up with x86.

From the comment right under the picture I took

> yeah, pretty typical function prolog, what's the question ?

Except we know it is not.

I am more saying to people be careful pushing any old binary blob through
capstone without considering what it might produce, I get this at $DAYJOB
where people disassemble VAX from things that are just data.

------
devy
Since Qualcomm and Intel are the baseband/modems manufacturers and Apple has
little or may not have anything to do with developing these x86 baseband
modules, this should be mostly Intel/Qualcomm responsibility to tighten up the
security, no? It's like we can't fault Boeing for the plan crashes if it's a
CFM56 engine failure, right?

~~~
aij
I don't think the CFM56 is a good analogy here. It does not appear to have
been designed to fit the bolt pattern of a Cesna 172.

The main thing the x86 instruction set has going for it, is backwards
compatibility. (Including the fact that there are a lot of highly optimized
CPU designs around that instruction set.)

------
nickpsecurity
They’ve been in embedded with Atom processors for a while. VIA/Centaur beat
them to it with C3, etc.. Id have assumed Intel would be in a mobile
eventually if they weren't already. Wonder why x86 is that surprising.

Also, early Nokia 9000 Communicator had x86 CPU. I think it was a 386. Mobile
returning to x86 instead of going to x86.

------
Mauricio_
Until 2015 my phone was A Motorola Razr i, which used an Intel cpu with x86
architecture, so it's not that uncommon. Also the PS4 uses today x86 for its
main cpu. It's not mobile, but comparing it to Z80 is exaggerating a bit.

------
jsjohnst
Anyone know if it’s the Intel XMM 7560? Very likely based on specs mentioned
at the keynote, but won’t know for sure until the tear down next week I guess.

~~~
lcq2
yes that's the model "ICE7560_XMM7560_RFDEV_UB_FLASHLESS" :)

~~~
jsjohnst
Reading the spec sheet on it has me feeling in awe for the antenna designer.
This chipset claims to be able to simultaneously tune in on 850 / 900 / 1500 /
1700 / 1800 / 1900 / 2200 / 2800 / 3500 / 5000mhz. Having gone through the
“black art” of antenna design just for a few of those frequencies before, I
can’t imagine trying to cover all of them well, but I also know if anyone does
it well, it’s XX’s team (not outing the person as I don’t know if it’s well
known who leads that team at Apple).

------
pq0ak2nnd
Can someone recommend a good resource to analyze entropy in a file as the
author discussed?

~~~
zeeboo
The easy way: compress it with whatever you want (xz? gzip?) and compare the
sizes. If it doesn't get significantly smaller, it's probably encrypted or
already compressed.

------
mankash666
Much Ado about nothing - Intel furnished modem on the iPhone uses x86
instructions!!!

~~~
danmg
Well it's a battery powered phone. Decoding x86 instructions and converting
them to the processor's internal microcode, unless it's not really the full
x86 ISA, is not energy efficient. This means there may be a noticeable battery
life difference between the GSM and CDMA versions of the new iPhone.

~~~
mankash666
Wait what. Where is this article suggesting conversion of x86 to arm assembly
prior to execution? Stop making things up. Unless otherwise proven, it's x86
executable running on x86 intel cores within the modem or related compute.

~~~
monocasa
I don't read anything of the sort in his comment.

~~~
mankash666
Then you don't read right. "Decoding x86 instructions and converting them to
the processor's internal microcode ..."

~~~
danmg
Modern x86, for the past 25 years or so, doesn't implement the instructions
directly. There's an internal intermediate, proprietary reduced instruction
set. So outwardly it's a cisc processor, with variable length instructions and
a lot of different instruction modes, but internally it's not.

~~~
mankash666
Speculation about how an embedded x86 works. Just because their desktop
focused, non-power optimized products do something, doesn't necessarily mean
their embedded line-up does the same.

FYI - Apple is a hard driving customer, they do not compromise on power,
performance & cost/pricing

~~~
danmg
Well, you're up against basic physics here: bigger die required to decode full
x86 ISA, more power, more heat.

~~~
tracker1
But a 486dx with more integrated cache could be incredibly efficient with
modern manufacturing processes. A modern i5-8250 uses under 15W power... we're
talking something less than 1/1000 the complexity.

------
kolderman
But what core? Intel Atom?

And does this mean it can run Doom?

~~~
Symmetry
Certainly not as big as an Atom, that would be an application core. Possibly a
Quark[1] or possibly something custom cut down even further.

[1][https://en.wikipedia.org/wiki/Intel_Quark](https://en.wikipedia.org/wiki/Intel_Quark)

------
johnvega
I did lots of coding for x86 assembly in college, so this article was
interesting. Actually, it was fun reading it.

