
64-bit ARM chips in iPhone 5S serve up taste of Intel-free future for Apple - boh
http://www.zdnet.com/2015-64-bit-arm-chips-in-iphone-5s-serve-up-taste-of-intel-free-future-for-apple-7000020524/
======
twoodfin
Disagree. Intel looks to be getting better at low power faster than ARM and
partners are getting better at perf/watt.

The ARM universe has a flexibility advantage, but Apple might not care, since
Intel will surely make them the SoC they want if it means a spot in the iPad.

I don't think the outcome here is ordained, nor would Apple want it to be:
Competition for supplying them fast, efficient CPUs is exactly what they
didn't have in the PowerPC days, and it nearly killed them.

~~~
joezydeco
_Intel looks to be getting better at low power faster than ARM and partners
are getting better at perf /watt._

Citation, please?

~~~
twoodfin
This is the kind of thing I have in mind:

[http://www.anandtech.com/show/7117/haswell-ult-
investigation](http://www.anandtech.com/show/7117/haswell-ult-investigation)

Even their "ultrabook"-class Haswell parts are becoming power competitive with
ARM tablet chips. The Y series should get even closer to parity. Broadwell is,
in theory, right around the corner, with Intel claiming another 30% power
improvement in the 14nm process.

[http://www.anandtech.com/show/7318/intel-demos-14nm-
broadwel...](http://www.anandtech.com/show/7318/intel-demos-14nm-broadwell-up-
to-30-lower-power-than-haswell)

Intel will have a significant process advantage over the ARM partners for at
least another couple of years, and possibly longer. I would be surprised if
they failed to overcome the engineering challenges required to exploit that
process advantage in mobile.

My point isn't that I'm sure who's going to win, but rather that Apple is not
inextricably tied to ARM.

------
revelation
Just because Apple says the processor is "desktop-class", does not make it so.
If it consumes 5W, you are only going to get 5W worth of perfomance. Any half-
recent Intel processor will easily blow this out of the water.

~~~
mikeash
Of course it's desktop-class! If your desktop was built in 2005....

~~~
lttlrck
What was wrong with the desktop in 2005?

~~~
wmf
Nothing... when running 2005 software. But due to Gates's Law you probably
don't want to run 2013 software on 2005 hardware.

~~~
x0054
Name one software package (not a game), released in 2012-13, that would not
work on a midrange 2005 desktop.

~~~
karl_gluck
"Work" is pretty broad. Let's define it as "have a productive experience"
because any non-period matched programs have issues other than pure
compatibility. For example, bring many old DOS programs up on modern hardware
and they will "work" but be entirely unusable unless you do something to slow
them down.

So, 2013 programs that don't "work" on 2005 hardware? Off the top of my head:
SolidWorks 2013, VMWARE running Ubuntu 12 on top of Windows, Avid, Visual
Studio 2013.

------
randyrand
It doesn't seem like th author has a very clear grasp of what 64-bit word
sizes actually allow over 32 bit. Here's a hint - not much. Sure, eventually
more ram and more registers (which is very important but not really _inherent_
to 64 bit as far as I know). It seems like the author thinks 64 bit magically
makes things run faster or easier to develop for - neither of these are true
except in the very small minority of circumstances.

I'll leave with this: Nintendo came out with a 64 bit game console ~15 years
ago at half the cost of the iPhone. This is not ground breaking stuff.

~~~
wtallis
The "more registers" bit doesn't _require_ going 64-bit, but it does require
changing the instruction format, at which point it would be foolish not to
take the opportunity to go 64-bit.

------
zaroth
Place your iPhone 8S next to your Thunderbolt display, and you have full blown
OSX appear on the screen, and you use keyboard/mouse as usual.

OSX and iOS will be running concurrently on the same hardware, when it gets
fast enough. Of course as soon as you "undock" your phone (which doesn't
actually involve a cable, or any physical connection), the OSX "VM" is
instantly suspended.

By the way, you could do the same thing with a MacBook Air type device which
is nothing more than monitor, keyboard, trackpad, and battery. Phone stays in
your pocket. Open the lid, and boom, there your desktop OS is. Except it's
actually being served over AirPlay from your "phone". Maybe an early iteration
would require connecting the phone to the shell with Thunderbolt.

Although I think it's fundamentally two different OSs, I think the line will
blur to the point where it's a meaningless distinction. For example, apps
which support both modalities will install with a single click in both places.

~~~
Sami_Lehtinen
Sounds silly to have two separate operating systems. Sure they're going to
converge those?

What about Chrome OS and Android?

~~~
mortenjorck
At least Microsoft has shown the market how not to do it.

Long-term, a converged, unified OS makes sense, but a lot more work is going
to have to go into making a truly context-responsive UI system.

------
danbruc
Albeit the about box says the author has more than 20 years of experience in
the IT industry, at no point does the article convince me that the author
understands the difference between a 32 bit and 64 bit processor.

~~~
mikeash
The amount of misinformation on the 5S's 64-bitness is hilarious. I've yet to
see a single person get it right aside from people I know personally.

~~~
jsf
Care to share? I haven't heard a good explanation of what it's good for.

~~~
mikeash
fleitz's and eddieplan9's replies are generally accurate, so I won't repeat
them.

The revised instruction set is the main thing. ARM's 32-bit instruction set is
a bit odd, and the better 64-bit one will give a decent speed boost.

The ability to expand the address space beyond 4GB is nice, although not yet
killer. The device still doesn't have enough RAM to need it, surely, but it
still lets you do interesting things like memory map large files or play other
fun address space games.

In addition to that, 64 bits also lets Apple make various improvements on
their end, such as adopting tagged pointers (already seen on 64-bit Mac),
which can be a big performance and memory win for certain kinds of code.

What 64-bit doesn't do:

1\. Make code vastly faster. The performance increase will be somewhere in the
neighborhood of 10-20%, not the huge gain some people seem to think. Some code
could end up being slightly slower due to not doing anything that benefits
from the new architecture while using up more memory and cache due to larger
pointers, although this probably won't happen often.

2\. Allow qualitatively different things to be done with the devices. For
example, the linked article implies that fingerprint recognition and VPNs are
both made possible by the 5S's 64-bitness, neither of which is even remotely
true. Fingerprinting might be a bit slower without it, but not much. VPNs have
been on iPhones for years, and 64 bits makes nearly no difference there, as
CPU load isn't high for network-limited crypto anyway.

3\. Make code easier to port or more compatible. For any halfway decent
halfway modern code, the 32/64 divide is not a big deal at all. Maintaining a
code base that works in both 32 and 64 is trivial.

4\. Signal anything about future iOS/Mac convergence at Apple. A 64-bit CPU in
an iPhone was inevitable. It's pretty much required once you want to go beyond
4GB of RAM, and extrapolating forward, iPhones will hit that limit soon. The
only surprise (and let's not downplay it, it was a pretty big surprise) is
that it happened _now_ , and there's no real difference in terms of
hypothetical Apple plans between a 64-bit ARM in 2013 and a 64-bit ARM in
2015.

~~~
vardump
64-bits has little to do with amount of addressable memory. In fact, 32-bit
and 64-bit processors can address just as much. For example, 32-bit ARMv7
Cortex A15 can address up to 1TB of memory.

The only difference is amount of physical RAM you can map at once. Only
virtual address space is limited in 32-bit processors. But even then, you can
change the virtual-to-physical mappings at runtime. It's only inconvenient.

~~~
mikeash
In theory true, but in a practical sense, roughly no real-world programs are
going to use more RAM than they can actually address with a raw pointer, which
means that roughly no real-world programs are going to take advantage of more
than 4GB of RAM on a 32-bit CPU.

------
ryanobjc
Apple did a bang up job accelerating their deployment of ARMv8 and A7.

The question is, what now Intel? On one hand Intel did a huge mis-step and is
missing out, on the other hand at the time "low cost ARM cpus" probably didn't
seem like a good market for a company that has typically sold premium high end
CPUs.

One thing is for certain, Apple is unwilling to let Intel or anyone else
dictate features via CPU roadmap.

Did anyone else notice that the A7 has 1 billion transistors? That's on the
scale of a desktop/mobile CPU - the latest intels have about 1.4b or so.
That's a big deal. And don't give me the CPU vs GPU, because intel ships with
a GPU on die.

------
grecy
I don't buy that Apple will merge iOS and OS X.

At the last All Things D that Steve appeared at (9? 10?), he talked at length
about this, and said they'd thought about it and worked on it for a very long
time. In the end, they figured that phones and Mac Pros are too different to
share the same OS, and it just doesn't make sense.

~~~
chiph
Agreed. The tablet-like features that got added into the recent versions of OS
X are something I don't particularly care for. If I'm on a desktop/laptop, I
want a desktop-like experience. If I'm on a mobile device, the paradigm is
sufficiently different that it justifies having a different OS.

Good example: the default scroll direction got changed in OS X to operate like
a touch device. Hated it with a passion. First thing I changed back.

~~~
Zr40
Some people like moving the scroll bar (which seems to be your preference).
Other people like moving the document (which is my preference).

There's no right or wrong choice here. The current default logically makes
sense -- your cursor is located at a document, and when you're moving your
fingers upward, the document also moves upward.

But like any changed default, there are people who liked the previous setting.
And there are of course people who want to use (or have to use) other OSes
where this setting can't easily be changed. Those people can change the OS X
setting at will.

------
minikites
>Because both Mac OS X and iOS are now 64-bit, it will be easier to port the
more demanding (but fewer in number) Mac apps to a single, converged future-
state Apple device OS.

It seems like Microsoft has shown that strategy to be a dud. Touch interfaces
are awkward with a mouse and mouse interfaces are too fiddly to touch.

~~~
dljsjr
If your business logic (Models/Controllers) are strongly decoupled from your
Views, you still maintain a pretty large amount of code that becomes easier to
port.

~~~
mikeash
It really doesn't, though. The 32/64-bit divide is not all that consequential
for this.

~~~
dljsjr
In 99.9% of cases, true. I was just trying to point out that the comment about
UI's was invalid.

If your code drops down to ASM or anything like that, then it becomes an
issue. But that's not most use cases.

------
drill_sarge
I am having nightmares if the future is hundreds of arm cores on workstation
class systems. I'd rather have a couple big x86_64 cores which are "just there
and work" without having to deal with all kinds of scary things.

Also I think it's funny how people fall for this 64bit marketing bs. It's like
back in the days of AMD Athlon64 when people though everything it's faster
just because of 64 bits.

I am interested in the new Atom generation too, which seems really promising.

------
ChuckMcM
This is an interesting article for the theme, but not the content :-). Intel
and ARM are on a collision course. What is interesting to me is that the
previous two trains on these tracks were Intel and the bespoke computer
makers.

Some folks here will recall the great microprocessor wars for the general
purpose computer. Starting with the Z80, 8080, 6502, and 6809, moving to the
80286/386, 68000/68020, and SPARC/MIPS/PA-RISC/POWERPC.

Intel pretty much won that war by coming up from the bottom, powering 'toy'
computers (IBM PC) running a 'toy' operating system 'MS-DOS' with 'toy'
peripherals (640 x 480 graphics). They came up and ate workstations, and then
they ate servers.

And here is the salient fact for me, they could do that because while the
market for workstations might be 1 million/year, the market for PCs was 1
million a _month_ (or more at its peak). So while a PC couldn't do what a
workstation could early on, the PC market was generating a ton of cash which
was going into improving the PC, much more cash and investment than was going
into improving the Workstation. One by one the Workstation vendors dropped out
of the market or replaced their offering with a "high end" PC.

ARM set up the same scenario today, although this time it was portable devices
that were the 'toys' and clever and useful but not nearly as powerful as PCs
were so hardly a threat, but for every million PCs sold there were 5-10
million phones sold. And all that money and all that investment goes into
making the chips that power these phones better and better, and the PC king is
sitting on there trying to make a more powerful PC when what people are buying
are more powerful _phones._

At IDF (in 2010 if memory serves) Intel noted that they were going to move
into the embedded space more aggressively but it didn't actually follow
through too well focusing in multi-chip solutions which didn't work well in
space constrained phones. Meanwhile ARM was getting design win after design
win with their SOC partners.

So last year Intel started going all out on making a lower power version of
the x86 architecture which could compete with ARM and this years IDF they took
aim at the SOC business. All in an effort to pull enough oxygen out of the
ecosystem to slow ARM down. After all, if _today_ you could choose between two
chips, one ARM and one x86-64, with the same cost and power curve, you'd
probably go with the x86-64 for the assurance that software support would be
easier. But once software support isn't an issue it gets to be more of a horse
race, and ARM has notably had much better licensing terms than Intel has after
being nearly usurped by AMD the first time they tried it.

If ARM can get to 64 bits and I/O channel bandwidth faster than Intel can get
to ARM's power envelope and cost, it will be a very interesting fight indeed.

~~~
Touche
Intel getting to ARM's power envelope and cost won't help them though. They
already did the Razr i which I think shows people won't buy something just
because it's Intel. Once Intel scales down to match ARM it has lost all of the
things that make it "Intel", and so what's the point? If they compete directly
with ARM they're going to lose because they _have_ more to lose.

~~~
sciwiz
"They already did the Razr i which I think shows people won't buy something
just because it's Intel."

Pretty sure it's because it was made by Motorola and for sale only in Europe,
where they have no mindshare.

------
jrockway
Switching from Intel to ARM for their "real" computers sounds like a decision
that Apple's current leadership would make.

------
vicaya
It'll be an interesting fight with Intel's Silvermont Atom, which finally gets
out of order execution.

Intel claims that dual-core Silvermont chips can outperform quad-core ARM
chips by a factor of 1.6 when power draw is similar. Intel also says
Silvermont can draw 2.4 times less power when delivering similar performance
as the quad-core, ARM-based competition.

------
Aloha
I think this is spot on, if you're buying 2 million units of X its nearly
always cheaper per unit than buying 1 million. That said, I dont expect to see
a total platform convergence, I see a future where you have iOS and OSX, and
where you could for example run iOS apps on OSX, I dont see them following the
same route Microsoft did with Windows 8.

------
kabdib
Current ARM memory systems are terrible compared to what you find on even
medium scale x86 chips. There's going to have to be a LOT of investment here
before you'll see ARM approach the scale of large (or even medium) iron.

