
How ARM got so successful without the public really noticing - room271
http://www.theguardian.com/technology/2015/nov/29/arm-cambridge-britain-tech-company-iphone
======
kogepathic
> Hands up who knows which company designed the iPhone 6s’ 64-bit chip

Apple did. It's been widely acknowledged that the processor inside the iPhone
is a custom design which implements the ARMv8 ISA. See:
[https://en.wikipedia.org/wiki/Apple_A9](https://en.wikipedia.org/wiki/Apple_A9)

> Furber and his co-designer, Sophie Wilson, had found research from the
> Berkeley campus of the University of California into a new type of
> processor: one that simplified the set of instructions it would follow, in
> order to enable a sleeker, more efficient design. This style of processing
> was called “reduced instruction set computing”, or Risc, and the Berkeley
> Risc designs had been put together by just two people, David Patterson and
> Carlo Sequin.

ARM were hardly the first people to bring a RISC CPU to market: MIPS, SPARC,
and POWER are RISC architectures which pre-dated ARM.

> By 2007, Intel had abandoned the excesses of the company’s Pentium line

Pentium is a brand name, which still exists today. What the author is
referring to is Intel's "NetBurst" architecture.

> powering the servers at companies like PayPal.

Citation needed. The CEO of X-Gene said PayPal has “has deployed and
validated” services based on their CPU, but readily admits to having shipped
only 10,000 units, which is just peanuts to datacentre guys. Source:
[http://www.theregister.co.uk/2015/04/29/aookied_micro_q4_201...](http://www.theregister.co.uk/2015/04/29/aookied_micro_q4_2015_results/)

Overall, I'm happy for ARM's continued success, and I hope that they and the
rest of the semiconductor industry will continue to innovate and bring the
performance per watt up.

That being said, this Guardian article is crap. It's factually incorrect in
places, missing sources for claims, and generally doesn't do anything to
actually explain why people supposedly don't know who ARM is.

tl;dr - Some guys in Cambridge, who are fab-less, made a CPU from a RISC
design. They license the designs to partners who can add whatever they want
around the CPU (or redesign the CPU as long as they conform to the instruction
set, as Apple does). The reason they're "not well known" is because they don't
sell CPUs themselves, companies like Apple, Qualcomm, MediaTek, etc do.

~~~
sohcahtoa
> ARM were hardly the first people to bring a RISC CPU to market: MIPS, SPARC,
> and POWER are RISC architectures which pre-dated ARM.

Minor point: ARM-1-based devices were kinda, sorta, brought to market around
'85-'86\. That's neck-and-neck with MIPS I. Real ARM-2 based machines in '87
were three years ahead of the RS/6000.

Of course, lots of people were developing RISC architectures in the early 80s,
in parallel.

~~~
tacos
Even Intel, of course. Windows NT started development exclusively on RISC: the
Intel i860 ("N-Ten"). Shipped supporting MIPS, DEC Alpha and PowerPC.

~~~
prodmerc
Not to mention Intel had quite a successful line of ARM CPUs back in the day -
[https://en.wikipedia.org/wiki/XScale](https://en.wikipedia.org/wiki/XScale)

Which makes articles like this one:
[http://www.digitaltrends.com/computing/intel-arm-
processors-...](http://www.digitaltrends.com/computing/intel-arm-processors-
why-how-who/) ridiculous.

"the unthinkable happened. Intel revealed it was going to build ARM
processors."

Do your damn research, people :-)

~~~
daemin
Well from memory Intel sold off the general purpose ARM CPU division to
Marvell back in 2006 or so. Though it kept the ARM chips that were specialised
for storage or networking IIRC. So technically Intel has always made ARM chips
in one form or another, just not any that slot into a mobile phone.

~~~
prodmerc
I really wanna know the logic behind that decision. They probably thought ARM
was not worth it? Despite it being used successfully in Pocket PCs?

~~~
daemin
Based on what another commented has said it seems Intel bought the ARM license
from DEC, and given that they sold off only the consumer ARM chips - keeping
the task specific ones - it makes sense in a way. It was so they wouldn't have
any internal competition or divided focus away from their low power x86 chips
- the Atom's.

------
leonroy
Another related British company which has become very successful without
anyone noticing is Imagination Technologies (formerly Videologic).

They had a successful graphics card business during the 90s but with the
introduction of 3D graphics accelerators their product (PowerVR) whilst semi-
successful began to struggle against 3dfx and nVidia's TNT and Geforce range
of graphics cards.

Eventually they pulled out of the standalone graphics market and pottered
around producing speakers and radios. They had a few sucesses like winning the
Sega Dreamcast bid from 3dfx but I believe it wasn't until the need to have
powerful and low power graphics chips in phones that the PowerVR's very
efficient rendering technology pushed Imagination back into the fore. Their
graphics chips nowadays can be found powering most smartphones and tablets.

Very interesting company and a very interesting past.

~~~
dogma1138
I wouldn't call Imagination Technologies a successful company their IP is
being shipped in billion of SOC's but they yearly revenue is below 200M and
they operate at break even or a loss. IT seems to be a case study of a company
which it's IP is a corner stone in many consumer devices but some how never
managed to capitalize on it. ARM is a very good example of how to spin off
your IP and make it successful even when your actual business (Acorn) is going
bust.

~~~
orf
> I wouldn't call Imagination Technologies a successful company their IP is
> being shipped in billion of SOC

I would consider any company who's tech ships in a billion devices successful,
regardless of if they have managed to capitalize on it.

~~~
BHSPitMonkey
I think "company" was the operative word there. They are certainly successful
technologists, but perhaps not a success as a company.

------
andmarios
The article is interesting and I think that the title and the closing sentence
aren't doing it justice.

It is the default for any company that produces non end-user electronics to
stay in the background. Most people don't know who build their RAM chips, or
the capacitors on their motherboards or even who built their motherboard. How
many people even on HN know the brand of their laptop's HDD or RAM?

Intel is the exception, in that they recognized early the value of a brand
name and did everything they could to make theirs known. It is hard for newer
generations to understand, but at the dawn of the PC era no one cared much
about the CPU brand. Intel gave discounts to manufacturers who used the “intel
inside” sticker along with many other promotional actions in order for their
product to become important in the eyes of the consumer.

------
mst
> The computer powered by the chip that came out of the design was called the
> Acorn Risc Machine – or ARM.

NO NO NO NO NO.

The computer was the Acorn Archimedes.

The first one I had contained an ARM26.

Gah.

~~~
gjvc
Correct. This was England's last hope in the computer market.

~~~
m-i-l
Or the INMOS transputer[0] based Abaq (which later became the Atari Transputer
Workstation[1]).

[0]
[https://en.wikipedia.org/wiki/INMOS_transputer](https://en.wikipedia.org/wiki/INMOS_transputer)
[1]
[https://en.wikipedia.org/wiki/Atari_Transputer_Workstation](https://en.wikipedia.org/wiki/Atari_Transputer_Workstation)

------
ddingus
I'm a fan of ARM for it's power / performance characteristics, and for the
market competition it's bringing for Intel. (who needs it)

One concern I have is the overall expectations on ARM are more locked down
than Intel. Look at Microsoft and it's ARM Tablet. Locked down, Win 8 App
Store only. Android is largely locked down, unless you jailbreak, or get a
development / standalone PC board or system.

The more prominent ARM devices do not offer the flexibility and openness your
average Intel PC does.

Legacy software is a big part of what I see as keeping that door open. This
latest round of Intel PC's is notable for the lack of documentation and or
options to configure them to boot other things. [1] Not like it can't be done,
but it's painfully obvious the only reason it can be done is the vendor left
the option there.

They don't have to, and I get the feeling nobody really does want to. They
just feel it makes sense right now for legacy software reasons. At some point,
that equation will change...

[1] - Case in point, a recent HP consumer grade laptop. Got it, and it had
Windows 8 on it, which at the time didn't make any sense for the use case it
was purchased for. There was no meaningful documentation on how to enter the
BIOS to set "LEGACY" mode. I literally poked around on the keys, until I found
it. Scary. The idea of, "they didn't even have to provide that" hit home right
then.

------
RockyMcNuts
With Intel skipping a 'tick' and fabs like Samsung and TSMC catching up in
process, ARM-based CPUs are competitive in performance with the low end of
Intel's desktop lines up to Core i5 - and use less power.

There seems to be a window for ARM to make inroads into servers, in addition
to low-end laptops like Chromebooks.

The trend of mobile devices replacing desktops for more and more tasks (> 50%
of emails read on mobile, only 10% of people desktop-only) bodes poorly for
Wintel.

One could foresee an era where one can add a keyboard and big screen to a
mobile device like an iPad Pro, and most folks no longer use desktops at all.
Phones migrate up to replace desktops, PCs migrate up to the cloud.

[http://www.huffingtonpost.com/druce-vertes-cfa/the-end-of-
th...](http://www.huffingtonpost.com/druce-vertes-cfa/the-end-of-the-
pc_1_b_8613602.html)

~~~
dogma1138
ARM CPU's aren't really comparable to anything above the Intel's ATOM line in
terms of benchmark performance, and raw real world applications still give x86
based CPU's a huge advantage. [http://www.computingcompendium.com/p/arm-vs-
intel-benchmarks...](http://www.computingcompendium.com/p/arm-vs-intel-
benchmarks.html)
[http://www.phoronix.com/scan.php?page=article&item=nvidia_te...](http://www.phoronix.com/scan.php?page=article&item=nvidia_tegrak1_preview&num=1)

Allot of the "performance" of most of these SOC's actually comes from the
GPU/DSP portion of the SOC rather than the pure CPU. So far it seems that
Intel's bet was correct it will take them just as much time to reduce x86
(Skylake/Core-M, Bay-Trail/Cherry Trail/Willow Trail) power consumption to ARM
SOC levels as it would take ARM SOC's to come close to x86 performance, and
Intel will still win at the end.

Keep in mind that in 2007-2008 Intel was very seriously considering licensing
ARM and other technologies to compete in the ultra portable and mobile markets
and we are all better off that it didn't do so.

Intel now has offerings which are better or comparable to ARM based SOC's
without fragmenting the ARM ecosystem even further and with having another
avenue of technologies and intellectual property to keep the competition going
and to offer an alternative to ARM/RISC with both the ability to streamline
transition and having a fall back option which is always needed.

An all ARM ecosystem is just as bad (or even worse since unlike x86 it doesn't
guarantee compatibility) as a monopolized x86 one. We should also be quite
thankful to ARM and it's users as they have been the driving force behind much
of Intel's work lately as AMD was not really offering competition since the
old Athlon-X64 days.

The new Core-M7 offers 2 cores at a boost clock of 3.1ghz, and a 1ghz GPU and
beat every ARM SOC out there in terms of performance especially in places
where it actually counts while having a TDP of 4.5W which is comparable to
high end SOC's like Apple's A9x/A8x and considerably lower than the NVIDIA
high end offering although that one blows everything out of the water when it
comes to graphical performance as it has desktop GPU core's in it.

~~~
RockyMcNuts
those blog benchmarks are 2013 and 2014.

current benchmarks linked in the blog post show ARM competitive with Core i5
(single-core) although not i7.

[http://arstechnica.com/apple/2015/11/ipad-pro-review-mac-
lik...](http://arstechnica.com/apple/2015/11/ipad-pro-review-mac-like-speed-
with-all-the-virtues-and-limitations-of-ios/4/#h2)

blog post also links to published reports saying Intel has 1,000 people
working on a SOC for iPhone.

[http://venturebeat.com/2015/10/16/intel-has-1000-people-
work...](http://venturebeat.com/2015/10/16/intel-has-1000-people-working-on-
chips-for-the-iphone/)

~~~
dogma1138
ARS isn't loading for me, yes the a9x is a monster chip especially as far
graphics and synthetic benchmarks goes, try real world applications (quite
hard to near impossible on Apple devices due to the closed ecosystem)
including web server performance, video compression throttling and many more
aspects and you'll see that's not the case. It's very competitive with Core-M
devices for sure but sorry not sold out on it being competitive against Core-I
devices on any level besides maybe graphical performance so far.

As far as Intel making apple SOC's and? Intel is selling modems to Apple, and
it wants to fab SOC's for them as well. Since Apple is fabless and Intel has
the most advanced SC production line at the time it nailed 14nm and it's
pushing for 10nm next year for launch in 2017 I don't see this being
indicative of anything, Intel tested the water with ARM almost 8 years ago and
it decided to continue with x86, and we've seen quite interesting things
coming from them Core-M, Xeon-D, and even things like Intel Edison which
showed that x86 can be scaled to pretty much every use at this time while
still being very interesting and competitive in terms of both price,
performance, and power consumption.

We've been hearing this gloom and doom about "Wintel" for almost 2 decades
now, linux is going to beat them, no wait apple power PC, no ARM, sigh it's
getting old. Intel isn't going anywhere soon and it has enough resources to
push through and their technology and where they can take it so far does not
seem to be obsolete on any scale.

~~~
josh2600
"Nailed 14nm". Lol.

I guess maybe you missed all the news about Intel fubar'ing it's 14nm and
having to order more 16nm capacity from TSMC to deal with the pain?

[http://techbuyersguru.com/hotdealsblog/tbg-commentary-
intels...](http://techbuyersguru.com/hotdealsblog/tbg-commentary-intels-14nm-
woes-and-the-future-of-cpu-tech/)

Intel might've once been unstoppable, but they're not perfect any more.

------
ksec
Paypal using ARM server? I didn't know that. I would actually love someone
explain to me why ARM server is an attractive option. When Intel launched the
recent Xeon-D i thought there isn't really any incentive to go to ARM CPU
server.

------
ChuckMcM
As with most things I think the success of ARM has less to do with its chips
or architecture and more to do with its business model and the competition.

For decades the combined power of Intel's volume and Window's ubiquity kept a
huge amount of resources dedicated to that platform. SPARC, M68K, NS32, VAX,
PA-RISC, even Itanium were crushed under the unrelenting focus by third
parties on building tools, software, and systems around x86 and later AMD64
architecture chips.

What is fascinating is that Intel got into that position by being open, there
were no fewer than 12 licensees for its 8086 design, and people had supplanted
"expensive, proprietary lock-in" type architectures with more open and cheaper
chips. It was the emergence of the PC market, and the great Chip Recession of
1984, where Intel decided if it was going to stay a chip maker, it had to be
the _best_ source of its dominant computer chips. I was at Intel at the time
and it shifted from partnering, to competing, with the same people who had
licensed its chips, with the intent of "reclaiming" the market for CPU chips
for itself.

You have to realize that at the time the bottom had fallen out of the market,
and things like EPROMs and DRAM (both of which Intel made) were being sold on
the grey market at below market costs as stocks from bankrupt computer
companies made it into the wild. Further competitors like Ok Semiconductor
were making better versions of the same chips (lower power, faster clock
rates). Intel still had a manufacturing advantage but it could not survive if
it couldn't make the margins on its chips hold. It dumped all of its
unproductive lines, wrapped patents and licenses around all of its core chips,
and then embarked on a long term strategy to kill anyone who wouldn't buy
their chips from Intel at the prices that Intel demanded.

We can see they were remarkably successful at that, and a series of CEOs have
presided over a manufacturing powerhouse that was funded by an unassailable
capture of not only software developers but system OEMs as well. They fended
off a number of anti-trust lawsuits, and delicately wove their way between
former partners like Compaq who were now laying on the ground, mortally
wounded.

ARM was playing in the embedded space, dominated by the 8051 (an Intel chip)
where Intel played the licensing card (just like ARM) licensing its
architecture to others who would make their own versions of the chips. As a
licensing play they insured their partners would never move "up market" into
the desktop space and threaten the cash cow that was x86.

The relentless pace of putting more transistors into less space drove an
interesting problem for ARM. When you get a process shrink you can do one of
two things, you can cut your costs (more die per wafer), or you can keep your
costs about the same and increase features (more transistors per die). And the
truth is you always did a bit of both. But the challenge with chips is their
macro scale parts (the pin pads for example) really couldn't shrink. So you
became "pad limited". The ratio of the area dedicated to the pads (which you
connected external wires too) and the transistors could not drop below the
point where most of your wafer was "pad". If it did so then you're costs
flipped and your expensive manufacturing process was producing wafers of
mostly pads so not utilizing its capabilities. At the Microprocessor Forum in
2001 the keynote suggested that spending anything more than 10% of your
silicon budget on pads was too much. 90+% of your die had to be functional
logic or the shrink just didn't make sense.

The effect of that was that chips ARM designed really had to do more stuff or
they were not going to be cost effective on any silicon process with small
feature sizes. And the simplest choice is to add more "big processor" features
or additional peripherals.

So we had an explosion of "system on chip" products with all sorts of
peripherals that continues to this day. And the process feature size keeps
getting smaller, and the stuff added keeps growing. The ARM core was so small
it could accommodate more peripherals on the same die, that made it cost
effective and that made it a good choice for phones which needed long battery
life but low cost. The age of phones put everything except the radios on chips
(radios being like modems, different for every country, were not cost
effective to add to the chip until software defined radio (SDR) became a
thing. And the success as a phone platform pushed the need for tools, and the
need for tools got more of the computer ecosystem focussed on building things
for the ARM instruction set.

At that point step two became inevitable. Phones got better and better and
more computer like, they need more and more of the things that "desktop" type
computers need. You have a supplier (ARM) which is not trying to protect an
entrenched business basically doing all it can to widen its markets. And a
company like Apple, who wasn't trying to protect its desktop/laptop market
share pushing the architecture as far as it can. More tools, more focus, more
investment from others to support it, and like a fire that starts as a glowing
ember near a convenient source of tinder, the blaze grows until the effects of
the fire are creating its own wind and allowing it to grow bigger and
stronger. Even after Intel woke up to the fact that the forest around their
x86 architecture was on fire, I don't think they had enough time to put it
out.

So here we are with ARM chips which are comparable in software support and
feature set of Intel's low end desktop CPUs. But without the Intel "tax" which
is the extra margin Intel could demand being the only player, and immune to
Intel's ability to attack by patents or license shenanigans. Intel is in full
on defense, paying tablet vendors like Lenovo to use their chips in ARM
tablets, supporting the cost of building out their own IoT infrastructure with
Galileo, and doing all they can to keep ARM out of their castle, the data
center. Like DEC and its VAX line, or Sun and its SPARC line, they are doomed.

Looking at the performance of the iPad pro it is pretty clear you can build a
chromebook or a laptop that would meet the needs of the mass market with an
ARM architecture machine. And because ARM licensees can add features
_anywhere_ in the architecture including places like the frontside bus[1]
which is tightly controlled space in x86 land, you will be able to provide
features faster than x86 OEMs can convince Intel they need them. And that will
change things in a pretty profound (and I think positive) way. Not the least
of which might be having the opportunity to buy a lap top that isn't pre-
backdoored by the chip manufacturer with its SMM.

[1] Literally if you buy a bus analyzer (a sophisticated logic analyzer) from
Agilent or Tektronix and hook it to the Intel frontside bus, it won't display
the signals until you enter the NDA # you got from Intel! That is pretty
tightly controlled.

------
randomsearch
Can anyone explain why ARM have not been bought by Intel, MS, Apple or Google
yet?

~~~
cstross
The success of ARM's strategy is based on them being an impartial supplier of
designs that facilitate low power consumption and some degree of
interoperability at the assembler level.

If anyone buys it, they kill the goose that lays the golden egg. All those
corporations run their own vertically integrated stack to some extent -- if
you were Apple, would you be willing to trust an ARM design licensed from MS
or Google? (Or Intel, when the whole point of Apple's CPU strategy going back
22-25 years is to be mostly CPU-independent after getting burned twice in a
row by MC680x0 and then PowerPC)? If you were Samsung, would you license an
ARM design from Apple-owned ARM? And so on.

~~~
Andaith
Interesting, I'd assume the opposite.

If you're apple, and you're licensing ARM designs, can you risk Google or
Microsoft, or Oracle buying ARM? Can you risk them changing the licensing
terms, or stop licensing altogether?

How come none of them want to be the first to buy, securing their rights to
ARM, and instead enjoy the risk of a competitor buying ARM?

~~~
stephencanon
You pay lawyers to negotiate contracts carefully to protect against that risk,
you don't need to buy the company.

------
qyv
[https://www.youtube.com/watch?v=1jOJl8gRPyQ](https://www.youtube.com/watch?v=1jOJl8gRPyQ)

Steve Furber on the creation of ARM, quite interesting.

------
ThePhysicist
I think "without anyone really noticing" should really mean "without anyone in
the general public really noticing". Competitors and people working in the
chip industry have probably been well aware of the ARM architecture and its
rise for a long time. In addition, some factors that are responsible for the
huge success of this architecture have probably been historical accidents as
well, in the sense that a different architecture might have been capable of
taking ARMs place, but the adoption of ARM by some large tech companies and
the explosion of the smartphone market made things move really fast in favor
of that platform.

In general I think that the press and the general public only becomes aware of
these things long after they have reached a dominant market position and are
employed almost universally. It's probably sure that even now many companies
that are almost unknown to the general public are working on technologies and
products that will change entire markets in the future.

~~~
diqu
> _I think "without anyone really noticing" should really mean "without anyone
> in the general public really noticing"._

I think that's implied when a mainstream news outlet writes about it, and not
some tech magazine.

> _In general I think that the press and the general public only becomes aware
> of these things long after they have reached a dominant market position and
> are employed almost universally._

which seems evident, here. ARM becomes dominant -> Guardian writes about it ->
the general public learns about it.

~~~
bpicolo
I mean, I don't expect the general public to care at all about CPU
architecture...

~~~
0942v8653
It does matter to them, though, because if the computing market turns towards
ARM tablets (tablets seem to be what Microsoft is pushing), and they get one
without realizing it, they won't be able to run any of the stuff they used to
be able to on Windows. Windows has a long history of backwards compatibility
and if ARM becomes dominant, it will end up being pretty much pointless. CPU
architecture isn't a huge difference when they are compatible, but this time
it is a big deal.

~~~
TeMPOraL
Still, it's not something the GenPop will know or care about. From their POV,
it's a product issue - this tablet does or does not run some software. It's
easy to paper over 99% of those differences by... writing new software. I
mean, people don't complain that iPad doesn't run vanilla MS Office.

~~~
diqu
Which boils down to the old "consumers view computers as black boxes"
argument. The question is whether we should care about that or not, and
whether the pain of not being able to run GenPop's favorite software grows too
big. Being locked into the respective device's app store hides that pain to a
degree. And whenever it becomes noticable, GenPop rather considers switching
to a different vendor than considering the tech in question. As long as people
put band-aids around the architecture, I'm not sure why any consumer would
want care about the underlying hardware in question.

~~~
digi_owl
Thats because the computer as a mix of hardware and software is something
fairly new. The home/personal computer didn't really happen until the 80s.
Before then every device had a defined purpose, and if there was any software
it was living as "firmware" inside the hardware.

Damn it, i still recall when people got hot and bothered about getting updates
for their Nokia Series 40 phones over the air. Before than you only got it
done if it was obviously broken, and to do so you brought it to the service
desk of a nearby store or some such.

------
runn1ng
I actually didn't know ARM is from England. For whatever reason, I thought the
company resides somewhere in Asia.

~~~
rzzzt
I recommend watching "Micro Men", a drama/documentary depicting the early days
of Acorn and the battle with Sinclair for producing the BBC's official
microcomputer.

[https://en.wikipedia.org/wiki/Micro_Men](https://en.wikipedia.org/wiki/Micro_Men)

~~~
matt2000
[https://www.youtube.com/watch?v=hco_Av2DJ8o](https://www.youtube.com/watch?v=hco_Av2DJ8o)
for anyone interested in checking it out.

