
The reluctant debut of the A14 processor - ingve
https://sixcolors.com/post/2020/09/the-reluctant-debut-of-the-a14-processor/
======
RupertWiser
Totally get this from a business perspective but speaking for myself, I was
actually far more interested in seeing the new chip in the iPad. I have no
doubt that the iPhone 12 will be another confident black rectangle. Having usb
c and the stronger processor work its way down the price points was far more
exciting of an announcement for me. I don’t really ever find myself yearning
for a stronger CPU on my iPhone XR but if Apple is serious about replacing PCs
with iPads, it seems like having strong chips in their cheaper models is a
great position to fight from.

~~~
nicoburns
It's the mac chips I'm really interested in seeing. That's the unknown
quantity.

~~~
josephg
Likewise. I'm still rocking a 2016 13" macbook pro with a dual core CPU and
integrated graphics. I'm looking at picking up this year's iphone, and it
might be faster in every way than my laptop. (But with less RAM and storage)

If true, I'm not really sure what to do about that. I doubt I can offload rust
compilation to an iphone.

~~~
nicoburns
Yeah, I have the 2015. You can actual compile Rust on an iphone (but I think
you'd have to sync the code thei manually), but I think for Rust you really
want one the AMD processsors with 8+ cores. I guess the new macbooks might be
able to compete. We'll have to see.

------
jiggawatts
Did anyone else notice that the AI section is now bigger than the traditional
CPU cores?

What applications benefit from this? The only use-case I've seen for AI in
Apple's mobile devices is the face unlock.

~~~
Skinney
Apple does all machine learning on-device. So stuff like handwriting
recognition, Siri, automatic improvements of photos etc. makes heavy use of
it.

~~~
jononor
All inference on-device you mean? I presume that the vast majority of their
models are trained in the cloud. Apart from some fine-tuning for
personalization maybe.

~~~
manojlds
While they did say machine learning, all the examples are of inference, and
not training. So you don't have to ask if they meant inference when it is very
clear what they mean.

~~~
lostmsu
Outside of marketing speak, it is not really machine learning if the machine
is not learning.

~~~
junipertea
I wouldn't expect "this phone can run for 10 hours and train a model", even
though it clearly is also possible.

------
ksec
It is worth pointing out A13 had a die size of 100mm2, much larger than usual,
especially on a Smartphone Leading Edge Node.

So the A14 is basically am A13 5nm Die Shrink with better GPU and Double the
NPU. What I am interested in is the Die Size. Which I am expecting to be sub
80mm2.

------
Guthur
I'm not trying to be some sort of Luddite, but at this stage what more can a
phone really do. If it's 10% faster than chips that are already doing a fine
job right now it just feels flat.

I really love new tech and especially microprocessors but have trouble seeing
how a phone breaks out of it's limiting form factor in any meaningful way
right now that would make use of yet more processing power.

~~~
simonh
People have been saying that for 10 years. It makes calls, it’s got location,
internet, a camera. Done right?

Since then we’ve got on-device powered AI voice assistants, Face ID, neural
engine powered photo enhancement, health monitoring, custom silicon enhanced
environment mapping AR. Goodness knows what else is coming.

The advent of custom tuned accelerators such as neural engines in silicon is
opening up a huge new field of applications and capabilities. That’s
especially true when they are linked to these new arrays of environment and
bio-sensors. The X series iPhones are doing a list of things simply impossible
on previous devices.

You may not care about some or all of these things, that’s fine they’re not
all for everyone, but each one of them is important to some people.

~~~
socialdemocrat
But how much does any of that really matter? I really believe the future is in
things like AI voice assistance, but IMHO stuff like that isn't really good
enough yet to make a big difference.

I think what Guthur may reflect upon is that we got tons of new phone models
pushed into the market every single year that do very little new compared to
previous models.

So what if we get 50% more performance. Sounds impressive, but then you
realize it is only used to create smiley animations, and other frivolous
teenager nonsense.

I don't think I have really seen a point in phone upgrades since iPhone 4 or
so. That is about 9 years ago.

Same deal with iPads, not much new anyone needs. There are a couple of things
though which I do think has mattered. Retina display and the Apple pencil.

But honestly the priorities are quite different from what I would want. I love
my Apple products but I really wished they where a bit more modular or
serviceable. By that I mean that e.g. replacing the battery should be much
easier. Memory should have been easier to expand or replace.

~~~
thefounder
>> I don't think I have really seen a point in phone upgrades since iPhone 4
or so.

The apps stop working/updating after a while. So if you don't have a reason
Apple gives you one: planned obsolescence

~~~
colejohnson66
Why is Apple guilty of planned obsolescence when they support their devices
sometimes 3x (5-6 years vs 2) what their competition (Android) is doing?

~~~
thefounder
>> Why is Apple guilty of planned obsolescence

Because they do it for obvious reasons(i.e money)

>> when they support their devices sometimes 3x (5-6 years vs 2) what their
competition (Android) is doing?

I guess they set a sensible lifespan considering the market they target(i.e
high end)

Either way the result is the same: after a while you end up with a useless
brick instead of an old/use-able computer.

~~~
simonh
Your contradicting yourself. How can they be supporting their devices for
longer (much longer) than their competitors because their premium customers
expect it, but also guilty of dropping support for the same products early to
make more money? You’re not making any sense.

~~~
thefounder
So you think it's all right to have your computer become a useless brick after
5 years because the other vendors makes it a brick after just 2 years?

It's also worth to note that at least on Android you can still side-
load/install apps outside the play store while on Apple your device becomes
totally useless/museum brick.

The number of competitors has little to do with planned obsolescence but

" Planned obsolescence tends to work best when a producer has at least an
oligopoly "

[https://en.m.wikipedia.org/wiki/Planned_obsolescence](https://en.m.wikipedia.org/wiki/Planned_obsolescence)

~~~
colejohnson66
So our quibble is over “useless brick.” How does Apple dropping support for a
device make it a “useless brick”? As far as I’m aware, you can still use an
original iPhone from 2007 today. It won’t be fast, but it will work more or
less.

Planned obsolescence is _not_ related to advancements in technologies. Would
you classify the Commodore 64 as having been designed with planned
obsolescence in mind considering it ran on a 6502 when the competition at the
time was transitioning to 8086? Of course not. So why is Apple deciding to put
more advanced technologies into their devices planned obsolescence?

If you’re referencing the App Store, though, I can see where you’re coming
from. But Apple releasing new technologies and not putting them on devices
that can’t handle them (due to computing power) and developers using them
(which prevents them from running on said old devices) is not planned
obsolescence either. The _developer_ chose to make it incompatible with the
older devices, not Apple.

~~~
thefounder
I'm not trying to use new tech on old devices. I just want to update the apps
using "old tech"/sdk (which is the only one that works) but Apple store
doesn't accept old tech/code regardless the device you try to target.

To answer your question:

today I can still deploy code on Commodore 64 but not on my old iPhone. How is
that possible?

[https://developer.apple.com/news/?id=03042020b](https://developer.apple.com/news/?id=03042020b)

>> Starting April 30, 2020, all iPhone apps submitted to the App Store must be
built with the iOS 13 SDK or later.

~~~
simonh
The same can be said for old consoles, or old non-smartphones that were also
computers, had browsers, even in some cases have CPUs more powerful than
contemporary iPhones, but didn't have any way to install apps at all.

There's no law requiring vendors to provide features to install apps on
devices with CPUs at all.

You probably own a dozen devices containing CPUs, running software, for which
there is no practical way for you to install anything. Apple provides a
certain set of features in their devices, if you don't think that's a good
deal, there are other vendors you can go to.

------
YetAnotherNick
Apple states that:

> two high performance A13 cores are 20% faster with 30% lower power
> consumption than the Apple A12's, and the four high efficiency cores are 20%
> faster with 40% lower power consumption than the A12's. A13's eight-core
> Neural Engine dedicated neural network hardware is 20% faster and consumes
> 15% lower power than the A12's

> A14 up to 40% faster than A12, with 30% faster graphics performance than
> A12, and machine-learning performance is up to 10 times faster.

~~~
stepsrabbit
Is there any reason for them to compare the A14 to the A12 instead of the A13?
Other than making the numbers look more impressive, of course.

~~~
twoodfin
They’re comparing it to the SoC in the previous generation of the iPad Air.

------
GeekyBear
TSMC says that moving from their 7nm to 5nm process node offers either a 15%
speed improvement or about a 30% reduction in power consumption.

The thing that I found interesting in the A14 is that this is the first time I
can remember a chip designer using a die shrink mainly to cut power usage
instead of increase performance.

However after Anandtech's recent deep dive into Tiger Lake performance
testing, you can see where cutting power use in the A14 cores might be a
winning move.

>Here we present the 15W vs 28W configuration figures for the single-threaded
workloads, which do see a jump in performance by going to the higher TDP
configuration, meaning [Tiger Lake] is thermally constrained at 15W even in ST
workloads.

Comparing it against Apple’s A13, things aren’t looking so rosy as the Intel
CPU barely outmatches it even though it uses several times more power, which
doesn’t bode well for Intel once Apple releases its “Apple Silicon” Macbooks.

[https://www.anandtech.com/show/16084/intel-tiger-lake-
review...](https://www.anandtech.com/show/16084/intel-tiger-lake-review-deep-
dive-core-11th-gen/8)

~~~
Gibbon1
> The thing that I found interesting in the A14 is that this is the first time
> I can remember a chip designer using a die shrink mainly to cut power usage
> instead of increase performance.

This has been a thing with RF Chip design for the 30 years I've been paying
attention. Transceiver I used 20 years ago consumed 220 mW in receive mode.
One I used 10 years ago, 50 mW. Current one uses 10mW.

~~~
GeekyBear
On the level of a mobile device SOC, however?

I can't come up with a single other example where cutting power was
prioritized.

Things have tended to go the other way with die shrinks that favor performance
and often with ridiculous amounts of overclocking piled on top of that.

>Brian and I have long been hinting at the sort of ridiculous
frequency/voltage combinations mobile SoC vendors have been shipping at for
nothing more than marketing purposes. I remember ARM telling me the ideal
target for a Cortex A15 core in a smartphone was 1.2GHz. Samsung’s Exynos 5410
stuck four Cortex A15s in a phone with a max clock of 1.6GHz. The 5420
increases that to 1.7GHz. The problem with frequency scaling alone is that it
typically comes at the price of higher voltage. There’s a quadratic
relationship between voltage and power consumption, so it’s quite possibly one
of the worst ways to get more performance.

[https://www.anandtech.com/show/7335/the-
iphone-5s-review/2](https://www.anandtech.com/show/7335/the-
iphone-5s-review/2)

------
dep_b
It's a bit strange in the same year they're going to present the first laptops
sporting the new ARM chip. I mean the performance increase can be declared by
the clock increase the smaller process allows alone. So either they put all of
their engineering capacity towards a blistering fast desktop / laptop CPU or
they really have a problem in their SoC engineering department.

Also they reserved all of TSMC's 5nm capacity for the A14. Like that was a
last ditch effort to make it _look_ faster after all?

~~~
my123
The upcoming Snapdragon and Exynos chips are on Samsung's 5nm process, so they
should be relatively comparable. I wouldn't worry on that front.

> So either they put all of their engineering capacity towards a blistering
> fast desktop / laptop CPU

TSMC promises up to a 30% improvement in power consumption or up to 15% more
performance over the current N7 node in the 5nm process. This relatively
matches with Apple's claim this gen. So Apple might have put the engineering
effort on desktops & laptops this time.

~~~
simonh
Transitioning to a new node can be tricky, so in the past they have been very
conservative in their changes to the design when doing so. Better to do the
transition with as few new potential surprises as possible.

This is also true for Qualcomm so I’d expect to see similar, mainly process
node driven changes in their new chips as well.

~~~
jkestner
Aka Intel's Tick-Tock cadence of releases.

