
Intel’s 11th Gen Core Tiger Lake SoC Detailed - rbanffy
https://www.anandtech.com/show/15971/intels-11th-gen-core-tiger-lake-soc-detailed-superfin-willow-cove-and-xelp/2
======
xyst
I won't be buying Intel CPUs for years to come. Haven't upgraded since 2016.
Years of "incremental" improvements while still advertising $1000+ CPU prices
has deterred me for quite some time. Intel fanboys might eat it up though.

Given their history of significant delays in the current roadmap, how does
this compare to their competition who is quickly advancing on 10nm?

Also what is this internal naming schema?

> The 14nm process node has been Intel’s most profitable manufacturing node to
> date, and continuous intranode enhancements over the years (14+, 14++,
> 14+++, 14++++*)

This reads like a satire article.

~~~
013a
Intel's CPUs today are _still_ king when it comes to single-thread workloads,
like gaming. The main difference in the past two years has been that AMD has
taken the lead in multi-thread performance, and that's scared Intel so much
that they've significantly improved multi-thread performance and price across
the board.

The difference is quite stark, in both directions [1]: AMD destroys Intel in
anything that benefits from multiple cores at the same product tier (Ryzen 9
3900x vs i9 10900k), in some workloads as high as 50%. But, Intel wins when
gaming; usually by around 5-10%.

The more measured position is: If you're just looking for the best value brand
right now, AMD is the way to go, even if you are gaming. The 10900k _is
better_ than the 3900x for gaming, but its also a ~$120 up-sell: At similar
_pricing_ tiers, the single-thread performance difference is likely quite
similar. But if you just want the best at any price, that's still Intel: More
cores won't save every workload.

[1] [https://www.pcworld.com/article/3543993/intel-10th-gen-
revie...](https://www.pcworld.com/article/3543993/intel-10th-gen-review-
core-i9-10900k.html)

~~~
kev009
By the time you subtract all the sub-prime performance from their
vulnerabilities, this isn't actually true on properly patched operating
systems. They took a sub-prime loan on validation, sacrificing security in the
process, and the result is chips marketed and sold as something they are not.
There really should be government-level lawsuit and restructuring at intel
right now.

~~~
Filligree
It's worse, isn't it?

From what I've heard, Intel likes to run their benchmarks with many (most?)
mitigations disabled... but Windows ships with all of them enabled.

------
wolfgke
Analysis by SemiAccurate on SuperFin:
[https://semiaccurate.com/2020/08/13/intel-talks-
about-10nm-s...](https://semiaccurate.com/2020/08/13/intel-talks-about-10nm-
superfin-and-packaging/)

------
x87678r
I have a Intel i5 2500 that I paid $209 for in April 2011. I'm looking at new
i9 desktop chips and they're like 50% faster. 10 years? WTF? The laptop chips
are barely faster.

[https://cpu.userbenchmark.com/Compare/Intel-
Core-i9-9900-vs-...](https://cpu.userbenchmark.com/Compare/Intel-
Core-i9-9900-vs-Intel-Core-i5-2500/m816115vsm517)

~~~
hmexx
Things have indeed slowed down but you're cherry picking numbers. +50% single-
thread but +200% in multi-threaded!

~~~
x87678r
Agreed but I dont think I do much that would keep more than 4 cores maxxed
out.

------
tyingq
Aside from the pressure from AMD, I think there's another force at play here.

There are quite a lot of people now that just use a web browser on their
desktop or laptop. For them, a 5 year old (or older?) CPU is good enough that
they don't know they are missing anything. Especially if they have enough
memory and an SSD.

~~~
dahfizz
> There are quite a lot of people now that just use a web browser on their
> desktop or laptop.

As opposed to the people who used to use their computer for email and spread
sheets?

I would argue that the resource needs of a web browser has absolutely been
increasing over time. The 4GB of RAM in that old laptop just does not work for
the modern web. And considering how many web site now require the client to
run tons and tons of JS just to work, I can see a need to refresh your CPU
regularly.

~~~
stagger87
4GB of RAM on a machine (regardless of OS) is absolutely fine for browsing the
"modern web".

~~~
mucinoab
Windows alone uses like 2GB, open some tabs and you are already out of ram. I
think 8 is the new minimum.

~~~
stagger87
I have 30 tabs open in Chrome, real tabs, on youtube, PDFs, and code/API docs,
and I just cycled through them all to make sure they are loaded, and Chrome is
at 600MB.

The phase "open some tabs and you are already out of ram" is a bit of an
exaggeration.

------
Wandfarbe
When i read comments here i'm wondering how and why people are either for or
against intel?

IF i need to upgrade, i will check the current situation and will buy the best
cpu in comparison to my needs and price.

If it means that i need tons of core because of my workload i might choose a
threadripper. Do i need to compile often and i know a specific intel cpu will
save me ton of time waiting but it costs more? Fine.

Do i wanna play games? lets see what i can get for ~150$ bucks range.

/shrug

~~~
derefr
When you're a business, you can't make turn-on-a-dime decisions like that. You
commit to business-equipment purchase/leasing/upgrade agreements with a
particular OEM; who in turn usually commits (for economy-of-scale reasons) to
one CPU supplier or the other. So, if you're stuck with an OEM who's in turn
stuck with "the wrong" CPU supplier (right now Intel), that can make you
cross, at the potential performance that's being left on the table.

But even for the individual, unless you're buying sealed non-upgradable
appliance devices, you've still gotta consider the fact that in building a PC,
you're choosing a _motherboard socket_ , and thus _potentially_ making it
cheap to upgrade to later-gen CPUs that stay compatible with that socket.

In that mindset, it makes sense to be happy with the CPU maker you're "stuck
with" for a while when they provide good, high-ROI upgrade options; and to be
angry with them when they don't.

~~~
DoofusOfDeath
>in building a PC, you're choosing a motherboard socket, and thus potentially
making it cheap to upgrade to later-gen CPUs that stay compatible with that
socket.

I'm curious how common it is for people to upgrade just the CPU or
motherboard.

I tend to upgrade / replace my personal PC's every 4-6 years. At that pace,
it's always seemed worthwhile to upgrade both motherboard and CPU at the same
time, because of platform improvements. But maybe I'm an outlier.

~~~
derefr
One oft-stated reasoning is that people who currently don't have much money
(e.g. college students) can build a PC with a motherboard socket targeted by
both low-end and high-end CPUs; and then select, for now, the (cheap) low-end
CPU. Then, a few years later, when they're working and have more money, they
can replace it with the high-end version.

I'm not sure how common this actually is in practice, but it seems logical.

~~~
leetcrew
it's great if you decide to upgrade your CPU a couple years later and you
motherboard is still compatible, but imo this is not a good reason to choose a
particular platform in the first place. you never know when the CPU
manufacturer is going to drop support for that chipset. even if the new CPU
does support the old chipset, you might be sacrificing some new features or
leaving some performance on the table. plus if I'm going to buy a high-end
cpu, I'd like to pair it with a high quality power delivery also, which you
probably aren't going to find in a college student's budget build.

~~~
Symmetry
AMD is pretty good about sticking with motherboard sockets and maintaining
firmware compatibility. The motherboard I got with my 1st generation Ryzen is
handling my 3rd generation chip just fine, though admittedly I'm missing out
on PCIe 4.

------
Wowfunhappy
Does anyone know where Intel's codenames come from? They're... supremely
weird.

"Ivy Bridge" and "Skylake" evoke images of fantastical locations—chasms
transversal only by green bridges woven of ivy, or serene bodies of water that
float among the clouds. "Willow Cove" is similarly suggestive, if somewhat
more mundane.

But, Tiger Lake? Coffee Lake? SuperFin? Is there a formula here?

~~~
culturestate
_> "We have a formal process for what types of things are lakes vs. bays vs.
peaks," Tripp told us. However, he didn't elaborate on what that process was.
"Different components are named after different geographic areas."_

[https://www.tomshardware.com/news/decoding-intel-code-
names](https://www.tomshardware.com/news/decoding-intel-code-names)

~~~
johnsoft
Just about every product since 2015 has been a lake. I'd guess "lake" is the
code word for "just overclock last year's chip and call it a day"

~~~
wtallis
The CPUs have been lakes. Their memory and storage division uses stream, pass
and -dale. Server chipsets are -burg, Ethernet controllers are -ville. All of
these suffixes are still in active use for new or upcoming products.

------
nimish
Intel advertised a bunch of fancy transistors and process improvements well
after it was clear 10nm as a whole was a bust (COAG, Single Diffusion Break,
probably more)

Color me skeptical.

~~~
Symmetry
Charlie, the arch Intel skeptic, was convinced and if he thinks they managed
to fix their problems with 10nm I'm willing to give them the benefit of the
doubt.

[https://semiaccurate.com/2020/08/13/intel-talks-
about-10nm-s...](https://semiaccurate.com/2020/08/13/intel-talks-about-10nm-
superfin-and-packaging/)

~~~
nimish
I'll believe it when I can finally buy 10nm at retail in volume.

------
MangoCoffee
[https://asia.nikkei.com/Business/China-tech/China-hires-
over...](https://asia.nikkei.com/Business/China-tech/China-hires-
over-100-TSMC-engineers-in-push-for-chip-leadership)

"As a company, TSMC competes to our fullest within the law, but we do not
slander our competitors and we respect the intellectual property rights of
others. Similarly, we expect our suppliers and other companies to respect
TSMC's intellectual property rights and will take appropriate protective
actions."

I remember a Taiwanese news said TSMC ask their suppliers not to sell TSMC's
custom machines to others.

a while back on Hacker news someone said TSMC have no choice but to build a
fab in the U.S. because TSMC needs suppliers like Applied Materials but it
doesn't seem that way because everyone can have the same suppliers but Intel
and Samsung is struggling while TSMC charge ahead.

[https://wccftech.com/samsung-lost-5nm-
snapdragon-x60-snapdra...](https://wccftech.com/samsung-lost-5nm-
snapdragon-x60-snapdragon-875-orders-to-
tsmc/#:~:text=Mobile%20%E2%8B%AE%20Rumor-,Samsung%20Has%20Reportedly%20Lost%20Qualcomm%20Snapdragon%20X60%2C%20Snapdragon%20875%20Orders,Due%20to%205nm%20Production%20Difficulties&text=Now%2C%20according%20to%20a%20report,because%20of%20these%20unnecessary%20delays).

------
dbcooper
Is Intel deliberately making its branding and specs as confusing as possible
now?

~~~
zrm
What do you mean "now"?

------
a012
So this CPU is about to compete with AMD Ryzen 4000 laptop series?

~~~
rwmj
It will be competing against the AMD Ryzen 4000, and most likely losing that
competition.

~~~
bryanlarsen
I'd take the other side of that bet. According to the article and other
related ones, both the Xe graphics architecture and the 10++ nodes are
substantial bumps.

But it'll be a close race, which is a piece of awesomeness we've never had in
laptops before.

The Ryzen 4000 is a 2019 vintage. Slightly losing against Intel's 2021 design
is no shame. Comparing Tiger Lake against the Ryzen 5000 is a fairer
comparison.

And then come the questions of volume -- if Intel can't manufacture in volume
they can win on paper and lose on the ground.

~~~
AgloeDreams
AMD has little stopping them from moving to 5nm or 6nm TSMC next year,
possibly sooner than Intel can ship this. AMD could do a small run to win on
both paper and ground.

The real factor here is the third competitor, Apple, walking in with what is
expected to be a 5nm 'no cost limits' CPU that they will ship on every machine
based on A14 architecture. Because they own the design and just have TSMC do
Fab it's likely that Apple will intentionally put what would be a $500-$1000+
monster core-count cpu spec in every base product, stealing the war. and
possibly deflating the market for victors here.

~~~
ben-schaaf
I don't see that happening. Apple needs to make money and they do so by having
huge margins. They can certainly increase the CPU budget by taking out the
Intel cut (though really they've just swapped that for the TSMC cut), but
they're not going to chuck a $1000+ CPU in a ~$1200 laptop.

This may be my lack of knowledge in CPU manufacturing, but from what I
understand larger dies get you more performance not more efficiency, so
considering their inability to build adequate cooling solutions and fondness
for reducing thickness even if they chucked a 32core bohemeth in there it
won't be running very fast.

On top of that, since AMD are using the same nodes I don't really see how they
wouldn't be able to compete. Especially considering Apple probably won't be
selling their CPUs separately, leaving out large parts of the market.

~~~
AgloeDreams
So by Mega-core-count I mean more than 10 and I'm including their big.little
architecture (so maybe 6+ high performance, 8 efficiency, remember that Apple
can use all cores grouped however, asymmetrical).

The chip design and margin advantages here are massive. Apple sell's phones at
$399 with the A13 (7nm) that outperforms (by any metric we currently have),
some of Intels > $300 notebook processors, building using iPhone cores as a
base means your margins are subsidized by one of the largest selling products
and most massive R&D budgets in the world. Then you have the efficiency of all
of this: sure, larger Dies won't get better efficiency but Apple's chips use
massively modular power operations. They can bring cores on and offline
dynamically, you can chop the GPU, core by core, when you don't need it. They
could (important: _could_ ) ship chips that use 4, smartphone power-level
cores in normal operations but can flash on 64 core performance for a second
and lose it just as fast. Thats how the iPhone's A13 works right now. Notebook
(unsustained) performance in a smartphone with no fan. It's why the iPad Pro
outperforms the 13 inch MacBook Pro at video rendering. It's the advantages of
ARM chip design in 2020 and I imagine they will play to every last one of
their strengths.

All the real money is in pre-built systems. AMD sells a ton of units but
nothing like Qualcomm or even Intel with their notebook stronghold. I'm sure
theres space, just that I believe the market will be decreased in size and if
Apple is over there frying everyone, then it's gonna be hard to advertise your
hard work beating Intel. I'm not saying all of this will happen or even will
happen next year, it's just that the forces here are somewhat massive
advantages.

------
Ijumfs
Can I get this particular Intel rig without the Intel ME backdoors courtesy of
the Talpiot program?

No? OK then, I'll stick with Ryzen.

~~~
IntelMiner
...Which itself has similar "backdoors"

~~~
Ijumfs
Not according to security researchers.

~~~
akvadrako
This is nonsense. AMD has it's equivalent of the ME and it's under a lot less
scrutiny. If it didn't have "backdoors" it would be a miracle.

------
endarkenment
Sweet title.

