
Intel 2020s Process Technology Roadmap: 10nm+++, 3nm, 2nm, and 1.4nm for 2029 - rbanffy
https://fuse.wikichip.org/news/3127/intel-2020s-process-technology-roadmap-10nm-3nm-2nm-and-1-4nm-for-2029/
======
merricksb
Other discussion:
[https://news.ycombinator.com/item?id=21770622](https://news.ycombinator.com/item?id=21770622)

------
jonplackett
Meanwhile, in reality, Intel rerelease 22nm chips.

[https://www.tomshardware.com/news/intel-resuscitates-22nm-
ha...](https://www.tomshardware.com/news/intel-resuscitates-22nm-haswell-
pentium-processor)

~~~
agumonkey
Revive your brand through wild slides and sell high profit old products.
Interesting strategy.

------
steve19
"WikiChip is happy to see Intel remains committed to furthering Moore’s Law
for the foreseeable future with an ambitious roadmap."

What is Wikichip? Some sort of Intel mouthpiece? How can anyone belive an
Intel roadmap at this point. At best it's fiction, at worst security fraud
(according to the Matt Levine definition)

~~~
quaquaqua1
Literally. What even is 10nm+++?

Only thing that needs to matter anyway is performance in a suite of benchmarks

~~~
RealStickman
don't forget powerdraw.

~~~
steve19
With Intel it is always important to ask if the powerdraw includes or excludes
the fridge hiding under counter*

* [https://www.tomshardware.com/news/intel-28-core-cpu-5ghz,372...](https://www.tomshardware.com/news/intel-28-core-cpu-5ghz,37244.html)

------
wruza
14, 10, 7, etc do not matter anymore [1], because there is no metric that
could correspond to that number directly. It’s all marketing nanometers. I’m
all intel, but how does one estimate bs like “1.4nm for 2029”? By dividing
14nm by years passed since 2019?

[1]
[https://en.wikichip.org/wiki/10_nm_lithography_process](https://en.wikichip.org/wiki/10_nm_lithography_process)

------
yedpodtrzitko
"Did I ever tell you the definition of insanity?"

------
jacek
Isn't that just marketing and wishful thinking? Not that long ago they were
"predicting" 10nm process for 2016 [1].

[1] [https://mr-
uploads.s3.amazonaws.com/uploads/2016/02/Series-6...](https://mr-
uploads.s3.amazonaws.com/uploads/2016/02/Series-65-A4.png)

------
coleifer
Thank you, based-AMD.

------
Merrill
Once you can process high-definition video there is little need for further
performance in mass-market processors. Progress thereafter would be mainly in
lowering power consumption, reducing cost, and greater integration of more
functions on chip to lower parts counts.

Therefore, the business drivers must be mainly server applications in the data
center.

~~~
jobigoud
> Once you can process high-definition video there is little need for further
> performance in mass-market processors.

In the grand scheme of things high-definition video is an extremely low bar
for consuming and archiving content. It's totally flat, distorted, has a fixed
viewpoint, a tiny field of view, super low dynamic range, etc. It doesn't
match what you would see if you were there in the slightest. It's closer to
looking at a photography than to being there. We should strive to do better
than that. Decoding high-def video is mainly done on the GPU nowadays anyway.

~~~
Merrill
Current video is a pretty good match for the needs of advertisers, who are the
major source of funding for video. There is relatively little market for video
that is not either free or heavily subsidized by advertisers.

TV manufacturers are counting on 8K, OLED, and "smart TV" functions to keep
volumes flat by encouraging replacements of what is a durable consumer good.
LCD TVs seem to be quite long-lived.

~~~
PaulHoule
Production costs go up a lot when you increase the resolution because then you
can see all the details they don't want you to see.

