
Intel to outsource 14nm chip production due to tight supply - skrause
https://www.digitimes.com/news/a20180910PD210.html
======
ksec
Outsource _Chipset_ production to TSMC. Not Chip / CPU, I would be surprised
if they do. I wonder if this is planned or not, because TSMC just doesn't have
spare 12nm capacity sitting there, at least not in the quantity Intel
requires. They tried moving those chipset back to their own 22nm and it is
obviously not working out for them well.

If not, it is another sign of Intel management failure.

------
lousken
Can't connect to this server due to very old and horrible https implementation

alternative link [https://outline.com/RstMtX](https://outline.com/RstMtX)

~~~
theandrewbailey
Holy crap! Except the certificate, this site seems to violate every
recommendation!

[https://www.ssllabs.com/ssltest/analyze.html?d=www.digitimes...](https://www.ssllabs.com/ssltest/analyze.html?d=www.digitimes.com)

~~~
jolopy
HTTP server signature Microsoft-IIS/6.0 Poor admin of theirs. They must feel
like talking to a wall.

~~~
lousken
emailed them about it, here's the response:

> Thanks for your email. We are aware of issues with our HTTPS and currently
> trying to upgrade our systems and the security certificates. Unfortunately,
> it will take us a while until all the testing is done, but hope to migrate
> to new environment by the end of this year.

------
dogma1138
TSMC doesn't have a 14nm process, it's "16nm" process is 20nm, and
considerably inferior to intel's 14nm same goes for the 12nm which is a BEOL
improvement of 16nm with no density improvements.

The only process that Intel might use if they outsource to TSMC is going to be
the high performance 7nm node when it's out in full swing which is insanely
expensive and not available just yet for mass production.

For non CPUs it doesn't really matter but I doubt PCH, network chipset and
modems are even built on 14nm but there is no process that Intel can use to
tap out CPUs on especially it's high core count ones while keeping the same
frequencies it's used to even with 7nm.

TBH this sounds more like some BS than a valid rumor.

------
mtgx
Wasn't it just a year ago when people were making big claims about Intel using
its "manufacturing edge" to compete against ARM and whatnot, and about them
making other companies' chips and stealing customers from TSMC and the others?

------
makecheck
I’m a little surprised that they’re continuing to push shrinking transistors
so hard. While it’s impressive, it introduces massive new problems to solve
across the entire design and manufacturing process, every time. There are
other ways to achieve their goals of reduced area, etc.

One example is to save area by making the chip _do less_ , and an
_outstanding_ place to start would have been to banish all of this DRM crap
that no _customer_ has _ever_ asked for. It costs complexity, no doubt adding
so much crap to the chip that they need to save area by shrinking the whole
chip on an accelerated schedule.

~~~
mikhailt
If there are other ways, you should patent it because it'll be worth
trillions.

They're not shrinking it just for the fun of it, they're progressing the
technologies to accommodate the explosive growth in computing every year.

The major problem with the growth is the energy consumption, we can't sustain
it without reducing it.

Shrinking the process node allows them to double the transistor capacity
without doubling the power requirements and they were on the path of doing it
every few years, alas also the Moore's law.

Also, the mobile devices are outpacing the desktops, which means the market is
demanding lighter, more power-efficient devices.

It takes several years and tens of billions of dollars to develop a new fab,
so Intel has to predict and work on future technologies continuously the best
ways they can; they can't risk losing their businesses to other fabs that is
also doing the same thing.

DRM is the least of the problem, the biggest part of the Intel CPUs are the
integrated graphics now.

~~~
makecheck
I did give an example of another way: reducing the features of the chip. DRM
is just one case.

I am aware of the reasons for shrinking. I’m not saying they should _never_
pursue a new process but they’re attempting to achieve it on a timeline that
isn’t realistic.

Also, power-inefficient design is something that is hard for Intel to change
because too much depends on their architecture being the way it is.

So between not wanting to get rid of stupid features and not being able to
sell a backward-incompatible chip with power-efficient architecture changes,
they’re left trying to shrink before everybody else.

------
toong
I'm wondering how much of their roadmap is affected with all those Spectre
revelations. There must be a massive re-engineering effort going on ?
Speculation mechanisms were mostly implemented in silicon, right ?

~~~
gwbas1c
Doubtful, this problem has to do with transistor size which has nothing to do
with speculative execution

