
New variety in the chip market, and trouble for Intel (2017) - tosh
https://www.economist.com/business/2017/02/25/the-rise-of-artificial-intelligence-is-creating-new-variety-in-the-chip-market-and-trouble-for-intel
======
okket
(2017), see also previous discussion from a year ago:

[https://news.ycombinator.com/item?id=13736763](https://news.ycombinator.com/item?id=13736763)
(55 comments)

------
cm2187
It's interesting that AMD isn't even mentioned in the article.

~~~
wtallis
The article was published a week before AMD's Ryzen processors started
shipping. At that point, it was uncertain whether AMD would be able to
recapture any relevance or threatening position against Intel.

~~~
IronBacon
Nice catch, didn't noticed the date. Strange also that cryptocurrencies aren't
mentioned at all in the article, or maybe it's simply lack of research from
the journalist.

If you look around in the HW forums, the blame for current GPUs prices is
strong demand from cryptocurrencies mining operations, but yeah, could also be
hearsay repeating.

~~~
distances
The article date predates the Bitcoin spike of the end of the 2017.
Furthermore, aren't cryptocurrencies more of a niche hobby than an actual
growing field in the industry?

~~~
zrobotics
They're a growing niche right now, and are impacting gpu availability quite
strongly, but AMD/Nvidia are both quite rightly worried. If ethereum drops, or
someone cracks ASIC mining (nobody mines bit coin w/ gpus anymore, but some
currencies were designed with ASIC resistant algorithms) then demand could
plummet overnight and literal tons of cards would show up used. Sure, ML has
been a growing field, and I don't see that changing overnight, but basing a
long-term business and trillion dollar fabs on crypto would be insanity.

Although I wouldn't call mining a niche hobby, there are some fairly sizeable
commercial operators.

~~~
whatsstolat
Just had a great idea for NVidia:

Pay per usage graphics cards.

Card is cheaper but you pay a monthly fee based on usage.

Gets extra revenue from the Cryptos, plus ensures a revenue base if cards
become cheap again.

Just because its physical doesn't mean you can't go the way of azure, aws etc.

------
Areading314
Yes, they will have to make do with their huge profit margins, huge free cash
flow, dominant market position, and strong balance sheet. I hope they'll be
OK!

~~~
ChuckMcM
It is not a good article for a couple of reasons and disappointing for the
Economist. While Intel is being challenged both in the mainstream (by ARM) and
by applications like AI, the driver of NVidia profits for the last few years
had been crypto coin mining.

While AI is great and all, it doesn't consume the sheer number of chips like
the blockchain craze does. If history is any guide, once the block chain thing
burns out or blows over, the number of available GPUs on the aftermarket is
going to be pretty incredible.

~~~
MarkMMullin
I agree on the assessment of the article, however not with the conclusion that
blockchain is driving this market alone - its certainly there, but I don't
know any ML practitioner who answers 'how many gpu's' with anything other than
'how many can I have/afford ?' We're starting to get more and more projects
reporting their processing time in GPU/years - the singularity is bull, but ML
ain't going away:-) That said, if people start dumping their mining rigs, I'm
on ebay looking for them :-D

~~~
Mirioron
I always figured that the singularity idea might work in theory but in
practice it would likely take an ungodly amount of time for it to work.

------
voltagex_
I don't really understand what these new chips accelerate - are they useful
for anything else when this current machine learning fad wears off?

~~~
IshKebab
ML actually works for real applications like speech recognition, face
recognition, image captioning, etc. While it might be over-hyped it's not a
fad that will "wear off". Google aren't going to stop using deep learning for
speech recognition and go back to the old hidden markov model technique that
barely worked.

~~~
voltagex_
Thanks - that makes sense. Guess it's probably best phrased the other way,
that these chips are more useful than say a physics accelerator or a
cryptocurrency mining ASIC.

------
zenovision
Intel is too bureaucratic, slow and margin-oriented for the modern world (just
like IBM). They tried to produce 10nm processors using an outdated technology
and they struggle even now to make it work. Competitors, on the other side,
using the newer and more expensive EUV technology, because they don't need
super-huge margins. TSMC is already working on 5nm technology, while Intel is
trying to fix their 10nm...

~~~
fermienrico
10nm isn’t the same across the industry. There are no standards to defining a
process node. Please watch Mark Bohr’s presentation on YouTube. The way you
make your claims is kind of off-putting because of sheer confidence in your
statements without properly understanding the fundamentals.

~~~
kingosticks
I think you've missed the point. Strip out the product names (which is all
those node numbers are, as you say) and understand that the comment is still
entirely correct - they are struggling with their outdated manufacturing
method and now losing ground.

~~~
icegreentea2
Intel is struggling with their manufacturing method, but they aren't wed to
it. Everyone is struggling with EUV.

ASML - the guys who make the stuff that all the fab houses use, said that they
only shipped 10 EUV machines in 2017. Intel's public statements on EUV is that
they will use it when its ready.

TSMC and GF both manufacture their newest large scale production nodes using
the same basic technique as Intel (hilarious amounts of exposures and layers
at 193nm).

Everyone except Intel has said that they only plan to start rolling out EUV
built chips in the later half of this year into next year.

No manufacturer has gained ground with EUV in their products yet.

~~~
kingosticks
Which sounds like 'everyone except Intel' has gained ground. You don't plan to
go into production with something next year if you have not made it work.

~~~
icegreentea2
I mean, you can announce your plans all you want. Intel did this with their
10nm stuff and now has significant egg on their face for their inability to
deliver. Maybe Intel is just playing a tighter PR game given their recent mess
up.

But really, EUV is a whole new ball game and its entirely possible for Intel
to lose whatever is left of their traditional advantages in the transition.

I just don't think its wise reading too too much into everyone's PR and
marketing at this time.

