
Nvidia deeply unhappy with TSMC, claims 22nm essentially worthless - evo_9
http://www.extremetech.com/computing/123529-nvidia-deeply-unhappy-with-tsmc-claims-22nm-essentially-worthless
======
ak217
Very interesting. Those really are pretty stunning words for a business
relationship as big as this. If I understand correctly, Nvidia has nowhere
else to go since TSMC is so far ahead of other independent foundries (I'm not
sure how independent GF really is - is it conceivable for Nvidia to work with
GF?) and it can't very well go to Intel given the amount of competition they
are engaged in. Maybe Samsung?

Also, I'm not clear on the arguments presented in the slides. The transistor
normalized price curves for 20nm and 14nm do eventually cross over the
preceding curves, unlike the 40nm curve. And even putting the same transistor
count on a smaller process node can result in lower power consumption once the
leakage problems are mitigated, so the value is increased.

Edit: For historical background, here's a fascinating conversation from 5
years ago between Morris Chang (founder and chairman, TSMC) and Jen-Hsun Huang
(CEO, Nvidia):

<http://www.youtube.com/watch?v=u-x7PdnvCyI>

~~~
ajross
_it can't very well go to Intel given the amount of competition they are
engaged in_

That's not the reason. Intel's fabs are at capacity producing chips with much
higher margins than GPUs. Check the price per die area on Geforce card vs. the
CPU that goes on the same motherboard, and then remember that the video card
is assembled and stuffed (with a ton of DRAM) and board tested where the CPU
just has to be packaged and shipped. The economics simply wouldn't allow it
even if Intel wanted to play.

~~~
drewcrawford
> remember that the video card is assembled and stuffed (with a ton of DRAM)
> and board tested where the CPU just has to be packaged and shipped

I used to design CPU tests for a major x86 CPU manufacturer. 100% of
manufactured chips are tested, sometimes at a length of many hours.

~~~
ajross
Maybe that was unclear. Obviously all ICs are tested. But the integrated board
requires an extra step that the packaged CPU does not.

------
ajross
I'm sure there's a lot of interesting dirt here. But some of the analysis is a
little weird. The cost-per-transistor metric isn't the right one, for a start.
Imagine a world where you could get all the 1975-era 5v 74HCxx chips you
wanted for free, with zero manufacturing cost. Would you choose to build a
smartphone out of them? Of course not. The newer transistors are better
(faster, lower power) and the products built out of them are more valuable.

So yes, production costs have increased. And that may have (is having, I
guess) effects on the speed at which new technologies are adopted (i.e. the
crossover into "worth it" is delayed).

But isn't that just a way of saying that semiconductors are finally becoming a
mature technology? That's not really such a shock, nor does it justify the
poo-flinging at TSMC.

So the real question in my mind is whether TSMC is having problems that the
other foundries (Samsung or Global Foundries, say) are not. Given that Intel
has been sampling 22nm parts already, it seems like the real news hers is that
TSMC sucks and the bit about production costs are just ammunition.

~~~
wmf
Thanks to the end of Dennard scaling, newer transistors are only marginally
better. If NVidia doesn't care about those marginal improvements in
performance and power, it seems legitimate for them to complain about price.

If an entire industry's financial projections are based on exponential
improvement then becoming a mature technology is apocalyptic, especially if it
happens earlier than predicted.

------
pvarangot
Hmm, the other day I was talking about exactly this with a coworker. This
being that transistor cost is not getting exactly cheaper when you factor in
R&D and complicate the manufacturing process.

I think Intel will be able to bundle a "good enough for playing AAA games in
1080p with maxed out graphics" video processor into the main CPU in about
one/two years, maybe less. Those CPUs will be more expensive that the tinier
(in transistor count) ones. This will happen around when they hit 14nm. NVIDIA
will go out of bussiness or become a niche company dedicated to vector
processing for scientific computation or heavy workstations, and maybe a small
player in the ARM market with their Tegra architecture. Maybe AMD will go the
same road, making something similar to Intel but using their ATI technology.

Bonus 1: too bad, no more Singularity mumbo-jumbo in less than two years.

Bonus 2: I am one of the few that believes ARM will never be relevant in the
desktop/laptop/console/gaming segment, because of what I think Intel will come
up with.

~~~
InclinedPlane
I don't buy it. The transistor counts of modern GPUs far exceeds that of even
multi-core CPUs. It's not rational to say "oh, well, that's all just trivial
engineering, it'll go away". Especially as GPU performance continues to be
taxed to a far higher degree than the CPU typically is. Modern popular AAA PC
games like Battlefield 3 or Starcraft 2 will eat all of the GPU you throw at
it and then some.

And then you see what game devs are working on for the future such as this:
<http://www.youtube.com/watch?v=Gf26ZhHz6uM> or this:
<http://www.youtube.com/watch?v=Q4MCqM6Jq_0> or this:
<http://www.youtube.com/watch?v=EhwZ7Sb0PHA> and the notion of "good enough"
quickly flies out the window.

~~~
pvarangot
Hmmm, you are right... maybe I shouldn't have said "maxed out" but "mostly
acceptable". I think that "maxed out" will be a niche market, specially since
the computers able to really max out games are _already_ niche, and PCs in
general are loosing market share. I'm not implying its a trivial engineering
problem, just a problem Intel will be able to solve in one/two years and
deliver processors for a small box you plug into your TV and happily game
without investing money and real estate space into a PC.

~~~
InclinedPlane
Niche? <http://media.pcgamer.com/files/2012/03/PCGA-2010.jpg>

~~~
sciurus
Many people play games on computers, true. I think pvarangot's contention is
that most of them don't need high-ends graphics cards to do so.

~~~
InclinedPlane
I should have mentioned, that chart is for sales in 2011. Some of that is low
graphics gaming, but people aren't spending $18.6 billion on bejeweled (maybe
on Farmville though). If you take just the segment of the market that is
"high-end graphics gaming" then it's still a multi-billion dollar a year
industry. Why would a self-sustaining industry of that size just spontaneously
go away?

------
AceJohnny2
This presentation is more evidence that Moore's Law is dead. Nvidia and AMD
can blame their partners, but the reality is that they're hitting increasingly
costly technological hurdles.

Of course Nvidia can't just throw it's hands up and say "welp, we've hit the
wall" when it has someone to blame.

It's fascinating to watch those predictions about the end of Moore's Law come
true.

~~~
DarkShikari
Intel seems pretty confident that they will be able to get to 11nm in the next
5 years or so. Past that, pure shrinkage of transistors via purely
"conventional" methods will mostly hit a wall, but everyone saw that coming.

There's been a lot of work on nanoelectronics. For example, some scientists
recently announced at a semiconductor conference that they had found a way to
bypass the diffraction limit completely, making 2nm features with a 650nm
laser. Similarly, I know an engineer who worked on a massively-parallel
electron-beam-based etching device, which could also be applied similarly at
very small node sizes. There's a billion other technical challenges, but
there's been a lot of progress in making features on the molecular level.

If anything, this is better than expected: EUV has had some serious problems
over the past few years, yet its delay hasn't stopped the next few nodes.

Moore's law doesn't require that shrinkage continuing, though, as there are
other technologies available. Memristors in particular may allow scaling
_faster_ than Moore's law would otherwise predict.

~~~
jacquesm
And then there is the third dimension to consider:

[http://spectrum.ieee.org/semiconductors/devices/3d-chips-
gro...](http://spectrum.ieee.org/semiconductors/devices/3d-chips-grow-up)

That comes with a very large set of engineering challenges but if those are
overcome you can expect another big jump.

~~~
marshray
You still have the problem of dumping the additional heat. So AFAICT, 3D will
only provide some incremental advancement.

~~~
ippisl
There's the intermediate step to 3D - 2.5D. it's when you lay all you dies
side by side, on another chip, instead of a pcb. It makes relatively easy to
manage heat and you can get much more bandwidth at much lower power(much
shorter wire distances). And it's already in use for high end fpga's.

~~~
marshray
You mean the multi-chip module (MCM) ? <https://en.wikipedia.org/wiki/Multi-
Chip_Module> Those have been around forever.

Or something different?

~~~
jacquesm
He's talking about this:

[http://www.eetimes.com/electronics-
blogs/other/4210170/Xilin...](http://www.eetimes.com/electronics-
blogs/other/4210170/Xilinx-multi-FPGA-provides-mega-boost-re-capacity--
performance--and-power-efficiency-)

~~~
marshray
Cool! Thanks.

I wonder if the price will ever come down to the point where it's used in
consumer devices.

------
ghshephard
Somewhat off topic, but does anybody know how to disable extremetechs
craptastic mobile interface? I've got twice the resolution of my laptop on my
iPad, and the mobile site looks like it's been optimized for my circa 2002
palm treo 650.

~~~
mrschwabe
+1 can't stand how Extremetech's mobile site hijacks an otherwise fine rich
web experience. On Opera for iPad (which is great for browsing HN stories btw,
because of its awesome tabs) Extremetech defaults to the mobile site and its
just a blank white page.

------
katane
I feel like this is a tech-problem that will have a tech-solution. No idea how
that fits into an "International Trade Partner Conference".

That said, you can't outsource your complete manufacturing process only to
come back later and wish for "virtual IDM" partnerships. Thats all of the
benefits and none of the risk.

~~~
ak217
Rhetoric aside, Nvidia is the designer who keeps breaking transistor count
records and is a dominant OEM supplier. That makes them both a technology and
business leader among TSMC's customers, so it makes sense for TSMC to devote
resources to them, since the experience they get in working with them can't be
gained elsewhere and is applicable to their other customers down the road.

~~~
ajross
Pretty sure most of the (logic) transistor count records are held by FPGAs,
no?

~~~
pmjordan
Not at the clock speeds that nvidia chips run at.

------
Maro
What do they mean by "rough justice" in this context?

~~~
wmf
Reading between the lines, it sounds like NVidia thinks that TSMC is taking
all their profit and they'd like to get a fair share.

------
6ren
If cost/transistor continues to not improve, industry-wide, it spells the end
of Moore's Law, at least for mainstream applications where _cost is a primary
factor_.

However... where size itself (and power consumption) are the primary factors,
there will be demand. Which means that SoC GPUs will adopt new process nodes
more vigorously than video card GPUs. [http://www.embedded.com/electronics-
news/4304118/Imagination...](http://www.embedded.com/electronics-
news/4304118/Imagination-outstrips-all-other-GPU-IP-suppliers) Sounds like
disruption.

------
vtail
I'm a bit surprised by a wafer pricing chart. Sure, the individual wafer cost
goes up; but if you have a design with the same transistor count, the size of
your chip is going _down_, which means that you'll have _more_ chips per wafer
(assuming comparable yields), which means that individual chip cost will go
down.

(40/28)^2 ~ 2, so if 28nm is not that bad comparing to 40nm, assuming, again,
comparable yields.

~~~
jrk
This is precisely the analysis given in the per-transistor pricing crossover
chart. That shows that the old rate of wafer price increase still allowed for
significant per-transistor cost advantage fairly quickly. The whole point is
that the rate of wafer price increase has _itself_ increased to the point
where these crossovers are barely happening anymore.

------
lien
Nvidia Doesn't have a choice no matter how unhappy they are with tsmc. TSMC is
the biggest fab that every fabless manufacturer uses. Every single fabless
semi has no choice but TSMC. If tsmc can't deliver on the 22nm technology, no
other fab will be able to. The 22nm is extremely difficult most skus fail.
Hence intel wins.

------
stewie2
why do companies want to be design only?

~~~
sparky
These days, it's not really up to a company whether it wants to be fabless or
an IDM. Apart from the several billion dollars needed to build a competitive
fab, you'll also need to contend with the 20+-year head start existing IDMs
and foundries have on you in terms of experience and expertise. As far as I
can tell, there are too many moving parts in a fab for a new company to simply
poach a few key people and be competitive quickly. I can't think of anyone
apart from Apple or a nation-state that has enough money to wait around while
their new division learns how to build chips.

That aside, assuming a fab better than TSMC's showed up on your doorstep
today, you're not out of the woods in terms of being dependent on suppliers.
On the contrary, designing fab tools is also a very capital-intensive business
and tends towards an oligopoly, so the TSMCs and Intels of the world are just
as dependent on tool vendors as NVIDIA is on TSMC. The same holds for the very
specialized materials and consumables fabs use, like the glass to make mask
sets.

~~~
stewie2
who is the best company that makes fab tools?

~~~
sparky
Most companies only make a small fraction of the tools you need for a full
fab; for examples of heavy hitters in key areas like lithography, chemical
mechanical polishing, and ion implant, see ASML, Nikon, Canon, Applied
Materials. For any given tool, there are usually 1 to 3 major vendors.

