
Nvidia Announces Titan Xp Graphics Card - hoov
https://techcrunch.com/2017/04/06/nvidias-new-titan-xp-top-end-graphics-card-also-offers-mac-support/
======
jsheard
Oh dear, what were they thinking with that name. First they released the
Maxwell-based _GTX Titan X_ , then replaced it with the Pascal-based _Nvidia
Titan X_ which nearly everyone called the _Titan XP_ to disambiguate the
confusingly similar names, and now Nvidia goes and uses that universally
accepted nickname as an actual product name for a different product.

~~~
nsxwolf
They should have gone with the great Apple/Nintendo naming convention of "The
New Titan X"

~~~
frankchn
I would have preferred the Microsoft naming convention of "Titan XP SP2"

~~~
mfukar
Surely, it would probably be "Titan XB 1"

~~~
gambiting
No, just "Titan One"

------
hatsunearu
Damn, the 2016 Titan X was confusing because the 2015 Titan X was named the
same, so people nicknamed the 2016 one as Titan XP and they fucking went ahead
and one-upped the confusion by making an actual Titan Xp.

Bravo Nvidia LOL

I had to check if my hacker news app wasn't updated with the newest post list
since April Fools.

~~~
sundvor
Well, at least it's somewhat better than the iPad.

"Here's the new iPad. It's newer than the old one, but good luck trying to
tell the difference. So to clear things up, we've named it the "iPad". You can
thank us later.".

------
sxp
For comparison, the 1080Ti is ~11.3TFLOPS + 11GB RAM @ $700 vs the Titan at
~12.1TFLOPS + 12GB RAM @ $1200. ~9% more performance for 70% more money.

[https://en.wikipedia.org/wiki/GeForce_10_series#GeForce_10_....](https://en.wikipedia.org/wiki/GeForce_10_series#GeForce_10_.2810xx.29_series)

~~~
tanderson92
I have been evaluating buying a 1080Ti recently; it appears to also have the
highest cores per $. Is the 1080Ti really then the most efficient card for
general purpose HPC work on Linux (numerics)? The 1080 is a step down but also
competitively priced. Curious about your thoughts, as I couldn't find a guide
on this stuff on the web.

edit: The Tesla K20 is also in competition in my view (despite the much higher
cost) due to its focus on higher double-precision performance.

~~~
VA3FXP
We do a lot of work on video encoding. We have had a K80, Titan X(Maxwell),
Titan X(Pascal), 1080, 1080Ti, and others (including render-farms based on
GTX980's).

General thoughts: Don't expect to get _any_ information out of NVidia unless
you are running everything on their hardware compatibility lists (i.e. server-
case) Do not mix & match consumer-rated gear with 'professional' gear. (i.e.
If you put the K80 in a system with a GTX1080, then the Nvidia drivers
restrict the number of available processing cores to 2 per device)

Air-flow: The Tesla's run HOT even with a blower attached, and/or installed in
the recommended case.

NVENC: the Pascal-based cards performance is incredibly faster AND better then
the Kepler-based cards.

For anybody else doing Video encoding work: Grab an Nvidia TK1/jetson dev-kit.
This little card is a MONSTER and can handle everything we throw at it without
breaking a sweat.

~~~
slizard
> Do not mix & match consumer-rated gear with 'professional' gear. (i.e. If
> you put the K80 in a system with a GTX1080, then the Nvidia drivers restrict
> the number of available processing cores to 2 per device)

Huh? Not sure what exactly do you mean by "number of processing cores"?

I use two development boxes on a regular basis with Teslas side-by-side with
GeForce cards and they all work just fine.

~~~
jamesfmilne
The NVENC SDK limits the number of separate H264 video streams you can encode
simultaneously to 2 if you have _any_ Geforce hardware in your system.

------
rz2k
>Currently Mac users are limited to Maxwell GPUs from the company’s 9-series
cards, but next week we’ll be able to finally experience Pascal, albeit a
$1200 Pascal model, on the Mac.

>We have reached out to Nvidia for a statement about compatibility down the
line with lesser 10-series cards, and I’m happy to report that Nvidia states
that all Pascal-based GPUs will be Mac-enabled via upcoming drivers. This
means that you will be able to use a GTX 1080, for instance, on a Mac system
via an eGPU setup, or with a Hackintosh build.

[https://9to5mac.com/2017/04/06/nvidia-titan-xp-beta-
pascal-d...](https://9to5mac.com/2017/04/06/nvidia-titan-xp-beta-pascal-
drivers-mac/)

~~~
josephg
> for instance, on a Mac system via an eGPU setup

This is one of my biggest feature requests for Apple. I want a tiny little
laptop with an integrated GPU when I'm on the road. But when I'm home I also
want to be able to run simulations, play games on a big screen & do VR. And
for that I want a desktop class GPU when I'm at home. And I want that GPU to
be upgradable - CPU speed isn't improving anywhere near as fast as GPU speed,
so it makes sense to keep the rest of my system across multiple GPU
generations.

The laptops are already there. The RAM fiasco aside, the current laptops are
fine little machines. And with thunderbolt 3 they should have no problem
supporting external GPUs.

All thats missing is an official apple egpu enclosure and software support!
People on the internet have already gotten them working via injecting kexts
into the kernel. But first party support would make the whole thing way
better, and way more stable. C'mon apple! We're so close! Take my money!

~~~
mhermher
They'd probably lean towards just selling the egpu with gpu included rather
than just the shell. Seems more in line wit the company. They don't want
people using untested hardware I am guessing. It's bad for brand image or
whatever. I figure that's the same reason they don't just sell OSX licenses on
their own. They want to know the hardware and software will work well
together. Even then, and even with a markup, I think mac users would be pretty
happy with at least having that option.

------
bitL
Is there something in Titan XP I would benefit from for ML/DL/AI comparing to
1080Ti (except for extra 1GB)? I am considering getting 8c Ryzen with 1080Ti
(1-2x) and am wondering if Titan XP has something that would render 1080Ti
obsolete for training models?

~~~
jsheard
In terms of architecture and features the 1080ti and Titan Xp are the same,
the only difference is the Titan Xp is slightly faster and has 1GB extra VRAM.

If your workload can be efficiently split between multiple cards then a $1400
pair of 1080tis will _vastly_ outperform a $1200 Titan Xp - 16% more money for
nearly double the throughput.

~~~
bitL
Great! That's what I wanted to know! Many thanks!

I was more curious if Titans had some lower precision data type or better
dataset packing than vanilla Pascals, or something similar that would help
with ML.

~~~
Nexxxeh
Another poster said that the drivers will prevent you running more than two
1080Ti's properly in the same machine (if I interpret it correctly).

In case you are just checking on your comment thread, do check the other
threads as there's interesting performance comparisons being discussed.

------
filipncs
Regarding the new Nvidia provided mac driver, does this have any influence on
Vulkan or modern OpenGL support? Or would that require changes in macOS itself
(that would presumably never happen)?

------
bryanlarsen
A few more details: [http://www.tomshardware.com/news/nvidia-titan-xp-
graphics-ca...](http://www.tomshardware.com/news/nvidia-titan-xp-graphics-
card-gp102,34079.html)

------
alkoumpa
not sure why anyone would spend double money when you can get about the same
performance using a gtx 1080 TI. The performance looks marginal IMO --
optimizing the code on gtx 1080 TI (cuda and/or shader assembly) would
probably yield very satisfactory results and definitely better perf/buck.

~~~
Nexxxeh
"double precision performance" and "industrial-grade drivers" are two reasons.
I think it depends on what exactly you are doing.

------
superfx
Why not more memory? Esp. with the 1080Ti at 11GB and half the price, it
would've made sense to push this to at least 16GB or even 24GB to distinguish
it.

~~~
strictnein
The problem is that they also need to distinguish it from the higher memory
and much more expensive Quadro line. For instance, the new Quadro P6000 comes
with 24GB and it'll run you $5500.

[https://www.newegg.com/Product/Product.aspx?Item=N82E1681413...](https://www.newegg.com/Product/Product.aspx?Item=N82E16814133636)

If you're doing memory intensive stuff, NVIDIA wants you to spend a whole lot
more.

~~~
superfx
Yeah I can see that, but they're running into the other problem now where 12GB
is just not that much more than 11GB and certainly not worth the 100% price
increase. At 16GB they would at least be offering ~50% more memory.

------
samcat116
Finally we get macOS Pascal drivers. Super pumped.

~~~
ClassyJacket
But what Mac are you going to put it in...?

~~~
lukealization
Hackintosh?

~~~
samcat116
Yup. Got a 1050ti sitting in my machine right now. That was the only thing
holding me back.

------
Zaheer
How does this compare to AMD's latest card?

~~~
dgritsko
It will more than likely demolish it in terms of raw performance; however,
AMD's most recent cards are aimed at being more budget-friendly. An RX 480
will only set you back about $200-250 (compared with this $1200 beast of a
card). You'll get more "bang for your buck" by going with an AMD card as
opposed to a top-of-the-line model such as this one. That may change later
this year when AMD releases their Vega architecture, as it's rumored to aim
more at the high-end market (which is currently dominated by Nvidia).

~~~
sorenjan
Are there any signs that machine learning libraries and other GPGPU
applications will start using the cross platform OpenCL instead of the
proprietary Cuda anytime soon? It's a bit of a shame that so many allow
themselves to be locked to one vendor, although it's been a while since I used
either of them.

~~~
xiphias
AMD should just do it for TensorFlow. They would get a lot of benefit if they
could show higher performance per dollar at least on Linux, and it would take
just a small team to implement it.

------
superrad
Is there any point (for games) in having that much memory when you can really
only address about 9 GB a frame at 60Hz (Titan Xp is 550GB/s)?

I mean its certainly better than the Titan X (Maxwell) which could only
address less than half it's memory while running at 60Hz.

It just seems like an effort to inflate the price of the product without
adding much value.

~~~
jo909
Sorry I might not understand what you are saying.

You're in the game and look at a house, then you turn around an look at a
tree, so you need the geometry and texture of the tree, but no longer of the
house. Then you look down and a chicken walks into frame, so you now need
that, you kill the chicken and suddenly need the dying chicken animation etc.

Almost never you need all data for a single frame. That would be way too much
work for the render pipeline anyway.

~~~
stonemetal
He is saying a video game running at 60 Hz has approx 16 ms per frame. 550
GB/s * 16 ms ~ 9 GB. So if you are running full bandwidth for an entire frame
you can access 9 GB of RAM.

~~~
Mahn
What the responder was getting at is that _despite_ being able to access only
9GB per frame, it may still be useful to keep more than 9GB of data in there
for other purposes, say if you have data that isn't being read/written every
frame but is still used for rendering. So it doesn't necessarily follow that
memory beyond that which can be addressed per frame is useless.

------
killjoywashere
Dear Apple: please give us a Cheesegrater Mac Pro with room for 4 of these.

~~~
intoverflow2
Think it's time to face facts that Apple doesn't care about you, me or any
other people who need this power.

(Will believe the new Mac Pro when I see it, but It's likely to have AMD
cards)

------
aceperry
Anyone know what the driver situation is going to be like for linux?

~~~
Vexs
The mac drivers give some hope, but I doubt it will be any better than what we
have already.

~~~
Florin_Andrei
Shouldn't it be supported by existing Pascal drivers already?

I've the original Titan X Pascal and Tensorflow works great with some old
driver version, I even forgot which.

------
popopobobobo
When will the price of this drop to below $300? 1200 for a graphic card is a
little too much for me.

~~~
strictnein
The GTX 1070 is a very powerful card and can purchased for ~$370.

The GTX 1060 is also a good card if you're looking to game at 1080p (which
most people still are). It can be found for around $200.

~~~
ornitorrincos
the 970 is a good card to play at 1089(admitedly I don't pkay a great variety)
but it's fairly enough, specially considering the price

