
More Extreme in Every Way: The New Titan Is Here – Nvidia  TITAN Xp - bcaulfield
https://blogs.nvidia.com/blog/2017/04/06/titan-xp/
======
utexaspunk
It's funny I remember this magazine cover as a kid:

[http://s7.computerhistory.org/is/image/CHM/500004286-03-01?$...](http://s7.computerhistory.org/is/image/CHM/500004286-03-01?$re-
medium$)

The machine pictured (Intel Touchstone Delta) performed 30GFLOPs and cost
upward of $10m. Now a computer 400x as powerful fits on a card and can be
yours for $1.2k.

~~~
samch
Wow - that brought back memories. Here's a link to the full article:
[https://books.google.com/books?id=ogEAAAAAMBAJ&lpg=PP1&pg=PA...](https://books.google.com/books?id=ogEAAAAAMBAJ&lpg=PP1&pg=PA50#v=twopage&q&f=false)

------
douglasfshearer
Someone updated the 10-series Wikipedia page [1] making comparison to the
previous card easier.

Compared to the Titan X Pascal it has more cores, a higher clock, and more
memory bandwidth.

[1] -
[https://en.wikipedia.org/wiki/GeForce_10_series#GeForce_10_....](https://en.wikipedia.org/wiki/GeForce_10_series#GeForce_10_.2810xx.29_series)

------
mark212
Finally! MacOS drivers for Pascal

~~~
gbrown_
Nice to see this but surely this is only of use for Hackintosh builds? Or is
Thunderbolt connectivity supported for external GPUs?

~~~
rz2k
The very old Pro, and external GPUs:

>Currently Mac users are limited to Maxwell GPUs from the company’s 9-series
cards, but next week we’ll be able to finally experience Pascal, albeit a
$1200 Pascal model, on the Mac.

>We have reached out to Nvidia for a statement about compatibility down the
line with lesser 10-series cards, and I’m happy to report that Nvidia states
that all Pascal-based GPUs will be Mac-enabled via upcoming drivers. This
means that you will be able to use a GTX 1080, for instance, on a Mac system
via an eGPU setup, or with a Hackintosh build.

[https://9to5mac.com/2017/04/06/nvidia-titan-xp-beta-
pascal-d...](https://9to5mac.com/2017/04/06/nvidia-titan-xp-beta-pascal-
drivers-mac/)

~~~
dawnerd
Thats awesome, I was thinking about putting a hackintosh build together with a
spare 1070 I have laying around.

------
gbrown_
Also the previous Pascal Titan appears to be dropped from the 10 series
webpage[1], although it's not hard to find [2]. So it looks like this replaces
the old card entirely. I've not followed the consumer GPU market too closely
as of late, is this normal for Nvidia? Or have they perhaps got yields up to
better level?

[1]
[http://www.geforce.co.uk/hardware/10series/](http://www.geforce.co.uk/hardware/10series/)

[2]
[http://www.geforce.co.uk/hardware/10series/titan-x/](http://www.geforce.co.uk/hardware/10series/titan-x/)

~~~
modeless
The 1080 Ti was marginally faster than the previous Titan, while being
significantly cheaper. This is a small spec bump to put the Titan back on top,
so that people with money to burn still have an excuse to give Nvidia $1200
rather than $700.

If you ask me the 1080 Ti is still a better choice, but there are always a few
people out there who just have to have the very fastest thing and Nvidia is
happy to take their money.

~~~
beautifulfreak
I keep my eye on the Folding@Home spreadsheet of GPUs and their
price/performance ratios, updated with price changes regularly. My 2012 Mac
Mini going full blast does about 5k Points Per Day (PPD), while the Titan X
Pascal does 1200k, but performance per Watt is the key metric.
[https://docs.google.com/spreadsheets/d/1v5gXral3BcFOoXs5n1M6...](https://docs.google.com/spreadsheets/d/1v5gXral3BcFOoXs5n1M6l_Uo3pZpQYogn6gVlxRPnz0/edit#gid=0)

------
chrissnell
I wonder when the Linux drivers will be out. I'm running dual 1070s powering a
single Dell 5K on my Arch Linux desktop machine and it's amazing.

~~~
Mk-0
Serious question- what benefit do you get out of this? I had an SLI setup with
Linux for awhile as well, before realizing that the second GPU was never being
tapped into at all. It's recognized, and is getting power, but it's wholly
under-utilized (if at all).

From everything I've found, Linux doesn't benefit from SLI nearly as much
Windows can.

~~~
semi-extrinsic
I've used several multi-GPU (up to 8x K40) Linux machines, and if your
application can provide enough GPU work (say ML/AI, or physics simulations,
etc.) they scale perfectly. For gaming, I don't know, but could be GP is doing
non-gaming stuff.

------
loxias
Does anyone have any idea, or even educated guess on if and when we might see
a price drop on other, now ""outdated"" (not really) GPUs? I was just starting
to plan building a desktop with plenty of CUDA cores, and now that this has
come out I'm wondering if I should wait a few weeks or a month to see if
prices fall.

------
nik736
Does this also mean we will finally get mac drivers for the new cards? (Hello
Hackintosh Community)

