
Nvidia GeForce GTX 1080 Ti - bcaulfield
http://www.anandtech.com/show/11172/nvidia-unveils-geforce-gtx-1080-ti-next-week-699
======
modeless
Hmm, so why would anyone buy a Titan X now? For 1 GB extra (slower) RAM? Did
they hobble it in any other way?

Edit: According to Anandtech, "NVIDIA has been surprisingly candid in
admitting that unless compute customers need the last 1GB of VRAM offered by
the Titan, they’re likely going to buy the GTX 1080 Ti instead."

I would suggest changing the article link to Anandtech, as it's really a much
more informative and better written article. Jen-Hsun Huang seems to have
bottled his cringe-inducing presentation style and fed it to whoever wrote
that PR piece on the Nvidia site. [http://www.anandtech.com/show/11172/nvidia-
unveils-geforce-g...](http://www.anandtech.com/show/11172/nvidia-unveils-
geforce-gtx-1080-ti-next-week-699)

~~~
jacquesc
I believe gamers who spend that kind of money on a graphics card do it more
for cultural than functional reasons. It's like buying a car model upgrade,
that has little practical value, aside from showing off that you could afford
it.

It's weird, but that's human nature I guess.

~~~
K0SM0S
No, currently monitor technology is evolving fast so GPUs at the are
struggling to meet the highest interface requirements, notably 3440x1440@144Hz
and 4K@60Hz (and don't even think about more frames at 4K).

I have a GTX 1070 mostly for deep learning, but I have to admit that piece of
hardware was pleasing the casual gamer in me as well. It's only about 20%
behind a GTX 1080 for gaming, a $500 GPU. Twice more than I had ever spent for
leisure as a gamer.

Well on my 4K display, I can push most pre-2014 games to 60 fps, but that's
it. After that, for most AAA games, I have to resort to 1440p + some AA. And
if I was really into gaming I'd have bought an ultrawide 1440p/144Hz monitor,
no doubt about it, knowing pretty well I'd need a Titan XP or at least 1080 to
see these framerates on newer titles.

It's a space that's moving fast, both on the interface (monitor, VR...) and
processing side (Nvidia makes a 20-30% leap each gen, kind of insane compared
to Intel CPU gens for instance). We're only beginning to touch low-level APIs
once again after a two-decades hiatus on the PC (DirectX being both great to
make it easy to create 3D apps but at the cost of much performance/overhead,
which DX12 is trying to address).

I'm jaded about CPUs, honestly Ryzen is much welcome economically but nothing
to be enthused as a nerd/engineer, however GPUs and display technology is
where it's at these days. Can't wait for proper AR too (HoloLens-like
technology, still a few gens away from a commercially viable product).

~~~
Foxhuls
I'm sorry but while I agree GPU and monitor technology is getting improved
quickly, I think you're giving some incorrect info. Ultra wide 1440p monitors
cap out at 100hz right now, not 144hz. The currents cards are pushing the
100hz ultrawides pretty much to their max and the 1080ti will 100% any slack
left. 4k is possible past 60hz and a single 1080 is doing fairly well at 4k as
is. 1080s in SLI can easily push a 4k monitor past 60fps and Asus is releasing
a 4k 144hz monitor around Q3 of this year because of that. The tech is moving
fast but you're exaggerating quite a bit on how hard it is to push these new
displays.

~~~
ptrptr
I have to disagree, there are couple consumer grade 1440p 144Hz monitors
(FreeSync) - ASUS MG279Q, Acer Predator XF270HU and even 165Hz one - Acer
Predator XB271HU (G-Sync) all IPS. There are also VA Ultra Wide panels like
BenQ XR3501 and Acer Predator Z35 with 3440x1440px and 144Hz. Technology in
this area is recently progressing with decent speed.

~~~
Foxhuls
None of the ultra wide 1440p monitors have 144hz. Almost all of the models you
listed are 16:9 monitors and the one ultra wide is not 1440p. I agree that the
tech is progressing but this generation of GPUs are pushing them quite well.

~~~
treebeard901
Why is your focus on ultrawide monitors? I'd bet most gamers do not use them
for various reasons. You may have one but it is not the standard by which to
gauge the viability of the GPU market. For example, Acer has a 240hz gaming
monitor out now.

~~~
bcrescimanno
Two reasons: first, the earlier post specifically called out that, "...ultra
wide 1440p monitors cap out at 100hz right now, not 144hz."

Second, it's all about pixel count. While 4k is still miles ahead of even the
3440x1440 ultra-wide resolution, that resolution represents nearly as big a
jump in pixel count as the jump from 1080p to 1440p.

The goal is 4k at a minimum of 60fps. Consistent 100 hz or even 144hz on an
ultra-wide 3440x1440 monitor represents a solid step in that direction.

------
kalleboo
When looking at the advancements in GPUs these days, where are the
improvements coming from? Are they just throwing more silicon at the problem?
Is it process improvements? Smarter design? All of the above? Can we expect
these advancements (and cost reductions) to continue, or how far away are we
from the wall that CPUs have hit?

~~~
ClassyJacket
Yes, all of the above.

IANAEE (I am not an electrical engineer).

Something nice about GPUs is that graphics rasterization is a very parallel
problem. If you can handle thermals, power consumption, memory bandwidth, etc.
that go with it, you can pretty much just add more of the parts that do the
processing (in this case, CUDA cores - 3,584 of them) and get a more powerful
GPU. A 1080 TI has 12 billion transistors. The 1080 had 7.2 billion. The 980
had 5.2 billion.

CPUs handle more branching problems, so it's harder to get improvements by
just adding cores.

They're also smaller than the pre-1080 - 16nm down from 28nm, which lowers
power consumption and therefore heat. It's a huge help and it's part of what
allows them to just add more transistors in the first place - but in this
area, GPUs will run into the same problems CPUs do. It's hard to shrink
transistors.

~~~
usaphp
> IANAEE (I am not an electrical engineer).

Why not just write "I am not an electrical engineer"? Defeats the purpose if
you still have to explain it.

~~~
madez
I appreciate they wrote it the way they did. I didn't know what the
abbreviation would have meant, and now I do. Also, I now have a clue what
"IANA" at the start of an abbreviation I have never seen before might mean.

~~~
cr0sh
I was informed by a long time user (Grumpy Mike) of the Arduino forums when I
first started posting there that one should -always- explain the first usage
of an acronym or similar construct before using it elsewhere in the rest of a
comment.

It makes sense from the perspective of the fact that not all users may
understand what the acronym means at first, and doing this is a great courtesy
(also, it removes ambiguity in instances where you may be using an acronym
similar or same as that for another domain - perhaps the domain of the forum
or medium you are communicating on).

------
sundvor
With that price (I had expected it to be higher), it looks like they're
positioning themselves against Vega already. We all win!

I'm locked in with a GSync monitor, and can't wait to replace my 2x 980s with
this one. I'll still wait for Vega to come out, hopefully putting downward
pressure on the pricing.

More details here: [http://www.pcgamer.com/nvidias-geforce-gtx-1080-ti-is-
finall...](http://www.pcgamer.com/nvidias-geforce-gtx-1080-ti-is-finally-
here/)

~~~
hermitdev
I'm looking forward to replacing (or supplementing) my 4x 690s (2 cards, 2
GPUs/card).

I'm not looking for 4k support, but I am looking for dual monitor gaming on
one while streaming video on the other support.

~~~
sundvor
I read you. My take is that a single 1080TI should suffice for your needs; 690
to 980 was a bit of a sidegrade, and the 1080TI being roughly 2x the 980 with
many added benefits starting with the huge increase of RAM.

You'll probably wonder if your PC is still there, with noise levels likely to
plummet. My 980 SLI certainly mounts a racket under gaming load.

~~~
hermitdev
This is one of those times I loath Intel's "new" (several years old now)
naming scheme. I've a 6 core i7 extreme...but it was one of the first or
second gens, so it's hard to qualify it against current gen. I've also 64GB
RAM, and seldom get near 50% usage w/o VMs running. IO seems reasonable, and
yet streaming video stutters a lot, with say Civ6 on the primary monitor and
streaming live TV on a secondary. I didn't use to have a problem with Civ5 &
streaming live TV (i.e. TV stream was clean and Civ5 performed as crappy as
Civ5 does), and when I look at task manager, the CPU is not fully tasked, so I
presume it may be a GPU issue (anyone know of a taskmon like app for GPUs?).

Since I'm on the fast insider ring, I've tried both with and without game mode
turned on (which supposedly now works), and both cause the TV stream to
stutter. I don't even recall having this issue before they even announce game
mode, but I can't be sure when I first had the problem, because I was also
having physical internet issues in same time frame (thanks, Comcast).

------
valine
I do a lot of 3d rendering with Blender Cycles so I'm really happy to see the
bump in cuda cores. The gtx 1080 didn't offer any significant performance
boost over the 980 when it comes to rendering in blender. Right now for Cycles
you get the most performance per dollar by buying multiple 1060s (blender
doesn't rely on sli when rendering with multiple gpus). I've got my fingers
crossed that this card will offer some tangible performance boosts.

~~~
abledon
Hey I'm experimenting with blender too,

Question: Ignoring sli, do you have different GTX versions plugged into your
Motherboard for rendering? Like, 2 1060s and a 980Ti ?

Have you tried background task rendering via commandline, targeting the GPU as
workers, but also using the CPU?

e.g. iF you have 2 graphics cards. Run 3 commandline blender render jobs (2GPU
+ 1 CPU).

~~~
valine
I haven't used multiple gpus with different versions, but everything I've read
suggests it should work. If one gpu is significantly slower than the other you
might run into situations where the slower gpu gets stuck working on the last
tile while the faster gpu sits idle, making your total render time go up. I
imagine this would be pretty rare, and a little effort to modify the tile size
would mitigate any issues.

As far as rendering from the command line with both the cpu and gpu its
definitely possible albeit a little hackish. Rendering from the command line
uses the cpu by default. To render using the gpu you have to pass a python
script as an argument that changes the render device using the blender python
api. Blender supports multiple gpus out of the box, so there is no reason to
split them up into separate jobs (even different model gpus that don't support
sli). You'd only need one job for the cpu and one for the gpu(s). The tricky
part is making sure the cpu and gpu work on different things. For animations
you'd probably want to change the render step option. Setting it to 2 would
make blender render every other frame, so the cpu would work on the even
number frames and the gpu would work on the odd frames. For single frame
renders you could set the cycles seed value for both devices and then mix the
two generated images together. Both the seed value and the step option can be
set in the python script which means its pretty easy to automate the entire
process. It definitly not trivial to get working so at some point you need to
decide if the 0.1x speed bump from adding the cpu is worth the effort. Any new
nvidia gpu is going to be worlds faster than whatever cpu you might be using.

See here for instructions on stacking cycles renders with different seed
values:

[http://blender.stackexchange.com/questions/5017/stacking-
cyc...](http://blender.stackexchange.com/questions/5017/stacking-cycles-
renders-in-the-compositor)

~~~
trendia
> Rendering from the command line uses the cpu by default.

I don't believe this is still true. When I render from the command line, it
will use the GPU if my user preferences are set to GPU. (confirmed by render
timings)

~~~
valine
Oh nice, I'm really happy to hear that because passing in a python script is a
pain. I don't remember seeing it in the release notes. Maybe the developers
didn't consider it a big enough change to warrant writing down.

------
gigatexal
Fp32 still gimped? I think I'll wait for the AMD cards and openCL even though
the cuda ecosystem is a lot more polished.

~~~
neilmovva
fp32 is almost never "gimped," since that's the basis of performance for most
video games and 3D applications. perhaps you meant fp64? That sees more use in
industrial apps (e.g. oil and gas exploration) and so is often slower on
consumer cards, and historically was artificially limited. For recent GPUs,
it's not quite fair to call fp64 gimped, since the fp64 equipped cards are
completely different designs - the lack of fp64 on this card is a design
choice, not an artificial restriction for product segmentation.

~~~
wtallis
FP16 is also becoming a feature of interest, especially for machine learning.

~~~
gcp
The card reportedly has a 4x speed INT8 mode.

~~~
pandascore
Is it worth to buy this GPU instead of the PASCAL X (price apart) FP speaking
since 1080 is for gamers and PASCAL X meant for ML/DL ?

------
wst_
Is a 4K tech mature enough to use it every day for desktop and software
development? I've read, not so long ago, that Windows still have issues and
plenty of apps looks just ugly. Also I read that there might be an issue to
keep your old HD along with new 4K panel - some people experienced problems
with such setup. Unfortunately, I've had no opportunity to test 4K so far.
Also I have no idea how does it looks like with Linux systems.

~~~
strictnein
I'm a software dev and use a 32" 4k monitor (3840x2160) at home. This is a
16:9 monitor (from BenQ). Unless you have 20/15 vision, you'll likely need to
scale it. Windows 10 has built in scaling at 25% increments, and you can also
do custom scaling. I scaled it to 115% and it works pretty well.

I also have a secondary monitor hooked up, but it's a 2560x1440 monitor turned
vertically. It works fine, but I forget if font scaling is applied across the
board or just to each monitor.

The ideal 4k desktop monitor size is likely at least 36-38", but I'm not sure
if those are economical yet or not.

At work I have a Dell 34" Ultrawide (3440x1440). Not quite 4k, but no scaling
is needed, and it's a great monitor. The one downside is that it's just got
the vertical height of a 2560x1440 monitor, so I kind of miss the vertical
height of my home monitor at times, but I can easily have three or four files
side by side in my IDE.

I also have an Ubuntu system at home hooked up to the 4k monitor at home and
it works, but you'll possibly need to adjust font sizing in your apps. I
haven't spent a ton of time recently with this system though.

For all of these monitors you want a video card with Displayport 1.2. You do
not want to use HDMI because you will likely end up at 30hz and that is a
horrid experience. HDMI 2.0 supports higher refresh rates, but having both a
HDMI 2.0 port and monitor is pretty rare.

Anyways, probably too much info, but it was either do this or work on an
annoying bug :)

~~~
aaronscott
Just wanted to provide a counter point - I also have a 32" 4k display but find
the default text size easy on the eyes without scaling. I do sit pretty close
to the monitor though, about half an arms length away (elbow to fingertips).

I move my head around a lot more to focus on different parts of the screen.
But I like that experience.

My vision isn't great - I'm near sighted with 20/200 vision. I can use the
screen comfortably at the default scale with glasses, or with a small bump in
text size without them.

I'm using OSX, with Atom & iTerm2 mostly. So that may have different font
rendering than Windows.

------
eng_monkey
Given this new model will push down prices of exiting models, what would be
the cheapest currently available NVIDIA that includes a Display Port output
and a resolution of 3440x1440? This is just for work (editing documents,
programming, etc.), not gaming.

~~~
random28345
Why not get a modern (Haswell or later) Intel CPU and a motherboard with
DisplayPort output? You don't need a discrete GPU unless you're MLing or
gaming, and you can splurge a bit and get something with DDR4 and/or NVMe.
Having faster IO is the best way to make "productivity" software like
spreadsheets and editors faster.

== gratuitous shilling below ==

For Christmas I got a new CPU/motherboard combo with a Samsung 960 EVO SSD,
and it's ludicrously fast, decently cheap, and a noticeable improvement over
my old (heh) SATA SSD.

[http://www.samsung.com/semiconductor/minisite/ssd/product/co...](http://www.samsung.com/semiconductor/minisite/ssd/product/consumer/960evo.html)

~~~
eng_monkey
Thanks for the suggestion. It is something I had considered; to replace the
motherboard and the CPU with the new AMD Ryzen. But if I remember correctly,
it is difficult to find motherboards that include video output (especially
through Display Port). So I would have, as you mention, go with Intel, which
seems a missed opportunity given the new AMD processors?

~~~
random28345
Intel still has a good edge on single-core performance. And productivity
software is about pushing lots of instructions through a single core.

------
strictnein
And I just installed my new EVGA GeForce GTX 1080 SC2 8GB yesterday after
work, so excellent timing on my part. $649 for that card.

Guess I'll be boxing it back up and returning it, which I dislike doing.

~~~
davman
The TIs have been strongly rumoured for months, I've been waiting for this
announcement so I don't end up in your situation.

~~~
strictnein
Yeah, I just thought EVGA's recent release of their FTW2 and SC2 with the
advanced temp monitoring stuff was a signal that it was going to be a while
longer. Guess I read those tea leaves wrong.

~~~
bcrescimanno
I very nearly fell into the same trap; especially since I have a 1070 and EVGA
was offering a nice upgrade program to the new ICX coolers.

Hope your return goes smoothly.

------
certifiedloud
Price drop for GeForce 1080 also announced: $500.

~~~
kayoone
with the AMD Ryzen CPU release tomorrow, you can now pair a lastest generation
8C/16T CPU with a 1080 for $1000, a combination that a week ago was about
$1700. The Ryzen boards should be cheaper too, bringing the total system price
for a configuration like this down a lot. Pretty amazing.

~~~
ameen
I'd wait for AMD's Vega. They are rumored to exceed ~1080's performance at a
lower price. They'd also work better in conjunction with Ryzen boards.

With widespread FreeSync availability (and better monitors), buying a Nvidia
card is almost shooting oneself on their foot.

~~~
dagw
_buying a Nvidia card is almost shooting oneself on their foot._

If all you want it for is gaming. If you also want to play around with machine
learning or other GPGPU applications, then getting anything other than Nvidia
is a bad idea, since that is what everyone uses and supports at the moment.

~~~
TwoBit
So like .01% of users.

~~~
theinternetman
It's not just machine learning, 3D rendering is moving over to GPU as well and
there is next to no support for anything other than CUDA in that space today.

Not everything starts and ends with gaming in the high end computer space.

~~~
sdwisely
Octane is the only real example of this I can think of. Nearly every other
tool I use is openCL.

------
arcaster
Wow, that price point is far better than what I was expecting. Still
expensive, but it's below $800.

------
lightedman
Still not good enough to move me from my $299 ($399 w $100 rebate) 8GB R9
390X. Last nVidia card (still operational and in use) was the 650Ti.

------
smrtinsert
Glad I went cheap with my 670 about a year ago. All the buzz about eventually
upgrading to the 1080 family was clearly on point.

------
pmoriarty
How much better do graphics cards have to get before they can handle full-
motion, 360 degree, 4k per eye VR streams?

~~~
Dylan16807
Full motion, 360 degree doesn't change the card requirements, and you can do
two cards if necessary, so the only real barrier is 4k rendering.

Apparently a 1060 can already render a game like overwatch in 4k, so I think
we're already there. You just need to find someone that will take 4k screens
out of some phones and put them in a headset.

~~~
fra
"You just need to find someone that will take 4k screens out of some phones
and put them in a headset."

Ha! With what cable? 4K * 2 * 90hz blows way past display port (even DP1.3).
If it was just a matter of gluing two android phones together it would be on
shelves already...

~~~
Dylan16807
You could run two cables. That's not the hard part.

And cancel the rest of my comment, I just found out there already are early 4K
VR headsets.

[https://vrworld.com/2016/11/10/hands-on-review-
pimax-4k-vr-h...](https://vrworld.com/2016/11/10/hands-on-review-pimax-4k-vr-
headset/)

[http://360rumors.blogspot.com/2017/01/ces-2017-hands-on-
with...](http://360rumors.blogspot.com/2017/01/ces-2017-hands-on-with-
pimax-8k-vr.html)

This particular device uses HDMI 1.4 and has a terrible framerate, but it's
close to doing the right thing.

~~~
fra
You're wrong, bandwidth on different interfaces (PC<->HMD, HMD<->panels, ...)
is very much one of the hard parts.

~~~
Dylan16807
It may be hard in an objective sense, but it already exists. It's not the hard
part of making a high-resolution VR headset because you can buy chips that do
it for you.

A single displayport 1.3 cable, supported by all recent GPUs, can push 4K at
over 120Hz. Screens that use such data rates already exist too.

------
JOnAgain
How close are we to being able to render a high resolution monitor / TV inside
VR? Not necessarily a 4K screen, but like a good 2560 x 1440 display? How
close are we to high resolution virtual cinema experience?

... or are we there? I really have no idea how resolutions in VR work.

~~~
namlem
VR headsets don't have that level of resolution yet. There will still be
visible pixels for years to come. I believe it was either Carmack or Gabe
Newell who said you'd need nearly 50k vertical pixels for true photo realism,
I may be mistaken though.

~~~
theWatcher37
An AMD study said you'd need a 16k display for 4k-like quality over a full
human FOV

------
TenOhms
How hard is it for games to ignore SLI entirely and simply use a second or
third GPU to offload things like post-processing effects ?

Or even better, how about if you had 3 1080 Ti's and 3 monitors, could the
application/game just easily assign one GPU per monitor without having to
resort to using SLI (which has such a bad reputation)? This would make things
so simple, want an extra monitor or two? Just add a GPU to power them. I cant
imagine that the coding for something like this would be anywhere near as
complicated as SLI/Crossfire.

~~~
sand500
I think you can do that using some virtualization software. Really depends on
the graphic card drivers and game engines to expose that ability to the actual
game programmers.

------
icpmacdo
How far off are we from 8K gaming?

~~~
Orangeair
Quite far away. Even the Titan X Pascal is having some trouble maintaining a
consistent 60 fps in all games.

[http://www.gamersnexus.net/hwreviews/2659-nvidia-gtx-
titan-x...](http://www.gamersnexus.net/hwreviews/2659-nvidia-gtx-titan-x-
pascal-review-and-benchmark/page-4)

See the Mirror's Edge and The Division benchmarks. Averages are (just) over
60, but there are dips below.

Personally, I think the best resolution for current-gen cards is 3440x1440.
Should be rock solid 60+ fps in all games, and gives the benefit of being
ultrawide.

~~~
ClassyJacket
I strongly disagree that being ultrawide is an advantage. 16:9 is already
wider than optimal for gaming. The human field of view is actually about 4:3.
While there is admittedly usually more interesting content to the sides than
the vertical edges, I've found 16:10 to be preferable for immersion and would
never dream of going _wider_. If anything, I'd go closer to a square if they
still made them.

I'd rather have 2560x1600 than 3440x1440, even though it's less pixels.

IMAX film format has a more immersive aspect ratio too, which is taller still
at about 16:11.

~~~
devonkim
I have an ultrawide not necessarily to be able to perceive all my content at
once but that I don't need to have dual monitors anymore for putting two
documents next to each other without compromising the width of each too much.
Even still, I came from a 27" 2560x1440 monitor and the edges are still of
value to me in peripheral vision in games. Add in that most 34" ultrawide
screens now seem to have a curve to them and it makes visibility at the edges
easier as well. Not having to setup an extra monitor and suffer the bezel in
the middle is very much worth the troubles for me because otherwise I'd need
3x monitors and at that point it gets insane with multiple monitors in
portrait and such.

~~~
digler999
Try a 4K TV. I bought a card with 4K HDMI2.0 (for 60 fps) out, plugged it into
my 46" TV and never looked back. The only drawback is needing to use the
remote to turn it on/off. The monitor shuts off with DPMS but when the machine
is off it still searches for signal. Doesn't bother me at all

------
TwoBit
So where will AMD's Vega land amongst the 1080s?

~~~
jsheard
AMD demonstrated Vega running Doom 4 with an FPS counter a while ago, and it
was running about 10% faster than a stock GTX1080 (non-ti) or on par with an
overclocked GTX1080.

Doom 4 performs unusually well on AMD cards compared to other games, so on
average Vega is probably similar to a stock GTX1080. Which leaves them with no
answer to the 1080ti :(

~~~
penagwin
I think Doom 4 performs so well because ID software usually has one of the
most optimized engines (for everyone) and I believe that AMD cards have more
raw power then Nvidia, but it's usually unused(I think due to unpopular APIs).
Thus Nvidia is generally known to perform better in real world cases.

------
mtarnovan
I'm curious how this would perform for mining Ethereum/ZCash vs, say, the
RX480.

~~~
ntelson1s
Just speculation here, but last I looked into it, video cards were worthless
when it comes to ROI compared to the specialized mining hardware used
nowadays, which is continuing to advance.

~~~
patrickk
This wasn't true for Ethereum, at least last year when I was mining it with
(used) R9 280xs. Depends more on the cost of electricity in your country.

EDIT: also, AMD graphics cards had a significantly better ROI than Nvidia
cards.

~~~
arcaster
However, an OCL-HashCat benchmark is something I'm definitely looking forward
to.

------
asafira
Anyone else getting that the livestream (or I guess now the recording of the
presentation) is choppy and skips back and forth between segments different
segments of the presentation?

