
Nvidia Announces GeForce RTX 2060 - zdw
https://www.anandtech.com/show/13796/nvidia-announces-geforce-rtx-2060-349-dollars-january-15th
======
CobrastanJorji
The bigger announcement here is that Nvidia is enabling support for FreeSync
and certifying that things work correctly for certain FreeSync monitors.
That's a big deal. Nvidia syncing previously required their proprietary
"G-Sync," which is substantially more expensive and is usually only present on
monitors explicitly aimed at gaming.

It's too early to tell whether this means G-Sync is dead and FreeSync is just
the new standard, but it's a great day to own a FreeSync monitor.

~~~
markdog12
And it looks like you'll be able to enable it for your monitor even if it's
not one of the expensive certified ones.

------
wlesieutre
Have to wait for benchmarks to say anything definitive, but from the specs
table it looks like performance is in the ballpark of the 1070. Same TFLOPS,
faster memory clock but not as much memory, TDP is 10W higher.

At $350 that sounds like a tough sell over the 1070, which is finally down to
$300 after rebate. Unless you're really excited about the raytracing stuff.

I'm underwhelmed for 2.5 years of progress since the 1070 (mid 2016). Just
sticking it out with my 970 until it dies, which will hopefully be until
there's something new worth upgrading to.

Maybe AMD has something more impressive up their sleeves in the mainstream
price range.

~~~
Jnr
I'm also sticking to Nvidia GTX 970 for now but would be nice to upgrade at
some point. I was hopeful about AMD's offering but after trying Ryzen 2400G
APU on my Linux media box I am not going to touch new AMD hardware again with
a ten foot pole any time soon. Their Linux drivers are a nightmare, freezing
display all the time, requiring power cycling. Nvidia works nicely and I don't
really mind the bigger price tag.

~~~
dave7
It's probably done enough damage to their reputation in your estimation to not
matter, but please note these Linux problems are specific to the Raven Ridge
APU [1] - not the case using a discrete AMD GPU, the open source drivers for
which are by many accounts absolutely superb these days.

It's a shame, I want a 2400G to replace my ageing A10 Trinity media box, will
probably give it a try fairly soon expecting latest kernels have this solved
by now...

[1] [https://www.phoronix.com/forums/forum/linux-graphics-x-
org-d...](https://www.phoronix.com/forums/forum/linux-graphics-x-org-
drivers/open-source-amd-linux/1045801-is-there-any-kernel-on-which-raven-
ridge-cpus-are-stable)

~~~
ohithereyou
Thanks for this information. I was planning a GPU passthrough setup using this
- 2400G APU for the Linux display and an nVidia GTX 970 for passthrough, but
this now makes me pause.

------
YegoBear
Usually the new XX60's are great because you get the previous XX70's
performance at the lower XX60 price. This time you're getting the previous
XX70 performance for the previous XX70 price...more than 2 years later. Weak!

~~~
JohnJamesRambo
Hey, it worked for Apple!

~~~
eanzenberg
I think you're being sarcastic, but it actually has according to their
revenue. They've hit their targets except for the weakened China market due to
increased trade wars and volatility there. Because the economy is exploding
due to the tax cuts, and the increased amount of disposable income, the price
increases of Apple product hasn't seemed to affect them adversely.

~~~
JohnJamesRambo
Have you checked the stock price lately? $223 to $148 in a few months is no
one’s definition of good. Top company by market cap to getting passed by many
of your tech rivals, not good either. It implies that just blindly raising
prices on your products to increase profits only works so long.

~~~
syspec
All time highest quarter in US, this last quarter. all time highest.

------
TwoNineA
I refuse to buy Nvidia cards until they stop blocking drivers in VMs with
passed through video cards. Yes I know, you can bypass the dreaded code 43,
it's more on principle.

~~~
withinrafael
Interesting, never heard of this until now. A KVM maintainer seems to confirm
this behavior too [1], albeit 3 years ago. Some newer anecdotes are around but
it isn't clear if this is still happening [2][3]. (I used to virtualize the
GPU via RemoteFx in Hyper-V many years ago and didn't run into this then. But
now I'll have to give it another shot!)

[1]
[https://www.reddit.com/r/linux/comments/2twq7q/nvidia_appare...](https://www.reddit.com/r/linux/comments/2twq7q/nvidia_apparently_pulls_some_shady_shit_to_do/)

[2]
[https://www.reddit.com/r/VFIO/comments/5sh41p/any_other_reas...](https://www.reddit.com/r/VFIO/comments/5sh41p/any_other_reasons_for_nvidia_driver_code_43/)

[3]
[https://www.reddit.com/r/HyperV/comments/5u2ge2/nvidia_consu...](https://www.reddit.com/r/HyperV/comments/5u2ge2/nvidia_consumer_card_passthrough/)

~~~
exrook
It's definitely still happening, however there are workarounds that bypass the
driver's checks by changing hypervisor settings. One particularly insidious
"feature" is that on the Quadro series of cards, which are ostensibly meant
for this use case, it will disable some features if a VM is detected (in my
usage, the displayport 1.2 support for 4k@60hz was not enabled until I enabled
the workarounds)

~~~
the_pwner224
The workarounds you mentioned didn't work at all for me (and reportedly for
some other people too), on my GTX 680 or 1050m. Just a warning for those who
may be considering buying a nVidia card and were given hope by exrook's
comment...

------
nickflood
Nvidia is sticking to the strategy of releasing the same GPU with RTX bolted
on under a lower number and charging the same price for it... 2060 ($350) is
very close to 1070 ($300), 2070 to 1080, 2080 ($800) to 1080 ti ($750). Edit:
not the same same obviusly, but very similar in terms of benchmarks.

Doesn't seem like progress to me. With RTX being underpowered on current gen
cards there's no real benefit in being an early adopter by buying this
generation of graphics cards if you at all count the money you spend. Just
find a used 1080 Ti, I say.

I think, Nvidia tries to pull an Apple and move the "premium" price tier up
500 points while curiously nobody in the press is calling them out on it.

~~~
TomVDB
Would you have been ok with them calling this a 2080 while keeping a $350
price tag?

Today, I see that a new 1070 Ti goes for $400+. This new RTX 2060 has the same
performance for $350.

In other words: if you ignore the name of the product, the new product line
gives you better performance and more features (RTX, tensor cores) for the
same price than yesterday's product.

Is your only beef with the naming of the product? Why does the name matter?

~~~
nickflood
The problem in my opinion is that in the past they would effectively shift
performance down one SKU on each generation, and at the top you could get a
30% upgrade in performance for the same kind of money. Like the 980 Ti to 1080
Ti for example. Now you get marginally the same amount of performance for
marginally the same amount of money.

It doesn't matter what they call it if the performance/money is what it is. I
compare names because names meant a set price point going up to 10XX
generation.

If you've got no use for RTX and tensor cores, those cards are very similar.
RTX is available in one game only 3 months after launch, and what if I just
dont want to be an early adopter? Tensor cores, yeah, if you're going to earn
money or otherwise have your life improved by additional hardware in Turing,
sure, it's a no-brainer. But I feel like it's not for many gamers.

Disclaimer: I'm not saying I am entitled to progress. I'm pointing out that it
has slowed. I feel that this is similar to the lack of competition Intel was
having. At least I'm thankful to Nvidia gods for making the higher performance
card at all available, I guess...

~~~
SketchySeaBeast
> The problem in my opinion is that in the past they would effectively shift
> performance down one SKU on each generation, and at the top you could get a
> 30% upgrade in performance for the same kind of money. Like the 980 Ti to
> 1080 Ti for example. Now you get marginally the same amount of performance
> for marginally the same amount of money.

That's the exact problem with this generation. Before the top end (980 ti)
became midrange (1070) for substantially less money. Now they just pushed the
names up a bracket and kept (roughly) the same, and justified it by inventing
a new "top end".

~~~
nickflood
Yep, you get me. And this saddens me doubly because we saw the same kinda
"upgrade" from $700 to $1000 in flagship phones not long ago and everybody was
just as okay there.

Maybe it's the tariffs, or inflation, or some other thing that totally
justifies this kind of sudden jump, but somehow not many press outlets
acknowledge this jump at all existing, for them it's business as usual it
seems.

My only hope is that consumers will react with falling sales. I'm sticking to
my 1080 Ti for sure until the next gen.

------
kristofferR
I really hope AMD can offer some real competition to Nvidia soon, right now
NVIDIA is fleecing customers because they have no real competition. $350 for a
mid-tier GPU is shameful.

~~~
brianwawok
NVidia stock has lost half their value since peak, if fleecing they aren't
doing a very good job at it.

~~~
mattnewton
Personally I think the price explosion had two groups of people driving it:
The first group, which I am in, believes nvidia with its proprietary CUDA will
continue to grow its datacenter presence rapidly, and get some big contracts
for field ML hardware. The second was looking for crypto exposure. The
etherium bubble has mostly burst and it isn’t profitable (probably hasn’t been
for a long time in most places) to buy nvidia cards for crypto mining. That
and the trade war with China have taking half of the stock’s overheated value
off. It’s still up quite a bit from ~$30 in 2016, and I think that reflects
its real edge in ML hardware as well as some modest healthy growth in consumer
gaming. I think, previously I assumed all the growth was from investors
thinking like I was about their AI prospects and missed that I had picked a
winner based on the crypto craze. I think a lot of smart money realized that
and got out.

Disclaimer: was very long nvidia, sold a lot in the 200s and hold a lot less
these days.

------
pizza234
I'm not very convinced.

Nvidia has been very commendable by breaking new ground with the RTX
technologies, but it seems that at least ray tracing has a huge performance
impact, therefore, I wonder how this card will fare, given its (relatively)
low-level position.

Moreover, with the RTX series, Nvidia has "pulled an Intel": they raised the
TDP for each model, so that they on generic (non-RTX) benchmarks, they look
good, but they actually have a negligible performance per watt gain compared
to the previous generation.

GTX x60 cards traditionally had fixed power requirements (6-pin); this doesn't
apply anymore.

My suspicion is that GTX 11x0 cards (1160, 1150) are not yet showing, because
Nvidia hasn't figured out a way to show a performance per watt improvement
yet.

~~~
suresk
> My suspicion is that GTX 11x0 cards (1160, 1150) are not yet showing,
> because Nvidia hasn't figured out a way to show a performance per watt
> improvement yet.

My guess is slightly different - they know that the RTX technology isn't
really there yet, and models that were just faster version of the current
generation would probably kill any RTX sales.

Putting RTX on a 2060 is odd to me. The 2080 can't really run ray tracing at
usable performance levels, I don't even want to know what it would look like
on a 2060. There might be some more rounds of improvements coming to software,
but are those enough to make it playable two tiers down?

------
ricardobeat
If only NVIDIA would drop their expensive G-Sync and use FreeSync instead.
That would eliminate the strongest reason right now to buy AMD (way larger
selection of quality monitors).

~~~
SketchySeaBeast
Apparently you're a wizard, but don't let that go to your head.

[https://www.pcgamer.com/nvidia-brings-g-sync-support-to-
free...](https://www.pcgamer.com/nvidia-brings-g-sync-support-to-freesync-
monitors/)

------
swalsh
How would this compare to my 980ti? It's been working pretty well for VR, but
I've been wondering if maybe there's some value in upgrading soon.

~~~
onli
This will be a little bit faster than the 980 Ti, as the 2060 is at the level
of a GTX 1070 Ti, while the 980 Ti is bit slower than a GTX 1070. But I'd wait
a generation longer, the jump is not worth it (if you don't get a great price
when selling the 980 Ti).

------
oldboyFX
I wonder if low sales numbers for the new generation of Nvidia cards will
negatively impact the current and future development of the ray tracking tech.

~~~
blt
Doubtful since the ideas behind ray tracing are very mature. Advances in real
time ray tracing will take the form "x advancement in offline ray tracing from
30 years ago can now be done in real time."

------
sneakernets
It's still ridiculously huge.

~~~
moftz
It's not much different than any other Nvidia card. Dual fans, dual slot, it's
probably smaller than some of the monster Radeon cards AMD used to make. They
make smaller versions of some chipsets but what you lose in volume you lose in
cooling.

~~~
sneakernets
That is true. To be honest, I'm still salty about some older OEM mobos I had
to upgrade, which decided to crowd the PCI-e slots with capacitors. All that
did was make the slots impossible to use!

------
dschuetz
I'm still waiting for the AMD implementation of hardware real time raytracing.
I don't trust Nvidia anymore concerning hardware quality, and the price is
just ridiculous.

