
Nvidia GeForce GTX 1660 Ti 6GB Review - babak_ap
https://www.anandtech.com/show/13973/nvidia-gtx-1660-ti-review-feat-evga-xc-gaming
======
LeoPanthera
I'm very confused by Nvidia's numbering scheme. It used to be generation-
model, so my 1080 is generation 10, and "80" is an arbitrary model number
where higher is better. It's the logical successor to the 9-80.

The 20-x cards I suppose are OK, a big jump to signify a big change in
architecture.

But now we have... 16-60? Why 16? Is this the successor to the 1060? And it's
a "Ti", but there isn't a non-Ti 1660?

I'm confused.

~~~
coherentpony
So, perhaps it's me, but why do folks care about the number on the part? It's
all marketing. Why does it matter? What matters is performance, right?

~~~
brucemoose
Software is versioned in numerical order which is pretty intuitive. At a
glance it's clear which versions are minor updates, and which are major.

Cars are versioned by year/model, which again makes it pretty clear to
understand minor/major updates. Sometimes significant updates are introduced
in a model year, but generally the core features remain the same and it could
still be considered an upgrade to that model.

Without a clear and intuitive versioning scheme it can be confusing and time
consuming to make sense of a product line. And that gets frustrating if it
keeps changing.

~~~
alasdair_
>Cars are versioned by year/model, which again makes it pretty clear to
understand minor/major updates. Sometimes significant updates are introduced
in a model year, but generally the core features remain the same and it could
still be considered an upgrade to that model.

Tesla managed to break this trend massively, which proves a problem for things
like insurance. The feature set on (say) the January 2014 Model S is very
different from the December 2014 Model S, even though they technically share
the same "year".

~~~
mirashii
Tesla didn't break this any more than any other car manufacturer. The model
year hasn't been tied to the manufacture year for a long time. See
[https://www.autotrader.com/car-news/why-doesnt-cars-model-
ye...](https://www.autotrader.com/car-news/why-doesnt-cars-model-year-match-
calendar-year-262514) for instance.

~~~
hakfoo
I think the difference is more "model year no longer reliably indicates
feature set", not "model year no longer indicates date of manufacture."

This may make finding and pricing spare parts difficult, or categorizing
safety and performance metrics.

------
r1ch
The NVENC hardware encoder in Turing is actually comparable to x264 at fast /
veryfast. This card will be quite interesting to Twitch streamers as it opens
up the possibility to stream without CPU impact for those with a limited
budget.

~~~
akanet
A lot of people are comparing RTX's NVENC to x264 medium. Favorably, I might
add:
[https://www.youtube.com/watch?v=-fi9o2NyPaY](https://www.youtube.com/watch?v=-fi9o2NyPaY)

~~~
the8472
I think some charts with SSIM measurements at various bitrates and resolutions
etc. would be way more informative than a youtube video.

------
Justsignedup
The question is: With prices returning to sanity, is it a better bang for your
buck to get a 1660 or 2060. Seems like a $100 difference. But the 2060 seems
like still an amazing value today.

~~~
smacktoward
It comes down to whether you think RTX/raytracing will go from being a novelty
to being a Must Have within the next few years. If it doesn’t, the 1660 would
be the right choice, you’ll save some money now and can always get RTX in your
next card if it makes sense then. On the other hand, if it _does_ , getting
the 2060 today will spare you having to buy a new card sooner than you
normally would just to get RTX support.

~~~
swinglock
If ray tracing becomes widespread in games, my bet is you'll need a new card
to run that ray tracing with a good enough visual fidelity and high
performance anyway. This is just the first generation.

I think of RTX like the first Oculus Rift. You shouldn't buy it to be future
proof, it will be anything but, you buy it only because you're an enthusiast
that want to play with the latest tech early and plans to upgrade before it
breaks or become obsolete anyway.

~~~
andybak
Being fair - the Rift hasn't been superceded and there's not really anything
imminent that would make it obsolete. So someone buying it on release has had
a pretty good run.

Obviously - this is partly because of VR uptake being slower than some
expected but I use a Rift daily and it still does it's job admirably.

~~~
swinglock
I'm talking about what the first version that consumers could buy, that a
colleague brought to work in 2013 I think. It had a low resolution.

~~~
andybak
The first consumer Rift is the same one currently being sold.

You might mean the DK1 - which wasn't generally available to non-developers
but I don't think there were any stringent checks in place over who was a
developer.

------
sandeatr
Why do they benchmark at 1080p, can't every card under the sun run games at
60fps with such a low resolution.

And no 4k ? Wtf? Shouldn't that be the standard now.

~~~
vbezhenar
Definitely not every card under the sun. I have 1060 and it can't run WoW
(game from 2004) at 60 fps 1080p unless I'm lowering some settings.

~~~
skohan
That sounds like a CPU bottleneck, or else very poor optimization. I cannot
imagine WoW is that demanding on a GPU.

~~~
cbg0
It is when you're in a raid and there are dozens of spells going off all
around you. They have however made some tweaks recently, but it's still not
incredibly well optimized.

~~~
skohan
Sounds like particle system batching and/or overdraw issue. Probably this
could be solved by more intelligent batching and culling, or by throttling
particle count/resolution when the scene gets too busy.

~~~
kennyadam
You've solved it!

------
beerlord
Looks like 1440p gaming at a reasonable budget is finally here. Now we just
need some good (non-curved) Nvidia-approved adaptive sync cards at 32" 1440p
IPS.

------
walrus01
For the price, I'm kind of skeptical that it will outperform a previous-
generation GeForce 1060. Which can be purchased from $259 to $299 in various
places.

I just got a 1060 that has 3 x displayport 1.4 outputs + 1 x HDMI 2.0 output,
it can drive four 4K displays at 60 Hz.

The 1060 is probably somewhat more power hungry and hot under load.

~~~
jplayer01
I'm sorry, what is there to be skeptical of? There are plenty of reviews with
hard data and measurements. This isn't one of those things where something
stops being true just because you refuse to look.

[https://www.guru3d.com/articles-pages/msi-geforce-
gtx-1660-t...](https://www.guru3d.com/articles-pages/msi-geforce-gtx-1660-ti-
ventus-xs-review,14.html)

Now, whether it's worth another 279 to upgrade from the 1060, well, probably
not. It _is_ very much worth considering if you have an older card than from
the 1000 series. It performs on par with a 1070, gets within spitting distance
of the Vega 56 in 1080p/1440p in a bunch of games, and it's the most power
efficient card out there.

~~~
walrus01
Yes, what I meant is that for a person who bought a 1060 two months ago, it's
not worth spending the same $279 again to upgrade.

~~~
jplayer01
Oh definitely. The 1060 is plenty performant for modern games (except for
pathological cases like Anthem) and somebody who owns one shouldn't be in any
hurry to get the 1660. The 2060 is a better candidate for that, but pricey.

~~~
chx
Which makes it particularly sad the Galax SNPR 1060 eGPU didn't see a wider
release. I have one, it's great.

