
Nvidia GeForce RTX 3080, RTX 3070 Leaked Specs: Up to 20GB GDDR6 RAM - verst
https://www.tweaktown.com/news/70053/nvidia-geforce-rtx-3080-3070-leaked-specs-up-20gb-gddr6-ram/index.html
======
Brave-Steak
At this point all I care about is the price. The price of GPUs has ballooned
in an insane way recently and it’s impossible to get a decent deal that’s
significantly faster than what I have now (and I haven’t upgraded in years).
The used market seems to have been decimated, with even last gen cards going
for crazy high prices. I suspect the 30x0 line will be even more expensive
than the current one, and I hate it.

~~~
ahelwer
Yeah, I really don't want to pay more than around $120 for a decent GPU, which
gives me a used gtx 1060 or 780ti. Both of those play new games at high
settings on 1440p, what more could you want? Shelling out $700 for a GPU to
get good VR or RTX tech demos just isn't worth it.

~~~
TylerE
It is if you use a high refresh rate. My 1080ti is barely adequate for
1440p165... and even at that I’m usually turning setting down to average
100–120fps.

------
marcus_holmes
I worked in a life assurance company in 1996. The networked shared hard drive
that the entire 15-ish person IT department worked from was 4GB. We considered
that to be a decent size, and rarely ran into space issues. I keep wondering
where this ends up - are we going to have 4TB graphics cards and 1PB hard
drives in 2040?

~~~
viraptor
There are two economic limits that I see.

1\. You're only going to see so much details, so making bigger textures will
not make much sense soon. We're already at the level where the art design /
mapping / movement / realistic lighting could improve the situation much more
than extra resolution. Games can already run at up to 4k and virtually nobody
even has hardware for it.

2\. There's only so much bandwidth on the storage - graphics memory and
graphics memory - GPU lines. If you can't realistically process the
textures/geometry you want to display, there's no point making it.

1PB drives will come though for long term storage. We already have petabytes
of data being processed in some companies. It's just a question of $/B ratio,
whether the R/W speed is reasonable for that size, and how reliable do you
want that storage to be.

~~~
Fredej
For gaming it think it's somewhat true that we may reach a visual limit at
some point.

However for other applications it's not the same thing. I work in medical 3D
imaging - we are constantly memory constrained.

------
james_s_tayler
But can it play Control with full Ray Tracing on Ultra detail in 4k at 60 FPS?

If yes, I'm getting one.

~~~
polyterative
without internal upscaling maybe

------
that_lurker
Is there really a need for 20gb of ram in a GPU?

~~~
ChuckNorris89
Machine learning, data scientist and real time professional simulations.

You need to have your models in video ram to be able to _" play"_ with them in
real time so with 20Gigs they can access larger models.

~~~
samplatt
>Machine learning, data scientist and real time professional simulations.

With photogrammetry (my field) we need to process anywhere between 1GB and
500GB of DSLR photos in one project and triangulate points between all these
photos. Lots of RAM helps. Lots of VRAM helps JUST as much.

Also crypto mining takes advantage of large VRAM too.

------
gautamcgoel
The big news to me is that the top card features a 320 bit memory bus, which
should offer much better memory bandwidth than the previous generation (256
bits). Assuming a linear scaling, we should see a 1.25 improvement compared to
the previous generation.

~~~
ksec
>The big news to me is that the top card features a 320 bit memory bus, which
should offer much better memory bandwidth than the previous generation (256
bits).

RTX 2080 is 352 bits, and RTX Titan is 384. Nvidia has always had high bit
rate memory bus. The one that had 256 bit was AMD Radeon GDDR. Arguably bit
bandwidth is meaningless without looking at actual memory type, or the total
bandwidth from memory.

~~~
RealStickman
Actually the RTX 2080 has only 256bit. [https://www.techpowerup.com/gpu-
specs/geforce-rtx-2080.c3224](https://www.techpowerup.com/gpu-specs/geforce-
rtx-2080.c3224)

The 2080 ti has 352bit. [https://www.techpowerup.com/gpu-specs/geforce-
rtx-2080-ti.c3...](https://www.techpowerup.com/gpu-specs/geforce-
rtx-2080-ti.c3305)

Interestingly, Nvidia decided to use a bigger die (relative to other dies in
the family) for the 3080 than for the 2080. TU104 (2080) vs GA103 (3080).
Probably to prepare for big Navi.

------
wdroz
4k@144fps would be a dream, but even with my 2080 Ti, I can't run AAA titles
at 1920x1024@144fps without severely cutting down on graphics settings.

I doubt that the 3xxx series will be strong enough to support consistent
4k144fps.

~~~
romanovcode
Maybe it's the CPU that's throttling?

I cannot believe that 2080TI cannot run at fullHD@144fps. I have 2080TI and I
get constant AAA 4k@60fps.

~~~
1_player
I’m not informed on the PCI specs but I bet the bottleneck is simply bandwidth
(or CPU), the GPU is capable enough.

I have a 2080ti as an Thunderbolt 3 eGPU, which has an even smaller bandwidth
than PCI Express so the effect is accentuated. I lose about 10% perf on 4K
compared to a desktop computer, and 40% at 1080p. I don’t recall the exact
numbers but a benchmark would do 80fps on 4K and 120fps on 1080p, even though
it’s 1/4th the amount of pixels. This is because most of the time is spent
transferring data on the bus.

