
Nvidia Unveils GeForce RTX 30 Series GPUs - mkaic
https://blogs.nvidia.com/blog/2020/09/01/nvidia-ceo-geforce-rtx-30-series-gpus/
======
modeless
I love a small feature he mentioned in the video: Monitors with built-in
latency measurement. You plug your mouse into the USB ports on the monitor and
it tells you how long it takes the image to change after you move the mouse.
Brilliant idea.

It's long overdue to have widely available metrics for latency in consumer
tech. Many hardware/software setups have absurdly long latency for no good
reason other than that it's difficult to measure. People underestimate the
subconscious effects it has on your usage of technology.

I couldn't be happier that phone and monitor manufacturers are finally
starting to compete on refresh rates >60 Hz. It's far more important than wide
gamut or 8K.

~~~
jacquesm
That's because our operating systems treat user input as a soft real time
problem when actually it should be a hard real time problem. That's why your
window manager sometimes can be unresponsive. I've used a hard real time OS
for a desktop for some years and the difference with consumer stuff was
extreme. You get used to the computer _instantly_ doing what you tell it to
do. None of these half second or longer delays between command and result.
It's like magic.

~~~
acdha
This is one of the things I still miss about BeOS. On late 90s hardware, the
OS wasn't quite hard real-time like QNX but the worst-latency latency was much
better than Windows 10 or MacOS on the latest hardware, even in the presence
of heavy resource contention — I remember simultaneously surfing the web,
downloading video from a camcorder with a Firewire interface which did not
buffer, and compiling Mozilla. The UI latency didn't change at all, and the DV
transfer didn't drop a packet — everything else got throttled back as
necessary, of course, but that meant that, say, GCC took longer to run rather
than impacting what I was doing in the foreground.

~~~
k__
Is it possible to do this with Linux? Giving the UI very high priority etc.

~~~
mensetmanusman
My 5 year old uses our first iPad which is over 10 years old now. The keyboard
UI from then is still faster than my iPad pro keyboard UI because Steve Jobs
valued the priority of that in the stack :)

~~~
ksec
I think Steve Jobs have the same latency sensitive condition as some of us do.
Getting irritated when things aren't the speed they are suppose to be.

~~~
Lio
I have an old Panasonic “smart” TV with a daft animated but glacially slow UI.

You can tell the execs that approved that were probably impressed watching
someone else demo it but probably never used it themselves.

Jobs would have spotted it was rubbish immediately.

~~~
jonplackett
OK so this is a partial tangent - but I have a washing machine that annoys the
hell out of me for this exact same reason (I know washing machine UI is not a
hot topic usually but it damn well should be!)

It has a dial on the front to choose settings. Potentially a great idea to
skip quick to the one I want - BUT the damn thing can only detect 1 step of
change about every half second. So if you fast-turn it 4 clicks, it still only
moves one step. So you have to stand there, slowly turning it, click, click,
click, click....

The dial is the biggest design feature on there. Massive. Right in the middle.
Bright lights all around it. But they couldn't even be bothered to make it
solve the one problem it was there to do.

~~~
shadyMrPatch
I'm getting angry even reading about it.

It's the kind of thing that'd make you reject purchasing it if only you'd
thought to have tested that specific bit of it before buying it.

~~~
jonplackett
I knew hacker news was the right place to share this story. Only here people
would understand the pain I am in!

~~~
Shared404
I am almost a day late... But yes.

Honestly, I wish that they would make appliances with physical controls as
opposed to digital. At least those are easier to fix/mod on your own.

~~~
jonplackett
I always think that about modern SLR cameras. They took a series of brilliant,
instantly accessible physical dials (around the lens for aperture, on the top
for exposure) and replaced them with menus and buttons on a tiny screen. WHY?
How is that progress?

I think if someone did a kickstarter for a very simple physical buttoned
digital camera it would do very well.

------
ZeroCool2u
Wow, I wasn't planning on upgrading from a 1080 (non-Ti) but the 3080 is so
good and priced so well, I probably will. I just ordered a 240 Hz 1440p IPS
monitor and I wasn't planning to hit 240 Hz, but this makes it so easy I might
as well.

My day job is primarily ML as well, so I might just go for the 3090. 24 GB of
memory is a game changer for what I can do locally. I really just wish Nvidia
would get its shit together with Linux drivers. Ubuntu has done some great
work making things easier and just work, but having them directly in the
kernel would be so much nicer.

One thing I'm curious about is the RTX IO feature. The slide said it supports
DirectStorage for Windows, but is there an equivalent to this for Linux? I'm
hoping someone with a little more insight or an Nvidia employee may have some
more information.

~~~
me_me_me
Don't buy it yet, usually oem versions are more optimised and better value
overall.

But i must say 3070 is really tempting for $500 for someone who is not looking
to upgrade.

~~~
twblalock
Now that the reference cards don't have the blower coolers they are a lot more
tempting. I'd still wait for reviews to see how loud they are though.

Up to now, OEM/AIC/AIB cards had quieter cooling systems, but they also make
you pay for the silly graphics, the RGB, etc. I don't think the overclocking
ability is really that much better. A quiet reference design with a non-blower
cooler and a good warranty is what I've always wanted.

~~~
fomine3
Yeah, NVIDIA makes blower again. RTX 2000 FE lacks blower design.

------
jjcm
A small thing that I haven't seen mentioned yet, but these cards have native
hardware decoding for AV1. With chrome just launching support for AVIF this
last week it seems like more and more platforms are getting out of the box
support for it. Nvidia is also working with a lot of partners[1] on it it
seems. I'm currently working on a social video platform, and having a unified
next-gen codec that works everywhere for both images and video would be SO
helpful. Hopefully this trend progresses - would love to be able to do some
extremely high quality streaming.

[1] [https://www.nvidia.com/en-us/geforce/news/rtx-30-series-
av1-...](https://www.nvidia.com/en-us/geforce/news/rtx-30-series-
av1-decoding/)

~~~
Polylactic_acid
I'm most interested in when we get hardware encoding for AV1. Encoding is
painfully slow right now unless on the low settings.

~~~
cheschire
There’s not a ton of money in that since most people who would need fast
encoding are streaming, and those folks already used capture cards or
secondary PCs to record.

~~~
Polylactic_acid
It would be super useful if it could be done on mobile SoCs. Realtime AV1
encoding on mobile phones would be huge. But even if you don't care about
speed, AV1 encoding is still painfully slow. Encoding an hour of video can
take over a day on a modern cpu.

~~~
ed25519FUUU
I’m surprised there’s no solutions using mobile phones, which usually have a
SoC with hardware encoding for h264 at 4k.

------
johnwheeler
It's interesting the effect new graphics have on how old graphics are seen. I
remember when Resident Evil came out on the first Playstation, at the time, I
considered certain game elements like fire and water indistinguishable from
reality. Now it looks like pixelated garbage and I ask myself how I could have
ever thought that.

~~~
jamroom
Oh man - the same here with me - when I first got Morrowind there was an
option you could enable in ATI (now AMD) cards at the time that made water
look "real". I was able to enable that and was blown away - I was like "well
that's it - doesn't get any more real than this". Having loaded Morrowind
recently I could not believe had bad it looked lol. Makes me wonder - what are
we "not" seeing right now that will make us think this way 10-20 years from
now?

~~~
formerly_proven
Hm. Many people here share their sentiments of perceiving older games (or
current games) as being close to photorealistic. Personally it never felt that
way to me. All games have obvious problems where they don't behave/look
anywhere close to reality. If you remove the interactive element and only use
somewhat massaged screenshots, then, sure certain elements are basically
photorealistic and have been for a while. For example, landscapes look pretty
darn realistic. Anything alive or even most man-made artefacts, not so much.

Apart from graphics most stuff in games is pretty rough. Animations are
generally bad and ways before reaching the uncanny valley of "getting close";
they're still in the "abstract representation of concept" detail level. AI is
dumb (largely for [perceived] player-acceptance reasons). Sound is generally
poor; some games still don't use 3D sound. Physics are "abstract
representation" level again, some games still have fly-through walls and
vibrating items. etc. etc.

~~~
KaoruAoiShiho
Yeah... these comments are kinda weird. Hate to say it but maybe they should
go outside more lol, computer graphics always looked pretty bad at me, even
the clips in today's nvidia presentation looks really unnatural. That's not to
take anything away from the technology and the massive advances it represents,
just compared to reality it's still really far from fooling a human brain.

~~~
wiz21c
interestingly many of us feel that there's a huge gap between the past and
now. But how many of us can actually verbalize what will change in the
rendering of pictures in, say 3 years.

For example, I can clearly see that ray tracing produces better results. But
it's a bit harder to tell how better it is, to find the words that describes
how better it is. Of course, one can say that, for example, photon tracing is
more physically accurate. But still, what words can we use to describe how
real (or not real) a still image is ("more realistic" doesn't count :-))

~~~
bonoboTP
Game graphics look sterile, too clean, sharp, plasticky. The real world is
messier with less clear separation between objects, things blend together
more, subtle shadows, surfaces are less uniform, there is more clutter, dirt,
wrinkles, details.

~~~
nightski
Is that an artifact of the graphics capability? Or the art style? I think the
two often get confused and many games are specifically designed for a
sterile/clean look.

~~~
bonoboTP
Not sure. It's hard to analyze as it's more of a visceral impression. It could
be in part the way natural environments tend to structure themselves over
time, like how we throw our stuff around in natural ways.

But also, the design often adapts to the capabilities. Games like GTA3 used to
have thick fog just to hide the fact that they couldn't render stuff far away
in time. You can say that's an artistic choice to give a smoggy big city
atmosphere, but clearly it was a practical choice as well. Even today, game
designers like to make things dark, blurry and rainy, so that the un-realism
becomes less obvious.

------
fluffything
* RTX 3070, 499$, faster than RTX 2080 Ti

* RTX 3080, 699$, 2x faster than RTX 2080

* RTX 3090 (Titan), 1500$, 1.5x faster than RTX Titan, 8k resolution @ 60 FPS with RTX on Control.

\---

I hope that if somebody bought an RTX 2080 or similar in the last 2-4 weeks,
that they bought it over Amazon, and can return it.

~~~
JackMcMack
As always, wait for the benchmarks before deciding to buy (or return). My
guess is the performance improvements are the biggest for raytracing, which I
personally don't care for. And let's not forget the huge power draw, requiring
a new power connector and a 3 slot cooler.

8N manufacturing process is presumably Samsung, which will probably be beat by
TSMC 7nm.

I'm holding out for RDNA2.

~~~
fluffything
> As always, wait for the benchmarks before deciding to buy (or return).

Why would you keep a 1200$ RTX 2080 Ti or a 2500$ Titan when you can get at
least the same perf for 50-75% of the price with the new products, and much
better RTX perf ?

This assumes that with RTX off the new gen won't be slower than the old one,
but I think that's a fair assumption, and if it isn't, you can always return
the 30xx card in 2 weeks, and buy an used 2080 Ti or Titan on ebay for
pennies, once the market overflows from people upgrading for the 30xx series.

People were still asking for 900$ for a used 2080 Ti on ebay this morning, and
the 3080 700$ price just destroys that offer. Many owners are going to try and
dump these cards for as much as they can get in the next two weeks. I wouldn't
buy a used 2080 Ti for more than 250$ today. In one month, these cards are
going to sell used for 100-200$. If AMD can only match the 2080 Ti perf, they
are going to have a very hard time pricing against the used 2080 Ti market.

> And let's not forget the huge power draw, requiring a new power connector
> and a 3 slot cooler.

That's only for the 3090 IIUC. All other cards were announced as being
actually much smaller than the 20xx series ones.

~~~
JackMcMack
True, the new generation offers better price/performance vs the previous
generation.

Do note that the 3080 is still 2 weeks away (17 sept), 3090 24 sept, 3070 in
October.

I would consider getting a non-founders edition though, in the past other
brands have had better/more silent cooling for pretty much identical pricing.

Edit: While the 3090 (350W) is the only one requiring a 3 slot cooler, all 3
use the new 12 pin power connector. The 3080 founder edition power draw is
still 320W, vs 220W for the 2080.

~~~
binaryblitz
The 12 pin connector has an adapter for two 8pin cables. This is a non issue
unless the connector just bothers you for some weird reason.

------
en4bz
All this on Samsung 8nm (~61 MT/mm2). They didn't even feel the need to use
TSMC 7nm (~100 MT/mm2). Probably keep the price down and to reserve capacity
at TSMC for the A100.

This is like the anime hero/villain (depending on your perspective) equivalent
of fighting at half power.

~~~
ss248
>They didn't even feel the need to use TSMC 7nm

They just couldn't get enough wafers. They tried to force TSMC to lower the
prices and it backfired.

~~~
pdimitar
How did it backfire?

------
dang
We changed from [https://www.nvidia.com/en-us/geforce/special-
event/?nvid=nv-...](https://www.nvidia.com/en-us/geforce/special-
event/?nvid=nv-int-gfhm-55882#cid=_nv-int-gfhm_en-us), which is a video of
(what looks like) the same material.

Some of the comments in this thread are about things people saw in the video
(spatulas?), so keep that context in mind.

~~~
mkaic
Yeah I posted the live announcement video and only found the static launch
page a few minutes after that. Apologies :)

~~~
dang
No worries!

------
metalliqaz
The 20xx series was very obviously the "tech development" release and was a
terrible value. It was first gen ray tracing and thus actually unequipped to
fulfill its promises. The 30xx series looks to be much better and is probably
finally worth the upgrade from 9xx and 10xx equipment.

~~~
Zenst
Generally new feature leaps - it is the second release that meets expectations
of the initial feature release.

------
Ninjinka
The pricing is insane. $499 for a card that beats the 2080ti?

~~~
mkaic
And to think I was just about to buy a $400 2060 Super. I don't think I'll be
doing that anymore.

~~~
easytiger
I recently got a 2070 super OC. It really struggles with, say, COD:MW at near
4k Res (3440x1440). Indeed the campaign with RT on doesn't get more than
~50fps. Kinda disappointed really. Get about 100fps in multiplayer with
everything turned off/low.

But the cost of a 2080ti was ridiculous especially considering the open secret
of its impending obsolescence.

AMD really need to up their marketing budget. From the zeitgeist I've no idea
where their lineup sits comparatively. No wonder they only have 20% market
share

~~~
hellotomyrars
AMD still don't offer a compelling product for the high-end, which Nvidia is
taking full advantage of. AMD cards are the most economical on the market and
have their own advantages but they're niche (Linux support is much better for
example).

AMD got on top of Intel by creating hardware that delivers. Just a few years
ago AMD CPUs were economical, but the performance was abysmal, especially in
single-threaded applications by comparison. They didn't make a product for the
high-end.

I am eagerly awaiting what the next series of AMD cards are going to be able
to do. They're talking a big game for sure. But Nvidia has a big software
advantage as well as a hardware advantage on AMD and that's likely to be a
sticking point for me personally on my next purchase. Nvidia spends a lot of
resources on working closely with developers and providing them support they
need to take better advantage of the hardware with nvidia-specific features.
AMD doesn't seem to do the same, and has had much higher profile issues with
their drivers in my experience.

All that said, I hope AMD can provide a product to truly compete at the high-
end with Nvidia, to hopefully drive prices down as GPU prices have gone up
dramatically on the high end.

------
mkaic
Here's the static launch page if anyone's interested:

[https://www.nvidia.com/en-us/geforce/graphics-
cards/30-serie...](https://www.nvidia.com/en-us/geforce/graphics-
cards/30-series/?nvid=nv-int-cwmfg-49069#cid=_nv-int-cwmfg_en-us)

~~~
ckastner
Thanks! Most importantly: "Available on September 17th"

~~~
tgb
That's for the 3080, the 3090 is Sep 24th, and 3070 is "October", from lower
down on that same page.

------
trollied
It's going to be very interesting to see what AMD counter with. I don't think
anyone was expecting pricing this aggressive.

Competition is great!

~~~
tmpz22
GPUs are still overpriced because of Nvidia's monopoly on high-end cards. The
fact that we're so normalized to this after the 2xxx series is kind of sad.
It's like praising Apple for a phone that's "only" $999.

~~~
ChuckNorris89
Without competition from AMD, Intel had normalized quad core chips for $800 on
the high end for over 10 years.

~~~
gruez
>Intel had normalized quad core chips for $800 on the high end for over 10
years.

What are you talking about? The i7-3770K (top of the line 4-core chip in 2012)
sold for ~$330.

~~~
nolok
I assume you meant desktop only, because the top of line 4 core chip of 2012
was the Xeon E3 1290v2 which sold for almost $900

------
rectangleboy
It was behind the exorbitant number of spatulas the whole time!

~~~
carabiner
Why does he have so many?

~~~
magicalhippo
Clearly he went to to the great sale they had at Spatula City.

[https://www.youtube.com/watch?v=0NCmDvrECS8](https://www.youtube.com/watch?v=0NCmDvrECS8)

------
0xfaded
I have some CUDA code that I need to run as fast as possible, so if I was
going to blow $1500, my use case would imply that I should go with two 3080s.

However, I also play around with deep learning stuff, expect to do so more in
the future, but don't currently follow it so closely.

Would someone care to ponder on what difference they think a 24GB gpu vs a
10GB gpu will have as a tool for deep learning dev over the next 3 years?

For what it's worth, I'm a computer vision guy, but I did have a play with
DeepSpeech earlier this year.

~~~
hwillis
I'm fairly lay but imo GPT-3 demonstrates pretty soundly that huge models are
no magic bullet- it's got >2x as many parameters as the human brain has
neurons and it can't do long division. Dogs and other animals get by just fine
having less than 1% as many neurons as humans.

Even a billion parameters is a _huge_ model, and a factor of 2.4x increase is
not going to make a tremendous difference in your performance. In particular
the data-heavy nature of vision stuff means that you'll be bottlenecked by
training more than memory, AFAIK (again, lay).

~~~
i-am-curious
This is a misleading comparison. You are comparing a massive model with huge
models. What you should be comparing are big models vs medium models that a
single consumer GPU will fit. And - you don't need to take my word for it,
there's tons of papers - the bigger models definitely perform better.

------
paulpan
This might be an unpopular opinion, but is the new RTX 3070 at $499 and RTX
3080 at $699 really such good value?

Sure it's being marketed as 70-90% better performance at the same price but
the value proposition is mainly in comparison to the Turing generation (20XX
series) versus the Pascal generation (10XX series). For example of 3070,
Nvidia moved the price anchor for the XX70 series from $379 to $499 so
consumers are essentially paying for a XX80 card - and should expect XX80
performance. Nonetheless still impressive for the 70-90% performance gain,
just perhaps not as much when price-adjusted.

On a separate note, I'm curious about how much of the performance gains is
attributable to the node shrink (12nm to 8nm) vs. micro architecture vs.
software optimization (e.g. DLSS 2.0 vs. 1.0).

~~~
Zenst
> really such good value?

Depends upon how you measure value.

If you can get something better for the same money or a little more, then sure
they are good value.

Even if you want something not as powerful or new, these will just send all
the older card values down, so for that - these are great value.

Now if you was a TITAN buying type, the GTX3090 is definitely good value.

As for node shrink and how much is it a factor; Would need to factor in clock
speeds and memory bandwidth, then need to compare exact comparable features.
It's a muddy path. More so with the bigger gotcha that the 12nm node was TMSC
and this 8nm node is Samsung, so hard to say, but over the months, reviews
will have a good stab at it. Though for me, GamersNexus on YT would be the
review I'd be looking for in the weeks ahead.

------
metalliqaz
8K gaming is the ultimate gimmick. Even in the ideal conditions they set up
for that demo, I'm doubtful those gamers have the ability to detect a
significant improvement over 4K.

~~~
t0mbstone
It's all about VR headsets

~~~
fluffything
For VR 60 FPS is not enough, but maybe in the next generation with another 2x
leap we'll have 120 FPS at 8k.

~~~
Polylactic_acid
We are at the stage where nothing is enough for either fps or resolution. I
use a vive original and its painfully blurry and thats 1080p per eye and a
very low fov. For a high fov lens we will probably need 16k per eye at 144hz
before people will consider it good enough like 4k for monitors is now. We
don't even have the cables to push that much data currently, let alone render
it.

I expect to reach this we will probably have to start using foverted
rendering.

~~~
fluffything
So the 3090 they announced is shown at 8k on 320 Hz monitors on the
presentation. They are quite far still from 2*16k at 144 Hz.

I was surprised over the difference 320 Hz monitors make according to them.

------
gallerdude
Just compare how much Nvidia is pushing the limits as the industry of leader
of GPU’s, to how Intel seems to be playing catch-up reluctantly as the
industry leader of CPU’s. Leadership matters.

~~~
jjoonathan
Nvidia isn't a fab. AMD isn't a fab. TSMC is the fab that beat Intel. TSMC's
customers, like AMD and NVidia, are beneficiaries.

So far. As customers of TSMC, in the long term it behooves them for TSMC to
have competition.

Further, "leadership matters" is a somewhat ironic complaint given that Intel
ran face-first into a brick wall precisely because they were leading. TSMC
placed conservative bets on the next node and Intel placed risky bets because
they needed the extra risk/reward to maintain leadership. Intel's bets failed
(in particular, cobalt wires and COAG). They chose "leadership or bust" and
went "bust," at least for now.

~~~
i-am-curious
This is such a simpleton take.

The whole reason AMD are able to crank out 128 core CPUs is the CCX
architecture - the one people laughed at. No TSMC there. Not to mention other
innovations like Infinity Fabric.

In ampere for instance, there are so many innovations, like PAM signalling, 2x
shader instructions per clock, DLSS, RTX Voice.

TSMC beat Intel, sure, but that is not the main reason for why Nvidia and even
AMD are leading the industry. In fact, ampere is on Samsung 8n.

~~~
CydeWeys
I see you're new here ...

Just fyi, steer clear of the ad hominems. You can disagree with someone
without calling them a simpleton.

------
lachlan-sneff
3080 ($700) apparently has 238 teraflops of "tensor compute." We're
frighteningly close to a petaflop of compute for less than a thousand usd.

~~~
arcanus
That is FP16 (rounding up to FP32). It is not sufficient for most HPC compute.
Good for ML/AI, at least.

~~~
alkonaut
For all but the AI/DL crowd the move away from high precision to high-power
low precision is a bit sad.

------
ttul
Jensen Huang has a really baller stove.

~~~
ealexhudson
That stove is an Aga and in the centre of his Aga is one of these RTX 30 GPUs.
Keeps the whole house warm.

~~~
trumpeta
Then why does he need the leather jacket?

~~~
nomel
It's actually a full body oven mitt.

------
Macha
If their performance claims are accurate, AMD has a huge hurdle ahead of it,
as the rumour mill only had them drawing even with the 2080 ti with big navi.

~~~
zucker42
The rumours I've heard have them beating the 2080Ti, but not enough to be
competitive with the top Nvidia cards if these performance claims are
accurate. Plus I'd guess Nvidia will launch a 3080Ti at ~$1000 sometime around
the release of Big Navi.

~~~
fluffything
Or they'll just move the 3070 and 3080 to 7nm TSMC to lower the price. Or
both.

~~~
zucker42
I've heard they don't have enough production reserved at TSMC to make consumer
GPUs on TSMC 7nm and they are only going to use TSMC for Quadro. Plus Samsung
is cheaper than TSMC.

------
valine
It’s interesting that the 3090 is a replacement for the Titan. The $1500 price
tag is a bit higher than expected but considering the Titan cost $2500 it
doesn’t seem too unreasonable.

~~~
mkaic
In terms of just nomenclature, I think I like the consistency of having all of
the current lineup actually be called GeForce 30XX.

~~~
Macha
The supers/tis/titans will come to clutter it up next year or so.

------
rnantes
Interesting that this card has 8K capable HDMI 2.1 but not DisplayPort 2.0.
Wonder when we start seeing DP 2.0 support in products, VESA said late 2020 in
the press release.

~~~
DoofusOfDeath
Apologies if this question is super naive, but is there a good reason for
ongoing development of _both_ HDMI and DP? At least for my use cases (home
entertainment, and work computers) both seem roughly equivalent.

The devices I've bought recently have tended to support HDMI more than DP. So
I got the impression that HDMI was "winning" and DP would fade away.

But now it seems like vendors are moving towards video-over-USB-C cables. And
the "Alternate Mode protocol support matrix for USB-C cables and adapters"
table in this article [0] seems to indicate that USB-C _cables_ have broader
support for DP than HDMI. Which makes me wonder if vendors will converge on
DP-protocol-over-USB-C-cable?

This makes me nostalgic for the relative simplicity of DVI.

[0] [https://en.wikipedia.org/wiki/USB-C](https://en.wikipedia.org/wiki/USB-C)

~~~
EricE
HDMI is all about DRM. USB-C is just one of many reasons DP hangs on. The
ability to chain monitors is huge for digital signage and other display uses
too. Let's hope both continue to be developed since for computing display port
is far more useful and free of at least some of the DRM hell of HDMI.

------
npmaile
As much as I like to hate Nvidia for all of the right reasons, this is pretty
big and might make me compromise my morals until AMD comes out with something
that can compete.

------
pixxel
I find the naming conventions for computer parts utterly confusing. I’m
looking to step away from Apple and do my first (AMD) PC build. Need to find a
good overview to read through.

~~~
stu2b50
3080

30 <\- represents the generation, previously it was 20,eg RTX 2080

80 <\- represents power within the generation, an 80 is near the top

Higher generation means newer. Higher number means more powerful within that
generation. To compare across generations, you need benchmarks.

~~~
Sohcahtoa82
nVidia has the most sane naming convention.

Sure beats AMD. Is an RX Vega better than an RX 570? What about an RX 5700?

------
solatic
Still no proper Wayland support?

I mean, I get that the primary market runs Windows. But some people like to
dual-boot.

~~~
smabie
Why do I need Wayland? Xorg works just fine. Actually better than fine, since
you can actually run a wide variety of window managers without having to
resort to xwayland

~~~
Lichtso
xorg is a security nightmare. Also, wayland itself works as well (even better
performance wise), it just takes time for all the applications to catch up and
enable it as a backend.

------
system2
All these crazy graphics cards, they still couldn't figure out high density VR
displays. I want amazing VR with super clarity. Then I'd invest whatever money
they want for a graphics card. LCD gaming just doesn't cut it.

~~~
Kapura
It's not that they can't figure out high density VR displays, it's that
they're prohibitively expensive to produce. Display miniturisation is not a
problem domain that a lot of tech is focused on, so progress is necessarily
slower than the more profitable areas.

------
fortran77
The pricing seems very good. Our company write a lot of CUDA code, mostly for
real-time audio processing. It's amazing how much performance you can get with
a desktop PC these days. These really are Supercomputers on a card.

------
gallerdude
These look great! It’s amazing how much better hardware gets annually. The
only thing I was hoping for that wasn’t mentioned was hardware-accelerated VP9
encoding, but we can’t get everything we want in life.

~~~
daneel_w
If we were to imagine that VP9 hardware encoding by Nvidia would hold the same
standard as their H.265 hardware encoding then we can stop holding our
breaths, as we have not missed out on anything of any value what so ever.

For H.265 their encoder is fast, yes, but the quality per bitrate is complete
rubbish, requiring higher bitrate than a good H.264 encode yet still contrives
the gruesome trick of looking far worse, which entirely offsets all and any
point with H.265.

~~~
gallerdude
Oh interesting, I didn't know the Nvidia encoder was regarded as trash. What
tools can one I use to evaluate visual quality of a video file? My proxy for
quality has always been bitrate, but I know that bitrate is just chasing
visuak quality anyways...

~~~
fomine3
SSIM is popular but now VMAF is considered good index.

------
zmmmmm
Question: what is the sentiment around how this may affect pricing of the non-
consumer cards? There was already a disparity, but the gap is now indecent -
something like a T4 looks to be worse than a 3090 and more than double the
price on basic specs (understanding there are some elements designed
specifically for server environments). Is there any potential nVidia will have
to shift on these prices or do we have to wait for a next generation of those
before we can expect more affordable data center GPUs?

------
shantara
It's worrying to see GPU cooling system dumping the heat directly onto the
CPU, RAM and motherboard components. That's the only thing that left me
skeptical after watching the presentation.

~~~
jiofih
What do you mean? The fan design is pretty standard. Either way it won’t
matter, as hot air should be extracted from the case, it’s not gonna
meaningfully change the temp on anything it hits.

~~~
shantara
The back fan flows the air through a hole in the GPU, past the thermal tubes
and carries the hot air above the GPU to the upper part of the motherboard.
Which is not in any way a standard design with the hot air going directly
outside the PC case.

[https://youtu.be/ALEXVtnNEwA?t=3283](https://youtu.be/ALEXVtnNEwA?t=3283)

~~~
jiofih
The non standard part is pushing hot air out of the back of the gpu itself -
“dumping the heat directly onto the CPU, RAM and motherboard components” to be
extracted by the system fan is what every existing design does...

------
zapnuk
Very impressive. I predict that they'll be (more or less) sold out at least
until 2021 at the very least.

~~~
phaus
I'm glad the 3080 is launching first so I can try getting one of those and if
they are sold out I have time to think about whether I want to try dropping
more than twice as much on a 3090 instead.

------
antpls
Not sure it was mentioned so far, but the reference to photon instead of
electron at the end of the presentation could point to future photonic GPU ?
[https://www.anandtech.com/show/16010/hot-chips-2020-live-
blo...](https://www.anandtech.com/show/16010/hot-chips-2020-live-blog-silicon-
photonics-for-ai-600pm-pt)

~~~
zamadatix
I think it's more likely a reference that long distance high speed
communication is done via photons not electrons.

------
agigao
NVIDIA, can you please make my GTX 1070 not to tear a screen when I'm
scrolling a web-page in a browser in GNU/Linux?

~~~
cyberdrunk
Try switching to Wayland.

------
djsumdog
I got a 2080-Ti last year and I have yet to really run into a situation where
I couldn't play a modern game, at 4K, with almost everything set to max (but I
also don't game over 60fps).

I'm sure this will really push those 4k/120Hz displays, but I doubt the
average/causal gamer will really care about this series for a few years.

~~~
zamadatix
From a fellow 2080 Ti owner this kind of statement that you can max things out
without issue is usually true with the latest flagship but falls apart once
the next generation comes out and provides a higher max performance envelope
for settings to target.

Expect the games demoed in the event due to come out soon (cyperpunk, cod,
etc) to not play as well when maxed out as those that were released while the
2080 Ti was the flagship.

------
dougmwne
I am curious how quickly Nvidia will add these to their GeForce Now streaming
servers. As of right now, they only stream in 1080p and it seems this could
allow them to stream 4k for about the same hardware cost. I'm personally not
in the market for a gaming desktop, but happily subscribe to GPU as a service.

~~~
xx_alpha_xx
GFN has been a bear, very hit or miss. Current generation of hardware is
either 1080 or 2060. If they start adding 30xx (which they have hinted at in
this past) that would be great.

------
Tuganin
Can anyone think of cases where the GPU/Processor unveiled by the maker wasn't
actually what they said it was once it is run in real use-cases?

It always felt to me that something similar to the car's gas emissions scandal
is just waiting to happen in this industry.

~~~
proverbialbunny
The FX series was pretty bad and around the same time AMD's X cards (eg X700)
would burn out within a year if you gamed on them, but not fail. The FPS would
slowly lower over time. This was due to a lack of a fan inside of the case.

------
mmanfrin
The 3080 is '2x faster than the RTX 2080', which was roughly on par with a
1080TI (it had advantages, of course, RTX among them).

3 Generations newer with only a 2x speedup feels like a much smaller leap than
the prior generations.

~~~
ebg13
I missed a step here. How is the 3080 3 generations newer than either the 2080
or the 1080Ti?

~~~
mmanfrin
1080ti -> 20xx -> 20xx ti -> 30xx

There were also the 16xx cards.

I'm being downvoted for this.

10xx

16xx

20xx

30xx

That's 3 gens difference.

~~~
ebg13
Ah. It seems unreasonable to me to consider the 2080 and 2080Ti as different
generations when they were released at the same time. Also I think we should
factor that the 3080 at $700 costs _half_ what I paid for a 2080Ti in December
(The ASUS RoG Strix, a premium 3-fan model, was going for like $1400 before it
was discontinued). By NVidia prices, the 3090 is the true price successor to
the 2080Ti.

~~~
mmanfrin
You're right, I shouldn't likely be distinguishing the Ti series (although it
is usually a 'toc' generation, like the just announced gen, as Ti variants
were not announced).

But that does leave the 16xx generation which was released wholly on its own,
in its own year.

~~~
phonypc
16xx series is just lower end 20xx.

------
fomine3
Comparing 3080/3070, 3080 is 146% greater TFLOPS than 3070 and VRAM capacity
and speed is different. Meanwhile 2080/2070 was 135% greater TFLOPS in same
MSRP with 3080/3070\. 3080 looks very competitive.

------
shmerl
I'm waiting for RDNA 2 cards from AMD.

~~~
Nursie
I'm waiting for AMD to show their hand, but this is a very strong first strike
from nvidia.

~~~
shmerl
Sure, if the bold 2x performance increase claim is to be believed. I'd wait
for benchmarks to validate that.

Plus for me, Nvidia is simply DOA on Linux, due them refusing to upstream
their driver and hindering Nouveau to reclock properly. So even if AMD won't
outdo them, I still won't touch Nvidia.

~~~
Nursie
Never had an issue using the proprietary driver on linux, myself.

I know, I know, it would be nice to have a proper FOSS driver, and better for
integration, updates etc. But it does work fine, IMHO.

~~~
shmerl
Depends, it works for cases Nvidia care about. But what you call integration
means all other cases :) And there it simply falls apart or takes decades to
be fixed.

~~~
Nursie
In linux as long as I can get XFCE going, all thw screens at the right res and
scling levela, and the cuda drivers working, I'm generally happy :)

I'm probably pretty easy to please :)

------
homerhomer
I'm guessing that this puts PCs ahead of unreleased PS5 and the new Xbox. I'll
wait for the budget model to be released here in two years. Good stuff.

------
berryjerry
I find it nuts that those streamers called the game "smooth as butter" at 60
fps. Even if it was 8K, there is no way 60 fps could feel smooth.

~~~
smabie
Yeah, having made the jump to 144hz g-sync, fixed 60fps looks hilariously bad.

Though my 13 year old self eeking out 20-30 fps playing bf2142 would probably
disagree.

------
d33lio
I just want to see an OCL-Hashcat bench of the 3090 :)

------
iforgotpassword
This looks reassuring. After the first couple rumors/teasers, especially
regarding power consumption, I feared that NVIDIA mostly just sat on their
hands and would just release something that's mostly a bigger version (more
execution units, ram) of the current Gen. I think they did that once some
years ago. Seems they actually did improve on the technical level too for
30xx. :-)

------
Ninjinka
The 3080 requires a 750W PSU, while the 3070 only requires 650W. Given I have
a 650W, that might tip the scale for me.

~~~
gambiting
These numbers never meant anything. I run a 1050Ti on a 200W PSU - nvidia
recommends 450W minimum. Add the TDP of your GPU, CPU and add about 100W for
accessories = what you actually need. Nvidia recommends a much more powerful
PSU than needed just in case.

~~~
eMSF
I feel like this used to be more true in the past. These days CPUs can exceed
their official TDP by quite a large margin, and while in theory they should
only do so temporarily, many motherboards default to unlimited boost clocks.
(Then again, perhaps you're never going to fully utilize both CPU and GPU at
the same time...)

~~~
gambiting
That is correct. Nowadays a 75W TDP Intel CPU can use as much as 200W for
short bursts. That wasn't the case in the past. However, it should still be
possible to find out that maximum draw value for many motherboards and pick a
PSU accordingly.

------
xvilka
Any news about that open source thing they promised to unveil this year? Or
they lied as usual?

------
tobyhinloopen
Well my 2080TI still runs Factorio without issues so I suppose I don’t need to
upgrade

~~~
ralusek
Ya, but we'll know, and more importantly, you'll know. Go ahead and upgrade.

------
Falell
What's the expected delay between reference card release and OEM card release?

~~~
coolspot
2-4 weeks

------
ponker
Jensen’s stovetop is the billionaire’a equivalent of a tricked-out RGB setup.

------
KingOfCoders
I wonder about the ML performance compared to my current setup of 2080TIs.

------
bitxbit
As someone who runs data models at home in addition to 3D rendering, 3090 is a
must buy for me. I imagine it will be sold out within minutes and supply will
be an issue for months.

------
cV6WB
Would this be able to drive a Pro Display XDR? (From Windows.)

------
ablekh
Very cool. I'm wondering about whether RTX 30 Series cards are compatible
(interfaces, form factor, etc.) with Quadro RTX 8000. Thoughts?

~~~
ablekh
Replying to myself. Still not sure about the form factor, but, unfortunately,
it looks like the RTX 30 Series cards - well, at least, RTX 3090 - have a
different socket interface when compared to 2080 Ti (12 pins versus 8+6 pins).

------
bogwog
10,496 cores on the 3090. That's just insane.

------
debaserab2
How many studios are even going to produce art assets at the level of fidelity
that 8K provides? These installs are going to be huge.

~~~
Sohcahtoa82
Higher resolutions IMO are not about the models themselves, but the edges of
them.

A character model can look decent in 1080p, but the edges of the model in
front of the background will be jaggy. Various anti-aliasing techniques can
only do so much.

Besides, I know I'm more interested in higher frame rates. I'd rather do 1440p
@ 144 hz than 4K @ 60 hz.

And I don't think there are even any 8K monitors out yet.

------
hank_z
I am trying to build a deep learning workstation. How likely do you think
Nvidia will roll out RTX 3090 Ti or RTX 3080 Ti?

------
Thaxll
Problem with AMD is their drivers, nowdays drivers are 50% of what makes a
good graphic card.

------
aclavelle
I bought a 2080 like 2 weeks ago. Luckily EVGA has the step up program.

------
adamch
I wonder if the pricing will drop once AMD releases their ray-tracing PC GPUs.

------
tkuraku
When would the Quadro cards based on Ampere likely be released. Any ideas?

------
sudosysgen
So it seems that the 3090 will be priced at 1499$. This is kind of insane.

EDIT: For people comparing this to the Titan RTX, no. This GA102, not GA100.
It's the cut-down version of Ampere. GA100 will come out, and it will be even
more expensive.

~~~
zamalek
Virtually nobody needs a 3090, much less a gamer (let's be honest, though,
many will buy one regardless). For the people that actually do need that
horsepower, it's unbelievably cheap _for what you are getting._ You could have
easily paid twice that a year ago for less.

~~~
zimpenfish
> Virtually nobody needs a 3090, much less a gamer

I'm tempted to get one just to avoid having to think about upgrading a
graphics card for another 10 years. Plus I can do some ML messing about as
well for resume-driven development.

~~~
selectodude
Cheaper and better to get three $500 GPUs every three years.

~~~
zimpenfish
Best I can get for $500 now (~£377) is an 8GB GTX2060 - 5x fewer cores, 1/3rd
the RAM, 2/3rd the memory bandwidth of the GTX3090 which is £1399. Plus I
really don't want to upgrade my PC again for at least 5 years - just done that
and been reminded of why I hate it.

~~~
sudosysgen
Maybe you should spend 500 pounds and get a 2070 super?

~~~
zimpenfish
But then I'm getting even closer to the GTX3090 price!

------
Havoc
Guess I'm not keeping my 2070 super for long then

~~~
tobyhinloopen
To be fair most games run great on any RTX card. What are you playing that
would benefit an upgrade?

~~~
smabie
If you want 144hz, even a RTX 2080 Ti can't hit it on 1440p on max with most
games. I have a 2080S and I'm pretty disappointed with the performance,
especially with Ray tracing on.

For example, RDR2 runs at around 50-60fps on ultra on my rig. Very
disappointing.

------
nirav72
did they leave out the price of the 3090 in the article or did I somehow miss
it? All I see is the 3070 and 3080 prices.

~~~
sedatk
It’s $1500

------
LoSboccacc
so are they definitely dropping the 250$ demographic using the rtx excuse to
forever up the entry level price?

------
VikingCoder
That is an alarming number of spatulas.

------
dsign
That looks good!!! Can we upload already?

------
jordache
dude has a lot of spatulas at his house!

------
baybal2
It looks really huge.

------
polishdude20
He sure likes his silicone spatulas.

------
tus88
But can it play Crysis?

------
randyrand
I have a GTX 970 that I bought for $379 in 2015. Does Nvidia make flagship
GPUs in this price point still?

~~~
jhloa2
Unfortunately not since Nvidia has had such a monopoly on the high-end GPU
market.

Interestingly enough, it looks like the $379 in 2015 dollars worth $415 in
today's dollars which makes the $500 for the 3070 seem slightly less shitty. I
didn't expect the cumulative inflation to be 9% since then..

