Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But it doesn't look like cheap upscaling.

It's that simple, try the fidelityfx options if you want to see what bad upscaling looks like.

DLSS will let people run games who otherwise wouldn't be able to afford to play them, end of argument.



It does look like cheap blurry upscaling. I was trying to be respectful to the opinion of people who love it, but you can't be respectful to my opinion. You are saying ”doesn’t look cheap” and ”can’t afford” in the same comment - think about this. You don't need to be a rich person to buy 3070/3080 - smartphones are less affordable nowadays.


No it doesn't

https://www.youtube.com/watch?v=9ggro8CyZK4&t=931s

It looks amazing, possibly more detail than native


What a ”proof”, muhaha. Let’s play this game together:

https://www.reddit.com/r/cyberpunkgame/comments/cxqm1u/every...

https://www.reddit.com/r/pcgaming/comments/aqmffc/is_dlss_ju...

https://www.gog.com/forum/cyberpunk_2077/cyberpunk_is_really...

I have 3 times more ”proofs”! :)

It's ridiculous how you can’t just accept mine ”ok, let's disagree”. Looks like it touches some really important side of your souls ;)


Author here.

DLSS is not the end-solution to rendering, well, nothing is, but it's an AMAZING piece of technology. All these temporal and dynamic resolution techniques are here to stay as they -improve- the look of games no matter the HW.

What do I mean? Obviously dynamic resolution and temporal reprojection are worse than say, a fixed 8k rendering at 240hz! Yes, true! But, that's not the correct math.

The more correct math would be, on a given hardware, say a 3080, would you rather spend the power to render each single pixel exactly, or would you rather "skip" some pixels and have smart ways to recover them for a fraction of the price, almost equal to the real deal, so now you have extra power to spend somewhere else?

Of course if you just do less work with DLSS or similar technologies, you're losing something, it's bad. But that's never the equation. The real equation is that no matter how powerful the HW, the HW is a fixed resource. So if you spend power to do X, you cannot do Y, and you have to chose whether or not X is more valuable than Y.

Makes sense?

Now, all that said, it's also true that sometimes you max out everything in a game, you cannot have more of anything because that's literally all the game has to render, and at that point sure, it's reasonable to spend power even in things that are not that great bang-for-the-buck because literally you cannot do anything else anyways! So for the very top PC HW, you end up doing silly things, like rendering in native 4k because you cannot use that power in any other way.

But that's in a way "bad", it's a silly thing that we have to do as there is no other option! If we had the option though, it would be much better even on a 3080 to render say a 2k or 1080p upscaled to 4k via DLSS, and use the remaining power to say, have 2/3 times the detail in textures or geometry or number or shadow-casting lights etc etc...


> almost equal to the real deal

Here is our disagreement. For my eyes DLSS is unacceptably blurry (even in ”Quality” mode). You can look at it - ok, but it's only one of the opinions. You can spend 1 minute in google to find quite contrary opinions. Whole your long comment is based on the idea that upscaling is an optimization - you are forgetting that upscaling is a tradeoff.

My main complaint about DLSS ”hype” marketing is exactly about this: do not promise ”incredible quality” when under the hood it's just a pity upscaling. Some HW is not good enough, some games are not optimized enough - it's fair and it's okay, there are things to sacrifice, there are workarounds. Just don't lie.


Respectably, I don't care about "your eyes" - all people are different and that's ok. Nor I care -specifically- about DLSS, obviously technologies are always evolving, it's not that DLSS is the best ever that the concept can be. Also its implementation varies among games.

What I meant to say is that we live at a time where rendering every single pixel all the times is simply a waste of resources - that can be better spent somewhere else.

And you're still saying it's "blurry" - that's not the point. Certainly temporal reprojection will -always- be blurrier than not using it. But you're not considering what you're -gaining- by that blur. The real question is - would it be better to say, have a world with 1 million objects at 4k, a bit blurry, or a perfectly sharp image, at 2k, with 100k objects...

Temporal reprojection saves time that then can be invested in other things.

Lastly. CP2077 ALWAYS uses temporal reprojection. ALWAYS. If you disable DLSS it uses its own TAA instead. If you disable TAA (which cannot be done in the settings menu, but there are hacks to force it) is STILL USES temporal for most of its rendering before the final image.


> all people are different and that's ok

I'm happy to see at least one person on HN can agree with this.

After re-reading your and my comments I concluded that my real issue is lack of the settings in CP2077. I do respect opinion of people who want to use TAA and DLSS, I’m just upset that I can not pick something I like, can’t decide what to sacrifice and what to prioritize.

Your article is quite interesting and I’m grateful for it. Please keep writing things like this.


To let you better understand me: I have vomiting calls when I remember how TAA+DLSS image looks. I just can't force myself to use it, it's like torture. FPS drop or aliasing are much smaller problems.


You can literally buy a perfectly functional car for the price of a 3080


And 2 of them for a phone!


A 3070/3080 is only one piece of the puzzle, unlike a smartphone.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: