Hacker Newsnew | past | comments | ask | show | jobs | submit | Promit's commentslogin

My response here is mostly that many of the visual choices made in the games I'm criticizing were made completely independently of the game design team. This is far from ideal but it's a consequence of how the production pipeline is set up in these blockbuster titles. I do vehemently disagree that the aesthetic traits I'm looking for are at odds to gameplay and design decisions, though.

I could go on for a while about how Zelda BOTW manages to integrate the design and aesthetic choices, but I'm going to get mugged if I try to use that game as an example again. Maybe I'll use it as part of a different write-up on visual design.


Zelda BOTW has climb-anywhere gameplay, though, so it doesn't need landscape contrast to support gameplay purposes. Contrast that to something like Horizon Zero Dawn where you need the harsh ledge contrast on rock faces to distinguish between walls you can climb on and those you cannot.

Similarly combat in BOTW happens up close, you don't actually need to distinguish things at range. This is not generally true, especially for the other titles you're comparing BOTW against. BOTW also has problematic cases such as this: https://static.gamespot.com/uploads/scale_super/1552/1552458... That rock face is horrible. Totally washed out in a highly unrealistic way, to say nothing of the ridiculous saturation of the torch light.


You don't want to encode brightness perceptually until you're showing it to the user. You need a linear space to actually do lighting calculations, which means physical luminance values. Within that space, you can use whatever scale you wish, with consequences to banding and quantization artifacts for decimating the bit depth. Tone mapping includes conversion to a log space via the gamma curve.

I explained it in another comment but basically the "20 bit" value is based on an idealized digital image sensor rather than a game rendering pipeline (which operates in floating point). It has admittedly proven somewhat confusing.


It's worth noting explicitly that floating point numbers let you have a linear scale but logarithmic-ish storage at the same time. A 12-bit float could represent a 1:1000000 range, let you perform normal linear math, and also be just as banding-free as a 20 bit integer.


The Switch is perfectly capable of both deferred shading and HDR, which are themselves independent unrelated techniques. You can make an argument about whether or not those are the best implementation trade-offs for that hardware, taking performance and desired visuals into account. But there's nothing inherently preventing you from doing it on the Switch and it's a fairly capable GPU in its own right.

P.S. Unreal Engine reintroduced forward (non-deferred) rendering as an option last year because it's more efficient in VR.


True, but deferred shading takes up extra performance, in most cases, thats why it is less used on lower end systems or vr where you need 120fps. i did not claim hdr or deferred was impossible on nintendo switch.


I liked this enough to merit an addition to the write-up. It's not perfect but I'm seeing some great GTA 5 shots that are very nicely balanced overall. Doubly interesting given the substantial age of the game.


This was certainly not meant to be a crucial piece of information, but sure, let's get into it.

Much of the post comes from a general assumption that the goal of computer graphics is primarily to replicate how a camera sees the world around it. Thus I think it's easiest to start from the idea of real world light entering a digital image sensor. Light in this setting is not continuous! Each subpixel in an image sensor acts as a photon counter. One photon hits the sensor, the count ticks up by one. There's no question of being able to perceive the values between 1 and 2 because they don't even exist. Either the sensor counted one photon or two. If you were going to literally create a digital camera that can process the entire world, you need 20 bits to count up to a million photons without losing any along the way. So if you were to build the hypothetical rendering pipeline that works on "real world" data about the scene, that 20 bit value would be the input.

As a practical matter, nearly all modern games store lighting levels internally in floating point, in arbitrary units chosen by the developers. Lighting pipelines are not integer based, but they're linear and not perceptual. The conversion to perceptual 8 bit (gamma curve) happens as part of the tone map stage. Doing things in floating point physical units is a better idea than the photon counter anyway, but the line you're confused about was really written with idealized cameras in mind.

(Technically an image sensor is an analog device and the voltage increases with each photon detection by an increment that is subject to noise of all sorts and pre-amplification before hitting ADC. Don't jump me on the photon counter thing.)


> Technically an image sensor is an analog device and the voltage increases with each photon detection by an increment that is subject to noise of all sorts and pre-amplification before hitting ADC. Don't jump me on the photon counter thing.

But it's rather important, and not even for those reasons. This is not meant to jump on you! (and I really loved your article, like you said it's not a crucial point at all)

The 1:1000000 or 1:2^20 contrast ratio only corresponds to exactly 20 bits if the 1 on the low end of this ratio corresponds to exactly one discrete unit of light (photon). If it's off by a factor of 0.5, 1.618 or whatever, that's what the whole argument is about.

First, the sensor counts not photons but a value relating to photons per second (because exposure time). If the 1 on the low end of the range corresponds to some exact minimum number of photons, it's going to be "one discrete unit of light per <exposure time>". Making the whole thing analog from the start.

Second, those sensors most probably aren't able to count individual photons any way[0]. The human eye, after about 30 minutes to get optimally adjusted to darkness, can sort of perceive individual photons, or small bunches of maybe 2 or 3, kind of. Those barely-perceptible specks of light in the utter darkness aren't the sort of resolution issues we're worrying about in the dark end of these types of scenes. And, as soon as you make a light source that can be described as "emitting single photons" in a certain context, you get uncertainty effects and all that quantum jazz (show me a photon/path tracer renderer that gets the slit experiment correct[1] and you can have your integer photon counters :) ).

So the sensor output values can (and should), for all intents and purposes, be assumed to be an analogue value.

The amount of bits you represent it with just puts an upper bound on your signal-to-noise ratio (as per Shannon entropy). But since we're dealing with 2D images, the distribution of this noise over the spatial resolution (either as a result of sensor noise at the input or explicit noise shaping dithering at the end of the pipeline) also comes into play when considering the quality of the output.

If the signal-to-noise ratio of a sensor output happens to allow for 20 bits of precision, for a sensor that happens to have a 1:2^20 brightness range, that's coincidental. Sure it correlates because higher-end sensors tend to perform better in both range and SNR. But I don't believe that the 1 on the very low end of a discrete range represents precisely one photon per <power-of-two times minimum exposure time>.

[0] correct me if I'm wrong about this btw. There might be specialized scientific equipment that can, but I doubt even high-end cameras bother to go to the accuracy of single photons. But, I mostly know about digital signal processing, not about state of the art of camera hardware. Yet even if they are able to detect individual photons, that's going to be a probabilistic and per-unit-of-time measurement, so the rest of my argument holds.

[1] these probably exist, but aren't used for games or photorealistic rendering purposes


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: