
Rendering does not matter anymore? - z3phyr
http://c0de517e.blogspot.com/2019/03/rendering-doesnt-matter-anymore.html
======
nikanj
Nothing seems to matter anymore when it comes to performance. Simple 2D
platformers get a laggier performance on my 2018 macbook than they did on the
C64, outlook lags for seconds at a time when I'm typing in a plain-text email,
etc.

~~~
pcwalton
The Commodore 64 ran at 320x200 at paletted 2bpp (16kB framebuffer size). My
MacBook Pro runs at 2880x1800 24bpp (15.5MB framebuffer size). So one frame on
your MacBook Pro has to render the equivalent of 972 C64 frames. That makes a
big difference—and remember, what matters here is _memory bandwidth_ , which
is not subject to Moore's Law the way that CPU speed is.

What's more, we care about power efficiency nowadays in a way that the C64 did
not. So while we could make things run as fast as possible, now we have to
trade that off against power constraints. The C64 ran at about 22 W while
sitting at the ready prompt [1]; my MBP is using about 11 W typing this
comment in Firefox.

That's not even getting into the fact that we care about high-quality vector
text (and now UI) rendering, secure multiplexing (e.g. having mutually
untrusting Web pages share the display with each other) which tends to add
extra blitting here and there, internalization, etc. Rendering is expensive
because your needs are expensive.

[1]:
[https://www.lemon64.com/forum/viewtopic.php?t=48631&sid=fe03...](https://www.lemon64.com/forum/viewtopic.php?t=48631&sid=fe030019ad4bea3f84607dfa34336009)

~~~
tmd83
I'm not a rendering expert but those numbers really is glossing over a lot of
facts.

If you are comparing power you have to remember how much progress
semiconductors have made. Now both the processing need, display density etc.
has increased so it's no way a straight comparison.

But that's not even the main point. Most applications are a tremendous memory
bloat and if memory bandwidth is a limitation factor 15.5MB is not the problem
here but that bloat is. And for sure browsers are the worst offender and so a
degree that it would have been if it wasn't so tragic. Now you can also say
browsers have a lot to do, lots of complexity and I agree. But it's the
complexity that we have added.

And what browsers and website themselves (with bloated js tracking and add and
pointless functionality) have done is to lower the standard to such a degree
that there is no longer a concept of leanness.

I know gmail would use more ram than older desktop clients and for sure
provide on client side a fraction of the functionality. On the server side
there's at least a performance concern for a lot of people but on client side
it's like it's not my money why not waste it. I'm not saying individual
developers are thinking about wastage but we have made an echo system where
bloat and wastage is the norm. I will be truly hard pressed to find any client
application in genre that is actually faster than it was five years or even
ten years ago and doesn't stumble more than before on general usage.

~~~
pcwalton
Browsers are not "memory bloated". Go look at about:memory and see what the
memory actually goes to. It's mostly the JS heap, images, media, etc. You need
those things to be in memory because pages demand that those things be in
memory.

Browsers have probably gotten more effort put into memory usage than any other
type of desktop or mobile app. Certainly it's far more effort than any of
those apps in the 90s received.

~~~
tmd83
And yet where we are. And why wouldn't I consider those parts of the browser
specially not including JS heap from any measure is absurd.

I am primarily using Chrome these days. I just looked at its own report. One
of my gmail account tab is using 891M memory. I am not saying it's entirely
the fault of the browser but whole web echo system. What you said is actually
true that this is the result after millions of dollars worth of effort in
trimming the fat. But that only turns it from absurd to actually pitiful. I am
not faulting individual developers or just browsers of course. It's from the
spec, to the browser, to the web developers but if people don't accept web as
a whole for all the fabulous advantage/access they provide which I don't
discount is out of control this will never fix. There are fundamental problem
here and no amount of incremental improvement will probably fix it. I guess I
will rant for a few more years to myself before accepting the status quo.

I don't restart my browser or machine for fairly long time and those few long
running tabs keep increasing in memory. I also just opened up firefox and
loaded up that same gmail account fresh. I think it started off at 350 and 5
minutes later it's over 400MB. Maybe it happened after I switched to web based
client fully but I think I never had a desktop mail client that took 400MB
memory.

I am a server side programmer who isn't really some grand expert in
performance (nowhere close) and I work with fat bloated Java but I still know
400MB gets you a long way or it should.

------
dragontamer
Games like "Doom" are aiming for 60FPS / 4k resolution for example. There's a
hardcore graphics / rendering community that still wants to see their systems
pushed to the limit.

But I've never been part of that community. I've always played Nintendo games,
pixel-art games, casual games, etc. etc. For me, gaming was NEVER about
pushing the technical boundaries of computers, it was simply about having a
good time... hopefully with my friends. (Some games I play: Factorio, Cuphead,
Rocket League, Magicka, Touhou, Overcooked, Smash Bros, Puyo Puyo Tetris)

I dunno, the blog post is kinda weird to me. Rendering engines NEVER mattered
to me personally, but I've always known people who were obsessed with those
figures. As far as I can tell, those graphics-geeks still exist... and I still
see games being made for them. (And I assume those people still play games
like Doom Eternal to push 4k / 60FPS).

Heck, NVidia's big "RTX" event is still being picked up positively by some
groups for more realistic shadows or whatnot. Someone out there cares about
improving graphics these days.

\-------------

It really depends on the specific game community. Assassins Creed games are
about historical fiction. The story is pure fiction, but it attracts a band of
history buffs who ABSOLUTELY get pissed at minor historical inaccuracies.

In the case of Assassin's Creed, the art must be well-researched and
historically accurate. This leads to challenges: such as proper-rendering of
Stained Glass windows in various churches. Rendering matters for sure, but
modeling these historical objects is also important.

------
ChrisSD
As the title is slightly unclear I'll try to sum up the article:

Rendering (in videogames) matters less than it used to in the sense it's no
longer genre defining. We've reached a point of diminishing returns trying to
(e.g.) make everything photorealistic. However there are still rendering
problems that can pay dividends, especially in "our times of enormous asset
pressure". But...

> We have to think hard about what is useful to the end product.

------
mntmoss
I believe that rendering has entered an era similar to what audio synthesis
encountered after sampling synths became pre-eminent - you can do more-or-less
anything given enough time, so the main focus of R&D turns towards providing
better access, which in synthesis usually means presets. Rather than develop a
custom program from scratch, it's more common to have a source library and
then modify that into final material.

------
petermcneeley
Compare these offline renderings and ask again if rendering matters:

Peter the rabbit:
[https://www.youtube.com/watch?v=3ittn4f0Em4](https://www.youtube.com/watch?v=3ittn4f0Em4)

Another "rabbit" movie:
[https://www.youtube.com/watch?v=w3gQ117IKkM](https://www.youtube.com/watch?v=w3gQ117IKkM)

~~~
wolfgke
This "rabbit" movie is probably liked more by the HN audience:
[https://www.youtube.com/watch?v=aqz-KE-
bpKQ](https://www.youtube.com/watch?v=aqz-KE-bpKQ)

~~~
vernie
Can't fucking escape it.

