Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] Unreal Engine 5 is meant to ridicule web developers (theolognion.com)
91 points by ksec 15 days ago | hide | past | web | favorite | 29 comments



How does Unreal Engine 5 handle global state management?

It is the most complicated NP hard problem that ever graced web development and literally stifling our progress on building a drop down box that updates all to other drop down boxes on the page.

Like, obviously your health bar is a separate component. But when your health bar updates, how does the other health bar over your characters head also reflect that change? Just drives me nuts, is there like a variable or something that they both subscribe to?

And if UE5 did solve this, why didn’t they even mention this technical feat in their demo? Web developers have entire conferences when this problem was discovered, but for UE5 to not even bring up this accomplishment makes me wonder, #conspiracy.


Generally things like that in games re-render from scratch on every frame. Since it happens every frame, you don't need state management to push updates around. Everything just grabs the latest value anyway.

The DOM on the other hand is stateful, and is very slow to mutate, which is the core problem of high-level web dev and really not something developers have any control over.


> Since it happens every frame, you don't need state management to push updates around. Everything just grabs the latest value anyway.

This is not true or at least only deceptively so. If you think about "global state" in a game or more specifically in a rendering scene as "all of the objects in the scene" then the main difference is the lack of acceleration structures for the web. Most of the objects in a scene are never touched because AABB or some other structure allows you to cull large number of objects without ever actually touching them. If objects in the scene change, you definitely need to update those acceleration structures. Fortunately for games, those updates are quick and there are typically not that many of them.


In the web the performance bottleneck is not figuring out what changed, it's doing any rendering at all. Rendering health bars in games is cheap because they happen on the GPU and because they don't have the sprawling layout ramifications ("reflow") that UI changes in the DOM do.


So, our rendering engine basically sucks? I guess browsers would eat up tons of battery if it rendered any faster.


Our rendering engine basically serves too many masters. It wasn't designed for what we're doing with it now, but it still has to support all the things people have done with it in the past.

This is why systems like Flutter, despite all their downsides, can be really snappy. They skip the DOM and all its baggage and just use their own "app-first" layout system.


An example to prove your point: Flipboard determined that our current rendering engine was too slow for their needs and implemented their own rendering engine on top of ours while retaining the rest of the browser infrastructure.

[0] https://engineering.flipboard.com/2015/02/mobile-web


Yep. And as mentioned in the write-up, it brings lots of complications when it comes to accessibility. The same goes for SEO. It sounds like it was the right solution for them, but it isn't for everybody.


Is it entirely out of the question for browser makers to maintain the DOM abstraction and upgrade the underlying engine to operate like an “app-first” layout system?

Take css translate3d, it forces the browser to use gpu to handle those transitions.

At the risk of sounding even more ignorant, but could the DOM be pushed onto the gpu layer, where things happen so fast it doesn’t matter at all how often relayouts/reflows happen? Obviously it’s using the gpu to render, no doubt, but what is about website rendering that’s just that much slower than a 2d video game?

I guess I’m describing React’s circumvention of the browser to some degree. Ideally, we want something like React gone, and implemented at the browser level.


> could the DOM be pushed onto the gpu layer

Some of it already is. The actual drawing-to-an-image happens on the GPU, so maybe that was a bit of a misnomer for me to point out in my last comment, but that's not the expensive part. Reflow is the browser deciding "here's everything on the page, here's all their CSS styles, and this one CSS style changed, or this new element got added, how does that affect everything else?"

Thing is, you're welcome to pull everything out of that reflow process! Through transforms and absolute-positioning, or the canvas, or WebGL, or whatever. And things will be a lot faster! But now you're taking on responsibility for a lot of stuff. Many of the benefits, in fact, that cause so many people to build on the web in the first place. The web has the most powerful and flexible and expansive UI layout system in the world. It's just that that comes with a cost.

The ideal scenario would be to have a "slimmed down" reflow mode that only does "app things" and doesn't get tripped up on "document things". However, that would be fundamentally incompatible with some parts of the web as we know it, and would fragment the platform. Here's an example of one of those weird pieces of baggage: https://wilsonpage.co.uk/preventing-layout-thrashing/. The browser doesn't know what you're going to do immediately after mutating the DOM, so it has to assume the worst in some cases.


Or we could go back to resource-efficient and lightning-quick native apps, like the good old days.


Just today i wanted to inspect a few MB json file to analyze some API data. I have a $7k custom made developer desktop with all at max. Tried to paste it to the regular web json inspectors like jsoneditoronline.org, basically just hanged all the browsers. Had to spend some time to search for a proper native tool that barely got the job done for what i needed.

Is it so hard? what gives?


What tool did you end up using?



vi can easily deal with hundreds of MB if not multiple GB on your custom made machine.

That's why you use a binary transport like gRPC.


> "That's why you use a binary transport like gRPC."

Unpopular option but it's true. Binary protocols are both bandwidth efficient and processor efficient. Back in the day, when a 25 MHz processor and 8 MB RAM was the spec for a powerful UNIX workstation with GUI, stuff like XML and JSON was simply not an option.


Ha! This was my first thought when I saw the demo. “Am I in the wrong industry? Why the hell does a megabyte of JavaScript take a second to startup? They’re doing millions of triangles and even a simple line chart hangs chrome if more than 1000 points. WTF!”

Dom is too complicated. We really need a much simpler primitives pipeline optimized for performance.

We need a binary JavaScript. Just parsing it takes a long time.

The other demo that’s really impressive is Microsoft Flight Simulator 2020.


Why is this flagged? I'm a web developer and think this was hilarious!


Most readers here don't have any sense of humour and can't take a joke! %:-O)

HN and fun don't mix well :)


Yes I know it's parody, but I don't really understand how it is web developers' fault that browsers aren't as powerful as native applications. Web is old, haphazardly planned standard for which nobody really had any idea where it was going. So there's a reason why it's all so cluttered and I still have to check my website in various devices and browsers before deploying it to my users. If using advanced tools like SPA frameworks is the only way to retain my sanity, I'm going to use them. (And yes the author also parodied the churn of web technologies, not going to deny that)

Perhaps there should be a fast alternative to the current web, where the biggest goal is to serve pages with lighting speed. It could be architected from ground up to come with built-in SPA support, that would load the pages in chunks while offering a smooth browsing experience. It could automatically adjust to the device and the network speed of the user, and have all the legacy cruft of the old web removed. So basically like AMP, but without Google's greedy fingers.

Yet the chances of that happening is probably zero, as where's the money in that. A secondary web just for power-users? Hmm, yeah let's see whose going to allocate resources maintaining that.


Fell for it like a dumbass, mainly because I agree 101% to the complaints listed in the article.


This is mildly funny but a bit of a cheap shot, I'd be more interested in "why is web dev so far behind videogames in terms of performances". Either the author doesn't know and is being cynical/sarcastic/devaluating to sound smart, either they do and could teach the rest of us a thing or two but chose to belittle webdev instead.

Edit: I just realised this is a parody website. Fair enough then, this is their editorial line!


I mean, I would still agree with what you said knowing it's a parody.


I laughed harder and harder as I kept reading. Fun!


He got me for a sec there, I almost didn't notice how similar the name of this site (The Olongnion) is to The Onion...


Never seen this parody site before. Droll.


What a lazy, unfunny parody.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: