As someone who knows very little about this space, is this just a hack for fun that nobody would actually use, or does it herald the possibility of an entirely new set of games that run in the browser?
Not from the official source, but we (Wonder Interactive Inc.) have a custom version of the engine that compiles to wasm. We are looking to upstream to Epic, or at least have a plugin that doesn't require custom source.
I love the idea you folks are trying to get this upstreamed, would benefit everybody.
IIUC the reason HTML5 was dropped in the first place was for the big render re-work that went on during 4.27 + 5.0. So now seems like a good time to add modern WebGPU support back.
Do have to ask, isn't your business model for "theimmersiveweb" built around this WebGPU capability? Providing that tech to others or making your site a marketplace for web games?
Same as all open source, other people can help maintain it, building/extend the functionality, ensure compatibility with other parts of the engine, etc
> is this just a hack for fun that nobody would actually use, or does it herald the possibility of an entirely new set of games that run in the browser?
99.99% just a fun hack. Running an engine like this in the browser is massively overkill in the sense that most of the code & features are going to be dead weight you don't actually want. You can't actually run any UE5 game in this way that actually benefits from UE5's advancements over UE4 or Unity or whatever, the assets are too big. You're not exactly going to stream 10GB+ of models & textures on-demand. You could build such a thing, but the engagement you're asking from the user at that point certainly justifies just being a native app and providing a much better experience.
That said, Unreal Engine is also pushing into the non-AAA gaming domain, and it's possible a smaller, easily run indie game ends up using the engine and still sees value in a browser-based deployment. Something on the scale of Vampire Survivors ( https://poncle.itch.io/vampire-survivors ), although that's using Unity but in theory there's no reason you couldn't do the same with UE5.
Unreal previously had built in support for asm.js + WebGL, so it's pretty reasonable to expect that you can compile UE5 games to WASM. The question is whether they will run, and it looks like the people responsible for this demo have ported the renderer, which is a big part of the equation. You would also need to port things like input, sound, filesystem and networking, but since they're loading textures and models, filesystem is probably ported too.
The browser is still pretty bad for deploying large applications like a UE5 game, so your guess is as good as mine when it comes to 'will people be able to actually ship this way'.
I remember their 'citadel' demo. I recall it disappeared rather quickly. All that seems to have survived is a youtube video of it. https://www.youtube.com/watch?v=c2uNDlP4RiE
> The browser is still pretty bad for deploying large applications like a UE5 game
Please wait, downloading 80gb of content....don't refresh page or clear browser cache...ever
I do appreciate the achievement here, but it's not something I personally would want.
Most of the code isn't source code but assets and those can be streamed.
Realistically even the toughest AAA games can be tuned to few GBs of code and the assets required for the initial area. I remember on UE4 you could compile the entire executable with some small game in waaaay less than a GB without particular difficulty. You can use this approach and stream the rest later.
This is especially doable if you keep the game state and menus/UI in the html + JS and avoid that part of UE entirely, so you just launch whatever is your map and game and those states.
Streaming is much better option, latest hardware instead of 2011 WebGL and 2015 WebGPU, and guarantee it works on the client without having to worry about black listed GPUs and drivers, leaving a empty black box for the user to see.
This work was apparently done by Wonder Interactive, you can see a previous post about their Unreal browser porting work here: https://theimmersiveweb.com/blog
I get the excitement around it but most ue5 games are giant... I get that pixel streaming is expensive for stupid car configurators but downloading 8gb just to look at your new car... I don't know.
That's actually the main thing we've worked on, WebGPU support is small potatoes compared to it, but I'm happy it's getting some attention here. We have an asset streaming system that enables you to stream in large, or near infinite size worlds at runtime, of course you still have to be smart about memory usage. We also have a server-side streaming solution that can be used as fallback, for constrained devices.
Yeah, the elephant in the room with big games on the web has always been and continues to be data storage. Even if the user sits there and waits for the whole thing to download there's still no reliable way to make sure it stays downloaded until the user decides they want to uninstall it, all caches and storage APIs are subject to the purged at any time.
We have the ability to stream in content from our CDN, so even devices with little to no storage capacity can run games of near infinite size. The main issue in today's browsers is the 4gb ram limit, although many games can run within that limit. We have MEMORY64 support as well, removing that ram limit, but it will probably be a few months before some browsers enable MEMORY64 without having to use a flag.
Streaming solves some of the problems, but the browser purging caches behind your back means you'd presumably have to serve the same data to the same user many, many times in the course of a playthrough. Even if you get a good deal on bandwidth, is that economical? And how wide of a connection does the user need to keep up with streaming high quality assets?
Games are already pushing 100GB when you download them up-front, with redundant streaming it's not hard to imagine that piling up to over a TB of bandwidth for one user.
It can be, we have our data cached as close to the user as possible, in over 300 locations. It does add some latency, because instead of a 3ms latency to fetch and decompress assets, it now might be 17ms. However, because we also have a memory cache, this can reduce that latency significantly, and we use that for as many small and recently accessed assets as possible. Our virtual fs is multithreaded and works in tandem with Unreals async loading threads, so we are able to fetch multiple assets at a time, reduce wait tune. We also have the ability to know what assets are commonly fetched in a certain period and to fetch those ahead of time.
Of course, if you combine bad networks, lack of storage capacity and large projects, you can be sitting around a while, or may not have the best experience. Keep in mind though, that the browsers don't usually evict data from the cache unless you've used up the storage quota, the system is under storage pressure, or the origin has not been accessed in a while. According to them
With severe streaming pop-in and LOD issues, though. UE5 games are already being designed around the assumption of having an SSD to pull assets from and have major visible issues when just using a hard drive instead, which is still a good order of magnitude faster than most internet connections.
I know this is super early, but I wonder how useful this is considering how bloated UE5 projects tends to be. I'd be interested in seeing what a slimmed down demo looks like.
Games with rich assets are big, that is true. But I don't see why this causes a problem for delivering a game over the web. It seems like the key requirements here are 1. Ask permission first and 2. Strongly cache the downloaded data. This is just like downloading a desktop game.
On the other hand the strong sandbox of the web is highly desirable for games which are often closed source and made with low security standards. Native sandboxing may be better for performance but won't be as reliable which is a tradeoff that it is nice to have the option for.
The fact that this game can potentially be archived and playable on any device in the future is also very nice.
> It seems like the key requirements here are 1. Ask permission first and 2. Strongly cache the downloaded data. This is just like downloading a desktop game.
There's no sure-fire way to cache large amounts of data in a browser, all you can do is pray that the browser decides to keep it. The more data you load, the less likely that is.
It's a problem that has to be solved at the standards level, adding some kind of "persistent data" permission, but it's been a known issue for years and there's been no progress.
In practice there are limits, it would be very rude to let random websites fill your entire disk until you manually clear the storage. This page tests how much you can store in practice, the current versions of Chrome and Firefox both seem to barf after storing just 10MB: https://arty.name/localstorage.html . What I meant by a persistent data permission is adding an explicit permission prompt which allows an app to bypass that limit and store any amount of data, if the user allows it, which you can't currently do.
The problem with maintaining unofficial language bindings of any kind is that it's an extraordinarily tedious, unrewarding job. It's hard to imagine many programming tasks which are less fun.
I've done it a few times, and I've learned to steer well clear of either creating or relying on such bindings if at all possible.
Epic is working on a, somewhat Haskell esque, scripting language for UE5. It's called Verse and currently available as a beta in the Unreal Editor for Fortnite, which is a simplified UE5 for creating community content for Fortnite. I haven't tried it yet, but the talk they held at the GDC a few years back was promising.
I wonder why so many prefer reimplementing the wheel (see any sort of Xscript out there) instead of leveraging some pre-existing and proven languages. I don't mind if it is something other than C# but I want great IDE support and, most importantly, memory safety + strongly typed.
As someone who knows very little about this space, is this just a hack for fun that nobody would actually use, or does it herald the possibility of an entirely new set of games that run in the browser?