Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Unity like game editor running in pure WASM (raverie-us.github.io)
644 points by TrevorSundberg on Sept 26, 2023 | hide | past | favorite | 159 comments
In the wake of all the Unity nonsense, just wanted to toss the Raverie engine into this mix :)

We’re building off a previous engine that we worked on for DigiPen Institute of Technology called the Zero Engine with a similar component based design architecture to Unity. Our engine had a unique feature called Spaces: separate worlds/levels that you can instantiate and run at the same time, which became super useful for creating UI overlays using only game objects, running multiple simulations, etc. The lighting and rendering engine is scriptable, and the default deferred rendering implementation is based on the Unreal physically based rendering (PBR) approach. The physics engine was built from the ground up to handle both 2D and 3D physics together. The scripting language was also built in house to be a type safe language that binds to C++ objects and facilitates auto-complete (try it in editor!)

This particular fork by Raverie builds both the engine and editor to WebAssembly using only clang without Emscripten. We love Emscripten and in fact borrowed a tiny bit of exception code that we’d love to see up-streamed into LLVM, however we wanted to create a pure WASM binary without Emscripten bindings. We also love WASI too though we already had our own in memory virtual file system, hence we don’t use the WASI imports. All WASM imports and exports needed to run the engine are defined here: https://github.com/raverie-us/raverie-engine/blob/main/Code/...

The abstraction means that in the future, porting to other platforms that can support a WASM runtime should be trivial. It’s our dream to be able to export a build of your game to any platform, all from inside the browser. Our near term road-map includes getting the sound engine integrated with WebAudio, getting the script debugger working (currently freezes), porting our networking engine to WebRTC and WebSockets, and getting saving/loading from a database instead of browser local storage.

Our end goal is to use this engine to create an online Flash-like hub for games that people can share and remix, akin to Scratch or Tinkercad.

https://github.com/raverie-us/raverie-engine




Wow you guys wrote Zero Engine? I used it in summer camps at DigiPen a long time ago, I really enjoyed using it and programming in Zilch (I wrote an AI that played a galaga clone!). This Unity fiasco had me wondering how Zero was doing. Seems like it's going to have a resurgence! Super exciting!


That's so cool! Zilch was my baby, that makes me so happy :D


Zero/Zilch was great, and I wish DigiPen would've stuck with it! I heard that these days, they don't even have students make games from scratch anymore, and everyone just uses Unity or Unreal instead—any idea if that's true? it would be a bummer if it were, because striving to make a cool engine (especially once Zero was there as something to aspire to!) was like the best and most useful part of my time at DigiPen.

glad to see the spirit of Zero still going strong through Raverie, though!


I teach GAM 300/350 at DigiPen. We still require them to create custom engines in their sophomore year, but in their junior year (and beyond, in some cases), they're all using commercial engines. One of the main things we want to focus on is giving them a chance to work on a truly cross-discipline team, where every member has a chance to thrive at their work, regardless of their degree program. If you're a student of game design or art, it can be very stressful to be on a team with a custom engine, because the bar is so high for them to provide the features you need to get your work done and accomplish your goals. If the engine and its editor and other support tools are all already completed when the project begins, these stresses and barriers are removed. Meanwhile, the programming students get a chance to work in an existing engine, which is applicable for the career trajectories of so vastly many of them, who will graduate and go on to work for companies that are using an existing engine (possibly even the one they used in GAM class). If they use Unreal, they're often working in C++, interfacing directly with the engine code. These are valuable experiences as well.

I have an imperfect perspective of the students' attitudes, but it seems to me that they tend to agree with my point of view: at the beginning of the year, I said, "Raise your hand if you wish you'd be working on a custom engine this year," expecting to see a few dozen hands go up. Surprisingly, only about one and a half hands were raised! Working on a custom game engine is an amazing and unique experience, but it's not the end-all-be-all of game programming.


Hey Doug, it’s Avi Eisner - long time no see! Dan told me you’re working on your own game engine - would love to see it some time and catch up!


Hi Avi! Great username. I assume the Dan you're talking about is my brother-in-law, because I don't recall telling Danny about my engine. It's true, technically: I was working on a game engine in JavaScript and then later TypeScript and then later JavaScript again, and since I haven't literally deleted the repo, I guess you could say I'm still working on it, but it's pretty far from the top of my priority stack right now. Still, I'd be happy to chat about it sometime, even if I can't really show anything running in it! I'll send you a message and we can catch up.


Thanks, yep - I had lunch with Dan last week. I understand, I have a half-dozen coding projects that I’ve got in various stages of (in-)completion.


I second that!


Trev, I think the last time we had breakfast together, I had already caught you up on the latest developments of my engine which is now gathering e-cobwebs over in its Github high-density urban housing development repo. Doesn't mean we shouldn't have breakfast again anyway!


I think it's great that sophomores still have to make their own engine from scratch.

making two engines, one in C for GAM150, and another in C++ for GAM200, were extremely formative experiences for me. when I was attending DigiPen, it was at the beginning of the ECS craze, so, naturally, I implemented terrible, extremely naive ECSes, in C and C++. they were woefully inefficient and byzantine to the point of ridiculousness (look up Game School Simulator 2015 on the DigiPen student games site—its framerate is terrible, despite rendering just consisting of a few dozen sprites), but I learned so much from making them—both good and bad. I especially enjoyed GAM200 when we got to have artists who had any spare time to work on a GAM project, because working with them to integrate them into the workflow was an amazing experience.

I understand that DigiPen wants to provide students with as much career opportunity as possible, and that means Unity and Unreal. or rather, it did mean Unity and Unreal, until very recently—now, it's kinda just Unreal. and Unreal isn't going away anytime soon, of course—but that's what we would've said about Unity just a short while ago.

even though it's certainly the pragmatic choice as far as getting a job after graduating goes, personally I firmly believe it's a mistake to forgo having students make their own engines after sophomore year. at least when I was there, there was borderline zero education about how to even go about making a game engine, despite requiring teams to make one, so there was that—but there was also a vibrantly competitive culture of one-upmanship among students, and it was fun to see what other teams were working on, and strive to do better and cooler stuff yourself. (Josh Fisher, if you're somehow reading this for some reason, your shit was always way better than mine and I was always super jealous.)

if students stop learning about how video games work at any lower level than "just use an off-the-shelf general-purpose game engine", then who's going to make game engines going forward? some of the smartest classmates I had were guys who put their heart and soul into their engine projects, and they all went on to find success one way or another as far as I know.

perhaps more importantly—and I only say this now with a decade of hindsight and experience since then—you don't need to make a general-purpose game engine in order to make a video game. our GAM 150 game did not need an ECS written in C to function. when I look at the code now, it's completely unreadable—everything is strewn about in random files, tenuously connected, such that it's a real archeological task to even figure out how the (extremely simple) core logic of my GAM 150 game even works. like I said, I really appreciate having had the experience of making terrible garbage tech, such that I could go on to learn how to do better. but it would've been even better if we had better instruction from professors, explaining that you don't need any of these fancy features to make a fully-functioning video game, especially on the scale of what's expected of a GAM 150/200 team.

I worry that going forward, the new generations of video game programmers are going to be too entrenched in thinking about things in terms of how e.g. Unreal does them, rather than what the solution to a given problem necessarily entails at the minimum level. sure, these people will be able to go onto find work in the industry at Unreal shops, but then what happens in the (admittedly extremely unlikely) event that something happens to Unreal as it did to Unity?

I don't know what DigiPen has been like in the past decade, but after I dropped out and talked to students from other game schools, it sounded like they had always, even back then, been pretty much Unity-centric—what happens to their former students now that Unity's in the situation it's in? do they feel cheated, like the only way that they know how to make games, the way they were taught, has now been somewhat invalidated? if they jump ship to another engine like Unreal or Godot, will they adapt, or will they still be trying to fit an Unreal peg into a Unity-shaped hole?

when Zero Engine was heavily in development while I was at DigiPen, it was hugely inspiring, because Zero was a great product, and the Zero team were available to chat if you needed advice about engine development. it's kind of a bummer to hear that nobody's taking their GAM 200/250 engine into GAM 300/350...


>if students stop learning about how video games work at any lower level than "just use an off-the-shelf general-purpose game engine", then who's going to make game engines going forward?

could this be a specialization or "minor", perhaps?

IDK, it's tough. I want more students to be learning this stuff while in a proper environment, but the reality is that so much of the real tricks and sauce are either a) buried deep, deep inside some "public" repos, or more likely b) tribal knowledge at a AAA studio. Maybe Digipen is different as a gaming focused school, but I never learned about ECS or object pooling or profiler usage or especially cache coherency in college. And I'd struggle grouping all these important engine concepts into a course since the knowledge is simultaneously disparate but related. each topic could be its own thesis if you wanted them to be.

>what happens to their former students now that Unity's in the situation it's in? do they feel cheated, like the only way that they know how to make games, the way they were taught, has now been somewhat invalidated?

custom engine or not, I do adamantly feel like a proper game dev program should teach enough CS chops for this to not be a problem. using a high level engine is useful, but the rest falls on fundamentals. Data structures and algotithms to know what to use to group data together and the tradeoffs, systems level programming to understand memory management (yes, even in Unity. Intuiting the cost of allocating and deleting game objects only helps. Any one of mulitple of various domains (network, graphics, databases, UI) to get different perspectives on how data can be structured and processed, etc.


> Maybe Digipen is different as a gaming focused school, but I never learned about ECS or object pooling or profiler usage or especially cache coherency in college.

I could write maybe a short pamphlet's worth of information that I wish I was taught by professors while I was at DigiPen that would've made all the difference in the world to me at least. the existence of cache coherency is definitely one of these topics, but ECS definitely is not. it would just be a few other general ideas, like:

“hey, y'know how you just learned about malloc() and free() in CS 120 when you learned C? well, it's not a good idea to be calling those all the time in your game. instead, have something like this:

    #define MEMORY_NEEDED 1024 * 1024 * 4 /* adjust as needed */
    unsigned char all_the_memory_you_need[MEMORY_NEEDED];
    size_t end_of_game_memory  = 0;
    size_t end_of_level_memory = 0;
    size_t end_of_frame_memory = 0;

    void *alloc_for_game(size_t bytes) {
        assert(end_of_game_memory + bytes <= MEMORY_NEEDED);
        void *ptr = all_the_memory_you_need[end_of_game_memory];
        end_of_game_memory += bytes;
        return ptr;
    }

    void *alloc_for_level(size_t bytes) {
        assert(end_of_level_memory + bytes <= MEMORY_NEEDED);
        void *ptr = all_the_memory_you_need[end_of_level_memory];
        end_of_level_memory += bytes;
        return ptr;
    }

    void *alloc_for_frame(size_t bytes) {
        assert(end_of_frame_memory + bytes <= MEMORY_NEEDED);
        void *ptr = all_the_memory_you_need[end_of_frame_memory];
        end_of_frame_memory += bytes;
        return ptr;
    }

    void reset_frame() { end_of_frame_memory = end_of_level_memory; }
    void reset_level() { end_of_level_memory = end_of_game_memory; reset_frame(); }

    /* likely not quite perfect sample implementation but you get the gist */
check it out: now instead of malloc() and free()ing everything everywhere all the time and worrying about memory leaks, we have "lifetimes" now, one that's per-frame, and one that's per-level (if your game needs levels). you just use alloc_for_game() to allocate stuff that your whole game needs at the start of the program's execution, and then you reset_level() and alloc_for_level() when you change levels, and reset_frame() at the end of your frame and alloc_for_frame() during it, and bam, now you don't need to worry about memory leaks, you don't need garbage collection, and your temporary bump allocator is reset and ready for the next frame—and the only cost was resetting a single variable to zero in a couple key places. neat, huh?”

stuff like that—stuff that's super basic and easy to understand once it's explained to you, but that most people wouldn't intuitively conclude on their own, especially when you're new to game programming and just chasing cargo cults like ECS without sufficiently understanding how stuff really works. for example, the ECS I wrote in C was terrible in part because I only understood the high-level ideas at the time—the whole thing was (hysterically, when looking back now) implemented with linked lists, as I was approaching systems design in terms of API, instead of actual functionality.

once you've taken a CS course to learn the basics of C, and a math course to learn the basics of matrix math and how it pertains to game programming (both of which were excellent at DigiPen, by the way!), you're just a few pointers like these away from being able to completely—and relatively competently—implement your own 2D "game engine", given that you're using external libraries for e.g. rendering, input, and sound playback.

but then, of course, the resulting student games whose screenshots and video clips you gather to use for promotional material for your school wouldn't be as flashy, compared to contemporaries who only teach e.g. Unity...


>the ECS I wrote in C was terrible in part because I only understood the high-level ideas at the time—the whole thing was (hysterically, when looking back now) implemented with linked lists, as I was approaching systems design in terms of API, instead of actual functionality.

well that sounds awful. I wasn't making too deep a thought on which topics I'd throw onto a curriculum, but I'd hope that they wouldn't treat ECS like some pattern to memorize like some design patterns in my SWE course (literally called "software engineering". Quite confusing in retrospect). It should be treated as a way to apply and understand some data oriented design so you can make informed decisions on how to implement things.

I'm especially a fan of always identifying any and all shortcomings of an approach too. Because there's no better way to understand an approach than to reason about with its weaknesses. ECS is nice but you shouldn't try to shove it in stuff that requires tight complex coupling (e.g. a playercontroller), nor for random, infrequent events (UI input). And of course, if you are making a small game that can almost entirely fit in RAM anyways (well, in theory. Desktop OS's wouldn't allow this, obviosuly) these patterns are overkill and a half

>of course, the resulting student games whose screenshots and video clips you gather to use for promotional material for your school wouldn't be as flashy

well that is reflective of the modern game industry, haha. I imagine even in a place like digipen where you are working with motivated game artists that there's still technical constraints to consider with the art team (especially students). And building tools to help with that would take as long as the small game.

I did ponder if it'd be a good idea to have a CS game dev course each some basic 2d/3d art, and vice versa for an artist learning some CS101 style stuff. But the CS curriculum was already jam packed as is, at least at my alma mater. I believe the average course work required 170 units and CS (like other engineering degrees) was topping out at 190.


I heard the same thing and I really hope what they're doing is for the best. It would have fundamentally changed my path if I didn't get that core low level experience of building game engines.


do you know what engine was in use or development during around 2002-2003? the engine was somewhat baked but flexible and had entry points where you drop c++ snippets in, I believe with a GUI


Was it a 2D game engine? I recall before using Zero I used an engine called ProjectFUN, which is similar to what you described. That was around 2013 though, not 2002.


2d yes


Just to contribute: yet another instance of "cool slideshow" on my (most definitely low end) laptop with integrated graphics.

Then tried it on my aging Mate 20 Pro (not too new, not tooo old), and it just seemed to get stuck at "Downloading runtime". Got a funny feeling my flaky 4G wasn't actually to blame, inspected the tab from my laptop (chrome://inspect) via USB, and was greeted with

  Uncaught (in promise) Error: Needs OES_texture_float_linear to function
      at M (worker-3ec07b2e.js:1:3338)
      at J (worker-3ec07b2e.js:1:3403)
It is what it is. It's a good philosophical question about whether it's worth coding up an error screen, since it's all just to tell people they can't play with the toy. Hehe.



Hmm, I think that polyfill might only be for WebGL1 (we use WebGL2). When I go to the WebGL1 report I see that my card supports both:

OES_texture_float OES_texture_float_linear

But on WebGL2, it only shows OES_texture_float_linear. I think OES_texture_float doesn't exist on WebGL2 I'm guessing.


How low end is it? I'm using my 8-year-old Surface Pro 4 on Starbucks wifi and it's blazing fast.


I like the concept. My GPU and browser weep with lag. Its a very impressive slideshow.

I finally managed to highlight the starting sphere after several tries. At first I wasn't sure if it was loaded.

Of course, I'm only using a 1.8 GHz Celeron with onboard GPU and 8GB Ram in Firefox, so... not exactly high-end stuff.


I'd be curious if it runs any better under Chrome. Seems a lot of people are hitting Firefox issues. I'll be sure to test Firefox regularly though.


Because of these questions, I have now downloaded and tried:

Firefox Developer (119.0b1), Chrome (117.0.5938.92), Opera (102.0.4880.56), Edge (117.0.2045.43).

Turning off all other programs (without crashing Windows), ranking for trying to drag the selection square ~2 inches x 2 inches (on a 1920 x 1080 laptop with the above noted characteristics) without doing anything else. Did the drag back and forth TL-BR, BL-TR 5-6 times to get a feel for general timing (cannot claim I used a stopwatch):

Edge: ~1 sec response before square changes

Chrome: ~1.5 sec response before square changes

Opera: ~1.5 sec response before square changes

Firefox: ~2 sec response before square changes

From my super low-end perspective, Edge is surprisingly responsive (maybe optimized for Windows?)

Edit: From another perspective, looking at the Dev Tools (what I actually care about), Edge is pretty hilarious, it has 72 errors on its default landing page (including this is not a "TrustedScriptURL"). Opera only has 2 "Uncaught (in promise)" and both Google and Firefox have 0.

There's something comforting about the idea that Microsoft still does not trust Microsoft. ("Left hand meet right hand. Begone fool, we have no hands!") The more things change...


It was completely unusable on my M1 MacBook on Chrome, I suspect it's hitting some degenerate webgl/driver edge case.


Strange, was super responsive on my m1 MacBook Pro under Firefox.


Interesting. M1 Pro Macbook Pro (14" 2021) w/ Firefox, massive lag - to the point rotating the view is unbearable.

For the dev, FF 117.0.1 on macos 13.4.


Same here on M2 MacBook Air.


AFAIK, Firefox has all around awful webgl and webgpu performance.

I know personally, anything 3d in Firefox is a slideshow, even with a 4090


It's running quite well here on Firefox in Fedora 38 with a AMD Radeon 6700 XT.


I have an AMD 6700xt.

It was laggy in Windows and under X11 on linux, but under Wayland I get 60-80fps and it's very usable.

Not sure what the cause is

EDIT: Under Chromium (compiled from AUR) the performance is roughly the same


Works fine for me running Firefox in Windows with a 3090. Also already have Unreal open on another screen. So lots of my GPU memory is already being eaten up.


In Firefox on a 12th-gen Framework i5-1240P, Fedora 38, integrated graphics at 2256x1504, it works and looks fine for me.


Hey all, if you’re willing to try again we just put out a fix that dramatically helps performance on some machines. Since this was a port to WASM from a native game engine, it turns out the issue was in our frame rate limiting code which wasn’t playing well with browser timing APIs:

https://raverie-us.github.io/raverie-engine/


I'm on an m2 pro and had the exact same experience, also Firefox.


So M2 = Celeron?


jesus they still make celerons?


> Our engine had a unique feature called Spaces: separate worlds/levels that you can instantiate

This is, as far as I can tell, exactly what the Godot engine does[1]. That engine also has a browser (wasm) target.

[1]: https://godotengine.org/


Not only does it have a wasm target, it also has a web version: https://editor.godotengine.org/

I suppose it's not any kind of priority to the project. Tested it a bit and it does seem to work :), but it's slower than running it natively.


Thats because the editor is built in godot as well, as far as i recall


I don't see how it's different than scenes in Unity either


Worlds/Spaces in Godot at least are more to do with physics/lighting than scenes/levels. So two objects in the same scene can occupy different "spaces" (for physics), "scenarios" (for rendering), or "worlds" (which encapsulate a space and scenario, for both physics and rendering). Objects in these different contexts will appear in the same scene but will be separated in terms of those contexts and won't e.g. collide with each other or be affected by forces from one space if in a different space, or be lit by lights from a different scenario.


Interesting. What are the use cases for being able to do this?


The main use case is for subviewports - nodes which contain a scene tree but aren't rendered with the main window, instead you grab a texture from the viewport and can display that how you like, as a mesh texture or as a sprite/texture rect, input to a shader, etc (similar to render textures in unity). You can apply a World resource to the subviewport (the default is, I believe, for them to have their own worlds), and that ensures that they are treated entirely separately from the main viewport's children and will only collide with things and be lit by objects in the subviewport. But, that may not be desired - you may want to use a subviewport to render certain objects and then apply post-processing only to those objects, but otherwise treat them like objects in the main scene - in that case you can have the subviewport share the same World as the main viewport, causing objects inside and outside of the subviewport to interact.

Similarly, there are often gameplay reasons you may want to separate physics/lighting for particular objects, and maybe even have several "spaces" in which objects can interact that you want to swap out at runtime. Stealth games where shadows are important to the gameplay, or games that have an "inner" and an "outer" in terms of physics (maybe a car in the world, and then objects inside the car in their own inner world that maybe obey slightly different physics parameters, for example).


I see. Thank you for the explanation!


Everything in Godot is a scene, from actual scenes to groups of objects, etc.

It makes composing and managing objects and compositions much easier than just with the tree view.

Like a Unity prefab, but much better


Do you know anything about any WASM developments that will enable pure WASM interaction with the browser's Web-APIs at no or at a low cost without the JS layer? Sometimes I look at https://github.com/WebAssembly/proposals and it's very confusing. There are the type imports proposal(years away), the almost complete GC proposal(which is apparently only for GCd languages, but not for anything browser<->wasm), the component model(which looks and sounds as something not for the browser use case), JS String Builtins (which will provide faster JS strings, but not DOM) and ECMAScript module integration (which will turn WASM modules into ES modules, but Web-APIs aren't ES modules so no luck). Sometimes I read contributor interactions and it looks as if providing such functionality isn't their priority or even in their plans, and for the majority the WASI + component model for the cloud, crypto and similar use cases are more important.


I really yearn for this myself. I would love to see a “pure wasm” browser where the pages are just wasm and the “browser” just provides a bunch of platform imports like WASI. I agree with you though, seems like it’s not a priority for them. One of these days…


Reference types technically already allow for calling directly into the browser APIs as reference types allow wasm to pass JavaScript objects around. So if you import all the relevant web apis, you don't really need any (almost any?) intermediate JS anymore, because you can just forward the JS objects between the individual API calls.

The only real need for JS would be to initialize the wasm module in the first place.


>you don't really need any (almost any?) intermediate JS anymore

I don't think that's true. You still need JS glue code that would return these externrefs. A WebAssembly module loaded through ESM integration can't as far as the current proposals access Web-APIs on its own. As far as I understand there is also a number performance issues related to the wasm spec itself that limit possible optimizations, making wasm second class to JS in terms of performance of WebAPI access.


The problem of pure wasm is we get a closed source web with unblockable ads.


that ship has sailed. Minified and optionally obfuscated js is already highly annoying to decipher. The wasm spec is surprisingly high level for what should be a low level VM, and is planned to get much more high level approaching near jvm levels, and isn't any less readable than C++ compiled to asm.js.


Still infinitely better than WASM blobs where everything is painted on a canvas, which is un-adblockable. The experience will also be hellish, because every website will ship a 10 MB blob of half-baked things that your browser already does a million-times better out of the box. And for what? Because people think it will be faster (it won't).


The experience will probably be hellish anyway because of the Web integrity API. Javascript compiled to asm.js could produce painted-to-canvas websites too, and it won't be any less viable especially considering the growth of the CPU scores. Webassembly is prevalent in the canvas use cases because DOM manipulation with it is slower, clunky(due to the poor browser integration) and because people want to reuse native-language codebases.

If the canvas website really was a threat we would have seen much more of it by now already, but it hasn't happened for a number of reasons -- we only see them in the dedicated desktop-app-like use cases like this one. Anything scrollable won't be viable because it would be inherently less responsive than the real deal. Fonts look different, most designers actually prefer for people to be able to select text, and be able to use browser functionality that looks native for the different OSs. Canvas websites are also more expensive in terms of blob loading and ability to cache pages. So it's not really a threat.

Most adblocking happens by the means of network/URL blocking and I don't see any potential difference between WASM and JS in this regard(unless the APIs are enshittified, but JS isn't impervious to this too). If ads can't be blocked by network, and the website proxies ad contents through its own domain(as far as I know that's not prevalent, maybe I'm ignorant -- they need to track users and clicks themselves anyway), than you could get hard to block websites with JS+DOM too -- it's possible disrupt DOM based blocking by frequent randomization/obfuscation to the degree that one would need some kind of advanced AI to even hope to find the ad DOM elements. Between WEI, web bundles, Topics API push, webassembly's supposed higher indecipherability (which isn't true, chrome "disassembles" WASM into the WAT form and it's pretty readable. More readable than wasm.js) doesn't even register.

>And for what?

To not deal with JavaScript? To be able to develop for the web in the language you want, without floats being the fundamental numeric type and other BS? Is that too fantastical of a desire?


One day I'd love to compile part of Chromium's rendering into a WASM module. In my imaginary world, people just make html+js+css pages using the "firefox.wasm", "chromium.wasm", whatever rendering engine of their preference, and browsers just become wasm players. They can even expose APIs to support local fonts and whatever else browsers need today. I realize that's far-fetched given the sheer amount of rendering optimization Chromium has done just to work on all the hardware, graphics APIs, and platforms they support. But this is honestly one of the reasons I wanted to port our engine to pure WASM. I wanted to see what a minimal API would look like for a full game editor (not just engine!) that a browser would have to expose with basic graphics and audio. It's a lot smaller than I expected, and side note... Emscripten did a great job of hiding all those details! WASI has made huge strides in getting OS level APIs exposed to WASM, but it's mostly headless stuff, not really a focus on GUIs or audio yet. I'm sure proposals exist, but the speed of getting these mainstream is slow because I'm sure they want to get it right. I'm hoping this engine can serve as an example for what we need from WASI to start making this possible.

And if you're curious, this is why I linked this header in the post; it has the minimal surface of functions we needed: https://github.com/raverie-us/raverie-engine/blob/main/Code/...

And I'm sure this fictional browser could find a way to make ad-blocking plugins that analyze network traffic and decompile wasms to scan for whatever using the latest AI magic :P


I was sceptical WASM is ever going to be mainstream, but after reading your post I'm convinced it's the future. This sound like advertiser dream, and adds already rule the internet, so the conclusion is obvious


I believe that's at least part of what the GC proposal is moving towards.


Cool project.

Is there any way to scale the UI for high dpi displays?

On a 2:1 display, the fonts look aliased, if I set the browser zoom to 50%, the UI looks crisp, but everything is a bit small to be useful.


To OP:

The fix is setting canvas width to window.innerWidth * window.devicePixelRatio, and height to window.innerHeight * window.devicePixelRatio. Then use CSS to maximize the canvas on the screen.


Thanks, I’ll make the fix!


Fixed: https://github.com/raverie-us/raverie-engine/commit/aa7e1b6c...

Had to take into account the mouse coordinates too since they were not scaled.


I fail to understand why not use the native browser rendering engine, or at least SVG for the UI. It must be cheaper in resources (memory, network, cpu), easier to make accessible and easier to render


Anecdotally, in the very early days Figma rendered its UI entirely in canvas, but we eventually switched over to React for the UI. It ended up being far easier for development cycles and for accessibility/integration with existing browser features.


I always thought it was so cool Figma just rendered to a canvas! Oddly enough I ran into this at my previous job doing browser isolation when we got a bug report that Figma wasn’t displaying properly in our browser.

Makes sense to switch the rest of the UI to react though, just for how many people know it.


I thought you guys recently rewrote everything to be in WebGL instead?


Go inspect the source and see for yourself, the UI is normal HTML.


I think their UI is react but the actual drawing canvas is WebGL.


The main reason was just because this engine was entirely built for native platforms originally in C++ (not for the Web). This is a port, hence why the UI is running in WASM/WebGL.


Trevor! I loved your CS elective at DigiPen on designing programming languages with grammars.

Many of the students you taught are working on Minecraft now. I'm using so many fundamentals taught by you and my time at DigiPen when designing the Minecraft scripting API today.

Glad to see you're still rocking it.


Awwww I'm so glad it's been helpful! One of these days I'll get back into teaching ;) I'd love to see the scripting interface you're working on too, I spent so much time making Minecraft mods back in the day and I always wanted some kind of dynamic scripting.


This is super cool, it's great to see a pure WASM offering in this space. It does chug quite a bit on my M1 Mac Pro though!


Yeah I got a question, will this run on my NES? It's a refurbished front-loader, mostly original parts, but I did add some more RAM (I hot glued it to the top), and also I painted the case red and attached a few army men to it.


Why does a WASM > NES translator actually sound fun to write...


This is hitting my gpu super hard (Radeon Pro 560X, near-4k) with just the default ball scene. Frame rates are dipping very low just orbiting around. Does anyone know if there could be a WebGL optimization issue?


I'm getting a really nice smooth experience on my Intel HD Graphics 620 (7th gen intel integrated graphics, old and slow). Also just looking at the default ball scene, and I was really impressed how smooth it was on my system. I'm using chrome.


I suspect this is possibly down to the use of readPixels and how slow that is on various devices. The engine seems to run in a worker and render to an offscreen canvas then transfer the image data in JS to the main thread before drawing it to a canvas there.


The transfer of image data only happens when we yield inside the engine which only occurs when you hit a breakpoint in script. Otherwise we're just rendering to the OffscreenCanvas and letting the normal flow blit it to the screen (not doing any copying).

I just did some profiling on Firefox and I feel like the profiling result doesn't quite make sense but it's saying that the majority of time is spent calling Performance.now() from the clock calls in C++. I'm wondering if that's because we're calling it too many times and maybe we should just call it once per frame.


FWIW my poor performance is in Chrome on Apple Silicon not Firefox.


Hey all, if you’re willing to try again we just put out a fix that dramatically helps performance on some machines. Since this was a port to WASM from a native game engine, it turns out the issue was in our frame rate limiting code which wasn’t playing well with browser timing APIs: https://raverie-us.github.io/raverie-engine/


It almost sounds like it’s not using your dedicated graphics card. Do other WebGL demos run alright for you?


Yes I can run about 200 jellyfish in this demo with similar GPU load, resolution, and frame rate. https://akirodic.com/p/jellyfish/

Maybe not the most helpful comparison haha. Anyways, great work, this is very impressive


Same experience here. I'm able to run other webgl projects fine, including the jellyfish project you linked, but this engine blows up my gpu. Strange


That's WebGL. This engine seems to use WebGL2.


Holy wow, opened this on my phone and it was like unity editor was running on mobile with a smooth 3D viewport!

Gotta give this a shot on desktop later.


The mobile version needs a lot of work! But we'd love to support the full editor on phones eventually :P


The Spaces feature reminds me of how Movie Clips were used in the old days of Macromedia Flash, where I'd put many different often-independently-running clips overlayed on top of each other to encapsulate different behavior. Of course Spaces sounds even more flexible. Very neat.

I'm not seeing documentation linked in the readme or within the github project. Are there how-tos and tutorials anywhere?


Not yet, the port is still a work in progress. For the old Zero docs you can go here:

https://github.com/zeroengineteam/ZeroDocs


Yet another reminder that WebGL is terrible in Firefox. Looking forward to webgpu being standardised and widely used instead.


The current status seems to be, chrome is also way ahead of firefox with webGPU. They just don't have the manpower anymore, so I would not get my expectations up high and just use chrome for games and co amd be happy if webgpu comes to mobile FF at all. Or petition mozilla into rehiring some engeneers..

And with this concrete project I cannot compare, because on my mobile nothing loads, neither on ff or chrome.


The WGPU people have a new, faster version coming out and Firefox integration is in progress.[1]

[1] https://github.com/gfx-rs/wgpu/pull/3626#issuecomment-173417...


So there is hope? Sounds good.


Hey all, if you’re willing to try again we just put out a fix that dramatically helps performance on some machines. Since this was a port to WASM from a native game engine, it turns out the issue was in our frame rate limiting code which wasn’t playing well with browser timing APIs: https://raverie-us.github.io/raverie-engine/


Didn't see that till now, but yes, that is so much better.


Same experience here, getting really low fps just moving around in an empty scene.


Check out what I wrote earlier about how WAForth dynamically generates and links WASM code! By just reading the WASM documentation I didn't realize it was possible to call back into JavaScript to dynamically create and link in WASM code on the fly, but WAForth opened my eyes to that, by compiling each FORTH word definition into a tiny little module and linking them all together. No (practical) limit on the number of modules that you can use in the same WASM app. It would be cool for a game editor app to have a visual programming language that compiles directly to WASM that way, so you can edit code at runtime.

https://news.ycombinator.com/item?id=34374057


YES! Absolutely yes to this. Right now the built in language was made well before WASM was even a thing, but at some point in the future I'd like to make it target WASM directly or even potentially use AssemblyScript or something like it. I certainly also want to have the engine support importing WASM modules as plugins, for when you need serious near-native performance rather than a scripting language.

Tiny modules also sound awesome. I'll give the article a read, thanks for this!


Tried on a couple of devices with Firefox, the Downloading runtime bit takes ages.

On Android firefox on my poor old Nokia Android phone it's way too slow to try, on rotating the UI size hasn't caught up for a few seconds and everything stretches.

Edit: I got Failed to create WebGL context: WebGL creation failed: (Feature_Failure_EGL_Create) Exhausted GL driver options. (FEATURE_FAILURE_WEBGL_EXAUSTED_DRIVERS) TypeError: a is null.

The Ubuntu is in qemu but should have 3D support, as others mentioned there should be some error surfacing, not just continuing to show the splash screen forever.

On Firefox on Ubuntu, I'm still waiting after a couple of minutes for it to start, so might try again later.


The summer workshops for HS students at DigiPen helped launch me into a programming career :)

No experience with Zero, but when I did the workshop in 2008 we used the .NET-based engine. I forget the name of it and the backup I have is in a pile of unlabeled burnt CDs.


it was ProjectFUN, same name as the summer workshop program—I wish I still had a copy of it, or at least the game I made with it that same summer!


It's sitting on some old drive somewhere. I'll probably do some digging at some point and put it on archive.org, if only for preservation.


> Our end goal is to use this engine to create an online Flash-like hub for games that people can share and remix, akin to Scratch or Tinkercad.

AMAZING.

Hackable games are such a great in-road for new developers, and for anyone who takes an interest in software because of gaming.


Really want this to be a great learning tool!


This looks amazing, and I really hope it gains some traction.

That being said, one of the biggest annoyances I had trying to switch to Godot from Unity is that their Scene view doesn't reflect the Game view at runtime (you can see the hierarchy, but there's no visuals, no gizmos, no debug raycasting or colliders)[1]. It seems this engine has the same missing feature?

[1]https://github.com/godotengine/godot-proposals/issues/7213


If you hit space bar to bring up the command selector and hit Edit In Game, you should get a view that shows you all the debug draw, gizmos, etc. Hopefully this answers what you're referring to.


oh wow that's perfect! In fact, this does something a little extra I wish Unity could do - because I can play the edit scene and game scene side by side, I can modify experimental properties at runtime, then copy+paste them into the edit scene on the fly (in Unity it's a lot of copy+paste into notepad, stopping the game and hoping you didn't forget anything and pasting back in).

Thanks Trevor!


It errored on first go with a blank screen:

Error: Failed to allocate type at x.allocate (worker-3ec07b2e.js:1:3516) at ImportGlGenTexture (worker-3ec07b2e.js:1:7645) at RaverieEditor-60af6467.wasm:0x1d459ee at RaverieEditor-60af6467.wasm:0xedf9e7 at RaverieEditor-60af6467.wasm:0xedf56b at RaverieEditor-60af6467.wasm:0xdffefa at RaverieEditor-60af6467.wasm:0xf213e3

Reloading fixed it, maybe a race condition between setting up the app and it being ready to run.


Hmm, it looks like our call to `gl.createTexture()` returned null. I'm actually not sure what we'd do in that case other than fail with an error dialog.


Very slow load in Firefox – stuck on “Downloading Runtime”. No error in console, and the .wasm file is stuck in the loading stage in the Network tab – a few minutes first time, 20 seconds now (with cache cleared). No such problems in Chromium. Weird!

Overall though it's a cool project. Personally I would prefer if it was a desktop app as well (with better integrations and less overhead – and certainly with hi-res support!). Congrats on the launch!


Worked fine for me in Firefox.


Probably some very flaky issue then. My current ISP isn't very reliable, although everything else seems to work fine right now.


I haven’t done a lot of testing with Firefox, I’ll take a look!


The UI looks very polished!

I have an old 3D modeler I want to modernize, is it possible to reuse the UI framework? is it imgui based? I don't seem to see a sign of it.

is it easy to be separated out? is the rendering backend swappable? (I hope to use webgpu)


I found the ui code and looked at the font handling part, it doesn't seem to support complex shaping, the font has to be monospaced I guess with a small set of glyphs


Strange, on my M2 MacBook Pro this still absolutely crawls. I thought it was my personal (intel) MacBook Pro, but it's still extremely slow on my M2. Looks amazing, but definitely requires a massive performance improvement before it's useful. Keep at it! :)

edit: chrome btw


Having worked extensively with TDI Explore and Wavefront software on SGI machines, seeing this on my browser still blows my mind.


Why invent your own scripting language instead of using one of the infinity other existing languages?


This project was made in 2012 when there weren’t that many languages, especially not ones that bind to C/C++ (and we’re type safe). LUA was the primary embeddable language back then, and C# wasn’t even open source yet :)


This looks great! Some feedback - I clicked on the link and tried to find out if it's on GitHub. I looked in Help > About but it didn't seem to have much. Then I clicked on Help > Documentation and it brought me to a 404


Ah sorry about that, this is still a big work in progress. A lot of the old links have died so we're porting over things like documentation still. I'll add a back-link to the github!


I almost skipped this because I thought it was Unity, not a Unity-like editor.


Very cool! I can't seem to get the script editor open in the demo though.


While impressive, the whole UI feels much slower than using PlayCanvas.


heh, I got a kick out of there being a Project > Exit (bound to alt-f4!) which didn't actually exit anything but it thankfully did stop the tab from taking over my machine, so ... win-win?


Still a lot of hold overs from being a native app :P

Good find though!


So I take it that this would be the highest performing + most customizable engine for web deployment due to its deep WASM nature? What about rendering, does it use WebGPU?


I'm sure there are higher performing engines out there, but it certainly should be easily deployed to any site. For rendering, this engine currently just uses WebGL. It was built on OpenGL originally (with a swappable renderer backend so we could target other 3d APIs). Eventually we may support WebGPU when support grows among the other browsers.


Poking the code it looks like it's based on an OpenGL renderer translating to WebGL2.


I'd guess Threejs or Babylonjs would be because they're web first but regardless super cool project!


This is amazing. I might have missed it, but is there a plan to put the dev issues onto github, or is there another way I can get involved in dev/doc?


I'll for sure list issues there. I also need to move the docs from here:

https://github.com/zeroengineteam/ZeroDocs/blob/master/getti...


How are Raverie spaces different from Unity scenes?


To be honest I haven't spent much time in Unity. Back in the day you either had to parent objects to the camera to make UI, or use a completely different UI system. It sounds like now Unity scenes can be overlaid on top of each other, is that correct?


Yes, multiple Unity scenes can be loaded at the same time overlaying each other. Physics can optionally be local to a single scene too.

You also don't need to parent objects to the camera for UI. Not sure how it used to be in the past.


It lags very slow on macbook m1 pro 14 inch chrome


Hey all, if you’re willing to try again we just put out a fix that dramatically helps performance on some machines. Since this was a port to WASM from a native game engine, it turns out the issue was in our frame rate limiting code which wasn’t playing well with browser timing APIs: https://raverie-us.github.io/raverie-engine/


Crashed my machine. I didn't know they could set web workers to realtime priority.

Had to reboot.

Lame.


Mess around - Find out. Rule #1 in the industry, #1 is do not anger Linux wizards on a Friday. They will spend ever single waking hour until Monday morning, nuking your business plan, and reverse engineering your product, and make no mistake: These are a large group of some of the most talented programmers on the planet. They have skills, skills which are extremely dangerous to people like you.


Performance is definitely an issue here, this is on an M1 MacBook with Chrome and it's quite choppy.


If you’re willing to try again we just put out a fix that dramatically helps performance on some machines. Since this was a port to WASM from a native game engine, it turns out the issue was in our frame rate limiting code which wasn’t playing well with browser timing APIs: https://raverie-us.github.io/raverie-engine/


This is awesome! What would you say are the primary differences between Raverie and PlayCanvas.com?


I haven’t tried PlayCanvas yet, but it looks great. I think the primary difference is that this engine is a learning tool primarily, and doesn’t have any intention to compete with bigger name engines.


Looks great. and it even works on mobile phone. But it is a little bit slow on my mac Chrome.


This is amazing. It's working fine with firefox on my old machine with Win10.


Tinkercad and scratch were never tied to a browser! Why go with wasm?


I’m fairly certain the native version of Scratch is just a browser with Scratch running in a web view. I’d also mention that WASM isn’t tied to a browser either (despite being called WebAssembly, it’s now widely used outside of browsers). You can in fact make native executables now from a WASM binary.


That's certainly an innovation. Scratch didn't use to involve browser-play at all.


Seems great, would never use a non-native app to develop games.


Great. Pretty incredible what can be done with WASM.


It's been sitting on "downloading runtime" for a very long time.

That makes me think this must be a challenge to host. What are the pain points to hosting it?


The runtime is currently about 37MB, and when served with gzip it’s around 11MB compressed. Once it’s downloaded it should be cached and quickly startup. I think a lot of the challenge is going to be spent getting the binary to be smaller, or potentially breaking it up into separate individually loadable parts.


How do I add WASD controls to the camera?


That's something you'll have to code up yourself, however to get you started this script checks for pressing the W key and moves forward:

https://pastebin.com/8kMTCu3Y


Hugely helpful, thanks! I really did give it a few minutes and look for docs/examples before asking. I promise!


Also hopefully these docs can be of use. They're a bit of a mess as they were never properly ported to GitHub wiki: https://github.com/zeroengineteam/ZeroDocs/blob/master/getti...


What’s Digipen’s stature now days?


Working great so far on Firefox!


DPI is not correct on MacOS


Thanks for this info! I had a feeling that we weren't handling DPI scaling correctly, but don't have a Retina screen to test it on.


Just letting you know we put in a fix for DPI!

https://raverie-us.github.io/raverie-engine/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: