Hacker News new | past | comments | ask | show | jobs | submit login
Why NASA Switched from Unity to Blend4Web (gamedev.net)
150 points by felhr on Aug 15, 2015 | hide | past | web | favorite | 46 comments

It's very simple: Unity was never designed with a target platform as radical as HTML5/JS/WebGL in mind. It's stuck with an older version of the Mono runtime for its scripting language, and they've pulled off some crazy tricks to get it to work on WebGL: Mono IL code is cross-compiled to C++, which then gets cross-compiled again to JS/WebGL using Emscripten. That it works at all is nothing short of a miracle. However, compatibility and performance will be really hard to get up to a decent standard, especially when compared to engines that were built from the ground up for WebGL, such as Blend4Web.

Long story short: if WebGL is your primary target platform, Unity is not exactly the first place you should look.

Quite true.

If one main goal is WebGL then the best option is to adopt a WebGL native engine, instead of trying to cram a native experience into the browser.

It depends on the functionality you need. Unity does a lot more than lightweight WebGL engines do.

Of course, if a lightweight WebGL engine is enough - and it might well be the case for this NASA app, which doesn't need sophisticated AI, physics, scripting, effects, etc. - then that is the better route.

There are also middle grounds between a native WebGL engine and a full native engine like Unity. There are lightweight native engines, like Cube 2,


and there are even lighter than that. There isn't a dichotomy between native and compiled game engines. In fact, some native WebGL game engines use compiled portions, like PlayCanvas which uses ammo.js, a compiled native physics engine,


Someone calling Unity "sophisticated" makes me laugh. It's a poorly implemented bugfest.

I doubt WebGL will be a serious competitor to any platform 3D engine for at least another decade, probably two.

I think you're conflating low level rendering API (WebGL) with Game Engine (built on top of low level rendering APIs such as WebGL, OpenGL ES, OpenGL, Direct3D).

WebGL does have to do some validation, but most implementations are built on top of ANGLE, which for the most part is _NOT_ an emulator. It does have slow paths for things that exist in WebGL such as the TRIANGLE_FAN drawing mode that don't exist in D3D11. For an interesting read, might I sugguest: https://books.google.com/books?id=6crECQAAQBAJ&lpg=PP1&pg=PA...

I exported my home from SweetHome3D into Blender, then into HTML5 using blend4web and it looks surprisingly good both on my fresh $100 smartphone and on my $600 notebook. I just added another source of light to make shadows softer. I reading book about Blender because I want to add walkability to home, but I have no idea how. :-)

Sounds like you really love your home :-)

I hoped that I will start building of my home in 2014, but revolution and war with Russia delayed it. I bought plan of "Starboard Solar home" built in Boulder, Colorado 30 years ago. It uses Trombe wall to heat and/or cool home. I will also use G.R.E.B. technology to build "Passivhaus". Unfortunately, I spent all my savings at revolution, so now I am waiting until I will have at least $15K in savings to build enclosure, so I am spending my time to make building plan detailed and SweetHome3D is perfect tool for that, except that it is hard to share 3D model with others. Export to HTML5 solved my problem: I can just display home levels on my cheap 5,5" smartphone or share it using Google Drive.

IMHO, export of 3D models to HTML5 is superb technology and will be used everywhere. For example, clothes shop can use it to display clothes on model of customer, in dynamic (walking, dancing, running, etc.). Car shop can use it to provide virtual test drive. Constructions shop can use it to let customer practice with tools and materials. Anybody willing to start SaaS startup in that area? I am working for outsource firm, so I cannot start a software startup, because I will be fired immediately and I have not enough savings, but I can be hired. I have more than 25 years of experience in programming and more than 30 of successfully finished projects (and 2 failed), I participated in few startups (e.g. Bazarvoice), I saw my code spread, e.g. in BashFAQ, etc.

Just saw this comment now. Interesting to hear your story. Do you have a link for your home HTML5 model?

In most cases I believe time estimates for technological advance to be far too short, but in this case I believe the estimate to be far too long. We've already got decent 3d engines running in the browser - it's not theoretical. With the current pace of web tech advancement and the advent of web assembly, it could easily happen within five years.

In terms of presentation, you're dead on. In terms of tooling and development environment, not sure that that's going to be.

webGL is very low level, so you should be able to get good performance. But the problem I think is that the specifications/standards move too fast. If you start to build a game with the latest technology today, those technologies will be deprecated once your release the game :P

This really isn't the case on the web. There are no "take backs-ies".

The article puts too much blame on Unity and not enough blame on browser vendors.

This is hinted at in the article:

> Browsers are the programs which eat all of a computer's free memory, and the half-finished Unity WebGL build often causes crashes and closes browser tabs (especially in Chrome).

The main problem is that Chrome has deprecated NPAPI before it is capable of running content like Unity well.

If you look at Unity's forums, you can see lots of posts talking about Chrome-specific issues, for example



As those threads say, many Unity games run well in other browsers, but since Chrome is dominant in the market (over 50%), poor Chrome performance makes Unity look bad. But this isn't Unity's fault.

Chrome decided to deprecate NPAPI now, before it has a good-enough solution for running large asm.js codebases. For example, on the Massive asm.js benchmark, Chrome is well behind both Firefox and Edge,


I want to be clear that I fully intend to spend my weekend with my family, and not arguing this with you. So, I'll avoid rehashing any of our past discussions regarding technical concerns with asm.js. However, as the person responsible for Chrome's NPAPI deprecation, I just want to set the record straight regarding Unity.

Unity previously had a high quality NaCl port (long before asm.js ever existed) and I fully expected that they would continue to support it. Plus, long prior to NPAPI deprecation the Unity NPAPI plugin was on our security blocklist due to a rash of vulnerabilities. So, I don't know why Unity chose to drop support for NaCl, but it occurred well after they became aware of our plans to remove NPAPI. And if you want a public record, you'll notice that Unity 4.3 was released (without NaCl support) months after our NPAPI announcement:


> However, as the person responsible for Chrome's NPAPI deprecation, I just want to set the record straight regarding Unity.

Why? You have regressed a feature that your users want. I realize that plugins aren't sexy, but Unity is something that both developers and users want. Entire development studios are based around creating online games in Unity that have active user bases who enjoy those games.

For better or for worse you have removed the ability for people to enjoy using a part of the web. But hey, an idealized vision of software perfection won out over what users wanted, so you got that going for you.

I'm a bit bitter that those in charge of browsers have en-masse decided that being pragmatic is worthless in comparison to creating idealized castles in the sky. This just as much applies to Mozilla as it does Chrome. Mozilla's response to why they won't implement Pepper is just as "pie in the sky, who cares what users want" as Chrome removing NPAPI is.

Plugins are something that users want. Ignoring how Adobe has managed to completely drive Flash's overall quality and performance into the ground, plugins are useful. They may not be useful to Developers who are doing Serious Developer Stuff at Serious Companies, but for a lot of users, my mother, my grandmother, my nieces and nephews, they are very useful.

On the other side, browser developers have content developers as customers. Those content developers prefer alternative toolchains. Flash and Unity have far better tool chains than WebGL does. Debugging problems is far easier when you don't have to go through multiple translation layers. And less layers in the stack means less places for something to go wrong. If you are writing against Flash then Flash may have a problem. If you are writing against 3 different browser's implementation of WebGL and/or ASM.js you now have 3 different sets of bugs and implementation quirks. (Not that Flash works the same across all browsers, but at least it is from one vendor!)

But all of that is thrown away in the name of "web standards".

Users care about what works. As developers we have a responsibility to make users happy, our own sense of happiness is sort of not the primary concern.

> This just as much applies to Mozilla as it does Chrome. Mozilla's response to why they won't implement Pepper is just as "pie in the sky, who cares what users want" as Chrome removing NPAPI is.

This seems like an unfair comparison. "Remove existing functionality that we no longer want to support" and "support a large technology with a single implementation and no specification" are not even in the same ballpark.

Here's some feedback from a game developer working on a game that runs in a browser plugin (not Unity) and also has a standalone client:

The 1.5 year notice you gave before depreciating the NPAPI in Chrome was way to short. 5 to 10 years would have been realistic to allow existing games to reach their natural end of life. Both NaCl and HTML5 are not appropriate porting target for an existing, complex PC game that's already running for several years. They are only a realistic target platform for starting new projects. What we do now is simply moving our users out of the browser into the standalone client, thankfully our users don't mind that much since most are 'hardened' PC players that don't mind the scare popup dialogs browsers put up when downloading a native installer. But it can hardly be in the interest of browser vendor to move gamers out of their platform I would think.

And for NaCl vs HTML5:

- NaCl was never allowed in the wild, only through the Chrome app store

- PNaCl simply came too late and had very long compilation times at first start, this has become better but is still a lot slower than FF's asm.js AOT compile pass

- performance differences between PNaCl and JS on Chrome are pretty much negligible, especially for a 3D game where WebGL overhead is much more important

- emscripten puts much more effort into supporting standard APIs for games then NaCl (e.g. SDL, glfw, OpenAL, ...)

- from personal experience, bugs in Chrome on the HTML5/JS side are fixed much more quickly than bugs on the NaCl side, very likely because the HTML5 side has much more resources available

From my experience, the PNaCl implementation in Chrome doesn't provide any real-world-advantages over the HTML5 way, and both platforms are a 'hard' porting target. Of all existing platforms (iOS, Android, OSX, game consoles), the web is definitely the hardest to port an existing PC game to (starting from scratch is much easier though).

Not sure what I did to deserve that hostility. I respect you and your work, and I agree that deprecating NPAPI is a good thing.

I don't know anything about why Unity stopped supporting NaCl.

But the fact remains that Chrome has left Unity no good option. Chrome's asm.js performance on Unity (and large asm.js codebases in general) is lagging. NPAPI is gone (again, a good thing by itself). NaCl exists, but would require Unity to maintain support for a platform only for a single browser.

It's not hostility. It's just that you're a very passionate Mozilla employee and a very passionate creator of asm.js. And from past experience I expect that you'll be willing to continue a debate longer than I can. Plus, you already know that I have reservations about asm.js, and it's just not worth rehashing that debate (accepting that WebAssembly appears to be heading in a direction that addresses most concerns, so I should offer well earned kudos on that front).

Getting back to your point, pragmatically speaking, maintaining a port is a distinction with less difference than implied. First, any reasonably complicated NPAPI plugin needs to handle a range of browser-specific quirks, meaning that significant porting is unavoidable even if it's bundled into one binary. Second, Unity supports ActiveX in IE, so they're clearly already comfortable maintaining a browser-specific port that doesn't offer any of the security or platform portability benefits of NaCl.

Outside of that, there's just the one-sided nature of the argument. As a creator of asm.js you are obviously a huge proponent. However, you must appreciate that not everyone perceives it the way you do, and not everyone agrees with the path you chose for adding support into Firefox. So, the technical burdens on another browser are just not the same, and in the specific case of Unity in Chrome have to be balanced against the fact there was already a tremendous past investment in a high-quality implementation.

So, no, I completely disagree with the framing that "Chrome has left Unity no good option". Independent of any work in Chrome on better supporting asm.js, the fact is that Unity had at least one other good option. So, I'm sure Unity had their reasons for the path they chose, but it was and continues to be a very explicit choice.

I think I see where you're coming from. Overall, I think it's hard for us to estimate how much effort it would take for Unity to support another platform. Perhaps I was overly pessimistic when I said supporting a platform just for a single browser is too burdensome; perhaps you were overly optimistic in saying that maintaining their existing port would have been a good option. Only Unity knows the answer. So I admit you might be right on that point.

But the more important thing is that I disagree on the effort it takes to optimize asm.js. As you can see in those forum posts, out of the 4 major browsers, mainly Chrome is presenting a problem to Unity developers. Yes, it takes some work to optimize huge compiled codebases well, but Google has talented engineers and massive resources. I refuse to believe that Chrome cannot match the results of the other 3 browsers - it would be insulting to the v8 devs to assume so. Since the ability is there but the results are not, I can only guess that the Chrome developer's focus is elsewhere.

And of course there are many other important things to optimize on the web. But Unity and other high-quality 3D games are very important too. Chrome is holding back this part of the web, right now - I hope not for long.

I think you're reading something different than what I intended. I don't work on V8, and what I know about their priorities is no different from what they've stated publicly. That is, the V8 team wants to improve asm.js performance by improving overall JS performance, and not special case behavior just for asm.js. I know that's not the route you took, but that was my point about "not everyone agrees with the path you chose" with Firefox. And regardless of what either of our personal views are, it's hard to objectively argue that the V8 team is wrong in focusing on areas that positively impact the overwhelming majority of their users.

Yes, and I think it's fine that v8 took that route - it's actually the route that I recommended that Firefox take. It is also the route that JavaScriptCore is taking. So I completely agree with you that the v8 team's approach is reasonable.

But they haven't optimized it well enough yet. The main issue Unity (and other big asm.js codebases) are seeing is OOMs and crashes in Chrome. The v8 and JavaScriptCore approach can avoid those, by not compiling everything at once, more efficient in-memory data structures, and so forth. This helps overall JS performance as well. It's just engineering work that needs to be prioritized and done.

(AOT, as in Firefox and Edge, does have an advantage in startup speed that I think v8 and JavaScriptCore will have a hard time approaching, but that is not a blocking issue the way that crashes and OOMs are.)

> As those threads say, many Unity games run well in other browsers, but since Chrome is dominant in the market (over 50%), poor Chrome performance makes Unity look bad. But this isn't Unity's fault.

If your product runs poorly on the dominant market platform, then that _is_ your fault. Especially as Chrome provides source, a reasonable community centered around that source, and strong developer tools to allow third parties to achieve superior performance on their browser.

It's understandable, but the Unity developers made a guess as to the future state of the market and that guess was wrong.

Unity's WebGL is still officially marked as being experimental, so it can't be expected to be a perfectly smooth experience. If Chrome uses more memory than other browsers, it is definitely not Unity's fault. Having access to the source code won't change anything. Chrome is behind the competitors right now when it comes to memory usage with WebGL, but that may change even in the near future.

I agree, in the earlier days of chrome I was not a fan because of the performance it had with the flash player. Now is kinda the same story with unity... this plugin is heavy, well lets cut it. We should protect unity and stop using Chrome I say.

Just a quick note: NASA is not a unified entity marching in lockstep. This was one org-unit's decision made for their own work. I work with a lab at NASA GRC that uses Unity engine pretty extensively for scientific visualization.

We have a product which used to run on Unity Webplayer. After the Chrome NPAPI deprecation we lost about 50% of our users. WebGL wasn't done and still isn't really working. Our experience is the same as described in the article. Memory usage and multi million lines of compressed JavaScript will cause most of our users computers to freeze or lag out. Any user with less than 4 GB and on Chrome will have a pretty bad experience. FireFox is far ahead here and runs much better.

Edit: Direct to one of our games using WebGL and websockets. http://www.kogama.com/games/play/68818/?webgl=1&da=0

That's pretty cool. So that's not the Unity webplayer, so what is it? Good draw distance. Crazy stuff in there!

Edit.. The loading experience was a bit clunky for me btw. I'm assuming it's not the Unity plugin you're using here, but I think Unity would have done better with loading!

Didn't formulate myself very well. It's the Unity WebGL export using WebSockets for the network component.

Good. Anything that goes away from Unity is good. I dislike Unity because they refuse to make their web player for Linux. So several Newgrounds games don't work for Linux users due to Unity.

The Blend4Web Mars Rover demo here works perfectly for me in Linux btw, and guess what, the linked old Unity version says "unsupported platform", so, thank you NASA for thinking about more users :).

Unity's goal is to improve their WebGL builds, such that the experience is basically what you got with the WebPlayer, with the exception that no PlugIn is needed. This means, you are going to get the Unity WebPlayer experience on Linux thanks to WebGL.

Blend4Web homepage with nice demos: https://github.com/TriumphLLC/Blend4Web

What about Unreal 4? I read that they support HTML5 pretty decently by using Emscripten. I really do not know what is their current status on that subject.

It's a push to call the Unreal 4 support decent. Unity at least have it as part of their product feature set. Unreal seem to do it only as a demonstration piece.

The problem with taking a engine like Unity or Unreal and compiling to javascript is that 1) they end up with very large blobs of javascript (Unity is around 5MB of JS for the webplayer alone) 2) It is impossible to optimize, debug or inspect any of that code in the target setting. Neither of these engines have ever demonstrated code running on mobile browsers for good reasons, they can't.

Meanwhile, engines designed specifically for the web, for example, PlayCanvas [https://playcanvas.com] let you create content that works on every device, down to the likes of the iPhone 4S and comes in much smaller download sizes. e.g.

http://tanx.playcanvas.com (~1MB)

http://mmx.playcanvas.com (~3MB)

http://swooop.playcanvas.com (~10MB)

All of which work on mobile browsers. The future of Web 3D is not compilation of desktop game engines. It's engine's designed to be web (and mobile) first.

If you can compile it, runs runs horribly horribly slowly. Sometimes the compiles work... some times not. Some things work, others not, but obviously thats not documented anywhere, its just trial and error.

Dont bother. Its lightyears worse than unity, and thats saying something.

It really would be interesting to know if anyone took it beyond the (amazing) demo phase.

To be fair, the previous Unity version he links to is obviously a beta design. It actually says "beta" at the top. For me, the Unity version loads and runs smoothly.

The Blend4Web version seems more polished because it is more polished. They've toned down the lighting, completely changed the interface, and reduced the map area.. you're confined to quite a small space. The Unity version you can drive further, only the ground texture needed work. Zoom out though and you'll see why the ground texture is more blurry than the Blend4Web version.

I'm sure Blend4Web is great, but what isn't great is that Blender is required!

But isn't Unity required when using the Unity plugin? Blender is at least open source!

Yeh true. I was responding more to his criticisms of that version based on the quality of the product...

Bezuhoff says: "Honestly, it looks like an unfinished game. The scene loads slowly (especially the terrain), functionality is primitive – you can only drive, the overall picture is of horrible quality."

I think that's a bit unfair considering it only looks like an abandoned game because it is an abandoned game.

I wonder how big the deliverable is compared with Unity, UE4 or other similar solutions? I am a long-time Blender user, but I have never used this in any of its previous versions. It seems it supports a lot of the things from Blender - materials, lights, and others. I will have to see how much can be directly used in the transition. I am not very clear on GPL. If I use it for my company, but I choose to release the source, do I still have to pay for a commercial license or is the open source license adequate? I know the commercial license comes with a lot of support material.

Is nasa doing this stuff in-house? I've never thought of nasa as experts on software beyond control systems. Have they brought in some gaming people to help with media relations?

Was it that necessary to keep it in the browser? Why not just give it to the user as a normal Unity build download instead of redoing everything to keep it on the browser?

Hmm, about networking, could not WebSockets be used for client-server comms?

WebSockets work for Unity WebGL. Not sure if there is an official implementation, but we use it in a multi-player game. We specifically run Photon Game Server with WebGL and WebSockets.

Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact