Hacker News new | comments | show | ask | jobs | submit login
Carmack: Doom 3's engine ready for open-sourcing, awaiting 'OK' from legal (engadget.com)
220 points by barredo 1788 days ago | hide | past | web | 90 comments | favorite



I love how Carmack and Id since its inception has not only defined the cutting edge of game technology, but also repeatedly utterly disrupted industry business models (pioneering shareware distribution, the 'free demo' model, engine licensing), all the while making an absolute killing doing so. Most inspirational hackers out there, imho.


He didn't pioneer shareware distribution or the free demo model. This is a good write-up on the history of shareware:

http://www.loonygames.com/content/1.24/feat/

(Has this been posted to HN before? Good stuff in there.)


When I think of shareware I think of id Software, Apogee, McAfee and PKZip. It might not be official but to me that is as close to pioneering as it gets.


Can you elaborate on why you think of Apogee? When I think of Apogee I think "Gee, I wish my Apogee product worked on Windows"


Different Apogee. Presumably you are thinking of Apogee Electronics, he is talking about Apogee Games, which more or less* became 3D Realms (of Duke Nukem infamy), and who back in the day published a lot of the classic shareware games from other developers as well, including id's early stuff like Commander Keen and Wolf 3D.

* "more or less" because the business structure is more complicated than that and there was still an Apogee Software alongside 3D Realms, but for all practical purposes, what was Apogee Games became 3D Realms and then died due to the Duke Nukem Forever project.


Ahh yes I was thinking of Apogee Electronics, thank you that clears things up.


Depends on the meaning of "pioneer" I guess. Certainly Doom was the first real shareware hit that matched sales numbers with commercial titles.


Yes , exactly.

Whilst I have been so-so on id games since Quake 3 (Actually Quake 3 was excellent but it was just so overshadowed by the jaw-droppingly awesome Unreal Tournament) I have huge amounts of respect for Carmack as a programmer and also for the way id generally do business.

I remember scanning through the source code for the original quake some years ago (not really reading in depth) but I couldn't help but assume my download was borked and I was somehow missing 90% of it , the core of the engine is tiny, I get the same reaction when I look at the Linux kernel.

I've written business systems that are orders or magnitude larger in terms of code than probably all the id games put together but have nowhere near the awesomeness and variety in terms of functionality.

I wonder whether this is just how he programs from the get go or whether he writes something much larger and clunky in the beginning but is just ruthlessly disciplined at rewriting things to make them smaller and more efficient.


I think Quake's code represents a design methodology you almost never see in the wild. Essentially, Quake's code operates on the same principles that Apple's physical interface designs do (or perhaps singular programming language designs like Scheme or C).

Carmack, as dictator of id code (and thus id design) had the ability to say "we are not adding the following feature to our game because I don't like the compromises it will force in our technology and, thus, in other features we want to add down the road." It's basically as simple as that.

In almost all other organizations, someone holding the pot of money will trump the programming overlord at that point and say, "I don't care, XML is the buzzword this second, and all objects in the game need to be serialized to XML as web services - that's what the market is clamoring for right now" (or whatever the buzzword du jour is). And over time those political technical decisions add up to so much cruft that adding any new thing becomes a technical, jenga-style nightmare.

Not so with id's engines. I actually worked on beta versions of both Quake 2 and Doom 3 code bases while those games were in development, and one striking thing about getting updates from Carmack was his ruthless willingness to eliminate working, finished features / code if he didn't think the complexity or dependencies they introduced were worth whatever feature they added in the overall technical context.

Quake is what you get when a genius programmer who has mastery of his knowledge domains leads. Institutionally, there aren't too many other circumstances where similarly talented and sure-footed and, well, fundamentally, _right_ programmers get to specify requirements like that. I think the comparison to the Linux kernal is apt.


Interesting, I think you are right. When I first looked at the Linux source I assumed it would also be a monolithic jenga style mess since it was developed by so many different people with different motivations but I guess it must be a testament to Torvald's management/leadership skills.

I suppose I am probably jaded because I work developing bespoke corporate software with relatively few users compared to Quake. Therefor each user has huge individual sway over the functionality/design even if they don't really get the nuaces of building usable software. For example they will request additions of buttons in nonsensical places because it helps their personal workflow but then somebody else will come along and use it then complain that there's too many buttons and it's too complicated "can we please make this more like the iPad?"..

I wonder if there are any pieces of bespoke corporate software that aren't in fact giant piles of bloated dogshit?

All the books on this subject recommend doing thorough requirements analysis but that doesn't really work when you have to deal with people who can't really articulate what they want and actually have no idea what might suddenly become important in the future. Plus you have to deal with corporate politics of people with different roles and places in the heirachy who want and need diametrically opposed things, people will also often have good ideas but they won't tell you them until it's too late because they are worried about contradicting their boss.

It would certainly be refreshing to work with a Carmack but I doubt that I would have the focus or brainpower required to keep up!


This is very interesting. I (yesterday, coincidentally) finished reading Masters of Doom, the book about id by David Kushner. While generally a good read, it had a very fanboyish tone towards both the Johns. It really portrayed Carmack as absolutely superhuman when it came to coding. Given the tone of the rest of the book I'd assumed he was exaggerating — it sounds like he wasn't after all, about that at least.

I don't know C so I can't, let alone appreciate, the source he's released to judge for myself.

Thanks for sharing.


This is amazing. How come ID can consistently release top quality engines as open source and other games companies cannot?


Most companies probably can, they just don't care enough to do so, seeing it as money spent on a last-gen engine that results in no new revenue streams.

I think id does it mostly because Carmack personally pushes for it. However it arguably makes some sense for them, relative to other game companies, because they've been particularly identified with the "genius hacks to make stuff work" aura, and every source release produces a new wave of discoveries/press about crazy things id did in the guts of one of their engines, which solidifies that reputation. Plus it's probably good for Carmack getting historical credit to put the techniques out in the open once they're no longer most-recent-gen.


Carmack also stated that these open source releases help them in the long run. E.g. they were able to port Wolfenstein 3D and Doom (the old one) to iOS much quicker by basing it on modern versions of those engines, created by the community.


Yes, and it also helps them keep the games alive. If you have the game data (CD, DVD, Steam, whatever), you can play every id FPS game up to Doom 3 in almost any platform you want, today. And you will probably be able to do it 20 years from now. I think it's fantastic.


Yes, this and the ability to build games on an open-source engine gets them a lot of goodwill, building a brand people love.


Yeah, there's a lot of goodwill generated by them, too. I wouldn't be surprised if this also helps them with recruitment. I mean, this helps create more talented game devs by giving them stuff to hack on.


> every source release produces a new wave of discoveries/press about crazy things id did in the guts of one of their engines

wow, that sounds interesting! any links?


Check out "Graphics Programming Black Book" [1] by Michael Abrash [2].

If you have not read his work, you are in for a treat. He has a wonderful writing style that is both engaging and illuminating.

There are a number of chapters [3] in the book that specifically discuss technology and development related to Quake.

[1] PDF: http://www.gamedev.net/page/resources/_/technical/graphics-p...

[2] http://en.wikipedia.org/wiki/Michael_Abrash

[3]

Chapter 64 - Quake's Visible-Surface Determination

Chapter 66 - Quake's Hidden-Surface Removal

Chapter 67 - Sorted Spans in Action

Chapter 68 - Quake's Lighting Model

Chapter 69 - Surface Caching and Quake's Triangle Models

Chapter 70 - Quake: A Post-Mortem and a Glimpse into the Future


Here's an example:

Carmack's Square Root implementation, which runs four times faster than the assembly language instruction.

http://www.beyond3d.com/content/articles/8/

http://www.codemaestro.com/reviews/9


Bear in mind that this is inverse square root (a misnomer; it's really 1/sqrt, or the reciprocal), not just the square root. Square roots are decently fast as provided by the processor, but not when combined with the division needed for the reciprocal; combining the two is why FastInvSqrt is so fast.


It was faster before 2000, but is it still faster now?

On my computer it is indeed four times faster, but the marvellous method is on its own again four times slower than SSE, using the rsqrtps instruction, with the advantage of a way lower maximum error.

To take the square root of an array of 1024 floats:

Naive sqrtf() CPU cycles used: 48160, error: 0.000000

Vectorized SSE CPU cycles used: 2970, error: 0.000392

Marvellous Carmack method CPU cycles used: 11330, error: 0.002186


No, it's not faster now and it's no longer really used probably (don't know if smartphones, tablets, various consoles, etc. offer access to any equivalent of rsqrtps), but it was brilliant at its time (well, still is, it's just not very useful).

Remember, Quake was released in 1996, SSE came out in 1999.


Hmm, that seems a bit misleading. The SSE implementation will of course be much faster than a non-vectorized implementation... it's working on 4 floats at a time.

A vectorized version of the marvelous method would be a fairer comparison. If your data is not very vectorizable and you don't need high precision, it's still fairly marvelous.


It is your comment that is (mildly) misleading.

The SSE version is 16x faster than the x87 FPU version and almost 4x faster than the marvelous method. Further, the marvelous method is a loop and is much more resource-hungry than an instruction that goes away for 5-10 cycles (reciprocal throughput is now 1 cycle) but leaves you with most of your execution resources to do other useful work.

Vectorizing the marvelous method is very likely to be painful, as it has the treatment of a value as both a float and an int - while the xmm registers are dual-purpose operating on one successively as a int and a float causes some extra latency. The use of bit shifts and float operations require these cross-domain transfers...


Would you mind posting your bench framework for this?


I didn't make the code, nor do I want to take credit for it. I only changed a few lines to make it compile. The code explicitly say one can redistribute the code as one wants, so here it is

http://pastebin.com/EnJi86hF

Would you mind posting your results here? I'm quite interested if this has the same pattern everywhere.


Here's what I get:

Input array size is 1024

Naive sqrtf() CPU cycles used: 54514, error: 0.000000

Vectorized SSE CPU cycles used: 900, error: inf

Marvelous CPU cycles used: 21514, error: inf

I'm not sure how to interpret the errors. I had to make a few changes myself for this to compile on my machine.



Scorched earth. If id release an engine at a certain tech level, they put serious competition in the way of other would-be licensors of engines at or about that tech level. That stops smaller software houses from breaking through that tech level to compete at the higher level.

I don't know if it's a conscious policy or not, but it's interesting nonetheless.


I see what your saying but I don't think so.

They usually dual license the old engines , distribute 1 under GPL and another under a commercial (albeit relatively cheap compared to the newer engines) license.

The GPL license can only be used to create games that are also GPL licensed (so stuff like alien arena I think).

I suppose this could allow someone to start developing using the GPL'd engine and then switch to the commercial engine later once they had something ready to release.

I like to think it is more to do with carmack being curious as to what people will do with his creations.


I've found out thanks to other comments here that id don't plan to license their top-end engine outside their own publisher any more. I think that means this argument is moot.


I disagree: The GPL allows those smaller software houses to use the id engine without giving away their game resources. After all, id doesn't give away Q3 just because it gives away the Q3 engine.

As the game resources are what the players mostly associate with the game, you can still make and sell your own game based on a GPL'd engine without the GPL really cutting into your profits.


That's exactly my point. You can make and sell your own game, but you can't compete with id by making and licensing your own engine so easily.


Simple: because they want to and others do not.

Also because they started the good habit a long time ago and it definitely showed off benefits. (AFAIK, the iPhone port of the DooM engine depends on the open-sourced branch of the DooM engine)


You also have to remember that a lot of game companies do not write their engines. They either licence iD or the Unreal engine.


That's an interesting point. I wouldn't immediately think of game developers as using someone else's framework to build an application with, but a lot of them really are, just like those of us building business applications with someone else's framework.


Sorry for the missing reference, as I am typing on the Phone: ID does not license their Engine anymore and Carmack seems happy about it.

Also, compared to UE3, ID Tech 4 was not used in many games.


Interesting. Here's a ref, for anyone else: http://www.gamasutra.com/view/news/29886/id_Tech_5_Rage_Engi...

It's actually slightly different from that: "if you're going to make a game with id Tech 5 then it needs to be published by Bethesda"


This is largely due to John Carmack and his philosophy regarding OSS [1].

[1] http://en.wikipedia.org/wiki/John_carmack#Free_software


Because Doom 3 was released 7 years ago and the engine was probably in major development what, 8-9 years ago?

They've since moved on and their most recent game, RAGE, uses the fifth version of their engine, not the fourth.

They're basically open sourcing the engine that they're no longer hyping/selling, while they continue to use their modern engine for business.

How can they? I mean, I guess other companies just lets their engines die instead of open sourcing it.


Also it's not always up to the games company, their publisher would not allow them to release the engine.


How is Doom 3 a top quality engine? I would say CryEngine 2 or Frostbite 2 are.


Top quality != latest tech. Id tech 4 is a top quality 7 year old engine.


I think the success of Minecraft proves that the latest engine technology is not critical to the success of a game.


Minecraft allows for fully modifiable 3D environments which no other popular games have. So it is the latest technology, much unlike Doom 3.


Right I totally agree and it's extremely fun. I guess I was thinking that the speed and graphics aren't as visually snazzy but the fans (like me) don't care about that.


Would you say the same thing if it were the year 2030? Does Pacman run with a top notch engine? It was certainly optimized well for its time.


I for one am looking forward to reading through the unified lighting model code.


I wanna see how they did their shadow volumes. As far as I know, they did it without geometry shaders, which would mean that they had to do the shadow volume extrusion on the CPU. Or maybe there's some cleverness involved.


They probably used the CPU -- I remember reading somewhere that Carmack also wanted to implement a beam-tree (3D BSP) for faster culling of shadow surfaces, but I am not sure. I did some experiments with Shadows myself 10 years ago, on that era of CPUs and GPUs and the bottleneck for me was drawing the volumes in the stencil buffer.


I think this is also the first C++ version of their engine, isn't it? And by that the first C++ code released by Id.

I wonder how much C++ there is involved in the code.


I worked on Quake4 as a programmer for a bit over a year, though little of my work ended up in the final game.

The Doom3 engine is mostly C++. If I recall correctly, the renderer still looked more like a blend of C with some C++ slowly drifting in (which makes sense, given how good of a C programmer Carmack is), but the rest of the game code is very heavily C++, for some definition of C++, along with a scripting layer.

Having worked extensively with the Quake 2 engine previously, I really didn't like the Doom 3 engine, myself. It was radically less... pragmatic. Earlier id engines, to me, were masterpieces of concision. They weren't amazing feats of black box abstraction by any measure, and they implicitly assumed you needed a lot of domain knowledge to understand what they were doing, but they did exactly what they needed to do with a minimum of fuss. Features were included because they were used, generally with the smallest and simplest code and tool footprint possible.

Not so with Doom 3. Huge amount of code appears to have been written because someone prolific thought it would be lots of fun to write it, many features end up going unused or are in the way, and ultimately (because it's more typically software engineered in style), every feature requires about 30 times as much code to be written properly and has several extra unused layers of abstractions crufting things up. The game code (in C++) in particular bears an unfortunate resemblance to Windows MFC-style C++, with a giant 12 layer deep inheritance graph with everything inheriting from a few enormous base objects.

All of that to my eyes, anyway. But then, I really cut my teeth on Quake 1 and 2, and found that their concision and brevity really helped them stay out of the way when adding new, innovative features.

The funny thing about the code for Doom 3 is that, looking through it all, you'd get the impression that Doom 3 was a radically more complicated game that what you ultimately experience when you play the game.

The best way I could put it is this: Quake 1 and 2 and 3 were well made games that ended up being enormously fruitful for white box reuse. You couldn't do much with them without understanding fair bits of their internals, but they didn't actively resist being understood. Doom3 was built to be used in a more black box fashion (like the Unreal Engine), with all that entails from an architecture perspective. I'm sure others might disagree with me from a code ideology perspective, but it is hard to ignore how fruitful the Quake engines were for licensing, and how much Doom 3 sank like a stone.

The code bases make really fascinating studies in contrast about the consequences of different software engineering values, at the very least.

TL; DR: Quake 1 - 3 remind me of linux code, Doom 3 reminds me of MFC.


I have read most of the Q3 code and really like it. And as you said, not much (or any) overhead.

What you write about D3 (i.e. idTech4) makes me a bit sad. How comes it? Less influence by Carmack? (Because he still seems to be the pragmatic guy to me.) Too often, when C++ is involved, I see much too complicated code. Whereby I still have the meaning that you can really end up in having simpler code with C++. (Well but that goes into the usual C vs. C++ ranting discussion.)

Do you have any knowledge about idTech5 (or maybe even idTech6) which you can share?


I think by Doom3 id had reached a point where they needed a team of programmers, rather than Carmack + programmers orbiting Carmack. And so Doom3 represents their transition to dealing with all the software engineering / communication issues all of us on teams always have to deal with.

I don't have any knowledge of idTech5, although I would guess it goes further down the lineage of Doom3. id still has to find solutions to the problem of programmers working on teams, after all - no amount of Carmack genius is going to magically fix that all too banal (and inevitable) organizational challenge.

I had my fill of the industry by the mid-2000s and have been working on indie/experimental/violent educational games since, so that's roughly where my big iron engine knowledge fades...


Just look at the Doom 3 SDK. It is C++ -- they use their own sort-of inheritance / type-check implementation as well.


So, what platform are we gonna port this to? :)


WebGL?


Accidentally downvoted you -- fat iPad fingers, sorry! I'm not opposed to the idea, though it'd probably be a port to OpenGL ES (iPhone?) before WebGL.


WebGL is pretty much OpenGL ES as far as I know...


Yeah, that's what I'm saying. OpenGL ES + C (iPhone, Android, WP7?) is a likely earlier step than OpenGL ES + JavaScript (WebGL).

Do you disagree?


Nope, that makes sense to me too, I had originally misinterpreted your post as saying the graphics would need to be re-targeted and you were talking about everything else. :)


OpenGL ES + C won't fly on WP7. There's no OpenGL ES support (XNA is the exposed 3d API) and you can't write C as you're limited to the managed-only subset of .NET.


You could use LLVM to compile to .Net. But not sure how easy it is to wrap the OpenGL calls.


Not even worth thinking about... DX and OGL are mutually unintelligible :(


Actually, that's not true. WebGL on Windows is implemented using D3D (ANGLE converts OGL ES calls to D3D9) in most cases, and WINE has done D3D on top of OGL for years. They're very different, but translating between the two really isn't as big a deal as you'd think. (Just keep in mind that XNA isn't D3D -- it may backend to it, but it's a very different API.)


ANGLE[0] goes some way to solving that.

[0] http://code.google.com/p/angleproject/


nullified


JavaScript please!


Emscripten... oh wow if emscripten could handle this that could be an amazing tech demo for browsers.


Platform... not language.


Platform: the modern web browser.


Then it can join MESS and MAME:

http://ascii.textfiles.com/archives/3375


If anyone's interested in reading a book about the history of id Software, Carmack, and Romero I recommend Masters of Doom (http://www.amazon.com/Masters-Doom-Created-Transformed-Cultu...). I finished it last week and it was a great read.


I guess this will get _someone_ using id tech 4... doesn't look like it ever got much love from a licensing standpoint.

http://en.wikipedia.org/wiki/Id_Tech_4#Games_using_id_Tech_4


This is great news! That engine is I think the same one that powers the most excellent ETQW and the not so bad Quake 4.

Those run on Linux and that is awesome.


I'm pretty excited to play The Dark Mod for free (http://www.thedarkmod.com/)


It's unlikely that you'll be able to play TDM without a copy of Doom 3 for the foreseeable future. TDM reuses a number of Doom 3 assets and those are not included with the open source release of the engine.


(If you don't know, the Dark Mod is a Doom3 total conversion inspired by the Thief series, one of my all-time favorite games).

Me too! I just heard about the dark mod a while back and tried to get it working. I didn't want to go buy Doom3, because I don't like the game and I couldn't be sure whether or not TDM would work. So I tried with a pirated version and it didn't, so I'm stuck waiting for the open source release of D3 engine (and the subsequent "porting" of TDM).

(For some reason, PC games for me have a terrible track record of not working. I often try a pirated version of the game and if it works and I like it, I might buy it. If the pirated version doesn't work, I won't bother because experience tells me the legal one isn't probably better and I hate begging for refunds from Steam support).


You can buy Doom 3 on Steam for 20 bucks. Or if you wait a few weeks there's usually Holiday Sales on Steam, you'll probably be able to buy it for 5 or 10 bucks.

http://store.steampowered.com/app/9050/


Can't wait to see the awesome things fans will do with the engine - assuming it even gets through ID's legal dept.


Meanwhile you can go take a look at all the nice things that have been done with the Quake engines. Nexuiz, Alien Arena and many other open source games are based on the Quake engine.

It's fairly certain that id's legals will approve the release as they have done with all their previous game engines. The only problem might be if D3 uses some 3rd party libraries.


I believe one of the major issues will be Doom 3's use of the "zfail" algorithm, which Creative claim a patent on. Generally, Creative license this patent to game developers in exchange for the game supporting OpenAL (a net win for Creative as it means their sound cards can make the game run faster and more people will buy them), but I'm not sure what the terms on a source release would be. Take a look at [1] for more info.

[1]: http://www.doomworld.com/vb/doomworld-news/45868-doom-3-code...


Carmack has said somewhere (I can't retrieve the source right now) that an alternative, slower method will be used in the GPL'd codebase, effectively working around Creative's patent claims.


Oh this is just silly. Thankfully the patented algorithm is ridiculously simple, so someone can write an open source patch to use the better stencil algorithm. No doubt whatever derivatives will spring from the D3 source will use the single pass method.

While being at it, it shouldn't be too hard to use geometry shaders for the shadow volume extraction in the D3 engine. Hmm, maybe I should do it :)


I suspect this will be released, I doubt they'd get this far in announcing it to the public unless the remaining stuff was crossing Ts and dotting Is, but it is notable that there's a pretty big difference between this release and previous releases of their engine source.

Previously id was an independently owned company. Now they are owned by ZeniMax, a larger entity.


All the others have, so I trust him here too.


yeah but all the others did when id was an independent entity - it's now owned by ZeniMax Media


Is it odd that it is a picture of a tweet instead of a clickable thing?


woah hello HN!




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: