Hacker News new | comments | show | ask | jobs | submit login
Doom 3 BFG Technical Documentation (fabiensanglard.net)
53 points by basil 1540 days ago | hide | past | web | 9 comments | favorite

A lot of the article is about memory bandwidth vs. cpu computation. In Doom 3 BFG, as opposed to the original release, a lot of things were recomputed instead of stored in memory.

This begs the question, was it really faster to store that stuff (e.g. skinned meshes, shadow volumes) back in 2004 when Doom3 came out? Has the hardware really changed that much in the 8 or so years between the original and the BFG releases? Or was this another case of premature optimization and doing things the way they were always done before?

In addition to memory bandwidth vs. cpu throughput, the number of cores in your average consumer cpu has gone up from 1-2 to 4 (or "8 cores" with hyperthreading). Reading between the lines, this was a motivator for the changes. Sharing memory between the cores is expensive and clearly some of the changes were made in order to reduce sharing between the CPUs.

So if any of the Doom 3 coders are reading this, can you give some insight on these changes? :)

I don't know about the specifics of Doom 3's developer decisions but I know that when it comes to CPUs, performance has really skyrocketed.

A low-end desktop Sandy-Bridge i3 CPU running a single-threaded task on one core is almost 5 times faster than a Venice Athlon 64 singlecore that was released in 2005 if memory serves right. When you factor in the second core and HyperThreading (if it helps the workload) it wouldn't be unreasonable to claim that performance has increased by an order of magnitude or more.

On the other hand, memory speed (both latency and bandwidth) hasn't changed much in that timeframe. That means that relative to the speed of the CPU, it takes much longer to access memory now than a decade ago, and so stuff that made sense to cache in memory back then will probably be faster to recompute whenever needed nowadays instead.

Maximum theoretical bandwidth may not have increased at the same pace but modern memory controllers are a lot more efficient. I'd wager that the CPUs are kept adequately fed even without extremely fast RAM like on GPUs.

When he mentioned Doom 3 coming out 10 years ago, I couldn't believe it.

But I looked, and holy crap: released 2004. I don't know why, but that makes me feel particularly old.

at first i thought they had written an actual tech spec of the weapon BFG!

Heh, me too. Now, after having read the entire technical note, I wish that was the case.

That document exists too:


The background of that website reflects well the year it was written (1999).

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact