
Graphics Programming Black Book by Michael Abrash (2001) - emptybits
https://github.com/jagregory/abrash-black-book
======
Shish2k
"Is performance still an issue in this era of cheap 486 computers and super-
fast Pentium computers? You bet. How many programs that you use really run so
fast that you wouldn't be happier if they ran faster? We're so used to slow
software that when a compile-and-link sequence that took two minutes on a PC
takes just ten seconds on a 486 computer, we're ecstatic—when in truth we
should be settling for nothing less than instantaneous response."

Not sure if I want to laugh or cry at how 100% relevant this still is 20 years
later...

~~~
anaphor
I've had people tell me all of the extra bloat/slowness in modern software is
because it has more features, and because accessibility / localization is a
thing more often. I'm not sure I actually believe that's true. Maybe
accessibility and localization contributes to bloat/slowness, but that doesn't
mean it couldn't be much much more efficient and keep the same features.

~~~
Accujack
Present day programs aren't optimized because they don't have to be. No one
needs to gain a competitive advantage by squeezing the last bit of performance
out of hardware because for a long time we've had steadily increasing
performance levels and backward compatibility. Applications which need more
performance than the present level of hardware can deliver so badly that they
need to optimize to that level are very rare.

That's why Windows is the size it is nowadays... it's optimized for delivering
the same user experience on all hardware, compatibility, language flexibility,
manageability and user friendliness (don't laugh) instead of performance. It
essentially performs the same function that Windows 3.1 did, but it does it on
a far greater variety of hardware running a far larger assortment of software.

It's bigger and slower than it could be, but there are always trade-offs, and
Microsoft hasn't really had any competition in operating systems for
decades... Linux and variants have their place, but it's not on the desktop
for most people, at least at the moment.

~~~
codesushi42
I disagree. Software nowadays is atrocious because it's not optimized. It's
just that average users have become more accepting of long load times and
bloat.

Here's an example. In the mid to late 90s, gamers would complain a lot about
load times for CD ROM games. For some gamers, the load times were a deal
breaker. This was one reason Nintendo could get away with using cartridges on
the N64.

Fast forward to the present, and load times for many mainstream games is FAR
worse than the 1-2x speed CD based consoles of the 90s. And a lot of this is
due to sloppy, unoptimized code, e.g. loading assets. And yet no one bats an
eye.

Another example is the web. There are so many garbage, unoptimized sites that
load on my smartphone as if I were on dial up (I'm looking at you, Gawker).
Yes, those pages contain a lot more multimedia content nowadays, but they
still load at the same rate as a typical page in the 90s would have on dial
up.

And with Moore's Law now ending (ended?), optimization may come back in vogue.

~~~
bitwize
The N64 lost its console generation and solidified Sony as the market leader,
in part _because_ it used cartridges while everyone else used CD-ROMs. Way to
choose an example that disproves the very point you were trying to make.

As for load times today, most modern games are implemented as scripts and
assets for one of the big engines, usually Unreal or Unity. It's hard enough
to manage all the assets that go into a modern game; an optimized loader would
just make wrangling them all even more difficult.

~~~
codesushi42
_The N64 lost its console generation and solidified Sony as the market leader,
in part because it used cartridges while everyone else used CD-ROMs. Way to
choose an example that disproves the very point you were trying to make._

Yes, I realize that. But the point was that poor decision was not enough to
kill it, and the N64 was still a commercial success. A lot of gamers even
preferred cartridges back then because of they didn't require load time.

 _As for load times today, most modern games are implemented as scripts and
assets for one of the big engines, usually Unreal or Unity. It 's hard enough
to manage all the assets that go into a modern game; an optimized loader would
just make wrangling them all even more difficult._

That's not a real reason to not optimize.

------
tverbeure
This takes me back to the early nineties, when these articles where published
in monthly installments in Doctor Dobbs Journal and, later, PC Techniques.

Incredibly, these 2 magazines were stocked monthly at my local newsstand in a
sleepy suburb of Antwerp, Belgium, and every month around publication time,
I’d bike there daily to check if the next issue had arrived.

I learned a lot of good stuff in college, but I don’t remember anything as
exhilarating as this series, which eventually would lead to a career in the
computer graphics industry.

~~~
floki999
I had the same experience (Belgium and Italy) - I would add in the same
category the old Game Developer magazine, which had a regular series on game
physics and math. Still have a bunch of them. Those were the days :-)

------
ben7799
This is such an awesome book. I read this back in the day and loved every
minute of it.

I think there is a lot to learn from it still, even if you're highly unlikely
to be writing the same kind of graphics code.

The book has a great balance between: \- Micro-optimization - fiddling with
ASM and data structures, memory alignment and such, custom math functions that
lose precision in tradeoff for great acceleration due to hardware features

\- Algorithmic optimization - trying to do things in ways that are
mathematically faster (computational complexity)

I work in enterprise software, not games or anything touching hardware.. it is
depressing algorithmic complexity is ignored so often these days and if you
give an interview question it is often greeted with blank stares and there
seem to be college undergrad curriculums which don't even touch on it.

I'm in that camp that feels like software has gotten so inefficient in a lot
of cases that the user experience is no faster than 20 years ago. We have
acceleration & optimization for certain things but every day applications are
no faster than they ever were and when you have to use a web application stuff
is often a lot slower than a native app was 10-20 years ago.

There are still a lot of great college texts on computational complexity even
though trade publications ignore this stuff with a vengeance.

~~~
robmaister
From my experience (graudated in 2016), most interviewing is centered around
algorithmic complexity or at least regurgitating logarithmic complexity algos.

Potential hires still in or just out of school should have no problem
answering those questions, but a few years out and most people forget those
skills since most of the time the answer is to use an existing implementation
or find a way to avoid the problem entirely. All of the people I know with a
4-year CS degree learned all about that stuff in their data structures/intro
to algo classes.

I work in games and have had to both implement a few data structures on my own
(mainly specialized trees and graphs). I've seen them help performance a ton
and I've also had to scrap one or two of them because the naive implementation
was faster. Nowadays a lot of indirection means your processor is spending
most of it's time waiting on memory reads, while flat arrays can be loaded
into CPU caches a lot more efficiently.

------
dang
2017:
[https://news.ycombinator.com/item?id=14897512](https://news.ycombinator.com/item?id=14897512)

2014:
[https://news.ycombinator.com/item?id=8803883](https://news.ycombinator.com/item?id=8803883)

2014:
[https://news.ycombinator.com/item?id=7149973](https://news.ycombinator.com/item?id=7149973)

2013 (with cameo by Michael):
[https://news.ycombinator.com/item?id=6659279](https://news.ycombinator.com/item?id=6659279)

2010:
[https://news.ycombinator.com/item?id=1301086](https://news.ycombinator.com/item?id=1301086)

2008:
[https://news.ycombinator.com/item?id=119494](https://news.ycombinator.com/item?id=119494)

------
robmaister
I work in games doing mainly graphics work - it's amazing how many of these
concepts still exist and have been recycled in interesting ways. Well worth
the read if you're in my line of work.

For example, the concept of "sorted spans" in Quake is conceptually the same
as how "light culling" is done in deferred and forward+ rendering pipelines.
The first I'd heard of the technique was how Battlefield 3 used the PS3's SPU
to do light culling for 64x64 blocks of pixels at a time.

------
jefftime
This book is a great resource for thinking about how to optimize your code.
And while VGA programming is not as relevant as it was, I still found it
really fascinating to read about. Plus, the chapters on Quake are really
interesting to read

------
phtrivier
Questions for experts: are part of this "timeless" and still relevant today,
or is it mostly historical ? (Chapter titles like "Pushing the 286 and 386"
are a bit scary :D)

~~~
Crinus
Some parts are timeless, like this introduction in the first chapter about
optimization:

\---

Understanding High Performance

Before we can create high-performance code, we must understand what high
performance is. The objective (not always attained) in creating high-
performance software is to make the software able to carry out its appointed
tasks so rapidly that it responds instantaneously, as far as the user is
concerned. In other words, high-performance code should ideally run so fast
that any further improvement in the code would be pointless.

Notice that the above definition most emphatically does not say anything about
making the software as fast as possible. It also does not say anything about
using assembly language, or an optimizing compiler, or, for that matter, a
compiler at all. It also doesn't say anything about how the code was designed
and written. What it does say is that high-performance code shouldn't get in
the user's way—and that's all.

That's an important distinction, because all too many programmers think that
assembly language, or the right compiler, or a particular high-level language,
or a certain design approach is the answer to creating high-performance code.
They're not, any more than choosing a certain set of tools is the key to
building a house. You do indeed need tools to build a house, but any of many
sets of tools will do. You also need a blueprint, an understanding of
everything that goes into a house, and the ability to use the tools.

Likewise, high-performance programming requires a clear understanding of the
purpose of the software being built, an overall program design, algorithms for
implementing particular tasks, an understanding of what the computer can do
and of what all relevant software is doing—and solid programming skills,
preferably using an optimizing compiler or assembly language. The optimization
at the end is just the finishing touch, however.

~~~
ryandrake
Think about the software you work on and maintain day to day: how much of it
_runs so fast that any further improvement in the code would be pointless_?
Truly we have strayed far from the light...

~~~
Crinus
Well, i work in a AAA game engine that needs to run on consoles, so... :-P

(though i work mostly on tools nowadays but even then, optimization is
important - from my experience people wont tell you that the tool is slow, but
they'll _really_ like it if you make it faster, which is why i always dismiss
comments like "people like Electron/otherslowstuff, otherwise they wouldn't
use it" as way more often than not, people wont tell you about something being
slow and they'd rather get used to it, unless it _REALLY_ affects them in a
major way)

~~~
ryandrake
Sorry, when I said "you" I meant the average "HN you", who tend to have 4
abstraction layers and 5 third-party frameworks between their users and the
metal :)

------
l4r5
I still keep the original version. I bought this book with very little money
in 2001. Back then not many book covered algorithms.

I can hardly remember from todays perspective how it was looking after some
code snippets in books without google, github, stackoverflow, strg+f.

------
iconjack
The best part of the book is the part about The Kennedy Portfolio, in Chapter
9. "Reader John Kennedy regularly passes along intriguing assembly programming
tricks, many of which I've never seen mentioned anywhere else."

------
masterwok
I wrote a game of life screensaver for Windows years ago using the
optimizations recommended by this book in C# if anyone is interested.
[https://github.com/masterwok/Game-of-Life-
Screensaver](https://github.com/masterwok/Game-of-Life-Screensaver)

------
kelvin0
This is great, but I can't seem to find a generated version? A PDF for example
would be nice?

------
winrid
A while ago I bought the paperback version of this book since I hate reading
on the computer. It's huge, was $100 on Amazon, and someone wrote "Trash" on
the side before they realized they could sell it.

And to think now I have a Kindle... >.<

------
floki999
Thank you for making this available. I had the original book which
unfortunately went missing between moves. A great book!

------
Iv
I read "Graphic Programming Black Block" and for a moment thought that the
demoscene had an awesome revival!

------
kunkelast
I read it long time ago, good book!

------
tobr
> Markdown source (2001)

This looked like a peculiar anachronism, as Markdown was created in 2004. But
apparently this isn't the original _source_ , but rather a scraped HTML[1]
version converted to Markdown[2] in 2013.

1: [https://github.com/jagregory/abrash-black-
book/commit/b946ff...](https://github.com/jagregory/abrash-black-
book/commit/b946ff9e70a7dc0c313f107833935e53c71a7935)

2: [https://github.com/jagregory/abrash-black-
book/commit/5e1079...](https://github.com/jagregory/abrash-black-
book/commit/5e10794a24a45e84a709e64577912999fe5bc557)

~~~
jagregory
Repo owner here. You are correct, not sure why this wasn’t made clearer by OP.

Book released 1997, made available online in 2001, converted to
Markdown/ePub/etc by me in 2013.

~~~
app4soft
Could you also add PDF version? (converted from Markdown)

~~~
big_chungus
You can add the following target to the Makefile:

    
    
      pdf:
              rm -f out/black-book.pdf
              pandoc --to pdf -t latex --pdf-engine=xelatex -o out/black-book.pdf --toc $(FILES)  
    

Or here's a link to the copy I generated:
[https://u.teknik.io/pAxm9.pdf](https://u.teknik.io/pAxm9.pdf)

------
kranner
To save others the disappointment of expecting a 2019 update, this is a
cleaned-up copy of the classic text.

~~~
taneq
I don't think I'll ever be disappointed to see this work linked. :) In some
ways I feel like modern graphics programming has little to do with traditional
optimisation. Rather than coding tight inner loops and wizardly algorithms,
it's all about managing cache lines and pipelining data flows into your
massively parallel desktop supercomputer. Which is awesome, but compared to
the older stuff it's like a high speed rail network compared to a motorbike.

~~~
markus_zhang
Just curious, is it OK to say that most of the book is irrelevant for today?

~~~
chongli
It's not irrelevant unless you intend to focus on modern, triple-A graphics
engines. Lots of indie game developers don't care about that stuff! There are
even people working on new games using engines from the 90s, such as the Build
engine [1]. Ion Fury [2] is one example.

I think the more you focus on modern graphics engines, the more difficult it
is to stand out from the crowd. You end up in a rat race where you need a huge
team of artists to create all of the assets for your photorealistic game.

On the other hand, with an old engine (or a new engine using traditional
rendering techniques), you can make something distinctive and stylish with a
smaller team. After all, they say the enemy of art is the absence of
limitations.

[1]
[https://en.wikipedia.org/wiki/Build_(game_engine)](https://en.wikipedia.org/wiki/Build_\(game_engine\))

[2] [http://www.ionfury.com/](http://www.ionfury.com/)

~~~
taneq
Just to build on this, modern triple-A games are incredibly asset-based. It
doesn't matter how good a coder you are, unless you have an art _department_
producing every single tiny little detail of your in-game assets, they're
going to look terrible.

This is part of the reason low-fi games are making a comeback in the indie
scene, because they can actually look good with only one or two artists
working on them (or even sometimes with programmer art.)

------
markus_zhang
I'm actually wondering who is Wendy Tucker...

