
The Poor Man's Voxel Engine - et1337
http://et1337.com/2015/02/18/the-poor-mans-voxel-engine/#
======
stcredzero
_Long story short, the garbage collector doesn 't like to clean up large
objects. I end up writing a custom "allocator" that hands out 3D arrays from a
common pool. Later, I realize most of the arrays are 90% empty, so I break
each chunk into 10x10x10 "sub-chunks" to further reduce memory pressure._

 _This episode is one of many which explain my present-day distaste for
memory-managed languages in game development._

If you push any programming environment far enough, you will always end up
"writing a buffer pool" or some other such memory management optimization task
like this. So are memory-managed languages all just "bad?" No. That's way too
simplistic. Memory-managed languages are just one particular set of tradeoffs.
You can think of it as a kind of "technical debt." Is debt always bad? No.
Sometimes it's the smart thing to do.

So where should the blame lie? Either on the person who chose the tool, or
just chalk it up to the "unforeseeable" and switch to a better tool.

(And this is why building modular, well-factored systems is often the smart
thing to do.)

(EDIT: It occurs to me that this story parallels the development of many
industries. Things start out "hacky" and cobbled, but then standard interfaces
are established so that components become "modular." Then, when people in that
industry learn more about precisely what people need and what works best,
modularization is left out to build a leaner and better performing widget.
Yes, modularity is sometimes the smart move -- but like all things, it is very
dependent on _context_!)

~~~
pekk
I think you're substantially agreeing with the author you've quoted, at least
as far as the quote. Expressing distaste for memory-managed languages in game
development and being able to cite a specific scenario for that is not the
same as saying that memory-managed languages are all just bad. I think you,
the author and anybody with a modicum of technical maturity can agree that
"fast" memory-managed languages are not at an absolute global optimum because
they make specific tradeoffs that not everyone will always want to make.
Granting this, if we have even one reason to still keep around a memory-
managed language (and that's not hard to come up with) then we have a case for
keeping multiple languages around and being able to interface among them.

Not everybody realizes this; especially among people who argue about
programming languages on web forums, there are still a large number who are
still married to the idea that one language has to do everything and if any
language has any weakness, we should leave it alone as "bad" because its flaws
mean that it is not the Miss Universe of programming languages. It apparently
takes some technical maturity to realize that significant flaws are often part
of complete packages of tradeoffs that make something a great choice for some
contexts even while they are not the best choice for some other context.

~~~
m_mueller
Is there any cross platform language that does it like Objective C w/ ARC? I
find that one a pretty good balance between being able to go nitty gritty with
optimisations while not having to care about memory management details for the
standard use cases. IMO it's also much easier to learn than C++. Could Rust be
the answer?

~~~
yoklov
C++ std::shared_ptr does the same thing as ARC (reference counting based on
scope/lifetime). Rust probably has something similar in it's standard library.

FWIW I wouldn't use reference counting in game development if I could avoid
it, because it has pretty bad performance characteristics as well (the only
thing it really has over GC is that it's deterministic, and that, with the
exception of in Objective C, you can choose where and where not to use it).

~~~
maccard
Most games don't do any heap allocations at al during runtime(or at least very
very few), so you shouldn't/don't end up with garbage collector pauses during
normal gameplay. If you refcount your main structures, they can be auto
cleaned up /removed when changing levels/loading scenes/swapping chunks out on
the fly.

------
louthy
Really good read, and takes me back to my younger days of feeling around in
the dark trying to work out 'the one true way'. A couple of things that went
through my mind that I don't see mentioned, or is kinda brushed over:

* You mention you originally rendered everything, then later moved to breaking into chunks?

* How do you decide what's 'in view', do you do any culling?

* You don't mention culling polygons that are pointing away from the camera?

Sorry if I missed those, but my thoughts were:

Use an octree [1] to register the scene blocks, you can then recursively
intersect the viewport with the tree to find out what's in-view. There's
perhaps even some cunning way you could 'rasterise' the blocks in the viewport
using Bresenham's algorithm (tracing the edges of the viewport) [2].

In terms of the polygons facing away from camera, you can get the dot-product
of the normal of the polygon and the vector of the camera. If it's negative
then it's pointing away, if it's positive it's pointing toward the camera (or
vice verse, I can't remember). In a tight loop that can be damn quick, and
saves rendering something that can't be seen.

Apologies if this is all obvious, and you're already doing it, or if it's all
handled automatically. It's been over 10 years since I've had to do any of
this stuff, so just dredging it from the depths of my memory ;)

[1] [http://en.wikipedia.org/wiki/Octree](http://en.wikipedia.org/wiki/Octree)

[2]
[http://en.wikipedia.org/wiki/Bresenham%27s_line_algorithm](http://en.wikipedia.org/wiki/Bresenham%27s_line_algorithm)

~~~
et1337
You pretty much nailed it. I store data in a bastardized octree, more or less.
I break the world into chunks to facilitate culling. I check the bounding box
of each chunk against the view frustum. There are generally under 100 chunks
in view at any one time, so the culling doesn't have to get fancy. And yes,
the GPU handles back face culling automatically these days. :) I do use
Bresenham's for raycasting.

~~~
louthy
Good stuff and good luck with the project :)

------
nkozyra
I read this yesterday and came away extremely impressed with your self-
analysis, documentation and the game's progress itself.

Is this entirely a one-man operation?

~~~
et1337
Glad you enjoyed it. More or less, yes. I've brought on a few contractors at
times to help with audio, animation, and a few other things.

~~~
sitkack
You should write a post on how to delegate.

~~~
Intermernet
Or get someone else to write one for him :-)

------
frik
He is not alone. I have followed several projects that try to create
Minecraft-like Voxel engines. Thanks for the article.

Minecraft was originally inspired by Infiniminer:
[http://thesiteformerlyknownas.zachtronicsindustries.com/?p=7...](http://thesiteformerlyknownas.zachtronicsindustries.com/?p=713)

It was coded in C# (.Net 2) and XNA 3.0 runtime. The code was not obfuscated
and someone published the code. As Infiniminer was a multiplayer game, hacks
and bots destroyed the game community as well as several knock-off clients.
That everyone had to download dotNet 2 and XNA 3 didn't help either. So
Minecraft with its original Java browser applet won the audience by storm. The
rest is history and Notch just bought the most expensive house in L.A.

The history of DirectX support on non C/C++ is a sad story of deprecated APIs:

* Visual Basic 6 with DirectX 7 support: _Direct3D retained mode_ (COM-based scene graph API), _Direct3D immediate mode_ and DirectDraw (2D)

* Visual Basic 6 with DirectX 8 _Direct3D immediate mode_ (different API), no DirectDraw (2D)

* C# with managed Direct3D (Microsoft.DirectX.Direct3D), supports only DirectX 9: [http://www.riemers.net/eng/Tutorials/DirectX/Csharp/Series1/...](http://www.riemers.net/eng/Tutorials/DirectX/Csharp/Series1/tut1.php)

* C# with XNA 1-4 (DreamSpark/MSDNAA license), supports only DirectX 9: _Released in December 2006, XNA is intended to push the ease of game programming to the extreme. XNA is new wrapper around native DirectX. As development on a new version of Managed DirectX has been cancelled, XNA can be thought of as the new version of Managed DirectX. Although the code is not 100% the same, it is VERY similar. No windows event handling, built-in update and drawing loops and XBOX360 compatibility are just some of the some of the reasons why XNA will become the future of DirectX game programming. XNA is built on top of DirectX 9_ \-- [http://www.riemers.net/eng/Tutorials/xnacsharp.php](http://www.riemers.net/eng/Tutorials/xnacsharp.php)

Windows comes with OpenGL 1.1 (from 1996) and one has to init its context to
load a third party OpenGL 4 context:
[http://www.gamedev.net/page/resources/_/technical/opengl/mov...](http://www.gamedev.net/page/resources/_/technical/opengl/moving-
beyond-opengl-11-for-windows-r1929)

Internet Explorer 11 supports WebGL 0.9 (almost no extensions, current would
be WebGL 2): [http://webglstats.com/](http://webglstats.com/)

~~~
dreen
> [http://webglstats.com](http://webglstats.com)

Neat. What happened in the summer of 2013?

~~~
ferongr
It's probably Chrome that enabled the software renderer for users with
unsupported GPU configurations. Note that the software renderer is extremely
slow (it barely manages to and unsupported systems are more likely than not to
have a weak CPU that further lowers performance. We're talking performance at
a level barely enough to animate the cube at
[https://get.webgl.org/](https://get.webgl.org/) . I don't know the value of
that software renderer, other than making Chrome look better on statistics.

------
_random_
XNA is single most important technology that effectively enabled the indie
movement. Shame they don't want to upgrade it.

[http://visualstudio.uservoice.com/forums/121579-visual-
studi...](http://visualstudio.uservoice.com/forums/121579-visual-
studio/suggestions/3725445-xna-5)

[https://twitter.com/hashtag/becauseofxna](https://twitter.com/hashtag/becauseofxna)

~~~
cpeterso
More than Unity?

~~~
onemore360
More than Flash, Steam, or mobile app stores?

------
AceJohnny2
> The voxel format is simply a 3D array represented as an XML string of ASCII
> 1s and 0s.

Oh _ow_. But otherwise, thanks for this wonderful tale of discovery and
progress. I'm impressed that you stuck to it despite all such issues. My damn
perfectionism would have had me dump everything in a fit of angst at the first
sign of trouble.

------
gavanwoolery
In the first voxel engine I worked on, I also attempted RLE (for network
performance). Doing it in 3 dimensions is not trivial, as you show. I decided
for some reason that doing it in a single pass was the "smart" way to do it.

Important thing is that you are having fun and learning. Your first attempt(s)
will usually not serve much more purpose than this.

Anyhow, Lemma looks promising (and pretty amazing for your first game to hit
shelves) - I was aware of the game but interesting to see how it came about.
:)

------
explorigin
FTA: "At this point, I'm saving and loading levels via .NET's XML
serialization. Apparently XML is still a good idea in 2010. The voxel format
is simply a 3D array represented as an XML string of ASCII 1s and 0s. Every
time I load a level, I have to re-optimize the entire scene. I solve this by
storing the boxes themselves in the level data as a base64 encoded int array.
Much better."

Sarcasm...please tell me this is Sarcasm!

~~~
gknoy
I think this is rather a stream-of-consciousness style blog of the progress he
made -- note that later on he said, "I eventually got a job in industry, and
learned that everything I thought I knew was wrong".

------
divs1210
Brilliantly written. Loved all the shockingly bad decisions. :D

------
jokoon
Looks like an advertisement for C/C++.

I agree that some language features are nice to have, but you won't convince
me that C/C++ is bad because it's old, or because it's too down to earth. In
short: you can't replace simple tools like a simple hammer and screwdriver.
You don't always need them, but they're not replaceable, so to me I'll prefer
learning a down to earth language and suffer the right consequences of that.
Having to manage memory is the bread and butter of any programmer. I hate
going around avoiding gimmicks of new languages and API just because the
language pretend that it's doing something magically, there's always something
that will come back and bite you.

And for the minecraft argument: minecraft is a huge PITA when it comes to
memory and I've seen servers just burn because of memory management. Notch
buying large house won't convince me to use java or any other managed memory
language. The argument "that guy got rich, so it's an okay language" is not
okay with me...

I'm very conservative when it comes to technology and programming languages. I
prefer using old tools, because if they're old, that means that they stayed.

~~~
CmonDev
_" simple hammer and screwdriver"_

To complete the analogy I would say both also have a blade attached to
handles, so that you need to be using them v-e-e-ry carefully and only if
truly needed.

~~~
jokoon
Every programmer need to learn how to do his job. if a programmer don't like
it, he can use something else, but he might not achieve the same result. What
kind of risk are you talking about ? You won't kill anybody. If there's a bug,
just fix it. C and KISS ane just not taught enough.

I find it pretty weird to see people refusing to use simple tools, because
it's "too difficult". That's why I think Linus Torvalds is right about many
things. I just can't like the many diverse and abstract concepts of
programming that pops around now and then.

~~~
moron4hire
People don't dislike learning C or C++ because the languages are supposedly
hard to learn. They dislike them because the tools used to build and
distribute their work are archaic.

------
jere
I read this, all these tribulations over years, and then just chuckle as I
remember all the people who love to call Notch incompetent.

------
hedgehog
I wrote a voxel renderer in 2008 and found some good resources from Ken
Silverman (Build engine used in Duke3 etc). There used to be a forum here:

[http://www.jonof.id.au/forum/](http://www.jonof.id.au/forum/)

I don't know if any of that stuff got mirrored but digging around for "Voxlap"
shows some related results.

------
millstone
> The CLR's floating point performance is absolutely abysmal on Xbox 360

Can anyone speculate as to what went wrong here? I thought the Xbox 360 would
have very good FP performance.

~~~
disky
XNA Vector3 doesnt use the simd instructions.

Further every

a+=b; (where these are vectors)

calls a function, which calls a constructor

You can tell by getting significant speed ups by inlining by hand.

------
swayvil
I relate. I so relate. We need some kind of support group.

------
galapago
Is Lemma going to target other platforms than Windows?

~~~
et1337
I'd love to, but no plans for it currently. I did contribute some code to
MonoGame[1] a while back which opens some exciting cross-platform
possibilities. It requires a lot more development before it will support Lemma
though. So, "maybe". :)

[1] [http://www.monogame.net/](http://www.monogame.net/)

------
guiomie
Interesting article.

I checked your repo, maybe use nuget for packages such as Newtonsoft.Json.

------
dave_chenell
great writeup, thanks for sharing the progression

