Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not doing any tectonics. Sorry if I was unclear, I meant it would be an interesting challenge, but in my game it doesn't make sense to worry about, because one doesn't interact with the world at a scale where you notice how the mountains are laid out.

At a more fundamental level though, doing evolutionary-type stuff in a procedural world is a problem I'm wrestling with. I'm running an infinite world that's generated one chunk at a time, like Minecraft, so you can't, say, "start by generating all the rivers, and then...". Obviously this means everything has to be deterministic, but also it means that whatever macro-sized calculations I do need to be re-done for each chunk, or else cached somehow, etc.

I've looked for articles on such things but not had much luck. Maybe others here have approaches they've tried?



I think you might be able to solve this with certain level of laziness, at a cost that you would sometimes need to generate more than just a single chunk.

I.e. once you encounter a river, you will then generate the whole river, from source, to sink (either you have a sea, or some other already existing, but undiscovered river?)

You would then need to account for 3 states of your world: a) blank canvas b) pre-generated but yet unexplored terain c) already explored terain

The world in state b) would then be the one both cached and available for re-calculating :-)

If I were you I would read up on pen&paper sandbox designs, i.e. Stars Without Number [1] relies on lots of random tables to generate a sector of space with different factions/tech-levels/ecosystems, combined with rules for faction interaction. This gives it a nice "I am just a small ship in a big universe" feel :) You can generate example sector [2]

[1] http://www.drivethrurpg.com/product/86467/Stars-Without-Numb... [2] http://swn.emichron.com/


Perhaps you could try a multi-resolution chunk system - LoD for chunks.

For large scale features like rivers, generate those at coarse resolution on very large chunks. When generating the detailed chunks, take input from the coarser resolution levels which overlap the same spatial region, and add fine scale detail.

A couple of levels of detail might suffice in practice, but if you want to go crazy you can more or less make as many as you like: As long as each level is downscaled by at least some constant factor of the more detailed level below it, you'll only have O(log(N)) overhead for O(N) chunks which is small even for very large N.

Compare mipmapping, which is more or less the opposite of this process.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: