This is, however, storing 31 bits of generation-counter per bit of state to try to make the ABA linearization problem unlikely. Except it's using signed ints and doesn't account for overflow, so I think the behavior is going to get a bit less intended once the generation count hits 0x3FFFFFFF, rather than 0x7FFFFFFF; but I'm no expert on integer overflow in Java.
The claim that this is unsynchronized feels dubious, however! (Lock-free, sure! Unsynchronized? Not so fast!) This looks like it doesn't have synchronization primitives, but it _does_ - it has atomic reads and writes of ints . (Thanks, Java!) So it never has to worry about reading partially updated data from a cell. If you were to try this in C, you'd potentially get piles of garbage as your thread is preempted partway through reading an int32 without a proper atomic intrinsic operation call.
I agree that the current implementation won't survive integer overflow, Java or not. That's fixable, though. Theoretically, algorithm's description would still use unbounded integers.
I disagree on atomic reads/writes. If the algorithm can deal with totally wrong values, why would partially wrong values make any difference? Atomic reads/writes certainly help with error probabilities but it's not clear to me to what extent. A C implementation will be as robust, Java doesn't add any magic here. A totally different CPU with no guarantee of atomic 4-byte updates would be needed. Do those exists?
Would the queues from https://www.liblfds.org/ qualify?
> Being lock-free implies using Compare-And-Swap - a very powerful synchronization primitive.
Just look inside https://github.com/liblfds/liblfds7.1.1/blob/master/liblfds7...
It's full of synchronization stuff. By "unsynchronized" I mean NOT using that.
1) A parallel variant of the Game of Life where updates happen in some random order instead of simultaneously across the grid
2) An unsynchronized parallel implementation of the Game of Life which uses some fancy error correction to keep consistent (the title claims deterministic behaviour), or
3) Edit: Reading comprehension fail, there is no point 3. (Was: An attempt at (2) which doesn't yet work (the readme states "Due to the highly asynchronous nature of this implementation, the simulation's results may differ from the classic version for the same set of initial conditions." which implies non-determinism).)
It seems like a cool idea but I'm not sure which cool idea it is. :)
The "state" array stores the state of the cells in the life simulation in the lowest bit of each int in the array. The rest of the bits are used to store the count of the current generation for that cell.
Each thread can then examine cells independently and determine whether there's enough information in the cell's neighbours (taking generations into account) to update the cell's state (and increment the generation).
The trick is that even though the threads could be reading and writing the same cells at the same time, they will only ever write the same thing and so it doesn't matter.
> Entropy is a programming language about giving up control. All data decays as the program runs: each value alters slightly every time it's used, becoming less precise. An Entropy programmer needs to abandon the pursuit of precision which most programming demands -- often working against years of habit -- in order to program effectively. Any output from an Entropy program will be approximate, and the more the data is accessed, the more random it will become. The programmer has, at best, a short window to get his/her idea across before the program corrodes. The program itself is not altered -- so the next time it runs, it is restored to its original condition, only to decay again through its next run.
EDIT: I guess it reminds me of Entropy language because deterministic behaviour from chaos is about as opposite in spirit to giving up and embracing decay and entropy as you can get.
Which he has combined with the classic Eliza program to create (imo hilarious) Drunk Eliza: