Hacker News new | comments | show | ask | jobs | submit login

Imagine a beach. A beach is made of particles of different sizes, aka length. Now physics apply to this beach, as a line of "processors" that grab one particle out of the array and release it to the future beach, taking the time of length of the particle. There can be emptiness between particles.

First property of this, is "entropy" that the beach is self-sorting interaction by length, as time taken to process if close to a different sized particle.

Second property is the "light-speed" of this physical simulation. This is the maximum particle length multiplied with the number of processors.

Third property is causality, causality is the range of the furthest out particle reached by the light-speed length again multiplied by the light-speed. Obviously causality can be reduced by reducing light-speed.

Fourth property is non-interaction, if you add traversable, unmoving particles into the scheduler queues, this warps "time" as in, large particles are forced to jump over them, while small particles take free particle-size time to traverse, and as being non-interacting, are invisible to the huge particles.

Sixth property is determinism. The whole physic-simulation is precise and from a deterministic input, we can derive a deterministic output, as long as causality remains unviolated.

Now for scheduling, the data-physics are rather simplistic, as the processor-front, or "time" is not repeatedly interacting with the same beach.We also can determinate ahead of time, which processor interacts with what particle, assuming light-speed is not altered. Also note, that big particle being processed equal reduced light-speed.

Now what is optimal scheduling? Optimized for throughput, as in shoving huge particles to dedicated processors, with huge fields of non-interacting particles? Fair scheduling, as in having all particles having average particle-size time going through? Prevent a causality buildup over lights-peed?

PS: I once wrote a little visualization of these data-physics, and showed them around algorithm class- nobody ever beside the prof cared for it. Nice to meet people who are fascinated by mind games.

PS^2: You can use similar property's to compress data, via Conway's game of life into deterministic lock-step simulations. Never saw that used in the wild, as you need either pre-computated simulations to be fast, or some sort of relational operator on some simple pre-set of simple simulations.

PS^3: Sorry, got carried away. This is fascinating. Should upload the simulation to GitHub. Excuse any typos.




> Should upload the simulation to GitHub.

Sounds interesting. I'd fork it!




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: