

Entropy - justplay
http://andrew-hoyer.com/experiments/entropy/

======
maikklein
I see an evil DRM for games, you buy a game and textures will decay over time
until you buy the game again to reset the decay.

~~~
jfoutz
Back in the day, AutoCAD did that with point coordinates. No dongle, each save
got progressively worse.

~~~
keyle
Did you mean 3D Studio on DOS?

~~~
jfoutz
AutoDesk not AutoCAD! gah. I think you're right.

There was a collection of DRM stories i read a few years ago. People
(businesses, architects, professionals that should know better) calling for
tech support over corrupted files was memorable (but apparently not memorable
enough). I imagine there was some grim joy in informing people they were not
using a legitimate copy, and thus their program was failing.

------
happywolf
This reminds of the polymorphic viruses in MS-DOS/Windows era (very likely
they still exist, but I didn't look for long time) where they would
dynamically generate the decryption/encryption routines and changed their code
appearances to thwart signature-based antivirus scanners. As such, maybe it
would be fun to add self-modifying codes so that 1) precision is reduced 2)
performance is also randomly reduced. The longer you run this baby, the slower
and less precise it would be. ;)

~~~
iancarroll
> very likely they still exist

Sadly. They're one of the more complex malware types.

------
jpalomaki
Computers are precise but sometimes I wonder could they be made faster if we
relaxed this precision requirement?

For example in game 3d graphics it might acceptable to sacrifice some
precision for performance. Could be interesting to apply some thing "Entropy"
to the graphics part of some existing game. Flappy Bird maybe?

~~~
jacquesm
The speed/precision trade-off in games is well known and exploited to the max
by those that care.

------
ingenter
It would be interesting to write a program that is going to work properly by
using only such "decaying" variables.

There are actually applications for such programs - software replacement for
ECC memory.

~~~
vanderZwan
I thought you were about to go full thermodynamics on the data and suggest
building an information-equivalent of the Stirling engine.

------
michael_nielsen
Reminds me of Netflix's "Chaos Monkey". Both Entropy and the Chaos Monkey
force you to deal with failure as an eventual certainty, not something to
ignore and hope it doesn't happen.

~~~
ams6110
AFAIK Chaos Monkey inserts random but non-permanent failures into the system,
not really entropy. Entropy would be the gradual loss of their movie data,
something I'm sure they work quite hard to prevent.

------
bodski
Wow, this is really appropriate for the day I've had today. Repeatedly failing
to get a brand new install of Bitcoin Core to index and sync with the
blockchain. I finally dusted off my Memtest86 and voila, bad RAM stick!

I suppose crypto would fare pretty badly inside this language, or has anyone
thought about this kind of thing?

~~~
moyix
Well, Shannon & Moore showed [1] that you can make arbitrarily reliable
circuits out of unreliable relays, so I expect you could find a redundant
enough encoding for any program that would allow it to operate at high
precision, even in this language. It would almost certainly be larger and
slower than a more accurate equivalent though!

[1]
[http://mriedel.ece.umn.edu/wiki/images/3/30/Moore_Shannon_Re...](http://mriedel.ece.umn.edu/wiki/images/3/30/Moore_Shannon_Reliable_Circuits_Using_Less_Reliable_Relays.pdf)

------
bmh100
What is interesting about this is how it forces one to deal with a lack of
precision. Computers are inherently imprecise in many situations, leading to
non-commutative multiplication. I wonder if a language like this would be
helpful for programming with approximate processors. They are supposed to be
much faster at a cost of precision.

~~~
zwegner
Pedantic aside: I guess you mean non-associative multiplication? That is,
ax(bxc) versus (axb)xc. Since rounding happens at different times, they can
give different results. Floating point multiplication is indeed non-
commutative as well, but that has to do with NaN handling, not imprecision.

~~~
bmh100
Yes, good catch! I did indeed mean non-associative multiplication based on the
timing of the rounding.

------
laluser
I love this idea, it's quite interesting. On the flip side, I made a change in
some software at my work the other day to replace the use of a random in a
unit test. The use of random was being used to populate some data. This
language would make matters incredibly difficult to recreate bugs and even
test software

~~~
bmh100
The situation you are suggesting might require dumps of as much state as can
be measured and a method for recreating the state of the application from that
dump. That might be a very large set of data if every variable can be altered.
Even execution paths and past variable values would need to be measured over
time.

------
cm127
Seems like a Monte Carlo simulation for adding noise to any Javascript
variable. It'd be cool if you could apply filters to the noises.

