Hacker Newsnew | comments | show | ask | jobs | submit login

That comes from the second law of thermodynamics. Setting a bit on the computer reduces its entropy, so it must dissipate enough heat to increase the entropy of the Universe by at least the same amount.

Notice that this only applies to setting it into a known value. In theory there are some operations that we can do without dissipating any heat.




But how do they know how much energy a bit requires?

-----


As I said, that's by the second law of thermodynamics...

There is a nice <a href="http://en.wikipedia.org/wiki/Maxwell%27s_demon>tought experiment</a> linking thermodynamics to information theory. And guess what, experiments agree with the theory.

After you have the theoretical framewok done, it is just a matter of applying the <a href="http://en.wikipedia.org/wiki/Boltzmann%27s_entropy_formula&#... entropy formula</a>, and you are done.

-----


We know the minimum amount of energy required, because we exactly know the information content of a bit.

-----


That is technology-dependent. Mercury-filled-cylinder delay lines took more energy than modern DRAM.

-----




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: