
Koomey's law - shpx
https://en.wikipedia.org/wiki/Koomey%27s_law
======
beat
I find the idea of power limits to be fascinating. As computing becomes more
distributed, more detached, more mobile, then power, not computation, will be
the boundary.

It's not just the cpu... it's the screen, and the antenna. A screen needs to
produce a certain number of lumens of light to be readable. That provides a
theoretical limit. An antenna needs to transmit clean signal over a certain
range. That takes power, too. And the size of the device limits the size of
the battery. How many joules of energy can we store in that physical space?

We may be coming up on limits for mobile devices much harder and faster than
we think. Of course, IoT devices can be better off by not supporting a screen,
but communication is still a problem.

~~~
oh_sigh
Modern devices are actually pretty close(within a factor of 10) to the
theoretical energy requirements for producing light.

Computation, on the other hand, is still no where near the theoretical
limit[1]. We literally use more than 10,000,000x the theoretical limit to
erase a bit of data.

[1]:
[https://en.wikipedia.org/wiki/Landauer%27s_principle](https://en.wikipedia.org/wiki/Landauer%27s_principle)

~~~
tedsanders
Nothing against your specific comment, but in general I wish that the Landauer
limit had less cultural importance and salience in discussions of computation
& energy. It's a simple result, but the assumptions leading to it are quite
subtle. First, it relies on the widely misinterpreted second law of
thermodynamics and second, it relies on the subtle notion of 'erasure'. In
loose analogy, it reminds me of quantum mechanics - "what's the definition of
a measurement?" "It's anything that causes the system to act like it was
measured." "What's the definition of information erasure?" "It's anything that
causes the system to obey the Landauer limit." In any case, this limit is so
far away it's practically theory (though experiments purport to show it), and
since the limit can be exceeded in theory, it doesn't seem like a useful
principle to guide us.

------
amelius
Unfortunately, due to bloating, we perform more than double the number of
computations.

~~~
seanp2k2
I look at things like "how long does it take to enter 100 values into a
spreadsheet". Back in DOS days, it was as fast as you could type. With Google
Sheets, there is some lag sometimes. Many orders of magnitude more computing
power, but how have we improved the actual experience and productivity?

~~~
AnimalMuppet
But that's network lag, not computation inefficiency. Compare either a
spreadsheet program running on a mainframe back around 1990 to Google Sheets,
or else compare a DOS spreadsheet to a spreadsheet running locally on Windows
today.

~~~
SilasX
There are many causes of lag. The parent was referring to the case of "I'm
typing a bunch of data into a bunch of cells". Network connectivity should
_not_ (necessarily) cause lag in that in case: it should let the client side
update without interruption, and then pass data over to the server as it is
able.

If network syncing is interfering with user text input, something is wrong
with that design.

Edit: The historical analogy would be like if you were entering data (offline)
into your Lotus cells, but your computer would keep freezing up so the modem
could reconnect. Somehow they kept those decoupled back then!

------
sfeng
I found this linked to topic even more interesting:
[https://en.wikipedia.org/wiki/Landauer%27s_principle](https://en.wikipedia.org/wiki/Landauer%27s_principle)

~~~
205guy
Indeed, the idea of information (bits) as entropy is fascinating to me too.
And not just Shannon entropy, they're all related somehow. See
[https://en.wikipedia.org/wiki/Bekenstein_bound](https://en.wikipedia.org/wiki/Bekenstein_bound)

And then there is the fact that life encodes and persists information. There
was a recent HN thread that got me interested:

[https://news.ycombinator.com/item?id=13496133](https://news.ycombinator.com/item?id=13496133)

I did read Gleick's "The Information" but was disappointed it didn't dig very
deep into the concept. I got further following links on Wikipedia.

------
abecedarius
The part at the end about the Margolus-Levitin theorem makes no sense to me --
that theorem gives a speed limit, while Koomey's trend is about energy use.
Can anyone fill in the citation-needed?

------
jwtadvice
This is incredible. I followed this link down into the Margolus-Levitin
theorem, which puts a physical limit to how much computation (entropy?) you
can produce with a single joule of energy (6 x 10^33 operations/second).

It seems that at some point the unit for powering computers would then have to
be matter, since there's such an incredible amount of energy stored per
newton.

~~~
andrepd
>It seems that at some point the unit for powering computers would then have
to be matter, since there's such an incredible amount of energy stored per
newton.

I'm not sure what you're trying to say but that sentence doesn't make sense.

------
jostmey
Has this law still held over the past few years ever since Intel has been
unable to transition to 10nm?

~~~
gwern
A version, perhaps, at least for GPUs - you can tell because we've gotten much
faster new chips like the Pascal GPUs, but no one has needed to run new power
lines to their rooms in order to use them.

~~~
googsh0tz
That's only because it's cheaper to build a whole new facility

~~~
gwern
Uh... people aren't 'building whole new facilities' for their gaming desktops.
The new GPUs also have the same rated power consumption.

------
mrfusion
It brings up reversible computing as a way to not require any energy.

What would a reversible computer look like? Could I run Firefox on it?

------
mkagenius
It ends in 2048(maybe before) -- Why call it a "law"? Probably easy to
remember by giving a name, I guess.

~~~
popey456963
The article ends by saying "By the theorem, Koomey's law has the potential to
be valid for about 125 years". One imagines that it could continue long after
even this however with further ways of computations being discovered.

~~~
tgb
Reversible computation (which that 125 year number refers to) has to be done
very slowly. There might be some applications where that would still be
relevant, but it's pretty clear this trend don't last even until the 2048
mark, let alone for another 125 years and beyond.

The Feynman lectures on computation have some good sections on reversible
computation that I would recommend.

------
rsuelzer
Yet my phone still won't hold a charge for more than three hours.

------
quakeguy
pi/2 years, i'm trying to make sense of this fact.

~~~
mistercow
It's a coincidence.

~~~
quakeguy
Of course, it must be. My theories explaining this were mind-boggling, at
least.

------
hartator
I think it's not a fair game if we are going to find a new law every time
progress stale in our industry.

------
btreesOfSpring
it looks like Koomey's law will outlive the year 2038 problem[0].

0:
[https://en.wikipedia.org/wiki/Year_2038_problem](https://en.wikipedia.org/wiki/Year_2038_problem)

~~~
libeclipse
What do you mean by this comment? I don't see how the two are linked.

~~~
btreesOfSpring
They aren't directly linked. I just like the idea of cataloging all of the
time bounded limits in CS. Reading the wiki entry and having forgotten the
exact date for when Unix time comes to end, although knowing it is close, i
figured others might enjoy seeing the 2038 date too. Based upon voting, I was
wrong.

------
dbcurtis
Good news for robots.

