
Lindy effect - bushido
https://en.wikipedia.org/wiki/Lindy_effect
======
thomasahle
I once studied the expected remaining length of a game of chess, as a function
of moves played: [https://chess.stackexchange.com/questions/2506/what-is-
the-a...](https://chess.stackexchange.com/questions/2506/what-is-the-average-
length-of-a-game-of-chess/4899#4899) What was interesting is that for the
first 20 moves (40 half moves) the expected remaining length will decrease at
a near linear rate, but then it levels off at about 25 remaining moves, and
after 45 moves every move played will _increase_ the expected remaining
length.

At the time it surprised me, but of course it is natural to expect long games
to be long.

~~~
sevenfive
The threshold at 45 probably corresponds to endgames where the kings have to
walk around the board to take care of pawns.

~~~
stouset
Not only that, but endgames frequently involve long periods of positional
maneuvering that can take dozens of moves before one side realizes an edge, or
before it becomes clear that it's heading toward a draw.

~~~
buzzybee
The longest possible chess game:

[https://www.chess.com/blog/kurtgodden/the-longest-
possible-c...](https://www.chess.com/blog/kurtgodden/the-longest-possible-
chess-game)

Chess AIs, perhaps needless to say, are very good at computing at the depth
necessary to win drawn-out endgames.

------
connoredel
The key insight is that you are unlikely to be experiencing the thing at a
special time in its life. This is the Copernican principle (which J. Richard
Gott uses in his version of this that Wikipedia mentions), which was basically
"we (on Earth) are unlikely to occupy a special place in the solar system --
it's much more likely that some other object is the center."

Gott says you can be 95% confident that you're experiencing the thing in the
middle 95% of its life. Let's say x is its life so far. If x is 2.5% of its
eventual life (one extreme of the middle 95%), then the thing still has 39x to
go. If you're at 97.5% (the other extreme), then the thing only has x/39 left.
So the 95% confidence interval is between x/39 and 39x

Of course, 5% of the time you actually are experiencing something at the very
beginning or very end of its life (outside the middle 95%), which is a unique
thing. But that's why it's a confidence interval < 100% :)

I prefer this form of the principle a lot more than "the expected life is
equal to 2X, always."

Side note: I took J. Richard Gott's class in college called The Universe.
Maybe not the best use of a credit in hindsight, but we studied some really
interesting things like this.

~~~
tgb
And the real fun is when you apply this to humanity itself:
[https://en.m.wikipedia.org/wiki/Doomsday_argument](https://en.m.wikipedia.org/wiki/Doomsday_argument)

~~~
simonh
Lots of interesting stuff in there. The problem I have with naive versions of
this is that they assume as random people we don't live in a special time in
human history, but if you look at human history so far the current era is both
extremely short yet also spectacularly atypical in almost every conceivably
way. It is also a period of still very rapid change. It's hard to get my head
around what that means for estimating future trends or outcomes.

~~~
indigochill
Funny thing with exponential curves, no matter where on the curve you are,
everyone behind you seems mind-numbingly slow and everyone ahead seems mind-
bogglingly fast.

~~~
grkvlt
This always confused me - people talk about an exponential explosion, but the
rate of change of e^x is e^x, so there is no actual 'knee' in the curve with a
huge speedup afterwards...

~~~
kenbellows
or, if you prefer, _every_ point is a 'knee' in the curve with a huge speedup
afterward

~~~
simonh
Except in reality, when samples are infrequent and there is significant
'noise' in the data, it can be far from clear what shape the graph is. This is
usually particularly true in the early stages of the development of a trend.
How do you know you're in the early stages? You don't, but it's a big mistake
to think that trends which eventually turn out to be exponential must
therefore be obviously so at all times.

------
dmurray
There's a similar effect when waiting for a bus or other public transport. At
first, the expected time you'll have to wait exhibits decreases as time goes
by: if there's a bus every 10 minutes, after waiting 8 minutes you expect one
to arrive in 1 minute, compared to 5 when you started waiting. Stand there
longer without a bus arriving, however, and the Lindy effect starts to apply.
After 15 minutes without a bus, most likely the bus broke down, but you can
expect another within 5 minutes. After 30 minutes, well, maybe the drivers are
on strike today or this bus route got cancelled or you misremembered the
frequency of the bus - either way, expect to keep waiting.

Anyone know of a term for this kind of behaviour? I've never seen it named,
though I do recall an article that made the HN front page that demonstrated
this effect with the New York subway.

~~~
gwern
The Hope Function?
[https://www.gwern.net/docs/statistics/1994-falk](https://www.gwern.net/docs/statistics/1994-falk)

~~~
dmurray
Yes, this is it exactly! Even the framing of it as waiting for a bus is given
as one of the examples, "Problem 2, The Standard Wait Problem" but the other
examples also have similar behaviour.

------
franciscop
I was thinking about what this had to do with HN and then it hit me:
javascript is going to live forever; and C will outlast it.

We can also see why it's really difficult to compete with early [surviving]
frameworks, since they will last for a really long time.

~~~
PKop
Bitcoin.

The longer it continues increasing adoption and users (like any network
effect) the more useful it will become to more people. Also, the higher the
market cap, the more people will be invested in its success. The longer it
continues to work as designed, the more people will trust it.

~~~
simonh
Apple, Microsoft and Intel have dominated the personal computer industry for
almost all of its existence, and their relative positions within that market
have been remarkably stable. Therefore as every year passes with that still
true, the expected future lifetime of that triopoly increases.

~~~
franciscop
I am not sure companies follow this as they can go bankrupt and be gone in a
matter of months.

------
srean
Given Wikipedia's standards I am a little surprised that the article is light
on the math. One tool to measure this with is the hazard rate.

Say at age x my probability density of dying at that instant is _f(x)_. Now we
condition on the obvious fact that I would lived at least x before I die
(counting from 0). The distribution function _F(x)_ is the probability that I
die before x. So that gives a conditional probability density of dying in this
instant is

    
    
        h(x) = f(x) / (1- F(x))
    

If this quantity is constant (in other words independent of _x_ ) then I am
Peter Pan. I don't age. I will die by some random accident that has no
preference over time.

If _h(x)_ is an increasing function of _x_ then I am more human, I age. I age.

If its a decreasing function of x I am probably the Joker, "... makes me
stronger". Whenever _h(x)_ is a decreasing function of _x_ one encounters
Lindy effect.

Pareto distribution has been called out in the article but _anything that has
a fatter tail than the exponential distribution_ will suffice. The life of a
database query, search engine request, etc., etc., likely all fall under this
category. In such cases it is on us engineers to try to make those latencies
have an increasing hazard function.

~~~
silverdrake11
Don't forget you can edit Wikipedia!

------
firebones
I teach this as part of an internal developer class; one important thing to
note is that it applies to certain classes of non-perishable items. Not the
books themselves, but the ideas the books contain. We talk about how things
like the presentation framework du jour (e.g., many Javascript frameworks)
change rapidly, while tech deeper in the stack turns over less frequently
(middleware, operating systems, etc.) And we ask about why some tech survives.

The lesson here is that things that last have developed certain adaptations to
make them last. It's _always_ worth studying why some oft-repudiated or
outdated tech won't die; it is almost always because it possesses some key
attribute present that is essential. If you're proposing a new framework, or
promoting a new idea, it is essential you understand why these crufty old
incumbents are still around, and see whether your new framework or idea
embodies those old adaptations.

I've learned a lot about how flashy surface features (which compete well
against new tech at the surface level) can be inferior to tech that embodies
what the incumbents did well.

------
sna1l
Nassim Taleb loves the Lindy effect -- [https://medium.com/incerto/an-expert-
called-lindy-fdb30f146e...](https://medium.com/incerto/an-expert-called-lindy-
fdb30f146eaf)

------
40acres
What's that term where you learn of a new concept or word and then immediately
see it referenced soon after learning it? In rereading zero to one by Peter
Theil I came across the Lindy effect and here and now I've come across the
wiki page on HN.

~~~
whytaka
Baader Meinhof.

~~~
wastedhours
[https://www.damninteresting.com/the-baader-meinhof-
phenomeno...](https://www.damninteresting.com/the-baader-meinhof-phenomenon/)

------
paulvs
I love it when I find an article about something that has crossed the edge of
my mind, but which I've never given proper thought to.

------
mdonahoe
I've once heard a similar description for the life expectancies of cancer
patients, new motorcyclists, and hard-drives. There is a high initial
mortality rate, but once you get past the hump your expectancy increases with
every day to a limit.

Does anyone know if there a different name for those distributions?

~~~
cromd
"Bathtub mortality" perhaps?
[https://en.wikipedia.org/wiki/Bathtub_curve](https://en.wikipedia.org/wiki/Bathtub_curve)

~~~
firebones
Weilbull distribution [1]. Why we used to run our mail-order PCs non-stop for
a week to burn them in. If there were faulty components, we wanted them to
fail under the 30-day return policy.

[1]
[https://en.wikipedia.org/wiki/Weibull_distribution](https://en.wikipedia.org/wiki/Weibull_distribution)

~~~
jventura
I'm thinking here if one should generalize that idea to other things we buy,
as to make sure that everything we buy has no faulty components. For instance,
if we buy a new car, should we stress it to check if something breaks while
still within the legal guarantee period, etc.. Of course, one should not
stress it too much to the point of wear out prematurely (as in the end of the
bathhub curve).

Do you know of any literature, articles, etc., about this? I'm asking more
from a consumer's view point..

~~~
cr0sh
> For instance, if we buy a new car, should we stress it to check if something
> breaks while still within the legal guarantee period, etc.. Of course, one
> should not stress it too much to the point of wear out prematurely (as in
> the end of the bathhub curve).

If it is a brand new car (and depending on the kind of car, too) - you
probably don't want to do this, at least immediately.

This is because the engine has not been run much since it left the factory.
There is an engine "break in" period (you can read about it in your owner's
manual) during which you need to follow the instructions properly, or you can
actually cause damage to the engine and decrease its life dramatically.

Essentially, it involves not running the engine under extreme loads or speeds
for so many miles (500 - 1000 I think is normal). After that, you might also
do an oil change and some other early maintenance. For certain sports cars or
other high-performance vehicles, it can be even more strict.

It basically has to do with the frictional components wearing into each other
with lubrication carrying away (and being captured in the filter) the small
bits of metal scraped away. Because even despite the fine tolerances engines
and transmissions, etc are machined too, they aren't exactly matched, and the
wear-in period allows for this (then you change the fluids and filters to
remove the contaminants). Running the engines and such hard during this period
puts higher friction and stresses on the system, which actually causes more
metal than normal to be removed (essentially - wearing the engine in more than
needed, if that makes sense).

This is also basically the same thing you have to do when you get an engine
overhauled or otherwise modified (ie - new pistons, rings, porting, honing,
etc).

------
hn_throwaway_99
Anyone living in Austin Texas who has seen the disaster of the Mopac Highway
"Improvement" Project sees this effect first hand. Construction started in
late 2013, originally scheduled to be finished Sept 2015, actual completion "3
to 6 months away" ever since.

~~~
domoritz
I present to you
[https://en.wikipedia.org/wiki/Berlin_Brandenburg_Airport](https://en.wikipedia.org/wiki/Berlin_Brandenburg_Airport):
scheduled to open in 2010, the current estimate is 2019. The stories around
its delay are sometimes comical.

------
shanusmagnus
I'm trying to figure out whether this can live peacefully with the Red Queen
Hypothesis:

[https://en.wikipedia.org/wiki/Red_Queen_hypothesis](https://en.wikipedia.org/wiki/Red_Queen_hypothesis)

Basically, from Van Valen's data, species have a constant chance of going
extinct, regardless of how long they've been around. His hypothesis is that
even though speciation appears to be a discrete event, these species are
constantly jockeying with each other for survival in a dynamic environment --
they have to run faster and faster to stay in the same place, and are always
subject to falling out of the race, as it were.

The devil's in the details, of course, but it's intriguing to jam these two
ideas together.

------
juskrey
This is a main rule of thumb when I am choosing technology stacks for my
clients (I may choose new experimental ones only for hobby/experiments). And
basically the reason I am coding on LISP (Clojure)

------
no_gravity
It probably holds true for startups. The longer the startup existed, the more
has been built. Code, userbase... So one would expect it to last longer.

I wonder if it also holds true for profitable companies. That would mean that
- all other things equal - a company 10 years old is worth as much as a
company that is one year old but makes 10x the profits.

~~~
erikpukinskis
I would expect a bimodal distribution here. The way most startups write code,
the velocity per developer is decreasing as lines of code increases. You can
throw more developers at it, up to a point, but eventually the codebase
becomes worthless and all the value is in the development team's heads.

There is kind of an escape velocity then... Some teams will be able to
refactor or replace their way out of this, back to a point where more
developer hours equals more features.

Other companies will fail to achieve escape velocity, and the dev team will
expand to use all available resources while the application becomes slowly
worse relative to competitors.

That process _can_ be stretched out, and money made while the velocity nears
zero, but the company will eventually die.

Whether your codebase and development team is an asset or a liability depends
on whether you have that escape velocity.

------
jcfrei
I had this same idea for quite a while and it's nice to see it has got a
proper name. I always wondered whether it would make sense when it's applied
to jobs - ie. jobs like being a chef or bartender will be around for a long
long time, while being a programmer probably not.

------
fvdessen
I wonder if this applies to human relationships as well; friendship, love,
employment, etc.

------
Confusion
Now imagine what the Lindy effect implies when technology becomes available
that undoes the damage caused by the mechanisms called 'aging'.

