
Skin in the Game as a Required Heuristic for Acting Under Uncertainty - breck
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2298292
======
lhnz
I wrote a blog article about requiring skin-in-the-game the other day. My food
for thought is:

Small transactional costs stop things from existing at the fetal/growth stage.
Evolution destroys them long before the long-term benefits demonstrate
themselves as it supports short-term adaptive traits over long-term adaptive
traits. If you require skin-in-the-game transactional costs then your
transactions will happen far less often, and in the short-term you will get
out-competed for resources by those that protect themselves with top-down
regulations.

It is often difficult for third-parties to evaluate the authenticity of a cost
or regulation, which is why institutional costs exist - they factor out
hundreds of checks of hundreds of transactions into a single check of the
institution that regulates hundreds of transactions (which itself is often
proxied by the social proofing of this institution.) This lowers the
transaction cost which causes high growth.

Fetal is really a good analogy here. The womb exists for good reason;
scaffolds exist for good reason; decorum on first dates exists for good
reason. Without these things early transactional costs defeat growth.

In short, skin-in-the-game evolves out of artificial systems as it is bad for
early-growth. However this does not mean that it should; in well-
designed/natural systems the entity first exists in a protective bubble during
its gestation period and then slowly has its skin placed in the game.

~~~
neilk
I don't think your analogies apply. We do not demand that a fetus bear the
cost of its bad decisions, because it is incapable of making decisions. (But
we might start exposing a child to consequences of its bad actions, in
proportion, quite early.)

Similarly, a building under construction is incapable of advising someone to
buy stocks. And first dates do not usually result in land wars in Afghanistan.

Can you give an example of a process or profession which can be said to
willfully expose others to risk, which _does_ deserve this sort of protection
when it's just starting out? It seems to me that such responsibilities are
nearly the definition of maturity.

~~~
eclipxe
>And first dates do not usually result in land wars in Afghanistan.

Keyword: usually.

~~~
mdt
Goddamn butterflies

------
tlb
It doesn't mention any negatives of requiring skin in the game. The most
obvious is adverse selection: risk-averse people won't enter a profession
where mistakes are severely punished, so you get only risk-seekers.

~~~
nostrademons
It's kinda interesting that the modern corporation was explicitly designed to
_shield_ the decision-makers from risk. Without it, many of the large
industrial & trade breakthroughs of the enlightenment wouldn't have been
possible, as nobody would've been willing to take the personal risk of
venturing to the Indies or setting up a cotton mill when the downside of
failure was perpetual indentured servitude.

I don't think it's bad to require some skin in the game, but require too much
and many risks that are quite beneficial to society as a whole just won't
happen.

~~~
lmkg
Partially disagree. The original necessity of corporations was to _distribute_
risk, not to _remove_ it. There was the same total risk, but born across a
wider number of people. The financial risk was unchanged, but the consequences
were made less catastrophic. Equivalently, a corporation aggregates enough
resources for the risk to become rational.

By contrast, one of the functions of modern corporations is that culpability
does not pass-through to the decision-makers. This is an actual removal of
risk, not re-distribution, as there are consequences that a corporation is not
capable of facing.

~~~
pash
I was thinking about this the other day. Just as with joint-stock ownership,
legally limiting financial liability redistributes risk (by transferring it to
creditors), but doesn't reduce it in net.

Creditors tend to be established firms and bankrupt companies tend to be new,
small enterprises, so it seems that limited liability promotes socially
beneficial risk-taking by offloading bad outcomes on those who are best able
to absorb them. ... "From each according to his ability, ..."

------
pizza_boy
It's not immediately obvious there's anything new or interesting in there: how
is this heuristic different to a classic Hansonian bet [1]?

One potential advantage is that having "skin in the game" has a more positive
connotation than betting on outcomes, to the general public at least.
Regardless, Hanson at least deserves a mention.

From a stylistic point of view I'm not a big fan of the appeals to authority
(e.g., "the ancients were fully aware" ) either.

From a startup perspective it's worth mentioning that mentorship or advice is
also generally more confusing and less useful when the mentor lacks "skin in
the game". Hence mentor "whiplash".

[1] [http://www.overcomingbias.com/2013/07/bets-
argue.html](http://www.overcomingbias.com/2013/07/bets-argue.html)

~~~
tedsanders
Good points. However, I don't think that "the ancients were fully aware" is an
appeal to authority. I think the idea is ancient people tended to use things
that were successful; therefore if the ancients used something, we should
boost our confidence in it (relative to the baseline).

~~~
pizza_boy
You're partially correct. I'd say that we tend to adopt the successful
cultural adaptations of our ancestors and discard the others.

In other words, when comparing an ancient society to one descended from it,
I'd expect the "successful" adaptations of the ancestor culture to be
disproportionately present in the descendant. The converse need not be true.

In Taleb's example Taleb of the builder, those ancient heuristics became our
tort law (and will be a part of our reputation networks in the future). He
doesn't mention all the cultural adaptations we've since dropped...

So it's obvious at best, and an appeal to authority at worst.

------
graeme
The comments here seem to miss Taleb's central idea that harm is non-binary.
You need to consider not just whether something occurs, but how much impact it
has.

I don't care whether a bank earns a financial gain or a loss. I care how big
the gain or loss is. One hundred years of small profits can be wiped out in
one bad quarter.

I do not believe Taleb is proposing punishing all risk taking. He's advocated
aggressively taking risks where the downside is known.

The problem we face is that many are now in a position to get a limited,
positive upside if things go well, but they face no downside. And the
potential harm in such situations has no upper bound.

Skin in the game ought to be proportional to harm. Many comments mention
medicine as a profession that should not have skin in the game.

Rubbish. Doctors are very liable if things go wrong. But it depends how wrong.
We don't hold doctors accountable for small errors, or random errors.

We hold them accountable for big errors. The bigger the error, the worse the
punishment.

~~~
khawkins
There's an issue in convincing people that harm is completely continuous. If
there is a perception that there is some culpability in performing some
action, regardless of how small, then it is easy to exaggerate it to a
significant quantity in their mind. This would make society increasingly
hostile and litigious for even small gestures like offering to watch their
stuff while they visit the restroom.

------
verdatel
Such a simple heuristic, but I think with far-reaching consequences. Last
week, NNT posted a link to a news article with talked about a new law in an
Indian state that required headmasters to taste the food that was served in
the school cafeteria. This was initiated after a bad case of contaminated food
and deaths resulting from it.

------
pfraze
I gave the paper a quick skim. My thoughts:

As they point out in the paper, there are some cases where risk is
intentionally removed, such as through bankruptcy for businesses to protect
entrepreneurs. I think that can have positive effects.

I'm not fully convinced that "academic economists, quantitative modellers, and
policy wonks"..."have no disincentive and are never penalized by their
errors." I believe their reputation is at stake. The authors may mean they
have no financial risk, but their reputation is tied to their future earnings.
I'm not sure that's a bad thing as long as it's considered.

Overall I'd say there's nothing surprising in this paper, nor any specific
proposals (as I hoped) but mostly a philosophical argument made in reaction to
the current financial and political environment. I do agree with their
premise. As a political view, I think "Skin in the Game" would make a pretty
good slogan.

It would be interesting to implement distributed trust networks around this
concept. If there are clear failure/success signals, you could have nodes
present a bond which is destroyed on success and paid out on failure.
Automated damages collection, I suppose.

~~~
graeme
His point is that there's rarely a reputational hit for being wrong in those
professions. There should be, but there isn't.

Examples:

Thomas Friedman supported the Iraq war.
[http://www.youtube.com/watch?v=ZwFaSpca_3Q](http://www.youtube.com/watch?v=ZwFaSpca_3Q)

Joseph Stiglitze and Peter Orzag predicted Fannie Mae and Freddie Mac faced
near zero risk.

Stiglitz later claimed credit for predicting the financial crisis(!) and Orzag
had a prominent position in Obama's administration

[http://www.pierrelemieux.org/stiglitzrisk.pdf](http://www.pierrelemieux.org/stiglitzrisk.pdf)

What commentator can you think of that's lost his job for a bad prediction?

Pundit's have perverse incentives. They can point to correct predictions to
boost their career, and they are penalized very little for wrong ones.

They have an incentive to make many predictions, and retrospectively choose to
highlight only those that panned out.

~~~
TheCowboy
To be fair, Fannie Mae changed its practices, engaging in the risky "financial
innovations" that caused problems for a lot financial institutions, after
Stiglitz' original study of their exposure to risk.

It is not as if Stiglitz was in control of the institution or encouraged that
they engage in these more risky behaviors.

It is more like a doctor performing a check-up on a patient, which represents
a mere snapshot in time, saying they're in good health. And then the patient
decides it's okay to start smoking, eating fastfood for every meal, and giving
up exercise. Are you going to blame the doctor for not warning this could
happen?

I would say the stronger lesson is that continued checks and oversight are
critical to the health of any institution or business. Though I do also agree
and think people who make sloppy predictions should also be held accountable
for their behavior.

------
steveinator
Interesting read. I liked the reference to Ralph Nadar... Those who vote in
favor of war should be required to enlist in a draft.

Though I don't know how serious I can take this when the guy uses the word
'wonks' in reference to those he disagrees with.

Also I think that the lack of 'skin in the game' encourages risk, which
counteracts a natural tendency for large groups to be more conservative (look
at big business vs small). If you made decision makers have skin in the game,
you'd have even less innovation and new ideas in large institutions. This is
taking into account factors beyond his thesis, but leads to interesting
thoughts when it comes to practical application of his ideas.

~~~
md224
I could be wrong, but I believe the term "policy wonk" is not necessarily
negative... I assumed it simply referred to one who is well-versed in policy
debates and data.

For example:
[http://www.washingtonpost.com/blogs/wonkblog/](http://www.washingtonpost.com/blogs/wonkblog/)

~~~
graeme
This is correct.

source: extensive reading of political blogs + newspapers circa 2006-2008.

------
jdmitch
_the skin in the game heuristic relates directly to the virtue of being such
that the system will not only survive uncertainty, randomness, and volatility
but will actually benefit from it._

Doesn't Taleb mean that the 'skin in the game' heuristic will prevent
uncertainty, randomness and volatility in the system rather than bring
benefit? Within examples such as the 07-08 financial crisis, which he must be
referring to indirectly, 'skin in the game' would have meant less risk-taking,
therefore reducing volatility. But I can't see how the system would have
actually benefited from volatility.

~~~
bdcs
Taleb defines a system as "anti-fragile" if it benefits from volatility. His
previous works build up a mathematical framework for distinguishing between
things which are harmed by fragility, things which are neutral to fragility,
and things which are benefitted from fragility, which are fragile, robust, and
anti-fragile, respectively.

~~~
jfasi
What's so mathematical about it? It has lots of pretty graphs and charts, but
as far I could tell, it's just philosophy masquerading as mathematics.

------
michaelfeathers
I haven't read the paper yet but I remember reading someplace that the
fundamental of economics is that if you introduce costs into a system, you
should bear them.

------
MBlume
So what happens to the authors of this paper if we take their advice and it
goes badly?

~~~
zzleeper
Nothing of course! It's one of the wonders of being an economist ;)

(sarcasm aside, I do think there should be stronger professional repercussions
in economics for advocating idiotic policies. It's one thing to found a
startup and fail, and then start again; it's another to cause a deep recession
in a country and then say "oh well, let's try again")

------
stretchwithme
Nothing educates you like an actual personal loss. If its always other
people's money, people will figure out how to game it. And those okay with
gaming it will soon crowd out the rest, as they can work for less because they
get paid extra for gaming it.

If the banks writing bum mortgages had to take the loss instead selling all
the losers to Fannie and Freddie, many bum loans would have never been made.

------
mathattack
Does this really solve the problem?

The execs at Lehman and Bear had skin in the game. Sure some of the senior-
most folks got away with small fortunes, but everyone last money. Some people
lost their life savings.

Conceptually this makes sense (similar to requiring all derivatives trades to
have margin requirements to limit leverage) but does it really fix what went
wrong?

------
stofu
My favorite example of skin in the game is riding a bicycle.

Compared to driving a car with a ton of metal and five airbags around you,
riding a bike means that you literally have skin in the game.

On the other hand driving a car you basically have no skin in the game. You
jump a red light or drive too fast and as long as the cops don't catch you
it's alright.

------
VMG
In some sense, the same ideas he has always written about, but again expressed
in a new way. I like it.

------
bayesianhorse
Investors need to act as if they could stomach a total loss on the investment,
founders need to act as if they can't survive failure on the venture.

Now imagine that these two roles usually exist in the same person, to varying
degree.

------
zilupe
This is why I believe more in devops approach. When ops and dev are separate
it's both developers acting in an environment with little useful feedback and
ops always able to blame developers.

------
jka
I'm re-watching The Wire at the moment, which I'm enjoying greatly, and one of
the best things about it is watching the lengths that the characters go to in
order to avoid being personally linked with criminal operations under their
control. They benefit when the organization benefits, but aim to avoid the
downsides when their staff's activities are exposed.

As long as there are areas with very high reward (investment banking, drug
dealing, human trafficking, startups), and enterprising individuals with the
connections to organize them (either through experience and leadership
progression, or by using opportunities to undermine former leaders / acquire
resources), people will rise to these opportunities.

Once an operation reaches a certain size however, many leaders become
disconnected - intentionally or otherwise - from the day-to-day business of
their organization.

Given the typical attitudes on HN, I'd expect that most founders here would be
genuinely mortified to find out if their software had caused real-life
problems - most would take a lot of care to ensure good user experience and
correct results; these make good business sense too. However I think many
would eventually aim to become distanced from the business too - the dream of
reliable passive income.

Meanwhile, we imagine - and it's not hard to imagine - that many bank
executives, criminal leaders, and others - actively enjoy their life in
distance and immunity while feeling little remorse for the damage they do, and
certainly not taking any intrinsic risk for it.

We expect that the rule of law will deal with these problems when they occur -
that's what optimistic films and positive fiction tell us - but the reality is
that a good-enough combination of wealth, influence, leverage over others, and
wits can let people stay at arms-length from (but in control of) nefarious
affairs even if they _are_ aware that they are causing harm.

In some ways I think this expresses itself even in the trend for honest
businesses to avoid liability where possible -- if we take liability for what
we do, we also have to do the best we can and take genuine care of our
customers.

Somehow a fear of litigation, genuine exploitation of litigation (c.f.
ambulance-chasers), and fear of decreased business efficiency have reached the
level where companies _do_ prefer to distance themselves. Often it is via
lacklustre or even laughable attempts, such as using oft-ignored in-store
signage, disclaimers, or automated customer interactions.

Retreating from each other for fear of risk isn't healthy as a general trend,
and neither is our inability to reach and re-integrate those who expressly
_intend_ to maintain their distance to avoid risk to themselves.

