
On the Folly of Rewarding A, While Hoping for B (1975) [pdf] - Someone
http://web.mit.edu/curhan/www/docs/Articles/15341_Readings/Motivation/Kerr_Folly_of_rewarding_A_while_hoping_for_B.pdf
======
impendia
I find this article... strange. Near the beginning, it talks about widespread
mutiny in Vietnam, and attributes it to the reward system: "What did the man
at the bottom want? To go home. And when did he get to go home? When his tour
of duty was over! This was the case whether or not the war was won."

So, in other words, the war was lost because management did not properly align
incentives for soldiers with desired outcomes?

I find the following article more persuasive:

[https://libcom.org/history/1961-1973-gi-resistance-in-the-
vi...](https://libcom.org/history/1961-1973-gi-resistance-in-the-vietnam-war)

It identifies three important factors for the same phenomenon:

\-- First, that new recruits discovered immediately that their recruiters had
lied to them: "Guarantees of special training and choice assignments were
simply swept away."

\-- Secondly, for the many Black GI's, a growing consciousness of their own
oppression within American society.

\-- And, finally, the sheer futility and meaninglessness of the war itself: "a
seemingly endless ground war against an often invisible enemy, with the mass
of people often openly hostile, in support of a government both unpopular and
corrupt."

In my opinion, the article overstates the importance of management. I work as
a tenured professor at a research university -- another profession discussed
in the article. Frankly, there are few rewards or disincentives for behavior
of _any_ sort -- at least not of the sort that come from my bosses. And yet I
do a good job anyway, for the simple reason that I believe my job is worth
doing well.

~~~
jacobr1
> I work as a tenured professor at a research university

Isn't the reward tenure itself? And it requires you to put in enough effort
for your department to consider to approve you. Behavior that would prevent
approval is implicitly discouraged (might be certain views held academic or
not, or pedagogical approaches) and requirements (publishing, funding brought
in, student feedback, favor your peers) depending on the department and field.

> And yet I do a good job anyway, for the simple reason that I believe my job
> is worth doing well.

Once tenured, you've already been selected for the traits that are desired in
the role and proven your ability to execute them well.

~~~
stepstop
> Once tenured, you've already been selected for the traits that are desired
> in the role and proven your ability to execute them well.

I agree that the OP has a bit of self-selection bias. Similar to the “why
don’t you go to college and get a real job. I did that, so can you” that
internalizes the success while ignoring externalities

~~~
geofft
I don't think OP is saying "I got tenure, why can't you go through the same
process" or anything, I think they're saying "I got tenure, and having been
tenured I continue to do good work because of intrinsic motivation and not
management-driven incentives, I think we should extend this level of autonomy
to everyone and get rid of the barriers to having a tenure-style job."

------
motohagiography
More recently, this concept was linked from Lesswrong/SlateStarCodex,
[https://www.lesswrong.com/posts/n86TNJS8k2Shd84kn/the-
asshol...](https://www.lesswrong.com/posts/n86TNJS8k2Shd84kn/the-asshole-
filter) , which I would recommend skipping in favor of summaries given it
wanders and you can only read the same word over again so many times.

The basic dynamic is: if you advertise a rule, and then don't enforce it, you
in-effect reward people who transgress the rule and directly discourage people
who obey it - and then wonder why you are "surrounded by assholes," when what
you've done is set up a filter that only assholes manage to permeate. I think
of this as a variant of, the "Hock Principle: Simple, clear purpose and
principles give rise to complex and intelligent behavior. Complex rules and
regulations give rise to simple and stupid behavior."

More than post-hoc explanation, it can predict a pattern for how large
organizations and bureaucracies have clusters of terrible people at the top,
how an individual may develop a pattern of being subject to abusive
relationships, how some social programs will just sustain the problem they are
mandated to mitigate, how corruption spreads in institutions, how some people
always seem to end up in terrible jobs, and even how discussion forums can
turn toxic.

It comes down to the maxim, "what you reward, you get." If you do not set
boundaries and then _enforce_ the boundaries you set, the _only_ people who
will pass through them will necessarily be the kind of people who don't
respect you or your boundaries. Set boundaries for yourself and how you relate
to the world and others, and be absolutely aware that when you choose to
relate without them, you are rewarding people for not respecting you. Do the
same in your companies, products, services, and teams. If you have ever worked
with someone who was an asshole and wondered why and how they were able to
succeed, it is because the people around the person let it work for them.

~~~
abnry
I would add to your statement

>If you advertise a rule and don't enforce it, you in-effect reward people who
transgress the rule... and when why you are surrounded by jerks

that transgressing the rule needs to provide some sort of competitive
advantage for the selection to work. Some people may transgress a dress code
but I am doubtful that sort of thing will actively help them in the workplace.

It may be in a culture lacking enforcement of minor rules results in more
people feeling free to transgress rules that actively benefit them, but that
is a second order effect.

~~~
tosers4
> It may be in a culture lacking enforcement of minor rules results in more
> people feeling free to transgress rules that actively benefit them, but that
> is a second order effect.

Not just that, but for example, it will leave distaste in the abiding workers
mouths. "I came here every day, respect the dress code, and that lazy bastard
does not?".

So, don't really make rules that take much effort to enforce, or leave more
open interpretation for small things, "common sense".

I don't know the answer, but a lot of minor things makes a big bad thing.

~~~
TeMPOraL
> _Not just that, but for example, it will leave distaste in the abiding
> workers mouths._

Or, in other words, there is a competitive advantage here: obeying the rule is
a hassle. Following a dress code is more annoying than wearing whatever one is
comfortable in. The abiding workers are rightfully angry, because they're
paying a price, however minor, for apparently no reason.

------
RcouF1uZ4gsC
Along the same lines, one of my biggest pet peeves is rewarding using Metric
C, which is used not because it actually correlates with your goals, but
because it is easy to measure and get a concrete number. An example would be
using lines of code written per day as a measure of developer productivity.

~~~
artsyca
Not to mention once metrics become targets, they cease being useful as metrics
because they're apt to be gamed.

It seems you simply can't win when you need to guard against mindlessly
selfish behaviour.

~~~
rightbyte
Protip: Instead of saying which metrics you use you could use a opaque
evaluation system that seems arbitrary for the subjects.

~~~
artsyca
Sure! And have them all report each other for thought crimes and maintain
secret files on everyone.

Either that or let AI sort it all out.

------
renewiltord
I enjoyed this read. It reminds me of a few things I've observed recently:

* If you give a little, you will be subject to increased disapproval and if you don't care at all, you'll be subject to distant tut-tutting. Examples: the tech industry faces widespread condemnation for its gender ratio while hedge funds face little - only 1 in every 9 senior hedgies is female[0]. I'm not complaining about the gender ratio condemnation, merely commenting on the difference. Another example is that Bernie Sanders's speech was interrupted by Black Lives Matter protestors. Donald Trump's wasn't.[1]

* Internet commenters frequently remark on how they would pay for good news, yet they prefer reading free low-quality news sources over paid high-quality news sources. Similarly with clickbait vs. informative writing. They share the former, they comment on the former, and they enjoy the former. Effectively, they only pay the former.

* Voters and plans (brought up in the article) are my favourite. You cannot attack someone without a plan but every plan has flaws. Therefore, offering a plan is disincentivized while offering mere ideas is incentivized by people who claim the opposite. I'm glad this was example 1 in the article.

* Internet commenters will frequently be unforgiving of errors corrected. They reward hiding errors and punish honest corrections. No apology is sufficient. Often this manifests as upvotes or likes for those who stick brutishly to their positions while merely abandoning to indifference those honestly convinced. Strictly this is not really only Internet commenters. The fact that Red Cross having to not give away free doughnuts resulted in animosity towards them is a lesson for future players: do not give away anything.

* YC's ill-fated attempt at crowd-sourcing a funding target was another amusing result. They intended to support something that they wouldn't have heard of, but they rewarded popularity, which resulted in their having to fund something they'd heard of a million times.

0: [https://docs.preqin.com/reports/Preqin-Special-Report-
Women-...](https://docs.preqin.com/reports/Preqin-Special-Report-Women-in-
Alternative-Assets-October-2017.pdf)

1: [https://time.com/3989917/black-lives-matter-protest-
bernie-s...](https://time.com/3989917/black-lives-matter-protest-bernie-
sanders-seattle/)

~~~
bonoboTP
A related blog post calls this the Copenhagen interpretation of ethics (your
ethical burden increases by interacting with the problem in any way, even
positively): [https://blog.jaibot.com/the-copenhagen-interpretation-of-
eth...](https://blog.jaibot.com/the-copenhagen-interpretation-of-ethics/)

There are also other analogies:

At work, if you touch something, you are now eternally responsible for
maintaining it. Be it code or process or equipment. Changed the toner in the
printer? Great, you can now do it all the time. "But you already know how to
do it, it will just take a minute for you". The best tactic at work is to be
really competent in the main job and push all energy into furthering the
skills and experience and recognition in that and dodging all the crap work by
appearing badly suited to them. You want your colleagues to say "ah he's a
good guy, really competent, but this type of work is really not his kind, he'd
forget it, mess it up or something". Basically the goofy distracted professor
meme. Great at focusing on his specialty we pay him for but cannot be tasked
with these everyday mundane shit tasks.

On a larger scale, big tribal feuds are often among groups with small
differences, like Christian denominations and wars over dogmatic debates on
the details.

~~~
fennecfoxen
> On a larger scale, big tribal feuds are often among groups with small
> differences, like Christian denominations and wars over dogmatic debates on
> the details.

Did you have a big Christian war in mind which was specifically about dogmatic
differences, and not primarily about monarchs exerting their power over some
group or another?

~~~
mrkstu
I think the modern version is Islamic- you see many Sunni nations more willing
to work with Israel than Shia Iran.

Christianity is not an explicit part of any modern state outside of the
Vatican that I'm aware of (with a small technical exception for Britain's
monarchy.)

~~~
osullivj
Mount Athos has special autonomous status within the Greek Republic; the
Orthodox equivalent of Vatican City for the Catholic world.

------
_dwt
I stumbled on this paper very early on in my career and it significantly
shifted my attitudes toward work, politics, and society. In short: people
follow incentives.

Of course, the suggestion of "altering the reward system" breaks down a bit in
the presence of Goodhart's law, but it's still very common to see (especially
in large or unusually bureaucratic organizations) policies which create
incentives completely at odds with the stated goal or metric. One of my
favorite examples is the "budget game" many companies play, where each
department/division/etc. puts forward a proposed set of projects and budget;
the corporate planners then allocate the overall corporate budget. Departments
obviously "sand-bag" and ask for more than they think they need (knowing they
won't get everything they ask for); there's often a "use it or lose it" policy
too, which leads to what I can only describe as an orgy of spending at the end
of the fiscal year. This can be great fun for vendors, consultants, and so on!

~~~
HappyDreamer
> people follow incentives

Maybe the less they care about the company's overall goals and vision, the
more they follow KPI:s and bonus incentives. When there seems to be no overall
purpose and meaning anyway (except for making rich people richer?), then why
not game the system?

But there can also be intrinsic motivation, just "unselfishlessly" doing what
seems like good for the organization, if one likes its goal and reasons for
existing.

 _Edit:_ You wrote: "significantly shifted my attitudes toward work, politics"

I wonder, how did this change how you thereafter did things at the workplace?
(And politics somehow?)

~~~
_dwt
It just made me more cognizant that people respond to incentives rather than
intentions. It's not enough to propose new "corporate values" or state that a
policy proposal must be passed because it "addresses" some problem - show me
how the proposed change incentivizes the desired behavior. Are there perverse
incentives? Principal-agent problems? And so on.

Make no mistake, I think people are "intrinsically motivated" too, at least to
a point. I'd like to believe I am. But it's hard to deny that people (myself
included) do things for some reason, some gain - even if that gain is
something intangible like "a feeling that I'm doing good" or "perceived status
in my organization".

~~~
HappyDreamer
> _people respond to incentives rather than intentions_

I like that way of phrasing it. I suppose it's good to keep in mind in ...
all? parts of life. E.g. also if raising children?

> _Are there perverse incentives? Principal-agent problems?_

If I setup incentives some day in the future -- I'd talk with the people
involved, and describe the overall goals and intentions. And then all of us
together can try to figure out problems with the incentives

(Of course in some rare? case some people might not want to do that in an
honest way.)

This was interesting to read, "Principal-agent problems" was a new phrase to
me:
[https://en.wikipedia.org/wiki/Principal%E2%80%93agent_proble...](https://en.wikipedia.org/wiki/Principal%E2%80%93agent_problem)

> _I think people are "intrinsically motivated" too [...] do things for some
> reason [...] intangible like "a feeling that I'm doing good"_

Yes, totally agree

------
artsyca
Totally happening in the corporate world. Rewarding employee mentality while
hoping for not employee mentality.

------
xdavidliu
There's a strange point that is made on page 771. Toward the bottom of the
page, it says:

 _Such a conclusion would be wrong.²_

The "conclusion" being that doctors want to minimize both false positives and
false negatives.

The footnote is:

 _²In one study (4) of 14,867 films for signs of tuberculosis, 1,216 positive
readings turned out to be clinically negative; only 24 negative readings
proved clinically active, a ratio of 50 to 1._

Can someone explain to me why the ratio matters? Surely the ratio would also
depend on the actual incidence of tuberculosis in the population, and would
not be determined solely by doctors' choices, right?

~~~
antishatter
The ratio matters because the second type of error is significantly worse ie:
it is much worse to have tuberculosis and be told you do not have
tuberculosis. They are looking at type 1 vs type 2 error.

I think that they are saying three things 1) real tests aren't actually trying
to minimize both types error of which is the opposite of the hypothesis "It
might be natural to conclude that physicians seek to minimize both types of
error" 2) illustrating with a very dangerous disease (to give the reader an
idea of risk) the difference in types of error. 3) Demonstrating potential
trade offs that occur, Type 1 error is so much less risky they are ok with 50x
as much error.

~~~
jellicle
Cynically, if doctors are financially rewarded for performing treatments,
doctors may prefer a test with significant false positives over one with
minimal false positives. (If your knee jerks and you start saying things like
'doctors never allow financial incentives to change their medical decisions!',
let me respond by saying there are many studies that say that they do, and
also you can replace 'doctors' with 'health-adjacent corporations' if you
like.)

~~~
kazinator
If nobody at all has tuberculosis, then even a test heavily biased toward
false negative will still produce nothing but false positives.

In that situation, only two kinds of test will avoid false positives: (1) a
perfectly reliable test, or else (2) a broken tests that never reports a
positive.

------
asdfman123
Surely making everyone come back into the office will boost productivity.

