
Coherence Penalty for Humans - fagnerbrack
http://www.michaelnygard.com/blog/2018/01/coherence-penalty-for-humans/
======
kornakiewicz
A manager went to the Master Programmer and showed him the requirements
document for a new application. The manager asked the Master: "How long will
it take to design this system if I assign five programmers to it?"

"It will take one year," said the Master promptly.

"But we need this system immediately or even sooner! How long will it take if
I assign ten programmers to it?"

The Master Programmer frowned. "In that case, it will take two years."

"And what if I assign a hundred programmers to it?"

The Master Programmer shrugged. "Then the design will never be completed," he
said.

~~~
dexen
A great koan, thank you.

There's also a related, from-the-trenches, story:

It is a very humbling experience to make a multimillion-dollar mistake, but it
is also very memorable. I vividly recall the night we decided how to organize
the actual writing of external specifications for OS/360\. The manager of
architecture, the manager of control program implementation, and I were
threshing out the plan, schedule, and division of responsibilities.

The architecture manager had 10 good men. He asserted that they could write
the specifications and do it right. It would take ten months, three more than
the schedule allowed.

The control program manager had 150 men. He asserted that they could prepare
the specifications, with the architecture team coordinating; it would be well-
done and practical, and he could do it on schedule. Futhermore, if the
architecture team did it, his 150 men would sit twiddling their thumbs for ten
months.

To this the architecture manager responded that if I gave the control program
team the responsibility, the result would not in fact be on time, but would
also be three months late, and of much lower quality. I did, and it was. He
was right on both counts. Moreover, the lack of conceptual integrity made the
system far more costly to build and change, and I would estimate that it added
a year to debugging time.

\-- Frederick Brooks Jr., "The Mythical Man Month"

~~~
fjsolwmv
That sounds specious - grass is greener fallacy. The evidence was that the CP
team failed at the job. But where is the evidence that the architecture team
would succeed?

The general suspicion of TMMM is that it extrapolates from failure but assumes
that some untried else would be better.

~~~
coldtea
> _That sounds specious - grass is greener fallacy. The evidence was that the
> CP team failed at the job. But where is the evidence that the architecture
> team would succeed?_

Anybody who has worked for a couple of decades of large software project,
doesn't need any, cause they have seen this play out time and again. Brooks
doubly so -- he literally wrote the book on software development timelines.

Sure, technically you're right.

But it's not like every piece knowledge needs to come packaged in a fancy
LaTeX, with confidence intervals, and control groups. Sometimes experience
alone is enough to assure us that the rain is wet and that a team of 150 will
invariably fail in this way when assigned such a task compared to a team of
10.

~~~
njarboe
Seems like a good way to go would be 15 isolated teams of 10. Some teams might
make the schedule. Pick the best project at that point. Big bonuses for the
people that get it done and extra for the final winners. Seems like a huge
waste of effort, but it looks like that is the way to go. This would be
similar to what happens when a company just buys a small startup that has
created what they need.

------
js8
I like the article. I think, in general, (project) management could learn a
lot from computer science. People working on operating systems etc. figured
out solutions to a lot of problems like scheduling and so on.

Another similar penalty with humans that is often ignored is caching of skills
into working memory.

For example, consider a process, like writing an expense, that is really
simple. Because it's simple, it might be tempting not have a specialized
person do it and instead having every person do it on the need basis. But
then, if it's done only rarely, people will have to learn it each time, or ask
somebody how to do it, spending lot more time on it due to what are pretty
much cache misses, and it would be more efficient to have a specialist do it,
because then he would do it every day and had all the process corner cases in
working memory.

Similar problem with caching happens when say a programmer multitasks on
several different things at a time. In that case, the cache (working memory)
is completely trashed every time a task is switched.

~~~
dasmoth
_For example, consider a process, like writing an expense, that is really
simple. Because it 's simple, it might be tempting not have a specialized
person do it and instead having every person do it on the need basis. But
then, if it's done only rarely, people will have to learn it each time, or ask
somebody how to do it, spending lot more time on it due to what are pretty
much cache misses, and it would be more efficient to have a specialist do it,
because then he would do it every day and had all the process corner cases in
working memory._

This is true, but it's important to remember that people aren't machines.
Going too far down the division-of-labour route risks losing sight of what
you're actually trying to achieve. Certainly in my case, that makes it a lot
harder to perform my best, and it reduces the chance of spotting different
ways to slice-and-dice the problem.

