
Memory capacity and commercial compiler development - johndcook
http://shape-of-code.coding-guidelines.com/2011/10/08/memory-capacity-and-commercial-compiler-development/
======
shaggyfrog
> When I started out in the compiler business in the 80s many commercial
> compilers were originally written by one person. A very good person who
> dedicated himself (I have never heard of a woman doing this)

Grace Hopper?

~~~
derleth
Did any of Grace Hopper's work get released commercially? She was an admiral
in the Navy, after all.

~~~
shaggyfrog
Is that really the point, though? kragen's sibling post makes a better case.

------
cwzwarich
I am not sure I totally get the author's point. Many commercial compilers had
already implemented and shipped many of the optimizations later found in GCC
before GCC did, although these compilers were often for a different class of
machine than the average desktop Linux PC.

Is he saying that GCC implemented them just as they became viable on lower end
hardware?

~~~
derleth
> Is he saying that GCC implemented them just as they became viable on lower
> end hardware?

To begin with, gcc didn't target low-end hardware; low-end hardware of 1985
(the beginning of the GNU Project) wasn't capable of running serious OSes, so
Stallman ignored it in favor of workstations built around Motorola 68000 CPUs
(I believe).

But that's beside the point: He's saying that once lower-end systems began to
be as capable of the optimizations as high-end systems were previously, the
expectations of compilers for low-end systems began to match the expectations
of compilers for high-end systems. He's also implicitly making the point that,
once CPU speed is above a certain minimum, the thing that most limits a
computer is a shortage of RAM.

A compiler for an MS-DOS system with a few hundred K of RAM can't do advanced
register scheduling (not that the 8088 has a lot of registers to begin with)
or code transformations but that's OK, because nobody expects it to. All
people expect is for their Borland Pascal or Pacific C or whatever compiler to
emit machine code that correctly implements their algorithm and, if it's a bit
slow, they'll grab Ralf Brown's Interrupt List and other reference material
and start hacking assembly.

In Linux, in 2011, people expect gcc to not only correctly unroll loops, but
to transform recursive functions into an iterative form and _then_ unroll the
resulting loop. (gcc can, actually, do this.) They expect gcc to use MMX
opcodes in somewhat-ingenious ways to speed up string handling; and as an
aside most of the more interesting opcodes of the x86 ISA are another thing
the solo MS-DOS hackers didn't have to worry much about until near the end of
the MS-DOS era. They expect, in short, a compiler that would have been at home
on a very high-end system as recently as the early 1990s and is therefore too
complex for any single human to write unaided. (Well, perhaps Knuth could, but
Knuth doesn't scale.)

------
kenjackson
The author is generally right. But it's not unique to compilers. Many things
were often written by just one person in the early days, but rarely so now. PC
and console games, for example, were often written by one person. Now of
course there are huge studios that do it (although mobile games can now be
written by one person).

Programs have seen an explosion in feature set. For compilers this feature set
includes optimzation, and additionally things like AutoComplete/IntelliSense,
debug information, precompiled headers, etc...

With that said optimization probably dominates, in terms of headcount, what
most compiler devs work on.

------
mathattack
Interesting. Seems like a rare instance of increased computing power hurting
individual programmer productivity.

Then again increased computing power in general enables more complex software
with more specializes software. I have heard that Safari is the product of a
very small team, but that's the exception.

~~~
ars
It has not hurt the productivity - it's just now more productivity is
expected.

The individual productivity is probably higher than before, but the need (or
expectation) is even higher than that.

~~~
mathattack
You said what I meant to say, but much more concisely, or dare I say
productively? :-)

------
beagle3
And yet, Mike Pall singlehandedly wrote LuaJit2 which is, by far, the best
ever compiler for a dynamic language. None other comes close at this time.

------
derleth
It's also no longer the case that a single person can build a commercially
competitive automobile or airplane. Even ignoring all the relevant
regulations, most people buying those things expect features like anti-lock
breaking and navigation systems and wings that are not made out of canvas
stretched across cunningly-bent pieces of wood.

On the other hand, a modern kit car would make the young Karl Benz green with
envy. A modern kit airplane would make the Wright brothers... well, they were
pretty emotionally reserved, but I think I can safely say they would be
intensely interested in every single aspect.

I think the main part he leaves out is that not only do we have better, more
complex tools, we have better, more complex tools to build tools. The
continuing expansion of expectations is only natural given our continuing
expansion of capacity.

