Except that even Algol 68 is more feature rich than Go, assuming a 60's powerful computer, like the Burroughs B5000. :)
But I do appreciate that every line of written Go code is one less of written C code.
Go thinks fast compile time is a feature. How fast is your Algol 68 compiler on a ten million line code base?
2. How many of those people would have much smaller codebases if they used a more expressive language?
3. Of the people who have 10 million line codebases and would still have 10 million line codebases in a more expressive language, how many of those could actually compile their programs faster if they used a language that allowed partial compilation of only the parts of the code that were changed?
Very few people have Google's problems, and I'm not even completely convinced that Go is the best way to solve Google's problems.
But if you want numbers, Turbo Pascal 5.5 was doing 34,000 lines/minute on MS-DOS back in 1989.
Of course faster hardware is part of the answer, but I suspect that it doesn't cover nearly all the ground between Turbo Pascal's compile time and Go's.
Besides I am comparing an 8086 with 640 KB against a i7 with 16 GB and three level caches.
Go being fast isn't that much of an achievement in 2016.
If Go compiles faster than Turbo Pascal with an 8086 CPU and 640 KB RAM, that is an achievement.
My point is that Go's compilation speed isn't nothing extraordinary, I can keep giving examples of other compiled languages that had equally fast compilers in the the mid-90's.
Oberon, Go's grand daddy could bootstrap itself and build a full working graphical workstation OS in less than one minute if I remember correctly.