
Go 1.3 Linker Overhaul - signa11
https://docs.google.com/document/d/1xN-g6qjjWflecSP08LNgh2uFsKjWb-rR9KA11ip_DIE/edit
======
acqq
Interestingly, they want to do the opposite of what Microsoft did with their
C++ compiler and linker. To enable "whole program optimizations" Microsoft's
compiler writers added the intermediate code analysis, optimizations and code
generation to the linker, allowing to, for example, if it is beneficial,
inline function invocations for small functions that aren't declared as inline
and that come from the different module. So their linker can now do both the
classical linking (what Go authors have as the goal now) and the "linker does
the code optimization, including intra-module, and generation."

Did Go ever use intra-modules optimizations in their linker? Do Go authors
really find there's no need for that, when they want to fix the new object
code format to only the "generated code" one?

~~~
twotwotwo
Go's gc toolchain (as opposed to gccgo) tends to emphasize compile speed over
producing optimal executables, so they may be flat-out deciding to leave some
potential whole-program optimizations on the table to for link speed--pure
speculation, though. (And don't know re: your question.)

~~~
coldtea
> _tends to emphasize compile speed over producing optimal executables_

So they opted to handle the easy problem.

~~~
jesstaa
Compile time is a big deal when you have millions of lines of code that
constantly change and very expensive devs that are spending large amounts of
time waiting. The infrastructure Google uses to distribute and track builds
across datacentres and the world to make compile time of large applications
bearable isn't exactly trivial.

Optimal executables aren't that important most of the time. Hardware is
cheaper than dev time and the gain of 'optimal' isn't much for the compile
time trade off.

~~~
coldtea
> _Compile time is a big deal when you have millions of lines of code that
> constantly change and very expensive devs that are spending large amounts of
> time waiting._

1) Sure. Most of us do not. So why should we care and/or trade other stuff for
compile time improvements?

2) That's when you don't have a module loading system and have to build
everything everytime.

~~~
girvo
1) Cause the creators of the language decided to. Don't like it, feel free to
fork it.

~~~
coldtea
> _Cause the creators of the language decided to_

Which is an no-op answer. That can be the answer for any kind of engineering
decision, including the most idiotic ones.

We're concerned with what's best here, not merely with what has been decided.

> _Don 't like it, feel free to fork it_

Another BS non answer.

~~~
girvo
The language authors obviously disagree with it being an "idiotic" engineering
choice. Your choices of tradeoffs don't match theirs, and engineering is never
as simple as "I'm right, you're wrong", as you well know.

Yeah I was being facetious with my original reply, and frankly I get your
point, I was pointing out however that you _can_ fork it to make it work the
way you'd prefer. If you truly don't want to (or can't) do that, then open a
ticket or send something to a mailing list.

If this really matters to you, do something about it that can make an actual
difference.

------
tomcam
Been a while since I looked at ELF but it would sure be nice if it used, say,
a well-defined ELF subset to make use of the many ELF tools out there already.

~~~
zellyn
The article mentions that possibility.

