
C++11 and The Long-Term Viability Of GCC Questioned - protomyth
http://www.phoronix.com/scan.php?page=news_item&px=MTI4NDU
======
hosay123
Certain "journalists" are seriously jumping the gun if they assume GCC is just
going to shrivel up and die. In addition to mainline, it is used privately in
a hundred different branches, several of which are commercially supported
(e.g. Codesourcery, Google Native Client) and many the public will never know
about.

There is a vast body of consumers who care more about maturity and stability
than a few convenient front-end tricks, which the majority of C++11 ultimately
amounts to. As for whether GCC is somehow "too far gone" to catch up, well
that's just nonsense, the core technology has barely changed in 10 years.
Things will improve when commercial needs arise and as eager PHDs hunt for
final year projects.

While ancient, there is immense diversity in GCC's configurations, and AFAIK
its code generator still maintains an edge over Free alternatives. Of the
purported speed improvements elsewhere, anecdotal evidence from my Macbook
suggests LLVM is slower at building smaller projects (predicted long ago by
smart types that knew LLVM was only fast because it lacked features).

It's worth noting that GCC already supports significantly more C++11 than
either MSVC11 or ICC, although this article makes no such comparison. Like the
majority of articles from Phoronix, this one is blinded by the new hotness
(LLVM) and a generally superficial appreciation of the subject matter.

~~~
bitwize
GCC already "shriveled up and died" once -- the egcs project got so far ahead
of regular GCC in development that the FSF basically just said,
"congratulations, you are the new maintainers and your project is now the
official GCC mainline".

GCC is stagnant, but the dynamic pace of development on the Clang front means
that either the GCC developers will adapt, or the community will adapt GCC for
them.

~~~
BlackAura
There's a bit of a difference, though.

For one, there wasn't any viable competition at the time. Your options were
GCC, or a commercial compiler. If you actually needed access to the source
code of the compiler for any reason, GCC was your only option.

So, when it became clear that the FSF and the community wanted to take GCC in
different directions, the obvious choice was to fork GCC. The fork developed
far more rapidly than the original, draining developer time (and users, for
that matter) from GCC mainline. Basically, at the point where the FSF anointed
egcs as the new GCC, the original GCC project was already dead.

Things are a bit different nowadays.

There are already disagreements between what people wanted to do with GCC, and
the FSF. Especially around the tool support the article talked about - the FSF
are against reorganizing GCC so that parts can be used as a library for
political reasons. They were against plugins for the same reason originally.
They have been resisting any attempts to clean up or modernize the codebase
(they've been doing it, but very slowly, because it takes the steering
committee a long time to... erm... steer).

There's an alternative - Clang / LLVM, which is built around that kind of tool
support, has less niche stuff to get in the way, cleaner, simpler codebase, no
legacy stuff. A good few GCC developers have jumped behind that instead,
rather than trying to work against the FSF, and Clang isn't having the kind of
problems attracting new developers that GCC is.

At this point, it does look like GCC is in trouble. The problem is that there
might not be anything that anyone can do about it. As with all large projects
with a long history, GCC has a hell of a lot of inertia, and it would take
massive effort to shift the direction of the whole project, and make the
changes that need to be made, especially considering that they can't just stop
and spend a couple of years refactoring (which would definitely kill GCC).

~~~
gillianseed
Actually the reasons against plugins and GCC being used as 'a library' went
beyond mere politics, it was also a practical consideration as it affected the
relevance/existance of GCC.

Back then the there was an obvious threat of GCC becoming little more than a
backend/frontend solution for proprietary interests which could lead to the
open source part eventually wither and die.

Steve Jobs when at NeXT illustrates this problem as he tried to combine NeXT's
proprietary ObjC frontend with GCC's backend and sell it under the reasoning
that the 'end user would do the linking', he couldn't legally do so which is
how GCC ended up with ObjC support which it otherwise hadn't, atleast not
then.

Nowadays open source is established and it's benefits are well proven so these
choices may look overly paranoid when viewed in the open source landscape of
today, but back then and for a long time in GCC's development they certainly
weren't.

The fact that an open source free compiler toolchain like GCC has established
itself as such a 'de facto' solution has obviously had an incredible impact on
the acceptance of open source in general, not to mention that Linux, the open
source BSD's, and tons of other open source projects (and likely lots of
proprietary projects aswell) would never have gotten anywhere near where they
are today had it not been for the existance of GCC, a top-notch compiler
toolchain free of charge and free to distribute.

------
_delirium
One intervening factor, imo, is that adding support for C++11 is a complex and
unpleasant enough undertaking that few volunteers are able to do it, and fewer
of those are interested in doing so. I would guess most language/compiler
hackers would rather hack on something other than the mess that is C++
compilation, if they have a choice (and if they're volunteers, they do have a
choice). So it ultimately boils down to what companies want to do, since many
enterprises are heavily invested in C++. If someone pays their employees to
contribute support for GCC, it'll get done, and otherwise, it's less likely.
Not that much different from the Clang situation, where C++11 is only really
getting implemented because Apple cares enough about getting C++11 support
into XCode to pay for it to happen.

~~~
alperakgun
don't underestimate the capabilities of `volunteers', or floss developers;
nearly 30 years of apple software was thrown away again and again, and apple
jumped to something new without much backward compatibility. So, they
_enabling_ sophisticated development by paying is a temporary illusion.

~~~
vor_
Are you referring to the move to OS X? Apple ran the previous operating system
within the new operating system just to run those old apps, and Carbon was
provided as a transitional API for many years. I'm not sure what this has to
do with anything, really.

A backwards compatibility argument (whatever it may be) doesn't make sense
here, as Clang has been designed to be backwards compatible with GCC-based
projects.

~~~
cpeterso
Plus, being based on NeXTSTEP and the Mach kernel, OS X was not exactly a
"new" operating system.

------
asveikau
Last few times I've seen Phoronix articles my impression is honestly "this is
a troll website, right?"

But I was pretty surprised by the attitude that surfaced very early in the
linked thread on the GCC list, too. Outwardly hostile, as if to say, "well
damn, you losers sure better catch up with LLVM right this instant, otherwise
you're making lame excuses". I don't have any involvement in either project
(other than the fact that I've used and appreciated both compilers), but this
strikes me to fundamentally misunderstand open source. Does LLVM have to fail
for GCC to succeed?

~~~
zanny
The argument isn't about one or the other failing. For the same reason Cobol
and Fortran are still used and iterated upon to this day, GCC will also never
"die". It is deeply ingrained into many toolchains used by many companies. The
article if vapor because the moment a company wants some C++11 feature in GCC,
they have a value proposition - refactor their code base to work on Clang
(which may not have feature parity with all their optimization / compiler
flags from GCC) or contribute the missing parts into GCC.

And a lot of companies will still go with the latter. GCC is fine, and won't
"die". The systemic problem in LLVM/Clang vs GCC is that the former is modular
and has an intermediary language that makes _new_ compiled languages much
easier to implement a compiler for. The resut is that if you were to write a
new compiler, LLVM is so far and away the best choice it isn't even a content.
So it has momentum GCC doesn't, and that momentum means it will see much more
active development.

------
gonzo
The reason that gcc loses the original context is that rms wanted it that way.
The "middle" of gcc is muddy expressly to make it difficult to "plug-in" a
proprietary optimizer.

The forced transition to GPL v3 isn't helping. OS X and Free SD are both
largely transitioned to LLVM. Linux is a holdout, but how long will it be able
to prop up gcc on its own?

~~~
DannyBee
I wrote portions of this middle end. In fact, the "middle end" is the least
muddy part of GCC at all.

It's muddy mainly because GNU coding standards and portability rules prevented
people from doing things right when they had the time to do it.

------
filereaper
I have to agree with Diego Novillo's comment about Clang's parser. Projects
like the Clang complete vim plugin come to mind. Ctags/Cscope doesn't really
work to well with overloaded functions and having something that "understands"
the language rather than just matching patterns helps.

~~~
mpyne
KDevelop has a very sophisticated C++ parser (duchain I think they call it)
which even provides semantic information (e.g. you can syntax highlight
pointers differently from objects, non-virtual method calls different from
virtual, etc.)

Example overview: [http://zwabel.wordpress.com/2009/01/08/c-ide-evolution-
from-...](http://zwabel.wordpress.com/2009/01/08/c-ide-evolution-from-syntax-
highlighting-to-semantic-highlighting/)

~~~
vor_
But one of the points of Clang's toolability is that you don't have to write
your own parser. You can use the compiler's and see what the compiler will
see, avoiding inconsistencies as well as unnecessary work.

~~~
mpyne
DUChain actually might even predate Clang from what I've been able to dig up
today, but there's a few other points working against Clang supplanting it:

1\. DUChain was designed to support multiple languages (since KDevelop does).
E.g. there is work going on now to offer first-class JavaScript support in
KDevelop, which libclang wouldn't help with.

2\. No one has done the work to verify API/ABI compatibility guarantees,
porting libclang into DUChain, etc.

There's interest though:

<https://bugs.kde.org/show_bug.cgi?id=253650>
<https://bugs.kde.org/show_bug.cgi?id=172622>

------
helmut_hed
LWN has great coverage on the modularity issue, from the time period when
Clang was just getting started: <http://lwn.net/Articles/301135/>

------
Millennium
> Richard Guenther responded by saying, "Note that we can't > drive GCC into
> the garage and refactor it for two years."

Wasn't this essentially what the egcs project did? It's possible that GCC
might need another round of this; could doing that provide a potential avenue
out of the current troubles?

~~~
ajross
Not really. EGCS was producing real, usable releases while it was forked. It
wasn't a rewrite in any meaningful sense.

------
Rovanion
Why oh why is the drivel from Phoronix repeatedly at the top of HN. In the few
cases where I have know people involved in what Larabel writes about they have
always said that he blows things out of proportions and that little of what is
written is actually true.

------
pnathan
The last two articles from Phoronix that I've seen pee on the FSF/GPLv3
efforts. Do they have an axe to grind, or is this just my sample bias?

~~~
CJefferson
Articles which insults the FSF/GPLv3 seem to always get very highly voted on
ycombinator. I read phoronix and I hadn't noticed a FSF-bias.

On the other hand, the stories also all seem true. There really is a serious
problem with being attached to the FSF being a liability, and RMS really has
held GCC back, and is (in my opinion) the main reason clang/llvm has been able
to move ahead in some areas (in particular tool support), where for years RMS
explicitly rejected any patch which would make gcc "tool-friendly".

------
orionblastar
I learned C programming in 1987 using Turbo C. I recovered several small
programs I wrote for the C class and converted them to work on GCC under
Ubuntu.

Most of the code didn't need changes, except where it handled strings (or the
char *string1 type string) because somehow gets() caused a segment fault and
when I switched to scanf() it worked without crashing. When I used the math.h
library and the sqrt() function I had to use the -lm switch on the compiler to
get it to work.

I rewrote them in C++ using the G++ compiler and found that the std::string
works better than the old C string, and that the cin and cout commands work
better than gets and scanf.

I really don't see much of a problem unless you are converting Pre-ANSI C code
to the current standard or something. I think that the scope changed, but back
in 1987 we were taught to keep variable names used in functions inside of
functions, etc to keep the scope the same and don't try to access something
out of scope. I guess that is why my programs converted so easily? They also
didn't need a lot of resources or libraries and were simple programs written
for the command line for DOS using Turbo C.

Should I use CLANG or LLVM instead? How about Visual C++ or Borland C++? Am I
doing wrong trying to learn GCC and G++ or should I try learning under a
different C++ compiler?

~~~
iso-8859-1
Why not just write code for the standard? Unless you are using your own stdlib
or something, it shouldn't matter much what compiler you use.

------
splicer
And what about ARM, which is used on the majority of phones these days? AFAIK,
LLVM/Clang is way behind GCC on ARM support.

~~~
Camillo
iOS is compiled with Clang.

------
Shorel
Great!!!

How long until RMS lets us drop the GNU/ part in what he calls Linux?

------
tkahn6
Something interesting to note is that the complexity and internal coupling in
GCC was an _intentional_ design decision by GNU to make it harder for non-free
software projects to make use of gcc.

Here's a pretty good talk about the philosophical motivations behind Clang and
the current work being done on it:

[http://channel9.msdn.com/Events/GoingNative/GoingNative-2012...](http://channel9.msdn.com/Events/GoingNative/GoingNative-2012/Clang-
Defending-C-from-Murphy-s-Million-Monkeys)

~~~
jmillikin
Although the Clang developers claim GCC was poorly adapted to non-compiler use
on purpose, the GCC developers seem to have been very eager to accept patches
that make these use cases work better. If they were truly opposed to people
using GCC as a module in a larger system, why would they go out of their way
to encourage modularization?

And please don't link to that RMS email as "evidence"; RMS has about as much
control over GNU projects' technical details as Charles Manson does over the
music industry.

~~~
shardling
>If they were truly opposed to people using GCC as a module in a larger
system, why would they go out of their way to encourage modularization?

The claim I've seen repeatedly is that this is precisely because they must now
compete with Clang/etc.

~~~
jmillikin
That implies that the previous lack of modularization was a practical
decision, not a political or philosophical one. Implementation of the GCC
plugins system started in 2008. Can you imagine the GCC team completely
changing their philosophical stance almost overnight (Clang's first release
was in mid 2007), just to get more users? It would be as if they'd relicensed
to proprietary software to better compete with VC++.

Previously GCC's primary competitor was ICC, and they were compared based on
the quality of their output. Now GCC's primary competitor is Clang, and they
compete on how user-friendly their interface is. Modularization is a user
interface improvement, just like better error messages or more comprehensive
warnings.

------
droithomme
C++11 is a theoretical language developed by a committee which doesn't really
exist in practice.

That gcc, like most other C++ compilers, does not support it, means little.

~~~
pretoriusB
I don't think denial is the best way to cope with the news.

Last time I checked C++11 has been an ISO standard since last August, with MS
and Clang ramping up their support for it.

~~~
jmillikin
But neither fully supports it yet, just as GCC doesn't. Even when they claim
to support every feature in the standard, that's just the beginning -- next
comes fixing all the bugs and corner cases that naturally arise from a spec as
complex as C++11.

~~~
jlarocco
And? Essentially the same thing happened with C++98, and in the long run it
didn't hurt adoption too much. To me, it seems like the compiler are doing a
better job this time around.

~~~
jmillikin
The point is that it's silly to criticize GCC for having incomplete C++11
support one year after the standard has finalized, and even sillier to point
to other compilers' less-complete support as evidence that GCC is falling
behind.

