

The Go Language is faster than the Computer Language Benchmarks Game thinks - gbin
http://klaig.blogspot.be/2012/09/the-go-language-is-faster-than-computer.html

======
CJefferson
The title is misleading, and the article is hard to read.

The central complaint seems to be that the language shoot-out uses the 'go'
compiler. This doesn't seem surprisingly, seeing as this is what the go
website tells users to use.

There is however a gcc frontend for go, complete as of July and still not in
many distributions (this isn't a go problem, man distros haven't updated yet
to gcc 4.7.1).

So the real headline should be 'Computer Language Benchmarks uses google's go
compiler, gccgo is faster'

~~~
markokocic
That is one of the things I miss in Shootout. Before, there were allowed
multiple implementations for each language, and users could see how big
difference between Lua and Luajit is, or how Python2 compares to Python 3.

Now that there is only one implementation for each language allowed, some
important information is missing from the site.

~~~
malkia
Wait... Is luajit is back in the benchmark shootout?

~~~
masklinn
no

------
masklinn
1\. Languages are not fast (or slow), implementations are. The language
semantics can help or hinder, but in and of itself a language doesn't run.

2\. This is old news. The Shootout was debatable before, but 12~18 months ago
it suddenly decided that only one implementation of each language (picked
completely arbitrarily and according to no clear cut rules, and sometimes
allowing two implementations e.g. Python gets CPython 3, Lua gets Lua, but
Ruby gets both MRI 1.9 and JRuby, Erlang gets HiPE and Javascript gets V8)
would be allowed after — as far as I understand — a spat with the pypy team.

~~~
wheaties
JRuby really is a different language than Ruby. Not only is it built on a
different ecosystem of libraries (anything on the JVM vs anything for Ruby)
but there are language extensions that make it operate differently (Java
MBeans being one of them.)

~~~
masklinn
> JRuby really is a different language than Ruby.

No, JRuby is a JVM-based implementation of Ruby.

> Not only is it built on a different ecosystem of libraries (anything on the
> JVM vs anything for Ruby)

Irrelevant, you can make the same claim about any impementation of a language:
GHC has extensions to Haskell, why isn't JHC benched as well? IronPython can
use .Net libraries but not Python C-API libraries, why isn't it there?

> there are language extensions

Which are not used in the shootout and — again – are irrelevant anyway.

------
jbellis
A trivial benchmark that doesn't actually run long enough or do enough
allocation to exercise the GC isn't going to be representative of real-world
performance in any case.

~~~
igouy
A trivial assertion without supporting evidence for the claims.

------
anuraj
Go is unlikely to be faster than Java (JVM is very optimized) in the near
future. So rerun your benchmarks!

~~~
Tuna-Fish
In a sufficiently short test (where the jit cannot show it's chops) I'd expect
go to win.

The tests in the benchmark game are totally useless anyway. When given a load
that does not stress the L1i, modern CPUs perform pretty much completely
differently than they do on real loads with more than 32kB of code. Given such
a small code snippet, the CPU will never fail a (theoretically predictable)
branch, will never spill L1, and, on SNB, will in fact never even have to
decode an instruction after the uOP cache is primed.

~~~
batista
> _When given a load that does not stress the L1i, modern CPUs perform pretty
> much completely differently than they do on real loads with more than 32kB
> of code._

Even if that's the case, one would assume that would hold for ALL languages,
so you get a fair comparison at that...

~~~
lmm
In the same way that a bicycle race is a fair way to compare sprinters as long
as you have all of them ride bicycles. It's a fair benchmark, but it's not
benchmarking the case that's actually important.

~~~
batista
No, it's like comparing sprinters by having them run, and someone telling you:
"hey, the problem is not that sprinter X is ten times slower. The problem is
what you want to do in the real world. Do you want to go from A to B? Why not
take the bus, etc".

[I'm not saying this about the L1i cache case specially, which is a good
point, but rather for the prevalent response to any benchmark in Go-land, a
defensive attitude which I have not witnessed in any other language community.
Usually Python, JS, Ruby, Rust etc guys get on to fixing such microbenchmark
behavior or explain why it's as it is. Go guys just propose you forget about
it and rewrite your code in another way].

I don't care about a specific real world case of getting from A to B, or how
it can be done faster in another way. I care about measuring sprinters. The
case that's important here is benchmarking itself. How fast each and every
instruction of a similar type executes in a language.

E.g if I do:

for i in range(10): print i

and

for i:=0; i<10; i++ { fmt.Print(i); }

I don't care if this kind of code is not representative of an actual program,
I don't care if the code inside the loop might take more time in most cases, I
don't care if I can write some particular program using some other structure.

I only want to know why this takes X time in Python and c*X time in Go, and if
the c factor can be improved.

Suggestions about "real world programs" and "write this another way and then
measure" in this regard are counter-productive, because they focus not on raw
benchmarking the language but on specific cases.

~~~
Tuna-Fish
I'd like to point out that I'm not one from the goland, and in fact I find the
language quite disappointing. However, the fact that the benchmarks are rather
meaningless is a real problem, because languages use resources differently,
and thus resource exhaustion affects them differently.

A notable example is C++-style templates (typically every codepath is expanded
independently) vs java-style generics (one codepath with checks for types).
C++-style absolutely dominates in small benchmarks, because it provides the
best possible code. However, if you use a lot of them on different types, it's
very easy for them to totally ruin your performance due to resource
exhaustion. Some of the very largest performance gains I've ever gotten from a
small optimization (on the order of 10x real improvement for a spot change to
a few dozen lines, with no change to algorithms) involved removing C++
templates, storing tags, and adding a few case statements based on them.

> I'm not saying this about the L1i cache case specially, which is a good
> point

Note that L1i is not the only, or even the biggest offender. BTB slots are
another very critical one. Predicted branches are essentially free on modern
OoO cores, and business logic often contains inhuman amounts of what's
essentially if trees. Have enough branches in your main loop that the BTB
overflows, and thanks to LRU replacement, all those ifs turn from costing half
a cycle to costing ~10 cycles or so. Properly optimizing for that case is very
different from optimizing for some microbenchmark with a 100 bytes of code.

And again, this is not praise for on an indictment against some language. I'm
not saying that C++ is slow, it was just an easy example of a case where
things change when the codebase gets bigger. What I am saying is that the
performance shootout is essentially useless, and promotes the kind of
optimizations that either don't really help, or even hurt speed in the large.
If performance actually matters to you, you are much better off measuring
things like competing mature XML parsers or the like. While there are a lot of
things wrong with using that to measure the speed of your favorite language,
it's still _less wrong_ than using a five-line microbenchmark.

~~~
igouy
>>If performance actually matters to you, you are much better off measuring
things like competing mature XML parsers or the like.<<

Please tell us where we can see that comparison!

>>less wrong than using a five-line microbenchmark<<

The trouble with that hyperbole is that someone's already mentioned their
five-line microbenchmark in this discussion --
<http://news.ycombinator.com/item?id=4544096>

The alternatives to the benchmarks game that people come up with are usually
worse, not better.

------
markokrajnc
I tested simple a simple Go program with a for loop counting from 1 to several
millions and making simple integer calculations and the same with Java. Java
was 4-5x faster than Go.

It looks like HotSpot VM is preety much optimized and that Go still needs a
lot of compiler optimizations...

I know that this test isn´t representative and complete, but it is a good
smoke test for first comparison. C and Go should be faster than Java in pure
integer numerics (or at least as fast as Java)...

~~~
supersillyus
Why should C and Go (AOT compiled) be faster than Java for pure integer
numerics? For long-running simple tight loops of numerics, I'd assume HotSpot
would be faster.

~~~
igouy
Don't assume -- measure.

------
SeanDav
OT:

I am a bit of a fan of dark backgrounds but you got to be careful with the
colours for links etc. Dark grey on Darker Grey is not very visible.

Additionally the article is not clearly laid out and if it didn't have "Go" in
the title would not have had a chance at high front page spot on HN.

Oh well such is life at the moment.

~~~
gbin
Indeed, fixed the horrible link color. I also agree, Go is really hyped at the
moment.

------
halostatue
Personally, I'm surprised that anyone takes the shootout seriously. In my
experience, benchmarks of this sort are about as useful and trustworthy as GPU
benchmark results provided by a GPU vendor.

After having spent a little while looking over the site, I've got to give
igouy props for fixing many of the worst problems that existed on the site a
few years back (some of the micro-benchmarks that used to exist were
questionable, at best; some of the implementations were even worse). It is a
far better presented, qualified, and curated site than it used to be.

I still don't think it's a _useful_ site and that people would be better off
ignoring it, but with the changes made, I don't think it's a _harmful_ site
anymore.

~~~
igouy
>>useful<<

"#1. To show working programs written in less familiar programming languages"

>>trustworthy<<

Are you saying that you believe the measurements to be falsified?

~~~
halostatue
Benchmarking is, in general, useless without context. I believe that something
like the CLBG fails "useful" in the sense that the curated programs are
themselves of limited utility. Showing different benchmark implementations
illustrates _some_ things about different programming languages, but IMO these
differences are of limited utility when learning languages.

Do I believe that your measurements are falsified? No. But generalized
benchmarks (that is, without meaningful context to the problem that you're
trying to solve) are truthy, at best—sort of like statistics presented out of
context (such as the 47% figure floating around in political circles).

As I said, I think the way that the CLBG is presented now is much better than
it was presented as the Shootout. I just don't think that it's useful toward
real software development.

~~~
igouy
I'm sorry about the reply I made a few days ago -- it doesn't seem to address
what you wrote.

>>limited utility<<

I don't think there's anything on the website that suggests the benchmarks
game provides some sort of perfect, definitive and ultimate statement about
anything at all.

On the contrary -- "Here you'll find provisional facts about the performance
of programs written in ≈24 different programming languages for a dozen simple
tasks."

>>useful toward real software development<<

That would be depend on how well informed the "real" software developers are,
and there seem to be plenty of programmers with strange ideas about languages
they haven't used.

------
Rickasaurus
One datapoint is not meaningful here. Give it a go with the entire suite.

------
Uchikoma
As long as someone in our industry writes "smoking scala, clojure, java and
lisp" we have not progressed from being children, playing with toys in our
sandbox with a mine-is-bigger-than-yours attitude. Sad.

~~~
ryeguy
Oh please. It's a simple informal choice of words. To many people, programming
languages are toys too, because programming is a hobby. It doesn't have to be
all uptight super business speak all the time.

~~~
silentOpen
It doesn't have to be tribal, either.

~~~
silentOpen
Is it really so hard to understand that the language shootout promotes the
same sort of (misinformed) tribalism-by-language that the blog post author
both dislikes (otherwise, why concern yourself over relatively pointless
microbenchmarks of single implementations?) and reinforces (apparently
"smoking" other languages is good?). What happened to objective assessment?

~~~
igouy
Is it really so hard to understand that it hasn't been called _the language
shootout_ for over 5 years - and you are promoting "the same sort of
(misinformed) tribalism-by-language" by talking about it in those terms?

My guess is that your mistaken view of what the benchmarks game promotes
reflects a lack of familiarity with what the website says about itself on the
homepage and the Help page and the Conclusions page and ...

~~~
silentOpen
_Is it really so hard to understand that it hasn't been called the language
shootout for over 5 years - and you are promoting "the same sort of
(misinformed) tribalism-by-language" by talking about it in those terms?_

The "benchmarks game" is largely the same as when it was called the "language
shootout" now with more caveats plastered about. Please explain to me which
language tribe I am promoting by calling the "benchmarks game" by its original
name? Is it the tribe of skeptics? Perhaps use an HTTP 301 to redirect
<[http://shootout.alioth.debian.org/>](http://shootout.alioth.debian.org/>);
to
<[http://microbenchmarks.alioth.debian.org/>](http://microbenchmarks.alioth.debian.org/>);
and expunge "shootout" from all names?

 _My guess is that your mistaken view of what the benchmarks game promotes
reflects a lack of familiarity with what the website says about itself on the
homepage and the Help page and the Conclusions page and ..._

Instead of guessing what my "view" "reflects", consider what the "benchmarks
game" has promoted in TFA:

Title: "The Go Language is faster than the computer language benchmarks game
thinks it is"

Well, this isn't what the "benchmarks game" says it measures anywhere (in
fact, the site explicitly decries this interpretation) and yet this is what it
indirectly promotes in various language communities (and linkbait tastes
good).

Conclusion: "So Go should be between C and Clean, smoking scala, clojure, java
and lisp."

Again, this isn't what the site says it measures (it measures some aspects of
some implementations, not languages) and yet this is how it is portrayed by
third parties. I understand that it is difficult to give people a lot of
'easy' data and simultaneously educate them about the reasonable limits of
inference based on those data. I understand that the benchmarks game has taken
great pains to try to educate its readers and inform their conclusions. I
don't know if it will ever be enough -- people (as we can see) like easy
answers and aren't willing to 0.) do the work to get more complete data sets
and 1.) temper their findings with objective reality instead of drawing
misinformed conclusions.

Do you see how the easily-digestible plots (indirectly) promote tribalism over
rationality? Do you see how a plot is worth a thousand words?

~~~
igouy
Talking about the benchmarks game as a shootout promotes "the same sort of
(misinformed) tribalism-by-language" -- I didn't say you were promoting a
particular language tribe.

>>Perhaps use an HTTP 301 to redirect<<

Perhaps that was understood 5 years ago, and there are obstacles to that
approach. (Not that I think that's actually why people like to talk about _the
shootout_.)

>>promoted in TFA<<

Here you are -- <http://news.ycombinator.com/item?id=4544096>

He's talking about this -- [https://groups.google.com/d/topic/golang-
nuts/KqA1Jlpu2nM/di...](https://groups.google.com/d/topic/golang-
nuts/KqA1Jlpu2nM/discussion)

He came up with it all by himself, the same way people make factorial
performance comparisons.

>>(and linkbait tastes good)<<

This kind of traffic-less blog post isn't a benefit -- stackoverflow is by far
the main source of link traffic, and much more traffic comes from direct
search.

>>Do you see how the easily-digestible plots (indirectly) promote tribalism
over rationality?<<

Do you see how you can intervene and explain what the plots actually say?

------
batista
>*The Go Language is faster than the Computer Language Benchmarks Game thinks

That's what Go advocates always say, but, for a static language with tight
memory structures, it's not that fast at all.

Now, he does compile with gcc go, which is faster, but using the go tool I
frequently find for lots of use cases it's about 50-100% faster than Python,
or 2-5 times slower than Java.

Not every impressive -- it probably needs a better compiler with more
optimizations and a better GC.

When you point that out the the golang mailing list, it's mostly met with la-
la-hands-in-the-ears denial, and suggestions to rewrite your program in a
smarter way and micro-optimize it. But I don't want to find the optimal
algorithm for something -- I want to compare speed, and for that same-ish
algorithms in same-ish languages are fine.

(Though, Russ Cox said the current HEAD has a lot of optimizations and is
quite faster, so things might be improving there).

~~~
tptacek
Not meaning this snarkily, but: if you write a program in Java and it's 2x
faster than a Go program written in the same style, why not just write in
Java?

~~~
batista
Well, and I mean this kinda snarkily, who said people don't?

I, for one, found that for some kinds of work I do, Go doesn't give that much
of a performance advantage, so I'm exploring other languages.

That said, even if Java gives you 2x, that doesn't mean that you shouldn't
complain about Go.

E.g you might like Go's semantics and syntax, or it might have other benefits
for you. That you like those things about Go, doesn't mean you don't get to
say "shame about the performance".

~~~
tptacek
I guess I'm just saying, Go's syntax and semantics aren't enough of a win for
you to deal with the performance hit you take compared to Java. So use Java.
It's not a moral issue.

~~~
batista
Well, I hoped to use Go to avoid both:

a) writing in C for performance (and having to setup glib et al to make it
comfortable)

and:

b) carrying the JVM with me to run something.

(And, no, I won't consider AOT compilers for Java. I want to deal with
reasonably mature and widespread technologies and tooling).

So, there are my constraints. Given those, maybe C++ is the ticket. D is not
mainstream enough, and Rust I've played with and I like very much but is still
in flux.

~~~
tptacek
I wrote three paragraphs of response here, as is my wont, and realized before
hitting "reply" that that's the wrong tack.

Let's be specific. What are you building or considering, where your current
options are C and the JVM? How does Go fall down on that work load?

I've got a fair bit of professional experience in both C (~10 years shipping
code continuously in it) and Java, and I've spent the last couple weeks in Go
--- not long enough to be a true believer, but enough to have some practical
experience.

