
An Updated Analysis for the “Giving Up on Julia” Blog Post - ChrisRackauckas
https://tk3369.wordpress.com/2018/02/04/an-updated-analysis-for-the-giving-up-on-julia-blog-post/
======
goerz
As someone coming from a background of running Fortran programs with runtimes
of multiple days, and looking into Julia at some point last year, the original
blog post was quite obviously bogus, regarding the performance complaints.
Also, 1-based indexing is more than a bonus for numerical languages!

IMO, C/C++ are terrible languages for numerical computing. In a way, Python is
terrible too, except that it's great at gluing together existing numerical
code, and doing high-level modeling, data analysis and visualization. These
are quite important, of course, but very distinct from the actual numerics
(the "two-language problem" Julia aims to bridge)

All that being said, Julia's JIT-slowness can be a little annoying during
development and testing, when you run Julia scripts on small example data, and
have to wait 20 extra seconds every time you run the script! It would be
really nice if Julia cached more of its compilation, so that any script being
run repeatedly and invoking the same code paths with the same types would only
cause compilation the first time.

~~~
ced
> when you run Julia scripts on small example data

That's not the favored way of working with languages that have a REPL (or
better --- Jupyter notebooks). Ideally, you work and reload new code
interactively. I only rarely have to restart Julia, so I don't have to pay the
JIT tax very often. Check out Revise.jl

------
ScottPJones
Julia has improved quite a lot since the "Giving Up on Julia" blog was
written, and even back two years ago when he wrote it, he'd missed some
packages that would have solved his issues. I'm very glad that Tom has taken
the time to address this. I still feel that Julia has a great future, and
things will accelerate as soon as v1.0 is released.

------
xiaodai
> It is important to note that the benchmark codes are not written for
> absolute maximal performance (the fastest code to compute
> recursion_fibonacci(20) is the constant literal 6765). Instead, the
> benchmarks are written to test the performance of identical algorithms and
> code patterns implemented in each language.

This is quite interesting. So potentially the code for other languages are not
the best. It would complete the argument then that the "code patterns" is more
often than not the most "natural" e.g. it translates well from pseudo-code.

When I was trying Julia I keep trying to vectorize code like I do in R and it
took me a while to get comfortable with writing loops again. Vectorized code
in R can be quite fast, but there is big caveat - you need a lot more memory
to do the same task (until ALTEREP makes into R Base) . E.g. say I wanted to
find the maximum of `abs(a^2)` in R where `a` is a vector, so I would do
`max(abs(a^2)` this creates a temporary vector. In Julia I can do
`maximum(x->abs(x^2), a)` and achieve similar (if not better performance)
using much less RAM. This is qutie important if you deal with large datasets
(e.g. Fannie Mae data at 1.8 billion rows).

~~~
ChrisRackauckas
The code here isn't the best in any language. A purely recursive Fibonacci is
a terribly inefficient way to do the calculation because it will recompute
every `fib(n-1)` a few times, making the complexity exponential. But that's
what makes it great benchmark of recursion. People then try replacing the code
with an O(n) vectorized version and say "hey, we beat Julia!" but that
completely misses the point that (1) yes, you can do the O(n) version in Julia
too and it will be faster but also (2) this is a test of recursion with the
name "recursion" in the name.

------
TeroFrondelius
I liked especially this statement: "If you have a large computation project
that takes 30 minutes to process then the 0.5 second startup time means
nothing. This should be the more common case for most business applications,
let alone the numeric/scientific computing community."

It's important to optimize the performance critical code. And in my use cases
scripts usually runs minutes or even hours.

------
DanGPhoton
I recently passed a happy milestone with my ongoing transition from Matlab to
Julia in that now I'm more likely to get an error in Matlab for using Julia
syntax than the other way around. I am probably using Julia for about 50% of
my work.

~~~
xiaodai
Similar experience with R and Julia. I keep putting `function` first

------
piever
Nice to see this "debunking". I was especially bothered by the analysis of
performance, focusing so much on start up time (which btw has vastly improved
since).

It's hard to remember if some of the concerns raised were valid at the time,
but as of now none of the criticisms seem to hold. To make a concrete example:
Julia testing framework (Base.Test) is now very user friendly and widely used:
with one command at the console one can generate a package which is already
set up to be tested remotely on different architectures. As a consequence,
package coverage is generally reasonably high.

~~~
eigenspace
I think it’s hard to argue that the original article was valid when it’s
opening argument was as far as I can tell, a deliberate attempt to
misrepresent Julia’s performance.

------
eigenspace
It’s been painful having that older blog post show up as one of the top items
when people search for Julia.

I know The first time I got interested in Julia I was turned off by that
article and it wasn’t till I came back later and started thinking about trying
Julia again that I realized the article was bunk.

------
lekand
It's great to find this piece. After one year+ of Julia usage I am really
excited about it's future, and cannot help but feel that old blog post
misrepresents what Julia has become

------
PetrKrysl13
Well done! In any case, Julia today is a very different language from the one
that the original post discussed.

------
baggepinnen
Nice writeup, I never found that old blog post correlate well with my own
experience, not then, even less now.

------
celrod
Great article providing updates on the language. Let's see this momentum
continue.

------
lekand
Nice!

