
NNSA, Nvidia to create an open-source Fortran compiler front-end for LLVM - ingve
https://www.llnl.gov/news/nnsa-national-labs-team-nvidia-develop-open-source-fortran-compiler-technology
======
brudgers
The sponsoring organization is is NNSA rather than NASA. NNSA is the National
Nuclear Security Administration and hence the association with Lawrence
Livermore Labs.

~~~
gh02t
I work for NNSA actually. There's a ton of codes in the nuclear engineering
sector that are written in Fortran and it's the lingua franca of the industry,
though C++ and Python are taking over for new projects. One of the biggest
codes in the industry (MCNP) is an enormous chunk of Fortran with a lineage
going back over 60 years.

It'd be nice to see an LLVM frontend for Fortran; Intel has a stranglehold
with ifort due to their excellent vectorization. Hopefully this project can
provoke some competition.

~~~
thearn4
NASA engineer here. Title typo aside, that actually sounds a lot like our
legacy code-base. Though Python has been making a lot of inroads, thanks
partly to f2py

~~~
ThomPete
Sure not only is the NNSA guy here to correct a mistake and add some valuable,
the NASA guy is also here adding even more context.

Man this community and it's diverse group of amazing people. I don't there is
a single place on the internet as valuable as this community.

Sometimes I just have to pinch myself...

~~~
dang
We fixed the typo rather late, but in this case fortuitously. :)

------
Kristine1975
Here's the announcement on the LLVM developer mailing list:
[http://lists.llvm.org/pipermail/llvm-
dev/2015-November/09240...](http://lists.llvm.org/pipermail/llvm-
dev/2015-November/092404.html)

The resulting discussion contains a mini-FAQ:
[http://lists.llvm.org/pipermail/llvm-
dev/2015-November/09243...](http://lists.llvm.org/pipermail/llvm-
dev/2015-November/092438.html)

(both via the LLVM Weekly newsletter: [http://blog.llvm.org/2015/11/llvm-
weekly-98-nov-16th-2015.ht...](http://blog.llvm.org/2015/11/llvm-
weekly-98-nov-16th-2015.html))

------
s-macke
If this is true I can also compile scientific software with emscripten to
Javascript, which is then able to compile the Fortran libraries BLAS and
LAPACK. I like to have one click live demos of such software. Speed and memory
doesn't matter for demos.

~~~
harveywi
You can already compile BLAS/LAPACK/etc. to JavaScript, but you have to jump
through some hoops.

Proof of concept: [https://github.com/harveywi/arpack-
js](https://github.com/harveywi/arpack-js)

~~~
s-macke
Yes, by using f2c. For BLAS and LAPACK, this is a big effort and doesn't run
out of the box. The cblas libraries are too old and don't contain new
functions.

------
Jerry2
Here is the link to a thread on GCC's mailing list discussing the move by
NNSA:

[https://news.ycombinator.com/item?id=10578354](https://news.ycombinator.com/item?id=10578354)

LLVM has a huge momentum now and it's negatively affecting GCC. GCC has been
in trouble before but they got out of it by being the best free compiler
available. I don't know how much longer they'll be able to sustain without
making some changes.

~~~
kevinchen
A choice quote from that thread:

> Between 2000 and 2004, [g95 fortran] front-end was coupled to the rest of
> the infrastructure of the GNU Compiler Collection. This was not trivial
> (just as it will not be trivial to couple the PGI front-end to the LLVM
> infrastructure).

They really don't get that LLVM's modularity is a strength...

------
cprayingmantis
Makes sense both Livermore and ORNL have a lot of older Fortran code that they
still use.

~~~
bsharitt
It not just to use old Fortran code. Lot's of it is still being written in the
scientific computing world.

~~~
truncate
I'm not at all familiar with Fortran. I was under impression that its old
language living in legacy. I believe there are other good alternatives, why
still use Fortran for new code?

~~~
tjl
It's still faster than the alternatives. Things like Scipy/Numpy are more
alternatives to Matlab as is R. Julia isn't fast enough yet. C/C++ doesn't
have the built-in matrix/vector support. The latter is what I ran into when I
was coding for my Master's thesis, so I just gave up and went with Fortran.
For my PhD, I used Matlab as the speed wasn't an issue.

There's also issues with wanting to use the old Fortran code in new projects
and if the new alternatives don't have a good interface for Fortran, it's a
bit of a pain to integrate.

~~~
Xcelerate
At least for me, Julia's fast enough in most cases. I switched from C/C++ to
Julia for all of my scientific work, and it runs fine, even on a supercomputer
like Titan (barring something that requires thousands of nodes at once — MPI
is not yet integrated into Julia). You just have to be careful in the tight
inner loops and check the assembly that it outputs using code_native().

Granted, there's plenty of fast libraries written in C/Fortran that don't have
a Julia equivalent yet, and depending on the overhead of ccall(), you may wish
to stick with writing the rest of your code in the language of the library.

~~~
tjl
Good to know. I've been looking at it, but not seriously. It's been more
keeping an eye on it. I was hoping that Fortress would go somewhere, but it
died.

My other problem is code generation. I can't generate Julia code yet.

~~~
ihnorton
> My other problem is code generation. I can't generate Julia code yet.

Perhaps I am misunderstanding your meaning here, but Julia has a range of
metaprogramming functionality including both Lisp-like macros, as well as
"generated functions" that allow custom code specialization based on input
type signatures ([https://medium.com/@acidflask/smoothing-data-with-julia-s-
ge...](https://medium.com/@acidflask/smoothing-data-with-julia-s-generated-
functions-c80e240e05f3)). These capabilities have been used for DSLs (e.g.
[https://github.com/JuliaOpt/JuMP.jl](https://github.com/JuliaOpt/JuMP.jl))
and parser generation (e.g.
[https://github.com/abeschneider/PEGParser.jl](https://github.com/abeschneider/PEGParser.jl)).

~~~
tjl
No, I'm talking about generating Julia code from elsewhere. I'll derive some
math in another program (e.g., Maple, Mathematica, SymPy) and output code that
represents that math. I can't do that with Julia right now.

~~~
ihnorton
I see. Indeed, Julia's ecosystem is still quite young. (possibly of interest:
a project building a Julia-integrated CAS called Nemo, at
[http://nemocas.org/](http://nemocas.org/))

------
sunnyps
OK. You can stop dreaming now, come back to reality please.

Sometimes this community is so cringeworthy.

~~~
ThomPete
Sorry that my happiness for this community makes you cringe :)

~~~
sunnyps
It's not your happiness that makes me cringe but the borderline cultish
comment that there's no "single place on the internet as valuable as this
community". Just reinforces the stereotype that this community is full of
itself. It would be great if we could all be a bit more grounded.

~~~
ThomPete
What on earth are you talking about? I have literally never ever expressed my
happiness over this community before. But just because i mention something
once after having been here for i don't know how many years I am suddenly
cultish?

It sounds more like a cultish anti-sentiment if anything.

~~~
themartorana
Some people just look for parades to rain on.

------
tedks
GCC has had an open-source Fortran compiler front-end since always.

The only possible reason for Nvidia to do this is because they want to have
proprietary compiler addons someday.

Remember that this was a predictable outcome when you're inevitably burned by
this.

~~~
cwyers
Or, alternatively, Nvidia is doing this because their current CUDA compiler
works with LLVM. And a possible reason that they built their CUDA
infrastructure on LLVM is because the people making GCC privileged ideology
over technical merit and made it much harder for tools (even open source or
libre ones) to integrate with the compiler. Remember that this was a
predictable outcome when you're inevitably burned by this.

~~~
tedks
Doing a rewrite seems to me to always result in a cleaner codebase faster than
the incremental refactoring. That said, GCC today is a lot easier to interop
with than the GCC of pre-LLVM times.

But, no amount of technical merit can save you when you don't have a free
compiler anymore. Don't say Stallman didn't warn you, because he did, just
like he warned you about all the other crap.

Remember, when you're getting burned by this, that you chose to make snarky
comments on the Internet instead of realizing what the moral path forward was
and doing the ethical thing.

