
Fortran is still a thing - zeveb
https://wordsandbuttons.online/fortran_is_still_a_thing.html
======
Animats
FORTRAN has multidimensional arrays, and the compilers know about them and
optimize aggressively. Most other languages don't. C people just do not get
multidimensional arrays. They think an array of arrays is good enough. I tried
to convince the Go people to put in multidimensional arrays, with no success.

Years ago, I had an amusing fight with the C++ committee over this.

In C++, you can overload the "[]" operator:

    
    
       int& operator[] (int i) 
    

But you can't do that for multiple arguments.

    
    
       int& operator[] (int i, int j)
    

Why? An obscure feature of C and C++ is that the precedence of the comma is
different for function argument lists and subscript lists. In

    
    
       foo(a,b)
    

is a function call with two arguments.

    
    
       foo[a,b]
    

is an invocation of the comma operator, which ignores the second operand. The
syntax charts reflect this. So I proposed that the syntax for comma usage be
the same in both "()" and "[]" forms. If you see "foo[a,b]" in a C/C++
program, it's almost certainly an error. I searched through large volumes of
code and was unable to find a single use of the comma operator inside
brackets.

But no, there's a "save the comma operator" faction.

C people just do not get multidimensional arrays.

~~~
cf498
> C people just do not get multidimensional arrays. They think an array of
> arrays is good enough.

I have to say, i dont get what you are trying to say. Do you have a source
that explains it?

What is the difference between a multi dimensional array and arrays of arrays?
And what does a FORTRAN compiler optimize here? It is a piece of memory
written and organized in a certain way. What is there to organize. You
basically have a pointer to a memory region with an offset for the position at
row X and column Y.

What am I missing? Are you only talking about the syntax for the final
address? What has that to do with the compiler? Do you mean the parser?

~~~
alfalfasprout
an array of arrays is not necessarily contiguous in C(++). Indeed, if
allocating on the heap, you end up with a bunch of discontiguous memory that's
not necessarily correctly aligned.

A good tensor implementation accounts for strides that are SIMD compatible
(eg; each dimension is a multiple of the SIMD register width).

~~~
int_19h
An array of arrays is necessarily contiguous in C - this is implied by the
type. An array of _pointers to arrays_ will, of course, not be contiguous -
and is the only way to get a dynamically sized heap-allocated 2D array in C
(VLAs give you stack-allocated arrays, with all the size limits that entails).

In C++, this all is best handled by a library class.

~~~
cygx
Heap-allocated dynamically-sized NxM matrix in C99:

    
    
        double (*mat)[M] = calloc(N * M, sizeof (double));

~~~
int_19h
Ah, good point. I always forget that VLA types in C99 are actually _types_ ,
and so you can use them in these contexts as well.

It's a shame they killed VLAs as a mandatory language feature. They didn't
make C into Fortran (which I think was the hope, between them, complex
numbers, and "restrict"?), but they did make some things a great deal more
pleasant to write.

------
wycy
I'm actively working on multiple Fortran projects, including one that's being
written from scratch just in the last year in modern Fortran 2018.

Modern Fortran looks completely different from the old school FORTRAN77 most
people probably imagine. Gone are the days of fixed-format, where the columns
actually mattered, and gone are the days of unreadable ALL CAPITAL LETTERS,
and gone are the days of GOTO.

I developed a Fortran implementation of Python's argparse[0] recently to use
for another project. The code is nothing like the monstrous spaghetti code of
days past---although I do still obsessively line up my code in columns, this
isn't important.

[0] [https://gitlab.com/bwearley/f-clap](https://gitlab.com/bwearley/f-clap)

~~~
tejassanap
Do you know of any learning opportunities such as internships where one can
learn and apply Fortran?

~~~
m_mueller
Look up for jobs in weather agencies, climate research, ocean model research
and computational physics.

~~~
int_19h
I've noticed that Fortran comes up a lot when talking to people who do fluid
dynamics simulations.

Curiously, it often does so in context of Python. One specific example I can
remember was a library written in Fortran, with unit tests written in Python.

~~~
m_mueller
Fortran and Python go together very well (they complement each other in just
the right ways and there's all the bindings / data structure compatibility
with Numpy that you need). Glue code / UI in Python, Numerics in Numpy +
homegrown Fortran, that's how I'd implement a numerical model from scratch
today.

~~~
septc
Just out of curiosity, what do you use for calling Fortran from Python (e.g.
f2py, ctypes)? Do you have any suggestion about how to combine them together
(e.g., for parallel calculations)?

~~~
m_mueller
So I haven't yet had the chance to do it myself, but yes I'd look at f2py
first and how to integrate it with Numpy.

------
cbkeller
I've done a bit of HPC work, and Fortran is very much "still a thing" there.
MPI[1] is pretty much the only game in town for between-node parallelism, and
MPI comes with interfaces for C, C++ and Fortran. The only other language
that's even been run at petascale is Julia, and as far as I can tell that's
still using Julia's `ccall()` under the hood to interact with the MPI C
libraries [e.g., 2].

Certainly legacy code is part of the picture, but not always as directly as
one might think. Probably the biggest factor is that Fortran compilers tend to
be _very good_ \-- partially a chicken and egg issue (e.g., Intel puts effort
into into ifort because its HPC customers want to use Fortran), but I think
there's also at some level a tradeoff between language convenience versus ease
of optimizing into machine code. To give one concrete example, until C99 added
the `restrict` keyword, it fundamentally wasn't possible for the compiler to
optimize C code as heavily as it could optimize Fortran in certain common
situations because of pointer aliasing issues.

It's probably also worth noting that modern Fortran is a long way from f77.

[1]
[https://en.wikipedia.org/wiki/Message_Passing_Interface](https://en.wikipedia.org/wiki/Message_Passing_Interface)

[2]
[https://github.com/JuliaParallel/MPI.jl](https://github.com/JuliaParallel/MPI.jl)

~~~
m_mueller
The chicken-and-egg thing also applies to GPUs btw., Nvidia & PGI have
supported GPU computing on Fortran for ~8 years, since the early days of CUDA.

~~~
cbkeller
That's a good point. Hierarchical parallelism is becoming increasingly
important, so having one language that can be used both within-node and
between-node is very convenient, and could add to the lock-in factor.

~~~
m_mueller
Good point and this is btw. exactly where Nvidia is heading. There will be a
point in the future where you just program kernels and/or map/reduce functions
and/or library functions and then call them to execute on a GPU cluster,
passing in a configuration for network topology, node-level topology (how many
GPUs, how are they connected) and chip-level topology (grid+block size).

The address space will be shared on the whole cluster, supported by an
interconnect that’s so fast that most researcher can just stop caring about
communication / data locality (see how DGX-2 works).

~~~
wahern
> The address space will be shared on the whole cluster, supported by an
> interconnect that’s so fast that most researcher can just stop caring about
> communication / data locality

There will always be people who will care because locality will always matter
(thanks, physics). Improvements in technology may make it easier and cheaper
to solve today's problems, but as technology improves we simply begin to
tackle new, more difficult problems.

Today's chips provide more performance than whole clusters from 20 years ago
and can perform yesterday's jobs on a single chip. But that doesn't mean
clusters stopped being a thing.

See also The Myth of RAM,
[http://www.ilikebigbits.com/2014_04_21_myth_of_ram_1.html](http://www.ilikebigbits.com/2014_04_21_myth_of_ram_1.html)

~~~
m_mueller
I do think there’s a paradigm shift coming. it’s a combination of the ongoing
shift away from latency- to throughput oriented design with the capabilities
shown in new interlinks, especially nvlink/nvswitch. This allows DGX-2 to
already cover a fair amount of what would otherwise have to be programmed for
midsized clusters - if it can be made to scale one more order of magnitude
(i.e. ~ 10 DGX) I think there’s not much left that wouldn’t fit there but
_would_ fit something like Titan. There’s not that much so embarassingly
parallel that the communication overhead doesn’t constrain it, and if doesn’t,
you _again_ don’t care much about data locality as it becomes trivial (e.g.
compute intensive map function).

------
marcelocb
I happen to be one of those "real engineers", working in an aerospace company.
When I have to re-do some calculations done 30 years ago (yeah, we need to
have traceability of calculations for that long and more ... ) , I grab the
F77 source code, build with gfortran or Intel Fortran, and believe me, it
builds and runs. From Windows workstations to Linux clusters. Easy.

Then I get the python code from 6 months ago, and spend hours figuring the
right python version, and library compatibilities ...

And when I read the fortran code, made by (physical stuff) engineers, it is
ugly but simple - there is one or maybe two ways to do something. Not the
1000000's of ways of doing the same thing in Python. And I don't have to learn
50 new libraries to understand the code.

By the way, Python has been my language of choice for most stuff in the last
15 years ...

~~~
angus-g
How do you handle answer changes owing to different environments/compiler
versions?

~~~
marcelocb
Almost every time there will be regression test cases to verify the
results,and the new builds will never reproduce perfectly the numerical
results. In general is not a issue, has to be handled case by case.

A bigger issue is that modern compilers will raise many warnings, and simple
inspection will find many bugs, like array or loop boundaries errors, and when
you fix the code you get a different result.

Then it has to be handled case by case as well, since you have to check if
that affects safety of flight.

------
milancurcic
I'm working on a Fortran book with Manning Pubs [1]. You can browse some of
the projects that accompany the book here [2]. At Cloudrun [3], we're running
some 900K lines of parallel Fortran in the cloud for on-demand custom weather
prediction. Yes, it's very much still a thing, it's just that the application
domains isn't as much in the mainstream media nowadays.

[1] [https://www.manning.com/books/modern-
fortran](https://www.manning.com/books/modern-fortran) [2]
[https://github.com/modern-fortran](https://github.com/modern-fortran) [3]
[https://cloudrun.co](https://cloudrun.co)

------
chubot
You can think of R as a wrapper around Fortran, in much the same way that
Python is a wrapper around C.

Not only does building R itself require a Fortran compiler, common packages do
as well. If you look inside your R package tarballs, particularly of solvers,
you will see Fortran.

(Scientific computing libraries in Python also wrap Fortran, but CPython
doesn't depend on it. Base R depends on Fortran.)

~~~
baldfat
And Fortran is still probably faster than if it used C in most cases of R's
use of Fortan.
[http://beza1e1.tuxen.de/articles/faster_than_C.html](http://beza1e1.tuxen.de/articles/faster_than_C.html)

There is just a ton of well written Fortran for number crunching and there is
zero reasons to not use them. You wouldn't gain speed and you would loss the
decades of stability these Fortan scripts have given the scientific community.

I am guessing all of us old timers remember the pain of Fortran in the past.

------
tempodox
> _...scientists and engineers; not computer scientists and software
> engineers, but the real ones._

That's a good one. And well-deserved.

~~~
waylandsmithers
I make this joke myself all the time and yet it does sting to see someone else
write it.

I do generally agree with the point though, and sometimes I feel that I and
people I work with become hung up on things like code elegance and testability
rather than actually addressing the task at hand.

~~~
santoshalper
In the vast majority of scenarios, code maintainability and ability to be
tested are the task at hand.

------
code_duck
It sure is. I know someone who has a very good job writing Fortran for
supercomputers with over 20k+ CPUs, powering defense related research.

Also, I started /r/fortran as a joke and then it was taken over by actual
FORTRAN programmers.

~~~
mindcrime
_Also, I started /r/fortran as a joke and then it was taken over by actual
FORTRAN programmers._

I had the same experience with /r/cobol. Programming languages never die...

~~~
quasse
Someone should do the same for MUMPS[1] if some Epic employee hasn't already
done so.

[1] [https://en.wikipedia.org/wiki/MUMPS](https://en.wikipedia.org/wiki/MUMPS)

------
wglb
Fortran II was my first programming language, and a key element to my first
summer job, and to my first FTJ. There were a lot of things about it that
eventually made me dissatisfied enough that I studied compilers to see why
there were such goofy restrictions, dabbling in XPL. I haven't been back, but
germane to this article, I visited with my professor with whom I had the
summer job 40 years later, and he pulled out the code that I had written that
was still in production, It had been changed a bit, but it was still very
recognizable. And it survived because, while it was a small part of a very
large numerical analysis project, it worked.

The article makes me wonder if I had a problem of this sort again, would I
choose Fortran. I'm thinking not, but then again, there are hints that the
language has changed significantly over the last 50 years.

I have used many other languages during that time, and am fond of several of
them (bliss 36, lisp, various assembler) and dislike others (RPG III, COBOL,
perl). Python would be in the middle.

------
electricslpnsld
Tons of linear algebra and optimization code is still actively developed in
Fortran. The most accurate solver I've found for small quadratic programs is a
Fortran code maintained by a grumpy German math professor.

~~~
psmirnov
Link please?

~~~
shaklee3
BLAS is still Fortran.

[https://en.m.wikipedia.org/wiki/Basic_Linear_Algebra_Subprog...](https://en.m.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms)

------
Shivetya
Quite a few languages people declare "dead" are doing just fine. We tend to
associate languages like FORTRAN, COBOL, and RPG, with the term legacy which
implies old an obsolete. Yet these are behind systems we use everyday but just
are not aware of it. Plus these languages serve their purpose very well,
FORTRAN is excellent in scientific solutions, COBOL and RPG excel in business
math and similar.

Solutions to problems are not one language dependent. Exiting systems can have
new modern customer facing solutions implemented all the while exploiting the
existing code base. In fact there are times were its desirable to keep what is
known to work and just give new means to access the code and data behind it.

~~~
sundarurfriend
> RPG

I had never heard of this one!

[https://en.wikipedia.org/wiki/IBM_RPG](https://en.wikipedia.org/wiki/IBM_RPG)
says:

> [D]eveloped by IBM in 1959 as the Report Program Generator - a tool to
> replicate punched card processing on the IBM 1401 then updated to RPG II for
> the IBM System/3 in the late 1960s, and since evolved into an HLL equivalent
> to COBOL and PL/I.

------
z3phyr
People who proselytize programming languages are missing the big picture. Its
always about making computers do stuff and programming language of choice is
just an implementation detail.

Having said that, we do need to look into other kinds of computers, I see none
of the proselytizing there..

~~~
coldtea
> _People who proselytize programming languages are missing the big picture.
> Its always about making computers do stuff and programming language of
> choice is just an implementation detail._

What if you're missing the big picture?

Many developers don't care about the stuff we make computers do, but are
passionate of the tools and implementation details of doing them.

If I work on some accounting enterprise software I could not care less for
accounting and the associated business goals. But I could very much like
working with this over that tool and finding nice ways to solve various
problems towards the (still boring to me) end goal.

Unlike we're working on our personal/hobby/passion projects, the "stuff we
make computers do" are usually irrelevant to us -- business owners care about
them. But, as programmers, we presumably do care about programming and
programming tools and concepts.

~~~
z3phyr
>Many developers don't care about the stuff we make computers do, but are
passionate of the tools and implementation details of doing them

This is a positive characteristic of a hacker. Very much desirable.

Do note that I was specifically talking about proselytizing in a negative
sense. One can love their tools and still not participate in dick measuring
contest of which tool is better and actively belittle someone else who also
loves their own tools.

There can be a constructive discussion about programming languages (it does
happen among the PL theorists and hackers), but in general PL theorists are a
rare sight.

In a general forum, sadly, many developers do judge a person with what
programming language they use, disregarding what they have created. This goes
against the foundation of hacker ethics.

------
mfsch
As someone working with scientific codes in (relatively) modern Fortran, I
agree that Fortran is still very strong in its niche. I don’t know any other
language that makes it as easy to write fast numerical codes, and it’s
relatively hard to accidentally mess up the performance.

That said, I still find it very much feels outdated as soon as you veer off
its core features. Handling strings is a major pain, writing short functions
feels overly verbose, and proper testing is difficult. The language is full of
arcane details that require reading up on best practices even for simple
things such as defining the precision of your numbers.

So while Fortran still is strong in some areas, I do see it as an outdated
language, just one that doesn’t really have a good replacement yet. Hopefully
Julia can replace it for most use cases, or maybe Rust if they figure out
their HPC story.

~~~
vmchale
> or maybe Rust if they figure out their HPC story.

Does rustc currently have the ability to implement the right optimizations?
One of the big things that FORTRAN can do is prevent aliasing. Can you prevent
aliasing in LLVM?

~~~
steveklabnik
Yes, through the “noalias” attribute. Rustc uses it, well, when it’s not buggy
at least. It’s currently turned off pending an upstream fix; we’ll have it
back on when it’s fixed.

~~~
wocram
Is that the last thing blocking "fortran-parity" in rust? Is there a working
group hpc would fall under?

Last I checked there was no fast-math on stable, and #[repr(align)] was
recently added.

~~~
steveklabnik
I don't know enough about fortran to say, to be honest.

> there was no fast-math on stable

We don't do global flags to tweak things like this, as a rule. You can do it
on indivudal operators, or use wrapper types, to get these behaviors.

------
syntaxing
Fortran is still highly relevant in the science community. I worked with a
couple senior scientists from a national lab and they mainly only knew Fortran
and C/++. Almost all their legacy software was in Fortran so everyone who
joined the group had to learn it too. Not coincidentally, they mainly teach
Fortran and C++ during my physics minor (same programming courses as majors).
I know they introduced Python to the curriculum these past few years though.

------
quotemstr
I find that the industry in general has a bias against gradual improvements of
existing projects. Several times throughout my career, I've worked on mainline
supposedly-legacy products while other teams work on greenfield rewrites, and
the result is invariably that I'm able to improve the left-for-dead legacy
project faster than the rewrite group can make something new --- and up with
an objectively better product.

The difficulty of continuity is overstated. It's very rare that something is
so far gone --- even Fortran --- that it can't be improved by insensible
degrees into something just as modern and usable as something started from
scratch.

------
xiii1408
I disagree.

Why is Fortran still a thing? _technical debt_

Everyone I know in astrophysics or engineering who can uses something else--
Python, C++, and Julia, are big winners.

The problem is that there are these huge monolithic codebases written in
Fortran that are difficult to get out of using. People end up using Fortran
because they need a hydrodynamics code, or a navier stokes solver, or a
chemical kinetics code, but they can't justify building one from scratch--
that's not how you get a Ph.D., and Ph.D. students and post docs are the
people doing the brunt of this work.

There's also the added disincentive that even if you do re-implement a solver
in a modern, maintainable way, then your results might disagree with previous
published work--not necessarily because your results are wrong, but because
the published work probably has bugs that no one has been able to discover or
reason about because the code is so complex.

~~~
yiyus
> Everyone I know in astrophysics or engineering who can uses something else--
> Python, C++, and Julia, are big winners.

I know many other languages, but for engineering work I have started some
projects in Fortran simply because it is a very good fit. Python is not fast
enough, C++ is a huge mess, and Julia has just been released, it is not mature
enough yet.

You would be surprised of how many people are writing solvers from scratch. It
is not the most common job in research, but it still is an active field.

And your point about results mismatch is true, this is a very real problem,
but independent of the programming language used. To give a concrete example,
many people have reimplemented almost identical material subroutines for FEM
packages and obtain different results, but this happens to people using
Fortran and people using C++.

------
dahart
What is modern Fortran good at? It seems like Fortran's niche has been
scientific computing, specifically high performance linear algebra. (I wonder
if that makes Fortran good for neural networks.)

In college, I remember attending a great talk by a visiting Stanford compiler
prof. who talked about being able to produce faster C++ code by transpiling to
Fortran first than by optimizing the C++ or IR directly. Not sure if that's
still true, but at the time it seems Fortran was a more restrictive language
which permitted stronger optimization.

~~~
electricslpnsld
> Not sure if that's still true, but at the time it seems Fortran was a more
> restrictive language which permitted stronger optimization.

There is no aliasing in Fortran. In C (and using compiler extensions in C++)
you manually use restrict to give the same information to the compiler, but it
is a trickier process than in Fortran. Rust inherits this benefit from
Fortran, but last I checked they had all related optimizations disabled (and
most numerical computing friendly features in Rust tend to be on the back
burner).

~~~
johncolanduoni
That was due to a bug in LLVM when noalias was used in conjunction with LLVM’s
stack unwinding facilities. It’s since been re-enabled now that the upstream
bug has been fixed: [https://github.com/rust-
lang/rust/pull/50744](https://github.com/rust-lang/rust/pull/50744)

~~~
cesarb
> It’s since been re-enabled

It's since been disabled again, due to a new bug (which was also reproduced in
C, on both LLVM and GCC): [https://github.com/rust-
lang/rust/issues/54878](https://github.com/rust-lang/rust/issues/54878)

------
lkirk
I think people also forget that there's a decent amount of fortran in numpy.
I'm pretty sure they are the originators of f2py as well

~~~
nine_k
+1. It is easy to see if you decide to build it from source.

Python is great as a glue. Fortran is great at numeric things (better than C,
guaranteed no aliasing allows for more optimizations).

------
Annatar
"Many believe that it’s complex and outdated. Programming in Fortran is
perceived like riding a horse-driven carriage to work. Fortran?"

That's incredibly ignorant, and shows very clearly how much we suck ass as an
industry, and why we're such a bad industry to work in. I'm currently working
on LAPACK, and that's written in F90. Why? Speed!

To add insult to injury, Fortran is really nice to program in, especially
modern Fortran, especially targetting the GPU's.

~~~
zoomablemind
Speed and trust. Trust in the algos that were devised and implemented through
the history.

Older codes often had a parameter to control a number of significant digits to
maintain. Perhaps a limitation in those days, but also it allowed to control
error tolerance. Indeed, engineerig it is.

------
geoalchimista
One interesting triviality. In modern Fortran you can do "list comprehension":

    
    
        integer :: i
        integer, parameter :: n = 20
        integer, dimension(n) :: index = [(i, i = 1, n, 1)] ! list comp
    

And this works just like range(1, n+1) in Python (array index starts from 1 by
default in Fortran).

The list comprehension in Fortran can even be nested (because it is
essentially a do-loop), according to this Rosetta Code example:
[https://rosettacode.org/wiki/List_comprehensions#Fortran](https://rosettacode.org/wiki/List_comprehensions#Fortran)

------
photon-torpedo
Article claims that "modern Fortran already has [...] generics" but AFAIK it's
still not possible to write e.g. a type-generic sort function (unless you
restrict yourself to types all inheriting from a common class, a bit like very
early Java). So there's no standard library of basic algorithms and data
structures, and you can't even write one, without resorting to meta-
programming (likely in a language other than Fortran). From what I have seen
about Fortran2018, this will not improve any time soon.

~~~
jabl
You can use "class(*)" and the "select type" construct, and yes, polymorphism.

So to compare to C++ or Java, it a bit like inheritance and RTTI. Not
generating type-specialized functions at compile-time like with C++ templates,
which is what's needed for high performance.

~~~
thecleaner
Is there a use case of using polymorphism in scientific computing ? Just
curious.

~~~
jabl
Scientific computing doesn't (only) mean small tight loops doing linear
algebra. Scientific applications can be millions of LOC, with all the
attendant problems of managing complexity at scale. Polymorphism can well be a
useful tool to have in the toolbox. Just because it's not an appropriate tool
when writing high-performance kernel code, doesn't mean it's not useful
elsewhere.

------
leeman2016
Of course real programmers code in FORTRAN.

[http://textfiles.com/humor/rp.txt](http://textfiles.com/humor/rp.txt)

> "If you can't do it in FORTRAN, do it in assembly language. If you can't do
> it in assembly language, it isn't worth doing."

------
gnufx
For something quantitative: the most time on general-purpose HPC systems is
likely to be taken by materials science software written in Fortran. There are
numbers somewhere for Archer, the UK "tier 1" system, for instance.

~~~
adw
Specifically, if we're talking about UK solid-state physics/materials science,
some combination of CASTEP, ONETEP, SIESTA, ABINIT and maybe VASP.

I'd expect there to be a bunch of GROMACS/CHARMM time for the protein-folders
too.

~~~
gnufx
VASP is likely to be top. I've not seen much use of abinit. Gromacs, charmm,
and siesta don't use Fortran. See also cp2k, elk, quantum espresso, nwchem,
and dl_poly, amongst others, which do, and are common. Gaussian is
unfortunately common, and I think is Fortran. That's on top of important
Fortran codes in ocean and meteorological modelling, and engineering.
[https://www.archer.ac.uk/status/codes/](https://www.archer.ac.uk/status/codes/)
has a lot of "other", partly because they don't log what actually runs with
whatever is in the Cray compute node kernel which would do a similar job to
audit, for instance.

~~~
adw
SIESTA is absolutely Fortran. I wrote Fortran which is part of it. :)

------
z3phyr
I heard my old physics professor say "In FORTRAN GOD is real"

~~~
celrod
Actually, almost all Fortran code says "implicit none".

~~~
zoomablemind
> Actually, almost all Fortran code says "implicit none".

... which would render GOD as of undeclared type. Compilers are such
'agnostics'.

------
ktpsns
There are certainly much more Fortran users today then 30 years ago. It is not
only widespread in science and engineering education but subsequently of
course also in the research codes. However, there is a gradual shift towards
scientific python when high performance is not needed. But as the author
notices, Fortran is the prominent language for high performance programming
because it is so simple and has linear algebra built in at its hearth.

------
saagarjha
> It stll excels in the good old structured programming. It has features that
> mainstream C-like languages lack. For instance, it can exit or continue a
> loop from a nested loop.

You can do this in C with gotos. The other things that are mentioned are
similarly easy to implement–they just don't have a particularly nice syntax.

~~~
wahern
Fortran has computed goto's (though marked obsolescent). Most mainstream C
compilers support computed goto's as an extension, Visual Studio being one of
the oddballs.

Computed goto's can _really_ speed up state machines by allowing you to
replace conditional branching with unconditional indirect branching. I've
never seen switch-based solutions come close, notwithstanding that switch
statement optimizations are supposedly part of the bread-and-butter of
optimizing C compilers.

Note that GCC's documentation on computed goto's uses an example where they
index a static array of label pointers. AFAIU the example does this because it
speeds up linking and because its more analogous to Fortran code. But it's
easier and more performant to simply store and use the label pointers directly
(in place of an integer for indexing the array), and the additional linking
cost is irrelevant compared to the amount of symbol linking required for even
a simple C++ program.

~~~
kolinko
Fun fact - Ethereum's Virtual Machine also allows for computed (and fully
dynamic) gotos.

~~~
wahern
That makes sense. gotos make code generation so much easier, and computed
gotos are even more powerful. Lua 5.2 added goto precisely to make it easier
to machine generate Lua code. WebAssembly lacks gotos and it's a serious
impediment--compiling to WebAssembly requires a special "reloop" algorithm
which, while conceptually simple for most code, doesn't work well for the type
of critical code for which using goto and especially computed gotos are useful
for. Reloop doesn't always work and so sometimes the only way to compile to
WebAssembly is to add indirection, such as compiling to an intermediate state
machine.

I think WebAssembly originally required reloop because it was originally
designed as a purely stack-based VM with properties that made it trivial to
verify the bytecode (similar to BPF). Then things got much more complicated
for performance reasons and I feel like they probably could have added goto
support at that point with little marginal complexity. It's clearly possible--
Ethereum did it, BPF permits forward jumps, and eBPF permits both forwards and
backwards jumps.

------
nyc111
He gives this example:

    
    
      real, dimension(5) :: a = [ 2, 4, 6, 8, 10 ]
      integer, dimension(2) :: i = [ 2, 4 ]
    
      print *, a(i)   ! prints 4. 8.
    

In APL the same is:

    
    
        a← 2 4 6 8 10
        a[2 4]
      4 8

~~~
ww2
I guess the speed of APL is comparable to python rather than Fortran?

~~~
Volt
If by Python you mean numpy, then yes.

------
ancarda
>There were a few caveats though. The applicant had to be a US citizen at
least 18 years of age, and the code to optimize had to be in Fortran.

Why would you need to be a US citizen to partake in NASA's competition?

~~~
TomMarius
Because nationalism (offerring opportunities to aliens is bad!)

~~~
sverige
I think it has more to do with national security; i.e., not sharing our tech
with other nation states that might use it against us. That used to be more of
a thing a generation ago. For example, it used to be a crime to export certain
computer technology to communist China. Now lot of our stuff is built there.

~~~
TomMarius
That itself is a result of nationalism, even explicitly.

~~~
sverige
But not for the reason you stated.

~~~
TomMarius
The reason is nationalism, what I wrote in the parentheses was meant as a
tongue in cheek paraphrasing of some isolation supporter, I thought it was
obvious because I would never call a person "alien" (as in intruder)
seriously. The reason here is "nationalism" because that includes the whole
history (which is in turn the reason for it).

Sorry for the confusion.

------
hnuser355
FORTRAN is definitely still used all over the place in scientific computing...
If you go to a CS department that does some engineering work and work on
numerical you might have to write some.

------
r-s
In University I worked for a professor translating FORTRAN77 code to C. Tools
existed for this, but did not produce human readable C code.

There is still a ton of Fortran code in academia.

~~~
C1sc0cat
Why ? and did you get equivalent performance afterwards

~~~
jjgreen
Most languages can access C without much hassle (Ruby, Python, ...), F77
predates this, and while you can combine C and F77 for a particular compiler,
it is a painful and unrewarding task (and the result is compiler-specific). It
is often quicker to run the F77 though f2c so you have an entirely C codebase
(even if f2c is not really readable).

Typically f2c code will be slower than F77.

~~~
TheRealKing
Whatever that can access C can also access Fortran these days. Look at the 3
newest Fortran standards, over the past two decades.

------
Myrmornis
Both BLAS and LAPACK are written in fortran aren't they? So that's all the
core numerical routines for numpy/scipy and R and I assume many other things.

~~~
gnufx
Sort of. The bits of BLAS that make it performant aren't written in Fortran --
either assembler, or the equivalent SIMD intrinsics. (See OpenBLAS and BLIS
source.)

------
cloudkj
It recently popped up in a place I didn't expect. In checking out some of the
deep learning work around neural style transfer, I found that many of the
examples make reference the L-BFGS optimizer. Peeling a layer back, it appears
that a commonly used implementation of L-BFGS is in scipy, which is a wrapper
around a Fortran implementation.

------
rootbear
Obligatory quote when Fortran comes up:

"I don't know what the programming language of the year 2000 will look like,
but I know it will be called FORTRAN." \-- C.A.R. Hoare, ca. 1982

~~~
UncleSlacky
Also "I can write Fortran in any language".

------
ausbah
if Fortran is built for scientific computing, how does it compare to modern
languages that are used for scientific computing for the same area like Julia,
R, Python?

~~~
mayankkaizen
As other comments mentioned, R is 'wrapper' around Fortran. Numpy uses decent
amount of Fortran and so on.

------
snarfy
I mostly do business software and was curious about doing modeling of physical
plasmas. Sure enough, everything is fortran.

------
pvaldes
Nobody calls fortran from lisp? Several libraries are available and would be
interesting to hear something about such mix.

------
ryanmercer
So is COBOL and several other old languages. Lots of legacy stuff that still
gets used and updated.

------
mhd
So is there a Ratfor2018?

------
youeseh
> Its users are scientists and engineers; not computer scientists and software
> engineers, but the real ones.

If you're going to belittle a portion of your audience then at least try to be
funny.

------
rodrigosetti
I stopped reading when it implies software engineers are not real engineers.

~~~
kilo_bravo_3
Real engineers are liable if their output fails.

What is one of the first statements made in every software license ever
written in the history of computing?

Civil Engineers: Bridge falls down, PE's stamp is on designs, design faulty,
is liable = Real engineer.

Software "Engineer": Program fails, money lost. "LOL must be bugz, btw: THERE
IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER
PARTIES PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER
EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO
THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM
PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR
CORRECTION." = not an engineer.

Can you imagine if an aerospace engineer submitted the designs for a new
turbine engine with an asterix at the bottom of the blueprints saying "this
engine design is provided as-is without warranty of any kind either expressed
or implied, including, but not limited to whether or not it will blow up in
flight"?

The programmers for the Space Shuttle program may have been actual engineers,
because I'm willing to bet that they submitted documents guaranteeing the
performance and operation of the code they wrote.

Yep:

>NASA knows how good the software has to be. Before every flight, Ted Keller,
the senior technical manager of the on-board shuttle group, flies to Florida
where he signs a document certifying that the software will not endanger the
shuttle. If Keller can’t go, a formal line of succession dictates who can sign
in his place.

[https://www.fastcompany.com/28121/they-write-right-
stuff](https://www.fastcompany.com/28121/they-write-right-stuff)

So maybe some programmers can be engineers, if they put their money where
their mouths are and don't go "lol bugz, didn't buy support contract" if
something goes wrong.

~~~
C1sc0cat
If a bridge falls down you blame the contractor :-)

This Happened at Dar Al Hadassah when I worked there

~~~
gknoy
You can certainly blame the builder, but the "real" engineers I've talked to
all spoke of traditions like a _ring on their pinky_ so that when they are
signing their name to a design, they are reminded (as the ring touches paper)
that their signature affects real people's lives, and that both their
professional reputation and personal liability might be on the line if they
screw it up.

If an engineer designs my bridge, and it collapses when more than one car is
on it, he can't laugh at me for not giving him a spec that included that
requirement. Contrast this with software, where if a database stores plaintext
passwords, the programmers are probably protected if there is sufficient
documentation that the bosses/customer insisted on it.

I'm grateful to have the flexibility of creating effectively magical things
that can change easily to meet changing requirements, but I also feel like
there will come a time when we will need to have some similar degree of
engineering rigor mandated for software.

~~~
grovesNL
It's beyond traditions in some regions as well. For example, in Canada, the
"engineer" title is protected, so anyone using that title must be a registered
member of the provincial association (wherever they practice).

