I had the same reaction: Fortran? Still used? Omg! But after three years, I learned to like it. In its modern form, it is simple, fully vectorized, and fast. I like its syntax more than Python (explicit end is better than significant whitespace). And did I mention it is fast?
So — you have a modern vector math-oriented language comparable to MATLAB or Numpy, but much much faster. And with excellent, state of the art libraries. What’s not to like?
Here's my gripe with Fortran: as a computer scientists that works with combustion chemists, astrophysicists, and nuclear physicists, it's disheartening to see the number of domain specific optimizations that these scientists hand code over and over and over ad nauseam in hundreds of code bases all around the world simply because they think their Fortran compilers are good enough. They don't realize that it would be much more productive if their communities standardized around individual DSL compilers capable of doing higher level transformations specific to their individual domains so they don't have to do complicated transformations by hand (and often mess them up). It's fine if those DSL compilers generate Fortran out the back end so they can interoperate with existing optimized Fortran libraries, but holy hell people, we are living in the 21st century, can we please stop programming in primitive data types (without any unit type checking to boot!) and calling MPI send/recv directly. Scientists have better things to be doing with their time than managing bits directly; raise the level of abstraction by building community specific DSLs, standardize, and let's get on with our lives.
However, I'm not so sure about the speed argument anymore. And in any case for most applications (which are rarely big simulations) it's not worth the sacrifice of readability and modern tools IMO.
Even at CERN people are re-writing some tools in python. (probably wrapping some C++ code, but still ... it's not Fortran).
It may be that his use case (something with cloud or other weather modeling function) was just particularly well suited
In my opinion Fortran is easier and safer to use for scientists with little background in coding because it was built to do math/simulations really well AND it's readable almost like psuedocode (compared with C or C++). Also, modern Fortran is object-oriented which makes building reusable tools and large packages pretty easy.
Having said all of this I think a lot of the physics/nuclear/aerospace/finance community is probably transitioning into more modern languages and starting to employ people with actual CS backgrounds to build with more updated coding practices.
The focus on the speed of execution also downplays two other important aspects of python: speed of writing code and possibility to visualize results easily. As a theorist these two aspects outweigh any argument in favor of Fortran. The only serious competition to python for me is mathematica.
When you add in the problems from legacy C compatibility, you have a language that no one person can ever hope to grasp in its entirety. I mean, it's so complex now it pushed Scott Meyers into semi-retirement.
I wish they would have just dumped C legacy stuff and started over some time in the 90s. Actually, the language D is precisely what I wish C++ had become. I just wish D was more widely adopted.
I urge colleagues and students to not fall for Fortran prejudices, based on crusty old Fortran 77 code. Python is nice and dandy, but it's not for HPC. When you suggest students to stop use Fortran/C and use Python, they might have used the wrong tool in first place, and this has nothing to do with the language.
If you want to get a real-world impression look at the other Rust implementations that roughly correspond to the Fortran code. They are almost 2 times as slow. This gives you some real-world insight on how much performance you can achieve using Fortran instead of Rust and spending the same time writing code.
Overall, good points!
No. Although the Rust program was initially presented to me as a "port of fastest C SIMD variant" the programmer made additional optimizations not found in the C program:
- Moving the loop from outside into "bodies_advance(..)" (SSE pipelining(?))
- Bundle intermediate variables/arrays as struct NBodySim (caching)
- Fit array-sizes within struct NBodySim to the number of bodies (caching)
As an example where it got harder for me to find code is numerical CDF function approximation for bivariate / trivariate normal distributions. This is of course just my example, but I'm sure there are a lot of similar operations that are really hard to rewrite because the math is so complex.
The reason why this kind of code is easy to translate is because there is a direct mapping of language features (difficult to translate code is code that uses unique language structures). The only real difficulty of mapping (non-object oriented non-distributed) Fortran into a higher level language is keeping the speed.
I was looking at Distributions.jl when I was searching for it, but didn't find it there.
Update some of my coding style is still based on from looking at that NAG code :-)
I believe the real value in Fortran is in these libraries that handle numerical edge cases as well. It would be much easier to move them to a newer language if they would contain unit tests, but they usually don't, so it's just not worth the risk sometimes.
First computer lecture the guy said why fortran does not die. It should. That is 1979.
Up to a few years ago, I found often myself in discussions with colleagues on the benefits of Fortran vs. less horrible languages such as C. And I always thought that we are so backwards in academia (physics) compared to CS people who have buried Fortran a long time ago. There are even colleagues who use Fortran 77.
Fortunately, things are changing fast now with more and more people using Python. I learned it myself about a month ago and urged my students to stop using Fortran and/or C. I see 0 reasons to torture a new student with Fortran. I am using Jupyter notebooks with my students now and things are so much better: easier/faster debugging, appealing code and interface, higher level features, visualization tools, etc.
In the context of numerics, C is in no way a "less horrible language" than Fortran. The necessity to use pointers in C for multidimensional arrays, and the large amount of "undefined behavior" in the C standard makes it almost impossible to write bug-free number-crunching code. In particular when the person writing the code is a scientist/engineer, not a professional programmer or computer scientist, they are guaranteed to shoot themselves in the foot with C or C++.
Despite the enormous amount of snark directed Fortran's way, practitioners in HPC still use it, and Python cannot seriously be discussed in the same context.
As of 2019, the speed argument is mostly academic though. IMO. Few people do cosmo simulations on a daily basis to really care about this.
It's only free for students and educational uses. Not for any commercial or non-education research use.