Hacker News new | past | comments | ask | show | jobs | submit login
Fortran is back on the top 20 TIOBE index list (zdnet.com)
36 points by milancurcic on April 6, 2021 | hide | past | favorite | 43 comments



Title is clickbait, suggest it be changed to something like Fortran is getting more popular.


An even better title would be We do not know if Fortran is getting more popular or not, because our data basis is Tiobe


I edited the title again to not confuse popularity with TIOBE ranking.


Even better, thanks.


Done, thanks!


I was surprised to find Fortran has continued to get regular upgrades every 5-10 years.

some interesting ones:

1995: "Initialization of pointers to NULL()"

2003: "Object-oriented programming support: type extension and inheritance, polymorphism, dynamic type allocation, and type-bound procedures, providing complete support for abstract data types"

2008: "Coarray Fortran—a parallel execution model"

2008: "The DO CONCURRENT construct—for loop iterations with no interdependencies"

2018: "Further Interoperability with C"


If you doing nearly anything in physics, you have to know and use Fortran. (And, in fact, modern Fortran is not a bad language -- it is something like Matlab or Numpy, but much faster).


I find TIOBE's ranking to be the least helpful and most volatile of any of the language popularity measurements out there. A couple years ago, the Julia language team did a pretty good deep dive on just why that is.

<https://juliacomputing.com/blog/2019/09/tiobe-blog/>


The article didn't say the reason for Fortran's renewed popularity. Anyone know why? Only thing I can think of is either due to AI/ML or robotic-y needs.


I can't say for sure the reason, but I can say with confidence the use of Fortran never really went out of style, but many of the uses were confined to classified systems and edge type or disconnected systems. Heavy number crunching is a big deal in applications like satellite imagery processing, signals surveillance, targeting and navigation. Fortran never really stopped being king there, but a lot of the development wasn't exactly taking place on engineer's personal laptops or Internet connected systems at all, so queries for how to do stuff wasn't being captured by Tiobe metrics.

The U.S. government has been making a huge push in the past decade to downgrade as much code as possible that doesn't need to be classified and push as much development as possible into fully unclassified environments to mitigate the cost of needing a fully cleared workforce. And network connectivity is generally increasing for systems that were not networked in the past.

Trends like these could easily account for Fortran developers doing more work in the open where they can just use Google to get answers instead of checking proprietary and closed information sources.


Any sort of language popularity tally is at least somewhat suspect. People who do this sort of thing systematically try to use a variety of sources and have some sort of formal methodology. But, at the end of the day, you're dependent on public data sources. If development is taking place behind closed doors, questions aren't being asked on stackoverflow or Google, not many books are being purchased, etc. you're going to undercount relative to, say, Javascript.


May I just express my gratitude for having this community. This answer is well articulated, and answers the question I had, in a form that is close to heart.


As I understand it, the TIOBE index is calculated based on search engine rankings. What I believe helped Fortran's ranking was our hard work on its web presence [1] (started April 2020), modern tooling [2] (started December 2019), and community building [3] (started May 2020).

[1] https://fortran-lang.org/

[2] https://github.com/fortran-lang/

[3] https://fortran-lang.discourse.group/


Right, I was wondering about the methodology. I think the yearly Stackoverflow surveys are a bit more reliable probably in terms of what people actually use.

The latest one does not even include either Fortran or Groovy, both of which allegedly and randomly got a lot more popular according to the article. I don't think either of those things actually happened.

https://insights.stackoverflow.com/survey/2020#most-popular-...

Fortran competitor, Julia is included. That might just be bias of the Stackoverflow user base of course. But, I think it's a pretty widely used tool also for Fortran developers. At least there are plenty of questions tagged with that (11K, which is more than Julia's 8K).

There recently was some HN post about Fortran, so I imagine that might have triggered a few searches.


There's been some great, recent efforts to bring more "modern amenities" to the Fortran coding experience. For instance the Fortran Package Manager [1] aims to make it easier to bootstrap new applications leaning on established, extant code without spending all your time manually setting up the build targets and library linking details.

I've also seen a confluence with the AI/ML communities. Some interesting applications of these tools to accelerate models used in weather, climate, chemistry, physics, and astrophysics require complex software interactions, usually running simultaneously with some multi-million line Fortran code base. There's been many interesting and clever marriages of classic HPC Fortran applications and novel, embedded AI/ML parameterizations.

[1]: https://fpm.fortran-lang.org/


Maybe Intel making its Fortran compilers for Windows and Linux has helped. Fortran Discourse https://fortran-lang.discourse.group/ is a recent, active discussion site, and Fortran-lang https://fortran-lang.org/ is a recent information hub. On GitHub the Fortran Programming Language group https://github.com/fortran-lang is active, creating a Fortran standard library and package manager.

That said, although I'm a Fortran fan and programmer I doubt its relative popularity has truly jumped so much in 1 year.


I do know that FORTRAN is considered the fastest scientific computing language; beating out C++.

I'm told that is because the standard libraries are so optimized.

Also, there is a gigantic infrastructure of literature, algorithms and support for the language. It is also relatively easy for folks that don't code for a living to use.

I wouldn't know, from experience. The last time I had anything to do with FORTRAN was in 1987, and I don't miss it one bit.


took over a project that calculates the inbreeding-coefficient[1] on a platform for animal breeders. The product is used by breeding organization of endangered livestock (an Alpaca club in Ireland, a horse breeder in Belgium etc etc). My first reaction was WTF who decided on this technology. It was developed with some university professors focusing in veterinary geneology and so weren't aware that this isn't the right choice. When I asked them they were dead set on using this because apparently in their domain it is considered the go-to language for number-crunching. Idk if this is anecdotal or if there is a pattern but the feeling / takeaway I got was that some older academics would bring these things back because it's what they think is the right tool for the job. It was odd because most younger people I know who develop but don't consider themselves "developers" by trade would almost always resort to Python.

[1] https://en.wikipedia.org/wiki/Coefficient_of_inbreeding


if the number crunching requirements are moderate (e.g. can be efficiently done in NumPy OR can be done <1min runtime in cPython interpreter and there are no real time requirements) it certainly is very questionable to go with Fortran. If the above isn't true, e.g. for some heavy physics calculations, Fortran is still a fine choice if you get the right people to maintain it. I'd choose it over C/C++ but would first have another look at Julia (esp. regarding its tooling maturity).


> The article didn't say the reason for Fortran's renewed popularity. Anyone know why?

As a single data point, I'm more and more drawn towards Fortran lately. I never used it before this year, having done all of my work in various non-Fortran languages, mostly C, C++, Matlab/Octave, lua, python+numpy, and Julia. I'm quite fed-up with limitations with all of these languages, and it seems that Fortran has actually its shit together, at least for purely numerical computation (which is all I do and all I need).


What are some examples of limitations you've run into with various non-Fortran languages?


None of them seems "tailored" to the job of numerical computation (mostly linear algebra algorithms) like Fortran seems to be.

C: it is mostly alright, especially since C99 with VLA and complex numbers. Yet the aliasing rules are a bit annoying, and multi-dimensional arrays, while possible, do not really feel natural.

C++: an unholy clusterfuck... I almost ended crazy trying to use it properly. With extreme discipline maybe you can get to do some work with it. But not me.

Octave: very beautiful language with a natural, concise notation for math. Excellent out-of-the-box support for sparse matrices. Shame that loops are so slow; needs a serious effort on a JIT.

lua: my favorite general-purpose language. The luajit interpreter may be one of the fastest and easiest alternatives to Fortran. I wrote a matrix product using an explicit triple loop, and it multiplied huge matrices just slightly slower than an optimized blas. Extremely impressive! I'm very sad that the project is sort of "abandoned".

python+numpy: My least favorite, but the one that I use the most... what can I say: this is a general-purpose language where you can sort of do math in it. But it is not a language tailored for math. The base language has natively strings and dictionaries (for which I have no use) but it doesn't have native multidimensional arrays of floats. I do not understand why the numeric computing community is shifting so much to python, it makes no sense in my eyes. Also, loops are insultingly slow.

Julia: may be the best of the bunch, but launching the interpreter is excruciatingly slow, and it does not suit my usage. In the time that julia uses to compile a three-line program to plot "sin(x)", you can run one hundred times the Fortran compiler on the equivalent program (or simply call gnuplot). All in all, it keeps pretty much the promise of "matlab with fast loops", which is just what I need. But I feel that it's still a bit far from being there.


Have you tried the recently released v.1.6.0 of Julia? Precompilation is dramatically faster. Also, for serious numerical work, startup time is eclipsed by run time, so it tends not to be relevant. I did most of my physics simulations in Fortran, which is a great language for the task, but if I were writing a new simulation code today I would use Julia. It is the only language in the petaflop club that is actually better to program in than Fortran. (+1 for gnuplot.)


> Have you tried the recently released v.1.6.0 of Julia? Precompilation is dramatically faster.

Yes, I tried it upon its release and, while much faster, the startup time is still dramatically sub-par, especially when loading packages. This means that when I call Julia scripts from makefiles, etc., a considerable amount of running time is spent in re-compiling over and over the same packages. I agree that my usage pattern is not at all representative, but still it seems that Julia is not a good "unix citizen", since it tries to force you to do everything inside its own REPL, instead of the native one. This is indeed my main point of friction with Julia. If it started instantaneously it would be essentially perfect. This is not a matter of dividing the startup time by 2, but at least by 200.


I understand. I haven’t tried this myself, but others have had luck, for this use case, in using sysimages: https://julialang.github.io/PackageCompiler.jl/dev/sysimages...


Thanks! I was told about this on a previous HN thread, but I have still to try it. Will do as soon as I have some time. It looks really promising.


Can you elaborate on why it "is actually better to program in than Fortran"?


The type system and dispatch paradigm provide a powerful method of code and project organization, and facilitate a type of code reuse that you don’t get with Fortran: the ability to pull other people packages into my project and use their types and methods. That’s one thing.

EDIT: Also, having a REPL, and a nice one, is a big deal.


Can you expand a bit on the need for "types" in numerical computation? They sort of seem like overkill to me. The only types I use are float and double, but I could really do with just doubles.

OK, admittedly I also need complex numbers (two floats) and ad numbers (however they are represented, typically by two floats also). But that's it. They are numbers after all, associative and commutative. The "dispatch paradigm" that these types require seems really simple, as it is implemented by e.g. generics in C as in tgmath.h. I'm wholly unconvinced--or more honestly, wholly ignorant--of the interest of a really complicated and powerful type system with multiple dispatch, broadcasting and whatnot that is offered by Julia. Fortran really seems enough for me. [And I do not care at all for the REPL, but that is a separate issue.]


Some other nice things that come from having a type system for numerical work:

- Forward mode automatic differentiation. Having a type system that allows `Dual` numbers to pass through your algorithm, simulation, or whatever means you can calculate derivatives, gradients, jacobians, and hessians efficiently (for small problems) and accurately without having to change any of your code. There are so many times where I say “hey, I wonder what the sensitivity of my simulation output to this input parameter is” and it’s really nice to be able to answer that question with one line of code.

- Unitful numbers. It’s really nice to be able to pass numbers with units through a simulation (little to no performance penalty!) to make sure everything checks out in that respect.

- Uncertainty. Both Measurements.jl and MonteCarloMeasurements.jl provide numbers that propagate (linear and nonlinear, respectively) uncertainty as the pass through calculations. Want to see how uncertainty in a parameter propagates through a calculation? Just change that one parameter to an uncertain number and let it run through your algorithm as-is and it will spit out an answer with uncertainty bounds on the other side.

These are just a few examples of the stuff I use it for in my everyday work. Having a full type system for numerical work is one of those things that seems silly before you use it, but once you do, you wonder how you got by without it before.

EDIT: BTW, these are just examples of numerical types. Sending specialized array types through a your code is also a thing. For example, if you have a `Diagonal` type matrix and you send it to an eigenvalue solve, it just pulls the elements from the diagonal without wasting any time trying to calculate anything. Or there are things like ComponentArrays.jl (full disclosure, I wrote this library), that let you pass arbitrarily deep structured information through a differential equation or optimization solver for much cleaner and more readable code than just indexing into a plain vector like you would usually have to do. And you can even put your weird numerical types inside of the weird array types and just send it on through.


I think I have a pretty good answer to your question in this article: https://lwn.net/Articles/834571/

It’s not that other types are “needed”, but that they let you do some pretty powerful things with surprising ease. And the nice thing is that in Julia, you can ignore them. You can just compute with floats or doubles as if the type system doesn’t exist. But it’s there in case you would like, for example, to apply the differential equation solver that you just wrote to quaternion-valued functions with no extra work.


This isn't theoretical too, here's an actual user who opened an issue where their MWE was using quaternions:

https://github.com/SciML/BoundaryValueDiffEq.jl/issues/52

This is how I found out it worked in the differential equation solver: users were using it. The issue was unrelated (they didn't define enough boundary conditions), so it's quite cool that it was useful to someone. It turns out the quaternions have use cases in 3D rotations:

https://en.wikipedia.org/wiki/Gimbal_lock

which is where this all comes in. Anyways, it's always cool to learn from users what your own library supports! That's really a Julia treat.


Thanks for this article, it's really well written and engaging.

But here, I cannot resist:

> You can just compute with floats or doubles as if the type system doesn’t exist.

Except when you can't! I want to plot a stupid array of floats. Yet it takes 10 seconds because it is juggling useless types around. A complex type system may be a nice thing to have, if you really want it, but it is definitely not "free", and it always involves serious compromises that make other things impossible or very cumbersome. I would like an option like --type=float in the interpreter that assumed that all numbers are of that type and ran extremely fast.


I’m glad you liked the article. Obviously the 10 second or so of precompilation time is a big issue for you. I guess I have no further suggestions about that other than to suggest again to look at creating sysimages (learning how to do this is on my to-do list).


> I’m glad you liked the article.

I was particularly "seduced" by the idea of using a standard ODE solver directly on quaternions. Having worked in the smoothing of 3D camera trajectories the last year, I would have definitely loved to know that at the time!


> Fortran's renewed popularity

There's no way to know if a language is becoming more popular. You can only observe if a measure of language popularity is rising, falling, or staying the same. We're almost certainly observing the "measured" part as opposed to the "actual" part.


Really well debugged and proven numerical processing libraries like LINPACK ???.


Fortran = physics, and perhaps either civilian or military engineering.


I think it's popular in the high performance computing crowd.


Makes sense... Because in the part (at least) in which it remains a "formula translator" Fortran cannot be beat by any other language in terms of the efficiency of the generated machine code.


It has been a few years, but less than 10 years ago I worked on a small project where we used the Intel FORTRAN Compiler: https://software.intel.com/content/www/us/en/develop/tools/o...

It's well maintained and highly optimized. Before then, I had not used FORTRAN since 1983 (and it was old even then).


What is going on with Java down -5.49% though?

https://www.tiobe.com/tiobe-index/


The article notes that Objective C has fallen out of the TIOBE top 20 index altogether. Stack Overflow's "Trends" feature lets you search for number of questions by tag. Here are the "Trends" results for Objective C vs Fortran. As you can see, by this definition of popularity, Objective C remains at least twice as popular as Fortran. This might only tell you that Fortran programmers are less likely to use Stack Overflow than Objective C programmers are. Nevertheless I think it's a useful data point.

https://insights.stackoverflow.com/trends?tags=objective-c%2...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: