There is no single best language. Instead, what you get from any sufficiently good programming language is a series of tradeoffs. Haskell, for example, provides the benefits of pure functions (lazy evaluation, composable memory transactions, ability to reason about code) but in so doing it takes away your ability to arbitrarily mutate state and perform IO. That's a tradeoff. Python provides a clean, easily readable syntax that leverages whitespace. But you can't define multi-line lambdas. That's a tradeoff. Most of the languages on the author's list use garbage collection, yet another trade off.
Importantly, it is nigh on impossible to understand the pros and cons of these tradeoffs unless you learn the language first. So get a dart board out and pick a language at random. Learn it well enough to find things you don't like about it. Then learn another language that fixes the problems you had with the first and figure out what you don't like about that language. And so on.
Also, be aware that this process will never, ever end. Because there is no single best language.
> it takes away your ability to arbitrarily mutate state and perform IO
No. Haskell requires that you declare where IO is allowed (via the type). Once you assert that you admit mutable state (ST) or arbtirary IO (IO), then you are free to use them.
In fairness, Haskell makes it more awkward to deal in mutable state and arbitrary IO. Like how not using goto in a more mainstream language makes it more awkward to have arbitrary control flow. Debugging is annoying when I can't stick a println or puts in any arbitrary function. (The solution of course is to have small functions and print them. If your Haskell function is over 20 lines long you're probably doing it wrong.)
Of course, Haskell also means if your program compiles, 90% of the time it runs fine. I love that about it. (Unit tests are still required for the things the compiler can't catch, but its a major win regardless)
> it is nigh on impossible to understand the pros and cons of these tradeoffs unless you learn the language first
That's because there's virtually no easily accessible objective material decision-makers can use to select between them. Because of the high financial stakes involved, every company and consultant has an interest in promoting their own language memes in businesses, government, universities and schools.
I would say it's better to pick a task, then take a look at what programming language is commonly used for that task. Want to write a web game to post online on a game aggregator? Sign up for Kongregate and NewGrounds and find out what they support. Flash, and Java unity seem to be your choices. Rather write a game to post yourself for geek cred? Pygame and HTML5 are much cooler options. Would rather create a web based app? Ruby and Javascript may be the way to go. Want to make an app to use on your phone? Objective C or Java are probably your best options depending on your phone. Programming is a means to an end and while enjoyable by itself the best way to narrow it down is to pick a goal.
Coq is also a good choice. Coming from Haskell, though, I think Agda/Idris have a shallower learning curve. It also sits more happily on the theorem proving side of things than the programming side.
The computer is a machine. It understands one language. Everything else is sugar on top. The language you should choose is the one that exists at the right level of abstraction for the problem you're trying to solve. That's a very academic way of saying, "choose the right tool for the job."
Personally I'm of the opinion that you can get really far knowing C and Common Lisp. They represent two fundamentally different models of computation and all of the other languages (save the Smalltalk offshoots and obscure ones I haven't had the pleasure of experimenting with yet) are essentially sub-sets of functionality from one or both of these two languages.
It's too easy to get caught up in the "feature creep," of all of the languages available today. Some will claim "immutability by default," is a feature. It's not. Any sufficiently sophisticated evaluator of a language can stratify a program into a process capable of emulating a machine that can treat all memory as "immutable by default." It's important to remember that often these languages are created to solve a specific problem and then evolve over time to become more general. It is best, in my experience, to just go straight to the source and work down to what you need.
> Personally I'm of the opinion that you can get really far knowing C and Common Lisp.
Yup. I advocate a fast running & low level language like C plus a quick to get started with lots of libraries language like Python.
At the end of the day, unless pure algorithms are the deliverables (and it may very well be), it is really about the libraries and ecosystem. In the majority of cases, it is simplest to re-use code someone already wrote. Who cares if this language compiles to assembler and is super functional and does all kinds of fancy tricks, can it read from Postgres or talk to RabbitMQ? Does it have an easy way to parse JSON or generate uuid's, or do I have to now spend time writing db drivers, json parsers before even starting on the real problem?
I agree that it's enough to know well one low level (or medium level) language and one high level language. For me, I decided a while ago that the two languages will be Java and Javascript. It was an easy decision since I use those languages at work, they both have healthy and growing ecosystems, and active communities.
For the first few years of my programming life (for research) these were exactly my only tools, and indeed they are perfect except when you get into "get something done fast" Then Python wins... Now I know far more programming languages, ideas and abstractions (btw: Prolog should also be a leg in that stool, since it offers things that can blow your mind... Sometimes :)
Prolog (and by extension I assume, logic programming) is an interesting way to solve problems. However it is rather simple to implement and evaluator in Common Lisp that can read and execute Prolog-like programs.
The graphics in this are not very helpful, inconsistent and in some cases wrong. Your 3-d size/cpu/memory was impossible to read, there was no origin.
Secondly, the intended uses and paradigms were wrong. Lisp can do object-oriented programming with the CLOS and I'm pretty sure C++ was designed to be a general purpose language. In the intended uses, you have so many categories that are one off or very similar to others: what's the difference between web, web application and client-side?
Also, the graphics were not sharp and looked hazy. And the languages choices were not consistent, you have Clojure on some of them but not others. And in general, benchmarks don't mean anything without context.
This article has very little insight and has quite bad presentation.
“Is optimization of speed premature while productivity languishes?”; “Are semantic elegance and lexical simplicity just means by which to further our productive capacity?”
The flowery language is confusing me. Is there any real content here? I've never heard anyone talk about programming languages like this. Writing like this just makes it harder to understand and seems like putting on airs
Not only bias, but obviously pure ignorance too, at least with respect to Lisp. In his table, Lisp seems to be unsuitable both for meta programming and multiple dispatch, which always have been touted as Lisp's major features.
The article is full of wrong information, wrong logic, and wrong conclusions. I feel embarrassed for the author, who must have spent a lot of time to produce something that is an intellectual net loss for anyone who reads it.
I write Python & Objective-C most of the day, used to work for several years in a PHP/JS shop and have to code in Ruby and/or C from time to time. I recently felt the urge to learn a new language which is distinctly different from what I know so that I can learn new things. I considered Erlang, Common Lisp, Clojure, Go, Haskell, Scala and Elixir. I plan to use it for server side web development and if possible for command line scripting (where I currently oftentimes use Python). After much back and forth I ended up having to decided between Clojure, Erlang or Scala. I then went with Scala because it seemed to offer the best of both worlds: It has a good actor library with Akka, it has macro support, it is functional, it offers good OOP support (to ease the transition for me), and as a boon it also allows to develop for Android, something I may want to tackle in the future. So far I'm really happy. The only downsides are that the REPL starts up pretty slow (I'm used to Python, which starts up way faster), and package installation (especially of Java packages) confuses me at times. I still may want to go to Clojure or Common Lisp from here, but right now I'm happy with my choice.
I wonder if your setup is faulty. My Scala REPL takes a couple of seconds to start. Admittedly longer than Python's, but not enough to worry about it. I'm on an old clunker laptop too.
Also, you may be interested in this course on Functional Programming in Scala run by Martin Odersky himself. It started last week. https://www.coursera.org/course/progfun
I suspect like a lot of people who came late to programming my choice was initially dictated by simple pragmatic concerns:
What can I learn quickly? ...that has a fairly simple forgiving syntax ...that has a ton of online help and resources ...that I can get a prototype product out simply with ...etc - I think it's a not atypical set of concerns.
For me, because of all the online help and community and ease of getting something useful working quickly I landed on ruby, at first using rails.
My analysis may have been simpler than the OP but as I learned more I was able to understand more of what I needed, where my choices were good, where not. I was able to pick up other tools and languages as my interest was piqued.
So no one choice is right for everyone but this approach worked for me, but more than anything the key is to drop the analysis and learn SOMETHING ...once you start it's easier to pick up more.
The RedMonk chart measures language popularity by activity on GitHub and StackOverflow. Top business languages Cobol, Transact-SQL, and PL/SQL aren't included at all in the 84 languages listed so the popularity measurement criteria is flawed.
The Redmonk measurement also seems to be gamed because the second cluster in the chart includes some languages with backers who are heavily promoting the popularity of the language, including using Stack Overflow as their primary documentation effort and creating numerous shell projects on Github.
Seriously how many people here write programs to solve k-means or n-body problems? Who fucking cares how many lines of code it takes to write k-means, and who fucking cares how long it takes to execute or how much memory it consumes.
Most of us write software that moves bits from an API or DB to a screen. If we're lucky we translate a couple of those values to a color or something.
Why are benchmarks so focused on code no one writes? How many successful YC companies aren't CRUD apps?
Type the problem you have into google, write your stuff in the language that has the most concise stackoverflow answers on how to solve that problem.
> Why are benchmarks so focused on code no one writes?
Mostly because, when you're doing something that cares about environmental overhead and compiler efficiency, it tends to involve the kinds of algorithms that benchmarks test with. In other words, benchmarks are designed to provide exactly the information that is useful to the people that need benchmarks. You're absolutely right that isn't the case for web, phone, and applications programmers, and it's true that those people shouldn't pay attention to benchmarks, but there are people out there that care quite a bit.
For example, I write code for humanoid robots. If my code takes more than about half a millisecond to respond to sensor input, my robot will fall over and possibly hurt itself (thousands of dollars of damage) or me (...no thanks). That means that the language and environment I use actually matters - I don't get to have a garbage collector or a high-level runtime VM, to start with. And what algorithms am I running? K-means, n-body, matrix multiplication and inversion, and constraint satisfaction AKA sudoku. As I said, the benchmarks are designed to suit the people that need them.
If you are in the games industry, you solve n-body problems all the time. If it's a multiplayer game, you work with distributed n-body problems. Even if all you did was a CRUD app, if you truly became successful, memory footprint and CPU usage will matter. The 300x slowdown from Ruby will sting. But for everybody else, you're right. None of this is relevant, in the same way that a 5000 hp car engine isn't useful to anybody but Batman.
Personally, I am very fortunate to not think about nothing but CRUD apps all the time. Working with games, all the fun things like sort speed, path finding, etc are real world problems :)
Zero, but other languages like Haxe, Actionscript, Javascript, and the like are becoming more relevant as higher quality browser games become more prominent.
So, most of us do something so anything else should be ignored?
Not that long ago my main programming tasks were:
* plotting some kind of Julia sets and analyzing its properties
* high precision computations (>150 significant digits) of invariant manifolds
A lot of people I know are world-class in... N-body stuff.
I somewhat agree. Choosing a programming language involves only a part of the factors listed here (speed, memory, paradigm) but also community, resources, development time and available framework, tooling and libraries.
It is hard to quantify those through and stackoverflow posts and github repos are an insufficient metric
Fully agree, I like the Twitter/Facebook model, write the whole thing in a god awful 'slow' language that's quick to get code out the door, then rewrite in much 'faster' languages once the feature set solidifies.
k-means has enough problems that you (almost) never implement it on its own anyway. If you do, it's generally because you need something now and it's faster than learning how to marshall your data for whatever API you can find on github or sourceforge.
Lots of people have to dig into clustering/mining/stats of the data we gather to figure out how to pull money and from whom.
This is a bit exaggerated right? When you buy a car, you aren't racing with it, but the amount of time it goes to 60 mph still matters, even if you're looking for fuel efficiency or something else.
Not really. Programming is very different to driving in a straight line.
As an admittedly fairly poor analogy, in programming you generally still have to assemble the car first. Silly things like laying out controls, moving data from/to the db, etc.
Having a really easy to assemble car is still going to take so much more time than the time you spend driving from A to B. So it's much more important. Add on top that you're going to be constantly asked to disassemble it and reassemble it to change the wheel size, etc. because accounting forgot to say that you'd be driving off road.
I'm addressing the point above: "Language Benchmark is probably the most useless and counter-productive place to start talking about language and platform comparison."
This was the point that I completely disagree with. When somebody gives me a new programming language (Julia for example), I'm going to want to know the numbers just so I have that as a baseline understanding. Afterwards sure, I want to know all the bells and whistles the language gives me over C.
The graphs seem to imply that functionality wise C# is the best choice....
I don't know much about languages but if that's true I wonder why it's not taught more often in schools, most classes I've seen are either for java or c++, and then assembly and matlab
From educational perspective, C# is kinda messy. It had a lot of the things bolted on over the versions. Don't get me wrong, most of those things are useful, and it is nice to have them in the language, but it does make the language more complicated and less consistent.
Since the goal of a school-level course is to introduce student to programming, rather than give them knowledge immediately useful in the industry (which the student may or may not go on to work in), C# is probably not a good introductory language. Though, in this respect, it is no worse than either Java or C++ (and I suspect those end up being preferred because of tradition, and because VS licenses are expensive).
Personally, I think something like Python would make a better introductory language in schools -- it's small, clean and high level, while at the same time practical and widely used.
The case could be made that universities teaching Java are already catering too much to industry. For my part, I would vastly prefer working with a lisp, ML or other esoteric language while in school. There's no shortage of time afterwards to work with more pragmatic languages & frameworks.
edit : not meant to imply that Lisp or Ocaml are not "pragmatic", just that the likelihood of working with them as a professional developer are substantially lower than Java / C# or Ruby / Python.
C# does fit many needs and molds and could be used as general purpose language of instruction. However decisions are seldom made on how 'good' a language is.
Java, C++ have plenty of free, multi-platform tooling and resources. C# requires lots of (hard to get or costly) Microsoft products and tooling like Visual Studio.
C/C++ has the additionally advantage that students are exposed to memory management issues.
Not to mention that C/C++ and Java are the top 3 languages on the KIOBE index and have been for the last decade. C# is newer than all three has risen in popularity over the last decade.
Matlab is much higher level language, usually targeted at (non computer) engineering disciplines because its is easier to learn, and provides lots of tools and libraries for simulation/computation/statistics.
C#, regardless of its merits, serves more Microsoft's commercial purposes than anything else, so it requires factoring in whether Microsoft's interests are aligned with yours.
How does a non-programmer write a guide to programming languages that has any value? Full of errors and misinformation gleaned from far too many hours reading websites and far too few hours writing code.
Importantly, it is nigh on impossible to understand the pros and cons of these tradeoffs unless you learn the language first. So get a dart board out and pick a language at random. Learn it well enough to find things you don't like about it. Then learn another language that fixes the problems you had with the first and figure out what you don't like about that language. And so on.
Also, be aware that this process will never, ever end. Because there is no single best language.
Except Haskell ;)