Importantly, it is nigh on impossible to understand the pros and cons of these tradeoffs unless you learn the language first. So get a dart board out and pick a language at random. Learn it well enough to find things you don't like about it. Then learn another language that fixes the problems you had with the first and figure out what you don't like about that language. And so on.
Also, be aware that this process will never, ever end. Because there is no single best language.
Except Haskell ;)
No. Haskell requires that you declare where IO is allowed (via the type). Once you assert that you admit mutable state (ST) or arbtirary IO (IO), then you are free to use them.
Of course, Haskell also means if your program compiles, 90% of the time it runs fine. I love that about it. (Unit tests are still required for the things the compiler can't catch, but its a major win regardless)
You might not know about Debug.Trace? http://www.haskell.org/ghc/docs/latest/html/libraries/base/D...
That's because there's virtually no easily accessible objective material decision-makers can use to select between them. Because of the high financial stakes involved, every company and consultant has an interest in promoting their own language memes in businesses, government, universities and schools.
Why not just try something a bit more mainstream like COQ?
The computer is a machine. It understands one language. Everything else is sugar on top. The language you should choose is the one that exists at the right level of abstraction for the problem you're trying to solve. That's a very academic way of saying, "choose the right tool for the job."
Personally I'm of the opinion that you can get really far knowing C and Common Lisp. They represent two fundamentally different models of computation and all of the other languages (save the Smalltalk offshoots and obscure ones I haven't had the pleasure of experimenting with yet) are essentially sub-sets of functionality from one or both of these two languages.
It's too easy to get caught up in the "feature creep," of all of the languages available today. Some will claim "immutability by default," is a feature. It's not. Any sufficiently sophisticated evaluator of a language can stratify a program into a process capable of emulating a machine that can treat all memory as "immutable by default." It's important to remember that often these languages are created to solve a specific problem and then evolve over time to become more general. It is best, in my experience, to just go straight to the source and work down to what you need.
Yup. I advocate a fast running & low level language like C plus a quick to get started with lots of libraries language like Python.
At the end of the day, unless pure algorithms are the deliverables (and it may very well be), it is really about the libraries and ecosystem. In the majority of cases, it is simplest to re-use code someone already wrote. Who cares if this language compiles to assembler and is super functional and does all kinds of fancy tricks, can it read from Postgres or talk to RabbitMQ? Does it have an easy way to parse JSON or generate uuid's, or do I have to now spend time writing db drivers, json parsers before even starting on the real problem?
Secondly, the intended uses and paradigms were wrong. Lisp can do object-oriented programming with the CLOS and I'm pretty sure C++ was designed to be a general purpose language. In the intended uses, you have so many categories that are one off or very similar to others: what's the difference between web, web application and client-side?
Also, the graphics were not sharp and looked hazy. And the languages choices were not consistent, you have Clojure on some of them but not others. And in general, benchmarks don't mean anything without context.
This article has very little insight and has quite bad presentation.
“Is optimization of speed premature while productivity languishes?”; “Are semantic elegance and lexical simplicity just means by which to further our productive capacity?”
The flowery language is confusing me. Is there any real content here? I've never heard anyone talk about programming languages like this. Writing like this just makes it harder to understand and seems like putting on airs
The article is full of wrong information, wrong logic, and wrong conclusions. I feel embarrassed for the author, who must have spent a lot of time to produce something that is an intellectual net loss for anyone who reads it.
I wonder if your setup is faulty. My Scala REPL takes a couple of seconds to start. Admittedly longer than Python's, but not enough to worry about it. I'm on an old clunker laptop too.
Also, you may be interested in this course on Functional Programming in Scala run by Martin Odersky himself. It started last week. https://www.coursera.org/course/progfun
What can I learn quickly? ...that has a fairly simple forgiving syntax ...that has a ton of online help and resources ...that I can get a prototype product out simply with ...etc - I think it's a not atypical set of concerns.
For me, because of all the online help and community and ease of getting something useful working quickly I landed on ruby, at first using rails.
My analysis may have been simpler than the OP but as I learned more I was able to understand more of what I needed, where my choices were good, where not. I was able to pick up other tools and languages as my interest was piqued.
So no one choice is right for everyone but this approach worked for me, but more than anything the key is to drop the analysis and learn SOMETHING ...once you start it's easier to pick up more.
The Redmonk measurement also seems to be gamed because the second cluster in the chart includes some languages with backers who are heavily promoting the popularity of the language, including using Stack Overflow as their primary documentation effort and creating numerous shell projects on Github.
Most of us write software that moves bits from an API or DB to a screen. If we're lucky we translate a couple of those values to a color or something.
Why are benchmarks so focused on code no one writes? How many successful YC companies aren't CRUD apps?
Type the problem you have into google, write your stuff in the language that has the most concise stackoverflow answers on how to solve that problem.
Mostly because, when you're doing something that cares about environmental overhead and compiler efficiency, it tends to involve the kinds of algorithms that benchmarks test with. In other words, benchmarks are designed to provide exactly the information that is useful to the people that need benchmarks. You're absolutely right that isn't the case for web, phone, and applications programmers, and it's true that those people shouldn't pay attention to benchmarks, but there are people out there that care quite a bit.
For example, I write code for humanoid robots. If my code takes more than about half a millisecond to respond to sensor input, my robot will fall over and possibly hurt itself (thousands of dollars of damage) or me (...no thanks). That means that the language and environment I use actually matters - I don't get to have a garbage collector or a high-level runtime VM, to start with. And what algorithms am I running? K-means, n-body, matrix multiplication and inversion, and constraint satisfaction AKA sudoku. As I said, the benchmarks are designed to suit the people that need them.
Personally, I am very fortunate to not think about nothing but CRUD apps all the time. Working with games, all the fun things like sort speed, path finding, etc are real world problems :)
Not that long ago my main programming tasks were:
* plotting some kind of Julia sets and analyzing its properties
* high precision computations (>150 significant digits) of invariant manifolds
A lot of people I know are world-class in... N-body stuff.
It is hard to quantify those through and stackoverflow posts and github repos are an insufficient metric
Lots of people have to dig into clustering/mining/stats of the data we gather to figure out how to pull money and from whom.
As an admittedly fairly poor analogy, in programming you generally still have to assemble the car first. Silly things like laying out controls, moving data from/to the db, etc.
Having a really easy to assemble car is still going to take so much more time than the time you spend driving from A to B. So it's much more important. Add on top that you're going to be constantly asked to disassemble it and reassemble it to change the wheel size, etc. because accounting forgot to say that you'd be driving off road.
This was the point that I completely disagree with. When somebody gives me a new programming language (Julia for example), I'm going to want to know the numbers just so I have that as a baseline understanding. Afterwards sure, I want to know all the bells and whistles the language gives me over C.
Since the goal of a school-level course is to introduce student to programming, rather than give them knowledge immediately useful in the industry (which the student may or may not go on to work in), C# is probably not a good introductory language. Though, in this respect, it is no worse than either Java or C++ (and I suspect those end up being preferred because of tradition, and because VS licenses are expensive).
Personally, I think something like Python would make a better introductory language in schools -- it's small, clean and high level, while at the same time practical and widely used.
edit : not meant to imply that Lisp or Ocaml are not "pragmatic", just that the likelihood of working with them as a professional developer are substantially lower than Java / C# or Ruby / Python.
Java, C++ have plenty of free, multi-platform tooling and resources. C# requires lots of (hard to get or costly) Microsoft products and tooling like Visual Studio.
C/C++ has the additionally advantage that students are exposed to memory management issues.
Not to mention that C/C++ and Java are the top 3 languages on the KIOBE index and have been for the last decade. C# is newer than all three has risen in popularity over the last decade.
Matlab is much higher level language, usually targeted at (non computer) engineering disciplines because its is easier to learn, and provides lots of tools and libraries for simulation/computation/statistics.