Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: The future of programming languages?
44 points by ique on Nov 10, 2010 | hide | past | favorite | 54 comments
I've been thinking a lot about programming languages lately and I wonder what you guys think about what the future of programming will be.

Considering the amount of "new" languages(/dialects) popping up (Clojure) and some languages find new popularity (JS) I wonder how they'll be used.

It feels to me like languages are getting more problem specific, I.E. Haskell, Clojure and other functional languages get most attention for research and heavy calculation while some stuff get specialized for web-usage.

Will the future of programming be like science; where people get extremely specialized on a small set of problems, or will programmers learn multiple languages and use the best one for the problem at hand?

Will game development ever diversify or will it go from C++ to C# to ...

How do you view a future world filled with hundreds or thousands of fantastic programming languages?

Topics that I think will be important:

  - Functional Programming
  - Logic Programming
  - Managing asynchronicity
  - Managing concurrency / Managing state
  - Pruning bad directions in OO (see the first two)
  - Type Systems
  - Virtual Machines
Languages/technologies that I would look at more closely given these are Haskell, Scheme, Prolog, Clojure, Qi, LLVM. I love JS but as far languages go, I would not look there for new ideas. At best I see it becoming a fantastic compile target.

I also don't see Haskell and Clojure being particularly specialized. They are very general purpose and suitable for tackling any kind of programming problem - simple to complex, JS on the other hand is a language with a very specific focus.

I don't see the importance of being able to read and understand C/C++ diminishing anytime soon as those languages are intimately tied to our operating systems.

EDIT 2: I added Type Systems above. I think Haskell has shown the power of an expressive type system. However it has it's problems. I look forward to see the distinction between languages w/ strong type systems and those without being abolished. Languages should support turning the type system on and off - see Qi. Type systems also should allow the typing of a much richer set of values - Qi's sequent calculus types is eye-opening in this regard.

EDIT: I'm opinionated about this, but the constant announcement of new languages that simply continue the traditional stateful OO paradigms (perhaps tacking on a couple of syntactic niceties or a crippled static type system) seem like complete dead ends.

I very much hope you turn out to be right. Unfortunately I'm afraid the momentum of the web client platform will be so strong, the impedance mismatch between client and server such a pain, and the progress of JS as a better compiler target so slow (the pressure to interoperate with existing JS libraries may also be a factor here) - that the dominant trend will be towards using JS as a source language everywhere.

So the turn to the web will set back the state of the art in programming by a decade, as the turn to microcomputers did.

JS doesn't solve the client/server impedance mismatch at all. The JS community is struggling with designing libraries that work as well on the server as well as the client.

As Ryan (Node.js) has said, I see JS going the way of PHP. That's great and that it will attract certain kinds of coders and certain kinds of projects.

But setting programming back by a decade? Personally I find JS a much better foundation for learning FP principles than PHP - there's enough in there to guide people to the topics I've outlined above. In fact my interest in these topics arose from being a JS coder for 5 years!

JS is the gateway drug to the new future.

"The JS community is struggling with designing libraries that work as well on the server as well as the client."

Disagree. YUI3 was a library written for the web, but it was so well designed that it took one YUI engineer hacking around for a few days to get it fully running in Node.js. Now, some awesome stuff is happening and it's an area of focus for the YUI team. http://express.davglass.com/

Mustache.js and Underscore.js are other examples of popular JS libraries that work great server-side right off the bat.

The main one left out is jQuery, and that is a library written primarily for DOM manipulation, which isn't really the point server-side (most of the time). Once JSDOM is a bit more mature, I'm certain you'll see jQuery become more popular server-side.

You can look at Agda, Idris or Epigram for some ideas about what's possible if you go beyond Haskell's type system: dependent types. It allows for even more expressive code. Instead of tests, you write your types, and your program is an executable proof that your type holds. Quite mind-boggling, and very awesome. Of course, these languages are currently only suited for academic use, but I suspect they will inspire other languages.

Both your list and your comments look good, generally, but I have to wonder about your mention of logic programming.

I think logic programming has a lot to offer the world, in theory, but in practice it seems to be pretty much dead. True, there is probably more going on with Prolog now than, say, a decade ago. But widespread use is not happening, nor is inclusion of logic-programming features into other languages. Further, I don't see this changing in any truly significant way in the near future.

Apparently, you do see it changing. Would you care to comment on that?

I will make some heretical statements. I think unit-testing sucks. I think current type systems are overrated.

Logic programming + meta-programming as well as Logic programming powered RTEPLs (Read-Typecheck-Eval-Print-Loop) can give us optional rich, strong guarantees without adversely affecting runtime performance. This is a space that needs more exploration.

Two visible things to come out of logic programming are Erlang and constraint programming. Both are very significant, in their own niches.

I don't think Prolog is dead yet, either.

Erlang is not a logic programming language. They started with a logic programming language and eventually removed every feature that made it a logic programming language.

Correct, what's left in erlang from prolog is the pattern matching syntax for function definition, where you have multiple clauses to test, sequentially, whether runtime args match the number and types of parameters in each clause (and values, if guards are used). No unification/backtrackig involved.

Data parallelism is something you missed spezially if you think haskell will get bigger.

Decades ago, lots of programming languages were created that hardly anybody knows now, including some that would definitely be considered very specific, research-oriented, etc. This isn't really a new thing. See for example Jean Sammet's _Programming Languages: History and Fundamentals_.

Edit: Bergin & Gibson's _History of Programming Languages, Volume 2_ is pretty good, too, but that one covers more recent languages: Prolog, C, Forth, Lisp, Icon, etc.

It's not a bad idea to read about old experimental designs, particularly those that didn't work out because of limited hardware. I think there's a lot of potential in the APL and concurrent / constraint logic programming families.

Thanks for the references. I just wanted to take this opportunity to thank you in general for the high quality of your posts.

Thanks. My academic background is in historical research, so I take references pretty seriously. :)

A lot of research has not made its way into any language yet. In particular, the data models and optimizers of current languages are woefully inadequate. Most mainstream programming languages are still in their infancy, being nothing more than glorified assembly.

The future will be more about declarative programming - write down a mental model of the program, and the language/compiler will do the rest. A simple restricted example is SQL - you write what data you want and the optimizer figures out the best "program" for the query, using even genetic optimization in the case of PostgreSQL. Another example is data binding - you write down what data connects to what part of the GUI, and the framework figures out what to update and when. The problem with SQL, data binding etc. is that they are not tightly integrated into a general purpose language, and do not have clear theoretical underpinnings.

A good language will have simple, compact theories and abstractions as general as possible to reduce the mental baggage necessary for programming - instead of remembering hunders of special cases, you should only work with a couple of general constructs.

Optimizers for these languages will have to be far more advanced - remember state between compilations to reduce the impact of whole program optimization, have advanced specialization and type checking capabilities using abstract interpretation etc.

Data models will have to grow too - they will have to be high-level and low-level at the same time to cope with the onslaught of data. The semantic web provides a fairly universal data model with RDF/OWL, but this again could be simplified and abstracted. A data model should also have the capability to specify the physical layout of the data down to the bits, but also at the higher level, such as distribution between disks and machines. Ah, finishing now to avoid tl;dr.

I for one wish that implementations of future languages provide the following features a) Some extension mechanism for the language like clos etc, where hooks are provided for executing code. b) Code walkers c) Documentation about the internals. I dont mind the language implementation being a tad slower because of simplicity but in the core has to be grokkable and extensible by the end programmer.

Existing languages suffice for most easy problems. For the hard problems that ive been tackling lately ive wished I could overcome some logical impedance between what I am doing and the language in a sane way. A few examples where additional flexibility would help a) Opengl is a state machine. Being able to take the graph of my program and write assertions that critical setup functions are called before other GL functions would help detect invalid logical states. b) Before, after functions (which exist in lisp) would be nice. EDIT: c) Being able to say -> for all objects in in my program that match this criteria, do something. Essentially -> for x in criteria(primitives(program)) do foo.

EDIT: A common thread to all the times I feel trapped as a programmer is when I have a knowledge of the meaning of my program which I want to express, or perhaps a question about its existing implementation which I would like answered. Many languages lack the introspective power to help me as a programmer to tackle these situations. Others simply make it inconvenient to do so.

I think VMs are the future of application programming, especially as manufacturers become more amenable to using "unconventional" architectures like ARM to host full-featured computers. LLVM, the JVM and the CLR all have benefits- I don't see any reason to believe there will be a convergence in the near future.

Environments like .Net and the JVM stack will slowly allow tighter and finer-grained interoperability of the languages they host. We currently have class-level blending of languages, and in the future we will probably see method-level blending. The best DSL for the job.

Those VM designs are pretty heavily skewed towards class-based OO languages, though. Most VMs have some kind of behavior (polymorphic method lookup, unification, message-passing) that they do very efficiently because of the language they target, but others are omitted. For example, it isn't possible to do real tail-call optimization on the JVM without a trampoline, AFAIK. If it had been created for (say) ML, TCO would be a given, but OO stuff would be an afterthought.

The Erlang VM is another interesting platform, albeit one heavily skewed towards fault tolerance, concurrency, message passing, etc.

LLVM is not skewed that way.

Very true.

I think (hope?) that the next wave in hacking will be understanding programming paradigms better as hackers - the specifics of language choice will probably matter a little less. I like the idea of languages that support multiple paradigms internally (e.g., you can embed logic programming into your functional programming language). A while back I spent some time reading Concepts, Techniques, and Models of Computer Programming and I wish I had found this book earlier in my career. I probably wasn't ready for it though.

This aspect of software development was largely missing from my formal educational experience in programming language paradigms.

What I am trying to do now is develop a better "taste" for what is "easy" using one programming paradigm compared to another. I'd ultimately like to have a better problem to paradigm mapping internalized. I've toyed with the idea of putting together a seminar or undergrad course to do flesh this out.

The elephant in the room is more market based - what programming languages will someone pay you to use in the future? We already have a number of interesting programming languages. But when you do your job search these days, I see a small number of large buckets. The .Net/CLR C# world, Java in the enterprise, Ruby (really the Rails framework, but even so), and a strong side of the data storage backend of your choice (RDBMs or NoSQL or sexps - kidding on that, pg keeps our own forum in files full of sexps).

I'm surprised you included Rails in that list. While there are quite a few Rails jobs, it pales in comparison to what's available for .NET and J2EE. Even PHP blows it away by pure number of jobs.

You know, you're right. Probably a sign of spending more time on HN and peeking around my local community of devs that reflection of a larger trend.

A quick look around found that Java, C++, C#, javascript, and perl (?) are big in job listing at Dice.com right now. http://duartes.org/gustavo/blog/post/programming-language-jo...

And the always friendly tiobe index at http://www.tiobe.com/index.php/content/paperinfo/tpci/index.... is useful when thinking about current programming language trends.

Unfortunately those search queries are worse than "ballpark" figures as they're mostly based on keyword matching, not the amount of code the developer is writing in that language for that job.

There are certain search queries that will always hit high. Of course, JavaScript is going to be in almost any job posting that's web-related, even if it's not true hardcore JS coding. Perl is used heavily for development automation, so it's also going to have a big showing, even if it isn't the core language. You'll also tend to see lines like "Previous scripting experience with Perl, Python, Ruby a plus" for Java postings.

If it could be monetized, doing more sophisticated data mining into job postings for actual popularity trends would be awesome, especially with a decent granularity. It wouldn't be difficult to train a supervised ML algorithm with a set of keyword-tagged job postings with weights as to how significant a certain set of skills would be used at a job for a certain job posting.

It's overrepresented in the startup/emerging company job market.

I think that eventually some programming tasks will become highly specialized, yes, and use custom and very specific languages depending on the field and the problem at hand and may end up with their own degree programs, etc. There's already hints of this sort of thing right now. It's very hard to transplant a programmer from (for example) web development to 3D game engine development - it has nothing to do with language, though. There's just a huge set of knowledge required for each that has very little overlap - even if the languages used happened to currently be the same in some cases.

It doesn't make sense to forever expend the effort required to force every problem into just a handful of languages' structures - even if it is theoretically possible to do so.

I think that things like OMeta (http://tinlizzie.org/ometa/) are an important piece and the other work being done by Viewpoints Research (http://www.viewpointsresearch.org) could help.

Aside from programming paradigms, my frankensteinian view of the future is as follow: Javascript VMs become so good, other languages start being built on top of it. The future V8/Rhino... instead of JVM for new languages.

I can certainly see Clojure in JS pretty soon. This might be wishful thinking on my part (see bellow for why).

"Will game development ever diversify or will it go from C++ to C# to ..."

Yes, WebGL is in my view a game changer (pun intended). I'm pretty bias considering I use it currently a lot. But I'm replacing scientific application in C++ with WebGL version online.

Finally, I don't see a slow down in new languages popping up any time soon. I think it's important that they can run on some generic VMs to allow for multi language apps to be possible.

I really hope that game and other performance-dependent development moves to Go or something like it, rather than being stuck in C/C++ land.

The advantages of modern syntax design coupled with a fast native compiler would be a potent and exciting mix.

The good news is that computers seem to be fast enough to allow game developers to work in less than speed-optimal languages, so long as they are willing to accept some stylization. Consider Minecraft, easily the hottest indie title in years, and it's written in straight-up Java. (As far as pure numerical computation is concerned, Java is actually a pretty good match for C++ these days, but 3d graphics remain a fairly serious bottleneck.)

If we're shooting for the stars, though, I'd personally prefer something more ambitious than Go, like GOAL[1].


Game developers don't only use it for speed. Mainly for portability. With c/c++ they can make (portions of their) code run on xbox/ps3/nintendo wii/ds/iphone/whatever.

Very true.

Syntax doesn't really matter in terms of performance - what matters is compiler optimizations. Theoretically someone, if they had a mind, could optimize a Python compiler to run just as fast as C (or at least as fast as a non-statically typed language can run).

With that in mind, perhaps some sort of modification to the JVM (or a competing VM that isn't controlled by Oracle) that provides an aggressively raw metal code would be an ideal future - the advantages of whatever language you want to use + speed and portability would be awesome.

LuaJIT (http://luajit.org/) performs quite well. While I can't say it'd be impossible, optimizing Python similarly would be much harder - Lua's tiny implementation and tendency to do everything in terms of a small and orthogonal group of concepts means it has far less that needs to be optimized. Python is pretty hairy in comparison.

Lua seems to have a similar advantage over Javascript, as well. LuaJIT beats Javascript V8 by a wide margin (http://shootout.alioth.debian.org/u64/benchmark.php?test=all...). It's not like Javascript implementers lack resources, either. LuaJIT is the work of one person.

> Syntax doesn't really matter in terms of performance - what matters is compiler optimizations.

Yes and no. Slight semantic differences can close off optimizations. There was a great discussion on LtU that pulled in a lot of JIT developers. Mike Pall (LuaJIT) tossed out some optimizations the Tracemonkey developers could exploit, but Brendan Eich pointed out that js can't use them:


Huh, cool - I never really looked into what goes into doing compiler optimizations.

What sort of gains would we be looking at with syntactic language differences taken into account?

For instance, suppose I wrote some program in C, compiled, and ran it with the latest GCC/LLVM. If I wrote a program in Python (or Lua) and compiled it down to bare bytecode with a comparably optimized compiler - no interpreter or JIT happenings - would there be a large number of optimizations that just couldn't happen for one language or the other?

You have to think about what invariants the compiler is capable of recognizing in your code, and can prove are valid: "If I guarantee X, it can do optimization Y behind the curtain, but will still run as if things were compiled normally."

For example, if all variables are immutable by default, they can be inlined at point of use, skipping a lookup. Functions whose arguments are all known can potentially be run once at compile time (partial evaluation). Collections can potentially be handled in parallel if each cell's processing is independent. Etc. This sort of thing is why languages with strong invariants (such as Haskell or Erlang) can do really interesting optimizations.

On the other hand, if the language semantics require that everything is polymorphic and has to be looked up at runtime, that adds extra overhead, and it's not always provable what those values will be at compile-time. A JIT-compiler can compile at runtime, when the information is available, but since they're usually not able to pause execution for long, they can't do extensive analysis. JIT compilers can also make optimizations not statically available because they can revert to the non JIT'd code and recompile differently, whereas static compilation is permanent. (Method lookups can also be cached, of course.)

Incidentally, normal Lua (i.e., not LuaJIT)'s compiler doesn't do much analysis - it's tuned for vacuuming up huge dumps of structured data, rather than trying to generate optimal bytecode. Lua usually still runs significantly faster than Python or Javascript, but that has more to do with the the clean language semantics and high-quality implementation.

Also, a good comment by Mike Pall (the LuaJIT implementer): http://www.reddit.com/r/programming/comments/badl2/luajit_2_...

Well, for one thing, Python guarantees that objects can be dynamically modified (methods can be rebound, new methods can be added, methods can be removed.. same goes for data members. Python can do this because the methods are stored in a dictionary which gets looked up by method name at runtime). This will never be as fast as a static function call in C or even an indirect vtable-based call (indirect pointer access vs hash table lookup).

That is a single reason why Python can never be compiled to run as fast as C. I'm sure there are plenty of other features that help or hinder performance optimisations. Its all about tradeoffs. (Of course, a suficiently advanced language may allow you to choose these tradeoffs at a finer grain than the language level)

Regarding "3d graphics remaining serious bottleneck" : any serious 3d graphics is done on GPU. C++ does not have that much advantage over Python or Java, as long as 3d computations are done on GPU

For game development, I think it will diversify with the increasing ease of developing and selling indie games that do well. There are lots of languages getting perfectly good game development libraries that can definitely be used for the (often simple looking) indie game market.

However I don't think the big players in the industry will be able to. To deliver games of great scale (WoW) or intense graphics/effects you're going to need something that can deliver the absolute best performance.

The world is already filled with hundreds (and probably thousands) of programming languages, most of them fantastic in one way or the other.

Some people will prefer to stick to one language, some will dabble in others, but... most of the time, if the language is not radically different from everything you knew before, learning a new one isn't that hard.

I don't see this pattern changing in the foreseeable future. I mean, some areas always had their niche programming languages (heck, most areas do, anyway).

The language of the future is Lisp. always has been, always will be. (which is a shame, because APL and its children J and K would be a better foundation).

> J and K would be a better foundation

A better foundation for...what? Many important programming domains don't fit naturally into the array-oriented model.

The array model fits the relational model way, way better than the object oriented model (no "o/r impedance mismatch"). I think most domains fit just as well (if not better) with arrays.

And specifically, the APL/K/J focus on data makes it a better foundation for optimization, parallelization and reasoning about program behaviour.

>And specifically, the APL/K/J focus on data makes it a better foundation for optimization, parallelization and reasoning about program behaviour.

I'll give you the first two points (though some heavy-weight optimizations become necessary to eliminate big wasteful temporaries), but certainly not the last. At least not in general--- recursion is a terrible pain with arrays! Even stateful object-oriented languages deal with inductive structures better than array languages.

What does tree processing in APL look like? I'm it's possible and I'm also sure it's hell. The K/Q approach of nested vectors is a first step towards making recursion tolerable in an array language, but it's still a kludge.

As Sir A. C. Clarke said (or at least to paraphrase)-- predicting the future is at best a crap shoot. That said I know what I'd like to see. As suggested elsewhere, much of what we see in 'new' languages is syntactic sugar bolted on to existing approaches; new on the outside, same old on the inside. I'd like to suggest that, that is not necessarily a bad thing. Consider what 'syntactic sugar' is--- a tweak or rephrase of an older approach to accomplish some task in the language in question. What if these were recognized as less sugar and more substance. The purpose of a computer language is to communicate with the computer first, innocent bystanders second. SS preserves the one and enhances the second, a win-win if you will. I'd like to see a language designed from the bottom up with that approach in mind. See if we can't come up with something that more clearly bridges the gap between programmer and machine. By now we know the variety of things that should be built in, now lets concentrate on the interface...

They tried that. What came out was Perl. Lots and lots of layers of syntactic sugar layered on syntactic sugar layered on more sugar, so you now have lots of sweet ways to do the same thing. But between all the sugar it's becoming harder and harder to see the real substance which it was about. The computer doesn't have any problem crunching through the sugar (it doesn't have any teeth to worry about), but as a programmer it doesn't become easier to recognize the vegetables if your cauliflower is sometimes covered in marshmallow, other times drenched in syrup, and the third time someone made a half hearted attempt to caramelize it. (Of course nobody servers the cauliflower as just plain cauliflower anymore.) So yes, you cook for the computer first, innocent bystanders don't need to know what went in to your program.

In the real world, the innocent bystanders matter, code is written as much for other people to understand as it is for computers to execute. I think that instead of plastering over all your content with sweeties, languages should be designed in such a way that the sugar is not necessary by choosing the right fundamental concepts, so programmers can understand what's going on. Learn to cook with the right ingredients, and learn when to add spices. And when not to.

For another example of a syntactic sugar friendly design, have a look at C macros.

I think we'll see greater use of type-inference in statically typed languages. I hope that we get rid of the "kinda strong" type systems (like C++). While I appreciate the convenience of such type systems, I think greater type safety is a worth the trade-off.

Ok, I thought it was interesting enough to do my own take on it:


I don't know how the future of programming language will be, but I hope functional programming will take over the world, more and more people will try to prove their algorithms are correct (with Coq, for exemample) and cool type systems like System F or Dependent Types will be used more.

The trend of niche-languages (e.g. Objective-C for iPhone) isn't actually a new trend. For example the high performance computing (HPC) community uses Fortran nearly exclusively for decades. The number of niches increases, though, so i expect further diversification of languages. E.g. game development is not C++/C# only. Most games include at least a scripting language (Lua, Python, Javascript) and many feature some online league, which may be implemented with typical web languages (Java,PHP,Python,Ruby,...)

In terms of programming language features i expect the next hot topics to be:

Dependent types, however, the question, whether a type system should be turing-complete is not answered yet.

Optional types for dynamically typed languages. Common Lisp had this for centuries of course, but now Python introduced the syntax and Clojure etc. also support this.

Various mixes of concurrency related concepts. There are lots of ideas currently (see X10, Clojure, Go, D, Haskell, ...), but no sweet spot is found yet.

Arc was criticized when it came out for being basically a few macros on top of Scheme. I don't think that's a valid criticism. I don't know if Arc is the hundred-year language, but whenever the hundred-year language does come out, it might easily be some carefully-chosen macros on top of Scheme, macros that help write other useful macros.

I mean, arc hasn't really 'come out,' has it? It's still being worked upon... the front page of that site specifically says the only reason there have been releases is to improve the language.

Add fexprs and a system F-based type system to lisp/scheme, and I might just go there...

The language(s) of future will be fifth generation languages. Current languages are not able to support more than 100M lines of code in one project and more than 10-20 years of continuous development.

In future, we will have something like that:

#... We are wrote large project using HTML7, but HTML8 is just released ...

$ spm update


12826 source files will be updated, 2324 new source files will be added, 343 source files will be deleted. Proceed? [Y/n] y



#... OK, let continue our development ...


I have demo. I hate developing in any 3/4GL language now. :-)

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact