

The next big language - newgame
http://eugenkiss.com/blog/2010/the-next-big-language/

======
8ren
New languages are carried on the backs of new platforms. eg: unix & c; web &
javascript.

To predict the next big language, predict the next big platform. The upcoming
platforms are: smart phones & tablets; cloud computing & many-core. The cloud,
being connected services, is largely language-agnostic, and many-core might
end up being implemented by borrowing whatever works in the cloud. The needs
of the above seem well-met by established languages, leaving little
opportunity for new languages to emerge.

What about the next big platform after the above? It's probably more than 10
years off, but Moore's law says smaller devices will come. If disruptive,
they'll be attractive to new audiences, with different needs - perhaps along
the lines of cochlear neural implants (already big business) or garage
genetic-engineering. What languages do those guys happen to be using? They
will be carried to success.

~~~
lkrubner
The next big platform is multiple cores/multiple CPUs. The next big language
is functional and helps deal with consistency across time and across CPUs.
Therefore, the next big language is Clojure. Rich Hickey said it best:

"If somebody hands you something mutable—let's say it has methods to get this,
get that, and get the other attribute—can you walk through those and know
you've seen a consistent object? The answer is you can't, and that's a problem
of time. Because if there were no other actors in the world, and if time
wasn't passing between when you looked at the first, second, and third
attribute, you would have no problems. But because nothing is captured of the
aggregate value at a point in time, you have to spend time to look at the
pieces. And while that time is elapsing, someone else could be changing it. So
you won't necessarily see something consistent.

For example, take a mutable Date class that has year, month, and day. To me,
changing a date is like trying to change 42 into 43. That's not something we
should be doing, but we think we can, because the architecture of classes is
such that we could make a Date object that has mutable year, month, and day.
Say it was March 31, 2009, and somebody wanted to make it February 12, 2009.
If they changed the month first there would be, at some point in time,
February 31, 2009, which is not a valid date. That's not actually a problem of
shared state as much as it is a problem of time. The problem is we've taken a
date, which should be just as immutable as 42 is, and we've turned it into
something with multiple independent pieces. And then we don't have a model for
the differences in time of the person who wants to read that state and the
person who wants to change it."

<http://www.artima.com/articles/hickey_on_time.html>

~~~
Chris_Newton
> The next big platform is multiple cores/multiple CPUs. The next big language
> is functional and helps deal with consistency across time and across CPUs.

So the popular opinion in the Internet echo chamber keeps telling me, but
somehow I don't buy it.

If you do, please try to answer this simple question: what single application
of widespread importance benefits on a game-changing scale from running on
multiple cores?

It's not office productivity/business automation applications like word
processors, spreadsheets, and accounting packages. They could run just fine on
a typical desktop PC years ago. Sure, it's useful to run multiple applications
simultaneously, but the OS can handle the scaling in that case.

It's not mass information distribution/web applications. The bottlenecks there
are typically caused by limited communications bandwidth or database issues.
While concurrency is obviously a big factor internally in databases, most of
us don't actually write database engines.

It's not games. Most AAA titles today still don't scale up in that way, and
one mid-range graphics card with its specialist processor would blow away a
top-end quad-Xeon workstation when it comes to real-time rendering. Again,
there is some degree of concurrency here, but many intensive graphics
rendering problems are embarrassingly parallel in several ways, so again this
isn't much of a challenge even for today's mainstream programming languages
and design techniques.

I suspect the most likely mainstream beneficiaries of better multi-core/multi-
CPU support would be things where there really is heavy calculation going on
behind the scenes and it's not always uniform: multimedia processing, CAD,
etc.

However, what about the alternative directions the industry might take? The
Internet age has emphasized some basic realities of software development that
as an industry we weren't good at recognising before.

For one thing, many useful tools are not million-lines-of-code monsters but
relatively simple programs with far fewer lines of code. It's knowing what
those lines should do that counts. That means rapid development matters, and
that in turn requires flexible designs and easy prototyping.

For another thing, data matters far more than any particular piece of
software. Protecting that data matters much more in a connected world with
fast and widespread communications, so security is more important than ever,
and we need software that doesn't crash, suffer data loss bugs, and so on.

So I'm going to go out on a limb here and suggest that multi-core/multi-CPU is
_not_ in fact going to be the dominant factor in the success of near-future
languages. I think flexibility and robustness are going to be far more
important.

It may turn out that the attributes of a more declarative programming style
support these other factors as well. It may be that functional programming
becomes the default for many projects as a consequence. But I don't think any
future rise of functional programming will be driven by a compelling advantage
to do with implementing modest concurrency on multi-core systems. That just
isn't where the real bottlenecks are (in most cases).

~~~
vilya
> If you do, please try to answer this simple question: what single
> application of widespread importance benefits on a game-changing scale from
> running on multiple cores?

Computer vision and machine learning both benefit a lot from multiple cores.
They seem to be really big growth areas at the moment and have the potential
to dramatically change the way we interact with computers. It's already
happening: recommendation engines on e-commerce sites are a great example of
machine learning in practice. I believe we're going to see this sort of thing
appearing in more and more places.

Web browsers already take advantage of multiple cores, by the way. The Rust
language is being developed by Mozilla because (one of the three reasons from
the project FAQ) of dissatisfaction with the concurrency support in existing
languages.

I think there's a large opportunity cost to dismissing concurrency &
parallelism at this moment.

~~~
Chris_Newton
> I think there's a large opportunity cost to dismissing concurrency &
> parallelism at this moment.

I'm not dismissing the idea, nor claiming that it is not valuable for any
application. Clearly that parallelism would have value to a significant number
of projects, which perhaps don't make best use of the host hardware today. I'm
just trying to keep the multi-core idea in perspective, relative to other ways
our programming languages might improve.

Better multi-core support can get you a constant factor speed-up in
computationally expensive work, but Amdahl's Law tends to spoil even that. On
the other hand, a language with a type system that allows you to prevent
entire classes of programmer error could lead to a step change in security or
robustness. A language expressive enough to capture the developer's intent in
ways that today's programming models do not could lead to entirely new
techniques for keeping designs flexible, supporting new rapid development
processes, or it could create opportunities for optimisers that bring the
performance of more expressive languages to a level where they compete with
lower-level languages used in the same field today for speed reasons. I
suspect that across the field of programming as a whole, such improvements
would be far more widely applicable.

------
binarymax
JavaScript is going to get much bigger. My money is on that especially with
node.js and html5 in the game. I also noticed F# was not even mentioned,
purposefully or mistakenly?

~~~
fauigerzigerk
JavaScript is definately a candidate, but I don't think it's going to be
node.js. Doing all IO asynchronously strikes me as very premature
optimization. I prefer the way Go or Erlang deal with these scaling issues.

~~~
beagle3
Funny you should consider async i/o to be premature optimization. To me it
looks like the only reasonable way to build a non-trivial system.

 _Threads_ on the other hand, are the often the wrong optimisation --
processes are the right one.

~~~
fauigerzigerk
The distinction of threads v processes is not something that necessarily
influences the logic of our code. If I want to say a() b() c() I can do that
with a thread or a process and it doesn't matter what a, b or c actually do.
However, with the "IO must be asyc" stipulation, I have to know whether any of
those functions is an IO call, because if it is, I have to write it in a
completely different way, even if the logic I need is synchronous. I don't
like that kind of interference with my logic.

~~~
beagle3
Well, with very high probability, your logic is broken in the face of errors /
malicious users then.

Node's async structure is just taking the async model of the web one step
further, into processing the same request asynchronously if you need to wait
for something else.

You might be master of doing this logic right, but I've audited tens (perhaps
hundreds) of systems, and all the synchronous ones got it wrong.

(Disclaimer: I've never worked with Node. But I've been writing async servers
in C and Python since 1999).

------
csomar
My sens tell me "JavaScript". The rise of the Web (especially Web
applications), HTML5, Tablets and mobile applications, Server Side JS... I
think JavaScript can make a big move. Learn one language and use it in
anything you'll even dream of.

With its' flexibility, extend the language to support any feature you'd like.
Make it Object Oriented, Prototypal, Procedural, your own...

------
hedgie
We don't pick our tools to fit the job? The next big language will be
determined by the needs of the largest computer user groups. The particular
needs of these groups will determine the languages that eventually reach
critical mass.

Using 5 universal criteria that characterize good programming languages is the
wrong approach. For example, in safety critical systems his metric of
conciseness is worthless. Ada may sometimes take more lines of code to
accomplish a specific task than C but the requirements of safety critical
platforms demand the extra effort and cost. The metric is worthless because it
isn't picked to coincide with the needs of the users.

No one cares that the strong typing or other features in Ada can slow down
coding and annoy programmers because this practice eliminates many bugs in the
final product. The needs of the end-user for bug free code outweigh their
needs to ship quickly and please developers. It is a tool used for a specific
job that it handles quite well.

Evaluating a programming language with 5 universal metrics ignores that they
are tools used to solve the problems of a particular group of users. If you
want to figure out what the next big language will be ask what platform will
have the most users in the near future and what the needs of those users are.
The languages that meet them best are the top choices.

------
ericb
I don't really understand his justification for excluding Erlang. It IS a
general purpose language, and I can't see it as any more niche than some of
the obscure languages he includes (Clay?).

~~~
hassy
No curly braces, no hashtables, awkward strings, doesn't look object-oriented.

(This coming from someone whose favorite language is Erlang.)

~~~
ericb
All true, but that isn't his justification in the article. He skips Erlang by
damning it as confined to a niche, like PHP.

Edit: Reia would be worth discussing if those are major obstacles.

------
edsrzf
I find it interesting that he says Go's syntax is ugly, but that Clay programs
look so good and readable. The two languages have a very similar basic syntax,
and some of the Go's differences are an improvement in my book: no semi-
colons, fewer parentheses.

------
michaelchisari
I had an idea the other day about being able to write client-side browser apps
in Google Go. I found it very intriguing.

My votes are for Go and Javascript. One has a 800lb gorilla behind it, and the
other has a ton of momentum, and I agree with the other poster that the
maturation of server-side Javascript and the ability to use one language on
both ends is very appealing.

------
rbarooah
Strange that nobody's mentioned scala yet. Clearly it has its detractors, it
seems like a credible candidate to displace Java. It also has more traction
than a number of these languages, and more importantly there's a huge number
of java based developers and organizations who are under pressure to improve
productivity and can adopt it incrementally.

It seems to me to have fewer barriers to becoming a major language than most
of the other contenders.

~~~
gclaramunt
Indeed, specially with the pretty high profile companies using it: LinkedIn,
Twitter, FourSquare.

------
johnnytee
I think server side JavaScript is the future. One language to rule the web's
front end and back end.

~~~
jpr
Would be consistent with the tendency of majority of programmers to select the
crummiest possible language because some platform that necessitates its use
fluked into popularity.

------
ohhmaagawd
I'd love to see Mirah hit it big (statically typed Ruby on JVM). Dunno if it
will tho: <http://www.mirah.org/>

------
twymer
I was a little surprised with D being listed as one of the main contenders. I
guess Clojure and Go were obvious suggestions. Go has the backing of Google
and aims to be everything we want (fun/ease of Python, speed of C++) and
Clojure is the ancient popular Lisp with a modern twist.

I guess I would have expected Javascript and Groovy to be included. Though I
might be biased, I worked with a client using Groovy and in that short time
saw that there was a lot of use in the (local) industry in it.

~~~
stcredzero
_I was a little surprised with D being listed as one of the main contenders. I
guess Clojure and Go were obvious suggestions._

Near the end he references some mailing list debate, then comments that Go
emphasizes simplicity while D emphasizes features, therefore D will win,
because to him, language features rock. To me, this idea reveals that much of
his "reasoning" is this prejudice.

Nothing to see here. Move on!

------
melling
Imagine that you want to rewrite OpenOffice from scratch. What language would
you use? That should be the next big language.

~~~
arethuza
If you were going to do that I think choice of language would be the least of
your worries!

Having said that, I would probably go with a set of low level
modules/services, probably written in C or C++, held together with JavaScript.

------
mmphosis
I think the next big language will actually be a small language.

------
woadwarrior01
The Clay mention makes me really happy. Years ago, I was a part of the small
team that was working on its first incarnation, though its a completely
different language now and I'm sure its been through half a dozen rewrites
since then.

------
skybrian
I think a new language could win by being more portable. A compiler that
targets Intel and ARM is no longer interesting. But a nice language with a
compiler that can create libraries usable in JavaScript apps, iPhone apps,
Android apps, and App Engine apps would be very interesting.

------
tszming
Seem many people here have high expectations on JavaScript, cool! :)

Also, be sure to check out the JavaScript's kid brother - coffee-script

<http://jashkenas.github.com/coffee-script/>

------
rbanffy
"compile-time correctness"?!

All a compiler can say is that it compiles. It has no idea if the program does
what you want.

~~~
evilthinker
Still that is more than one can say about most dynamic languages.

~~~
rbanffy
And how much this "more" really means?

All it says is that the program pieces fit together. The small benefit of
knowing my potentially wrong program got the types of whatever is being passed
around right gives me very little comfort when I think about the flexibility I
lose by using statically typed languages.

------
robryan
Don't want to get into a language merits argument, but how can you consider
the most widely used language on the web niche?

~~~
frou_dh
PHP? Computing is more than web pages.

I'm rooting for Clojure. Rich Hickey's presentations detailing his rationales
make sense and are quite inspiring.

~~~
rbanffy
And Clojure makes a lot of sense in many-core scenarios.

------
aufreak3
No mention of Arc here? Its the language behind HN after all :)

------
r0sebush
Erlang.

