
Another go at the Next Big Language - iand
http://dave.cheney.net/2012/09/03/another-go-at-the-next-big-language
======
InclinedPlane
Here's the core problem with next generation languages. Languages that come
out of academia focus too much on syntax and computer science level
functionality, and it's extremely rare for a language of that sort to make it
in the real world. The languages we use today either come from big companies
with the ability to promote anything long enough to get traction on any
language that is at least "good" or they come from the "streets". From
extremely small teams who create incredibly flawed languages that are
immimently practical and go on to rule the world. Perl, Ruby, Javascript, PHP.

It's "worse is better" again in spades. Ivory tower language designers try to
come up with perfection when what we really need is to improve on the basics.

The next big language is probably not going to be something like Haskell (as
nice as all that functional purity is) it'll be something that builds
profiling and unit testing and better source control support right into the
language, compiler, and tools.

Edit: if you look at where the average developer is spending most of their
time and especially where the majority of the pain is it's typically in things
like testing, debugging, performance profiling and optimization, and
deployments. And if you look out there in the field you'll see lots and lots
of awesome tools and systems helping peoplee tackle those problems. But it's
exceedingly rare to see a new language which approaches those problems or
tries to codify those tools into first class language features.

~~~
qznc
The main problem of academic languages is that they improve one or two aspects
and neglect the rest. Real world languages must improve one or two aspects
without hurting the rest too bad.

The rest means for example: debugging, IDE, multi-platform, standard library,
performance, deployment

~~~
seanmcdirmid
Unless your language research focuses on debugging, IDE, standard library; I
don't bother with multi-platform, performance, and deployment, but I know
others that do. Really, we are just single people, we are not out to create
the NBL, we are out to push things forward and create well-thought-out ideas
that could be included in the next NBL, and we realize that most of our ideas
will fail to make it big time. But perhaps some of them will survive and have
an impact (so is the depressing life of an academic programming language
design researcher).

------
rogerbinns
My kingdom for someone who can figure out how to solve error handling. My code
consists of some reasonably straightforward sequence of actions with a random
smattering of error handling significantly distracting from that.

That error handling code is tedious to write, very time consuming to test (and
often virtually impossible) and usually not run very often. Exceptions at
least let you put the handling code somewhere other than the normal sequence
(although you may still have some finallys), continuations are quite nice, and
the Go/C model pollutes the code but puts the error handling right next to the
error detection.

None of these really solve the problem though. How can I have the least amount
of error handling code possible, how can I test it, and how can I be sure it
is correct, and all while spending my mental efforts on the code that actually
does useful things?

~~~
lkrubner
Lisp had a condition system that was very clever. You still had to write the
code, but you were able to separate the problem, the handling, and the
restart.

This is the problem that comes up in many languages and which needs to be
dealt with:

"Because each function is a black box, function boundaries are an excellent
place to deal with errors. Each function--low, for example--has a job to do.
Its direct caller--medium in this case--is counting on it to do its job.
However, an error that prevents it from doing its job puts all its callers at
risk: medium called low because it needs the work done that low does; if that
work doesn't get done, medium is in trouble. But this means that medium's
caller, high, is also in trouble--and so on up the call stack to the very top
of the program. On the other hand, because each function is a black box, if
any of the functions in the call stack can somehow do their job despite
underlying errors, then none of the functions above it needs to know there was
a problem--all those functions care about is that the function they called
somehow did the work expected of it.

In most languages, errors are handled by returning from a failing function and
giving the caller the choice of either recovering or failing itself. Some
languages use the normal function return mechanism, while languages with
exceptions return control by throwing or raising an exception. Exceptions are
a vast improvement over using normal function returns, but both schemes suffer
from a common flaw: while searching for a function that can recover, the stack
unwinds, which means code that might recover has to do so without the context
of what the lower-level code was trying to do when the error actually
occurred.

Consider the hypothetical call chain of high, medium, low. If low fails and
medium can't recover, the ball is in high's court. For high to handle the
error, it must either do its job without any help from medium or somehow
change things so calling medium will work and call it again. The first option
is theoretically clean but implies a lot of extra code--a whole extra
implementation of whatever it was medium was supposed to do. And the further
the stack unwinds, the more work that needs to be redone. The second option--
patching things up and retrying--is tricky; for high to be able to change the
state of the world so a second call into medium won't end up causing an error
in low, it'd need an unseemly knowledge of the inner workings of both medium
and low, contrary to the notion that each function is a black box.

Common Lisp's error handling system gives you a way out of this conundrum by
letting you separate the code that actually recovers from an error from the
code that decides how to recover. Thus, you can put recovery code in low-level
functions without committing to actually using any particular recovery
strategy, leaving that decision to code in high-level functions.

To get a sense of how this works, let's suppose you're writing an application
that reads some sort of textual log file, such as a Web server's log.
Somewhere in your application you'll have a function to parse the individual
log entries. Let's assume you'll write a function, parse-log-entry, that will
be passed a string containing the text of a single log entry and that is
supposed to return a log-entry object representing the entry. This function
will be called from a function, parse-log-file, that reads a complete log file
and returns a list of objects representing all the entries in the file.

To keep things simple, the parse-log-entry function will not be required to
parse incorrectly formatted entries. It will, however, be able to detect when
its input is malformed. But what should it do when it detects bad input? In C
you'd return a special value to indicate there was a problem. In Java or
Python you'd throw or raise an exception. In Common Lisp, you signal a
condition.

A condition is an object whose class indicates the general nature of the
condition and whose instance data carries information about the details of the
particular circumstances that lead to the condition being signaled.3 In this
hypothetical log analysis program, you might define a condition class,
malformed-log-entry-error, that parse-log-entry will signal if it's given data
it can't parse."

[http://www.gigamonkeys.com/book/beyond-exception-handling-
co...](http://www.gigamonkeys.com/book/beyond-exception-handling-conditions-
and-restarts.html)

Read the whole thing. It was a very clever system and wish something like it
was available in other languages. (The lack of it in Clojure was one of the
few things I agreed with Loper about in his anti-Clojure rant:
<http://www.loper-os.org/?p=42>)

~~~
krichman
I read this article last week, lost it, and was looking for it yesterday. You
just saved me a long hour of guessing at terminology.

This seems like a large step in the right direction for exception handling,
but I think it still has the problems that the programmer writing the function
that can throw needs to enumerate a number of cases to make it effective, and
the programmer calling that function needs to have documentation ready for a
descriptions of all the possible restarts.

~~~
6ren
Google's web history, with its toolbar, allows you to search the pages you've
visited before (not just their titles, as in browser history). That is, you
can search the subset of web that _you've_ seen
[http://support.google.com/accounts/bin/answer.py?hl=en&a...](http://support.google.com/accounts/bin/answer.py?hl=en&answer=65396&topic=14150&ctx=topic)

NB: Google will then have all your base, and people on HN have recommended
turning off google web history altogether (let alone the toolbar!). I mention
it, because it is also a killer-solution to the common problem you mention.

~~~
technolem
Don't most browser history systems support this on their own?

~~~
dbaupp
I think this allows one to do a full-text search, essentially. I.e. search for
content on the page, rather than just URL/title.

------
nyan_sandwich
I was really excited about Go. Designed by some gurus, seemed to get
everything right, google app engine supported it.

Then I tried to build something.

Java-like verbosity. Meh, I can deal with it.

[]byte and string aren't the same. Whatever, a few extra lines and thot cycles
here and there, no big deal.

Overly complex library functions. Let me explain this one. In Lua, markdown
(discount) is a single function. In Go, there was a bunch of extra stuff that
just seemed like noise. Likewise for cypto, Base64, stringwriters, bunch of
other stuff.

A bunch of little annoying stuff like that adds up. Eventually I just said
fuck it. Maybe I'm not hardcore enough or something, but now I'm back with
LuaJIT.

Goroutines are cool, tho. Lua's synchronous threads aren't quite the same.
Also google's app engine datastore is sweet. Nice and simple, no screwing
around with SQL. If I had to do systems stuff, I'd reach for Go.

LuaJIT is faster anyways.

~~~
kleiba
* []byte and string aren't the same.

...and how could they be?

~~~
arnsholt
Despite Unicode being something like 20 years old, people still expect strings
to be byte arrays and other assorted lunacy, like reading from file not
requiring stuff like encodings. "How hard can it be" to just figure out which
encoding the file has, after all...

------
cageface
I still think that the ability to generate self-contained, static binaries is
potentially a huge advantage for Go vs the scripting languages. Instead of
installing an interpreter and a bunch of libraries and fiddling with search
paths you just copy a single file and go (as it were).

Think about what it takes to get something like a PHP photo gallery going vs
what it would take with Go, for instance.

------
jhuni
If you are looking for a new language you are not looking for the right thing.
We already have the language of mathematics and the homoiconic programming
language Lisp. What we need isn't a new language, its a new platform which
uses Lisp all the way down. Unfortunately, I don't see that happening anytime
soon.

> Rule #1: C-like syntax

Just what we need! Another programming language with C-syntax! Its not like we
don't already have thousands of those, none of them better then the other. I
think this new language should be renamed from the next big language to just
another C-based language.

> Personally I had hopes for Clojure, but I realise that the same people who
> think that knowing what a Monad is makes them mathematicians also think
> they’re being hip and edgy by pointing out that Lisp has a lot of
> parentheses.

A more accurate statement would be that Lisp code has a lot of links (pointers
between data structures). Lisp code is a linked data structure, it doesn't
have any parenthesis. However, Lisp code is sometimes presented with
S-expressions which do have parenthesis.

~~~
batista
> _If you are looking for a new language you are not looking for the right
> thing. We already have the language of mathematics and the homoiconic
> programming language Lisp. What we need isn't a new language, its a new
> platform which uses Lisp all the way down. Unfortunately, I don't see that
> happening anytime soon._

Because you just (presumably) discovered the hammer of Lisp, it doesn't mean
anything has to be Lisp like.

Even more so that the supposed superiority of Lisp is mostly anecdotal -- no
actual studies on developer productivity, product robustness, etc: all
anecdotes.

Where is your PROOF for what you say, computer SCIENTIST?

If anything, empirical data favor languages with C like syntax. More programs
we CAN'T DO without have been written in those (from OSs, to office
applications, to embedded systems that power almost everything, to servers of
all kinds) than in Lisps. In fact, the ratio is incredibly small for Lisp-made
world changing programs (Emacs --which is partly C, and then what?).

~~~
fauigerzigerk
I would love to see a scientific method to measure developer productivity that
isn't ridiculously flawed in completely obvious ways. So far I haven't seen
one.

What I care about is primarily what makes me productive. I don't care if it
makes anyone else productive but there is a good chance that it might.

Call it proof by induction based on an admittedly shaky prior ;-)

~~~
batista
> _I would love to see a scientific method to measure developer productivity
> that isn't ridiculously flawed in completely obvious ways. So far I haven't
> seen one._

Well, just use the good old empirical method then. Of all the programs out
there that people and businesses need to have, in what languages was the
majority written? Do proponents of older, supposedly superior languages have
an equal body of work to show for it?

> _What I care about is primarily what makes me productive. I don't care if it
> makes anyone else productive but there is a good chance that it might._

I'm fine with that, what I tried to counter-argue was the statement "What we
need isn't a new language, its a new platform which uses Lisp all the way
down.".

~~~
fauigerzigerk
_Of all the programs out there that people and businesses need to have, in
what languages was the majority written? Do proponents of older, supposedly
superior languages have an equal body of work to show for it?_

The obvious flaw of this approach is that the majority of people might have
written their software in a less than optimal language for reasons unrelated
to productivity.

Proponents of niche languages, almost by definition, never have a body of work
to show that is equal to that of the mainstream languges. If they did, they
would _be_ the mainstream.

Or to put it more succinctly: The majority can be wrong.

~~~
batista
> _The obvious flaw of this approach is that the majority of people might have
> written their software in a less than optimal language for reasons unrelated
> to productivity. Proponents of niche languages, almost by definition, never
> have a body of work to show that is equal to that of the mainstream
> languges._

I'm not expecting equal bodies of work. Just show something.

Forget the enterprise, big companies and such. How about lone wolf
programmers? Where are the Lisp gurus "beating the averages" and producing
some killer stuff? 3-4 apps would suffice. For Lisp I can see very few things,
statistical noise almost. Heck, even Erlang has Riak.

One way to see it is: "of course Lisp doesn't have a large body of A-list
programs written in it, since it has less programmers". This is your reading
of the situation.

Another way, though, is:

"there is a reason Lisp doesn't have as much A-list programs written in it,
and it's not adoption. The reason goes deeper and it also explains adoption".

One explanation: Lisp was too high level for the machines of past era to run
sufficiently. That explains why it didn't caught on in the past. It means that
despite being conceptually better, it was a bad language for the problems most
people were trying to solve (squeeze the last trace of CPU and memory juice
from very constrained hardware).

And now? Now other languages have the most essential of the high level
features it used to have, so other factors weight more in using them over Lisp
(e.g available programmers, libraries, etc). Which means again that despite
being conceptually better, it is a bad language for the things people do now
(front end web stuff needs JS, enterprise needs Java/.NET and Oracle/MS
support, embedded needs C, web apps need Node/RoR/Django/PHP, etc).

Not a single niche where Lisp is the best option.

Consider eg that: Productivity = LanguageProductivity +
EnvironmentProductivity.

And let's take the scientific computing field. Even if Lisp, the language, has
70 productivity points over 50 for Python, the Environment for Python has 80
points (NymPu, Scipy, Sage, etc) over 20 for Lisp.

So, Lisp = 70 + 20, Python = 50 + 80, hence Python wins.

(The numbers are out of my ass, but you can make a similar thought experiment
and, people that make it come to similar conclusions when they pick their
tools. Even PG if he had to build something today he would have picked RoR,
not CL).

Lisp guys tend to argue that Lisp has "language productivity" of 100, but I
don't think so. And even it it has it's not 2-10 times the productivity of
something like Python the language. Maybe 20-30% better.

In the grande scheme of things, macros don't matter that much.

~~~
fauigerzigerk
There certainly are other factors than a language's productivity. I don't
dispute that at all, or I wouldn't be writing so much code in C++.

But I think language popularity has very little to do with productivity or any
other rational factor. Languages mostly piggyback on platforms that emerge
rapidly at some point in history.

C came with Unix. JavaScript and Java came with Web browsers. SQL came with
relational databases. Objective-C became widely used (if not poplar) with the
iPhone. PHP came preinstalled with shared web hosting. C# comes with Windows.
VBA comes with Office, etc.

Developers mostly just choose platforms not languages. Whoever makes the
platform decides on the language and it will be "popular" regardless of how
atrocious it may be.

And by the way, PG is building something today and he's building it in a Lisp
dialect (<http://paulgraham.com/arc.html>)

------
ecolak
It's a pity that we are still talking about things like syntax in the context
of a next big language. Programs are still full of bugs, especially concurrent
programs, lots of time and effort is still spent on testing. At this day and
age, the NBL should be a language that helps or guides a programmer write
correct concurrent programs with good performance. It should prevent
programmers from making mistakes as much as possible. I think the NBL will be
in the same school as Erlang and Scala.

~~~
dsymonds
Syntax is very important for a programming language. It's in your face all the
time. A serious downside to many programming languages is their awful or
inconsistent syntax, and that results in code that is hard to read and
comprehend even before trying to understand what the code is doing.

~~~
ecolak
It sure is important but who is to say C-style syntax is better than Lisp-
style syntax and vice versa. Obviously the NBL should have a sensible and
consistent syntax but it has to have a lot more than that to be the NBL.
C-style syntax is definitely NOT a must have.

------
kombine
"Two years later however, D was still stuck in 2007. This was in part due to
the infighting between the standard library camps who had failed to learn from
the mistakes of the Java class library and were busily adding bloat and
verbosity to Phobos and Mango."

This was never a case for Phobos. D still lacks a lot, but Phobos is a pretty
lean library. It's built around the concepts that were introduced by Alexander
Stepanov and STL and refined them to the great extent. Go on contrary has no
support for generic programming, therefore I can only imagine how full of ad-
hoc algorithms the typical Go code is. So yes, for me it's definitely not the
NBL but just a niche language. D still has a lot of issues, but as a language
it's better than Go.

------
pshc
The NBL requires a Next Generation Editor. We can't escape from this local
maxima of code expressiveness, readability, amenability to change, richness of
types, etc. without a truly better programming environment.

Working on it...

------
PuerkitoBio
I believe Go is actually BSD-Licensed, not MIT. Though from what I understand
these are very similar (permissive) licenses.

~~~
andrewflnr
They're almost interchangeable. If I wasn't reading carefully I probably
wouldn't notice the difference.

<http://opensource.org/licenses/MIT>
<http://opensource.org/licenses/BSD-2-Clause>

~~~
qznc
From <http://producingoss.com/en/license-choosing.html> :

There is perhaps one reason to prefer the revised BSD license to the MIT/X
license, which is that the BSD includes this clause:

Neither the name of the <ORGANIZATION> nor the names of its contributors may
be used to endorse or promote products derived from this software without
specific prior written permission.

It's not clear that without such a clause, a recipient of the software would
have had the right to use the licensor's name anyway, but the clause removes
any possible doubt. For organizations worried about trademark control,
therefore, the revised BSD license may be slightly preferable to MIT/X. In
general, however, a liberal copyright license does not imply that recipients
have any right to use or dilute your trademarks — copyright law and trademark
law are two different beasts.

------
logn
I have to agree with the commenters on that article that JavaScript is the
next big language. With HTML5 it's pretty amazing what you can do with JS.
It's reached the point of being nearly as powerful as any thick client
technology yet with ubiquitous browser and OS support.

It performs fairly well too:
<http://shootout.alioth.debian.org/u32/javascript.php> . I'm not sure why one
test is 100x slower, but the rest are < 10x slower than Java.

~~~
james4k
I don't know, I'd say JavaScript is pretty close to reaching its peak at this
point, if it hasn't already.

~~~
davidw
It's got tons of room to grow outside the browser where it's never been used
all that much.

------
ankurdhama
This comment is for all those people who have written against Lisp in this
thread.

Lisp was designed to represent/design/play with/experiment with advanced and
complex algorithms. Now, when was the last time you wrote a code that was
algorithmic or was about some new cool algorithm, try to remember, let me tell
you the answer: most probably never ever in your life (and there is no hope in
future as well). What you people do is stitch APIs, or more precisely, iOS
API, Android API, Some web framework API, DOM API, API API API... look at your
code and see it is just a bunch of APIs call thrown in a mix. This is
absolutely fine because what you guys build is "Application" (not
"Technology") which grab some data from here and there, store it in some shiny
DB and when user want it show him in a shiny new UI, that all you do and for
this purpose go ahead and use whatever language those APIs are in, no problem
at all, but please don't rant about something that you don't understand.

------
tosh
Dart almost exactly matches what Yegge wrote about a few years back.

------
jongraehl
> I have no hesitation in recommending Go to displace any programming task
> which would previously have been targeted towards Java.

Java is faster and has more libraries.

------
spullara
What I want is somewhere in the middle of Java, Scala and JavaScript. I have
no idea what that looks like exactly though. Maybe Kotlin? Dart?

~~~
joneil
I haven't used much Scala, but Haxe (haxe.org) sits somewhere between
Javascript and Java. I enjoy it, ymmv.

------
zedzedzed
Yeah, go is in compition, but Rust is in competion too.

------
batista
[while checking Yegge's list the NBL should have:]

> _The final tally is 11 affirmative, 7 negative_

Yes, but among the negative are items with far most importance than among the
affirmative.

~~~
eckyptang
I disagree. The negative items are things that people are _used to_ but not
necessarily the best solution to the problem:

Perl regex: it covers most perl RE features that are important. The RE engine
is the only real weak point I see in Go at the moment but that will change in
time and it's "good enough" now.

Strings and streams as collections: String is a collection of runes - you can
get the rune at an index fine. The latter is just wrong - A stream is not byte
addressable by nature.

Iterators: for not good enough for you? You can build an iterator interface
over any type if you desire that but it's really not needed. If it makes you
comfortable...

Generics: These are really not needed. This is a common misconception. I've
come from a heavy OO C# background (DDD, 1000 class models etc) into Go and
I've not missed them for a second. In fact I think I've probably been freed
from a million bad design decisions and many refactor sessions.

Standard OOP: I'm not even going down that route. The hacks you have to do to
get proper composite models or some level of dynamicity using "standard OO" is
horrible. I wish it would go away. CLOS is the nearest thing to something
usable - not what C++, Java, C#, PHP force upon everyone.

An there is a cross platform GUI for Go - it's called a web browser.

The only nicety I'd like to see is dynamic loading as it'd allow composite,
modular applications to be built without compilation but that has its own
pitfalls.

~~~
tsahyt
> An there is a cross platform GUI for Go - it's called a web browser.

A web browser is _NOT_ a replacement for a proper GUI library. Writing web
GUIs is rather hard in comparison because that was not it's original
intention. They're rather slow in comparison because they were not originally
intended. The list goes on like this. Everything we did to HTML/CSS/JS in
order to "support" those features better of the last decade were just
workarounds and things to cover up the fact that we're still dealing with a
Markup Language here which is supposed to add semantic meaning to a text or
some other piece of information in order to _display_ it.

GUI libraries on the other hand were designed to do GUIs. They're designed not
just to display information but to handle user input, handle widgets and
components of user interfaces. Most importantly though they're designed to be
consistent. Consistency is one of the most important aspects of user interface
design and ultimately user experience.

For example, look at Apple: The whole OSX experience and all the apps it comes
with are consistent in look and feel. They've got the same polish on top and
the same usability features that come with it. _That_ is what takes most of
the burden from users. You learn how to use it _once_ and then you're able to
apply that knowledge across many other applications. That is how things start
to feel "intuitive". That is what all the UX designers are ultimately aiming
for.

HTML/CSS/JS enable us to build wildly different GUIs that do their job well
but are not consistent for the most part of it. It has enabled designers to go
on a rampage, each of them trying to improve on certain aspects. That's good,
that's progress but it also comes at a great cost because things stop to feel
intuitive and the paradigms of interaction you once learned are rendered
invalid. Every site these days is trying to be different to stand out from the
crowd and eventually it's all a bloody mess concerning overall UX.

The only real use case for web GUIs are GUIs on the web, not native apps.
Introducing the same mess we've got on the web nowadays to native desktop apps
is not the right way to do it. A proper GUI library is the way to go both for
the developer as well as for the user.

~~~
eckyptang
Perhaps I simplified a bit too much. I agree with you entirely.

The browser provies an API and canvas on which you can build a user interface
much as X provides it to Linux, GDI/WPF does for Windows and Quarts of OS-X.
It's a medium, not the solution.

I also agree about a lot of sites on the internet being a bloody UX mess.

