Hacker News new | past | comments | ask | show | jobs | submit login
Why Dart is not the language of the future. (perl.org)
260 points by vital101 on Oct 11, 2011 | hide | past | web | favorite | 139 comments

There are a lot of unwarranted conclusions drawn here. I think this article is FUD (and I don't really care for Dart).

> Dart programs may be statically checked. The static checker will report some violations of the type rules, but such violations do not abort compilation or preclude execution.

> In other words the "static checker" is a lint-type development aid, not a language feature.

My reading is that the language is optionally typed (similar to SBCL). The type checker will warn you when you have a type declaration that appears to be incorrect, and will optimized based on these declarations, but will not keep you from running code that violates a type-check. This is a remarkably good solution for dynamic languages.

> Here's a strange thing: the one and only true value is the boolean literal boolean true. Everything else is false. That means that code in if (1) { ... } will never execute, because 1 is a number, not a boolean, and there is no implicit conversion to boolean. You'll need to write if (1==1) instead.

This makes as much sense as 'everything' being true. You would write if (true) {...}, but I can't imagine why you would do that.

> There is no implicit type conversion between numeric, string or boolean types.

Ok. This is a good thing. Auto conversion is bad.

> The distinction between string and numbers allows to re-use the addition operator + both for addition and concatenation. However, without strong typing, this will almost certainly prove to be a bad idea. From the specs, it looks like "2" + 2 will be a concatenation, and 2 + "2" a run-time exception (in the absence of implicit conversion from string to number), but experience infirms this: string concatenation happens in both cases (although with a warning in the second one).

This seems to directly contradict the first point. Which is it? It should do a dynamic type check and generate a run time exception in both cases. I don't see anything to convince me that "2" + 2 will be a concatenation... in the spec.

>Thus, isolates are a heavyweight thread control model very much like Perl 5's ithreads. That means that they are good for data isolation, but heavy to use and hungry in memory, because spawning a new isolate will imply cloning all the objects and data structures of the running libraries.

There is absolutely nothing that implies this. There is a tiny section in the spec about isolates. They could easily be lightweight like Erlang processes.

> This makes as much sense as 'everything' being true. You would write if (true) {...}, but I can't imagine why you would do that.

I think his real point is that there's no 'truthy', no evaluation to true. Later in the article he says that

  if (a) a.foo(); 
isn't possible because of that. A better example would've been (a != null) instead of (1 == 1), imho.

> This seems to directly contradict the first point. Which is it? It should do a dynamic type check and generate a run time exception in both cases. I don't see anything to convince me that "2" + 2 will be a concatenation... in the spec.

Not in the spec itself, right. But the current documentation seems to imply exactly what he states. See the definition of '+' here: http://www.dartlang.org/docs/api/String.html#String::String

"String +(Object other) Converts other to a string and creates a new string by concatenating this string with the converted other. "

> A better example would've been (a != null)

This is a good practice, even better would be checking the type or class.

> "String +(Object other) Converts other to a string and creates a new string by concatenating this string with the converted other. "

This is a bad idea, now I have a gripe with dart.

Considering this is a new language, why allow nullable references in the first place? What you call good practice I actually consider a curse.

I agree that it would be better to not compare a null reference at all. A better option would be to return multiple values or have a tuple type and pattern matching.

Like I said, I'm not particularly a fan of Dart.

Don't (typed) Union types work better for errors(with pattern matching) In haskell I think a nullable type or possibly error type frequently returned from functions which can error

ie. data Maybe a = Nothing | Just a data Either a b = Left a | Right b instead of (err,result) like in go

It was hinted in one of the other Dart threads here (or maybe on reddit) that if enough people got on the mailing list and made a good, reasoned case for it that null could still go away.

> > "String +(Object other) Converts other to a string and creates a new string by concatenating this string with the converted other. " > This is a bad idea, now I have a gripe with dart.

I don't think this is right, and seems to be bad wording on part of the documentation. I don't have any access to an interpreter right now, besides the Google AppSpot in-browser console, but adding a String Object and a Date Object to a new variable does not alter either object or do a forced alter on the existing objects, but rather creates a new string object.

Though, I can't tell if either the date object, or the first string object in this example are altered, it appears that they aren't changed during concatenation: http://try-dart-lang.appspot.com/s/KSAX

That's what the documentation says, imo.

The gripe comes from overloading the + operator, for all I can tell. That's at least the point of the author of the blog post, having + for addition and concatination, with poor type support.

The blog claims that "1" + 1 = "11" (exactly what the docs state here) and claims the same for 1 + "1".

Dart will not actually optimize based on type annotations; it will optimize based on watching what the actual internal types of objects are, and generating specialized code for the common cases. From an interview with Lars Bak:


This sounds exactly like Strongtalk's optional type system.

> This makes as much sense as 'everything' being true. You would write if (true) {...}, but I can't imagine why you would do that.

As I understand it, it's not that you might write if(1){} literally. It's that you might write a method to return <something> on success or null on failure, and ordinarily that would be reusable in a conditional. In Dart, it's not - unless you're careful. This is not the sort of thing we have grown up thinking that we need to be careful about.

Combined with the author's example of a==a potentially evaluating to either false or true depending on the class in a really unexpected way, I'm just seeing potential for confusion and I can't see any immediate reason why they would have thought that this would be a good idea. Yes, the spec mentions JS's problem with an autoboxed false, but they've just moved that problem around, they haven't solved it.

> It's that you might write a method to return <something> on success or null on failure, and ordinarily that would be reusable in a conditional.

Doing this is like writing a pun into your code. Check the type of it, or check if it is equal to null (or if it is not equal to null).

You can even go farther than that -- return true on success, and anything else on failure, and you get the semantics this person wants. If the results should be saved, that should generally be done outside the if condition anyway.

Returning true on success is less useful because in general it's the details of a successful result that you want to operate on. It's less common (but not by any means rare, I admit) for the "happy path" through code to be concerned with details of the error.

> If the results should be saved, that should generally be done outside the if condition anyway.

That's as may be. This is one of those areas where the language is guiding you towards a specific code pattern rather than explicitly preventing something, and I just don't think in this case the guidance is helpful or warranted. I think it's a side-effect of a workaround for a bug in the compilation target, and that's not a good thing to expose at the language spec level.

An example when a==a is when a is NaN

Yeah, but that's literally the only time it makes sense - when you've got a first-class "undefined" or "unknown" type - and if you're going down the path of letting user classes participate in that protocol, I suspect that allowing them to override == in a slightly funky way isn't enough.

  > You would write if (true) {...}, but I can't imagine why you would do that.
In haste to make your point, you overlook that Boolean expression are used in loops too. Combined with a "break;" I'm sure many here have used a "while (true)".

He was just responding directly to the article's points. The author lamented that now he would "need to write If (1==1) instead"

We all knew, or should have known, the point the article author was making. Who gives a shit if he wrote if(true) or while(true) in the piece, the same problem applies regardless of if "if(true)" is something you want to see in code or not.

>> There is no implicit type conversion between numeric, string or boolean types.

>Ok. This is a good thing. Auto conversion is bad.

One of the things I really dislike about developing in Ruby is that I can't do

value = 5 "The value is " + 5

I get bitten by this at least 10 times a day while developing, usually either when dumping to the log, or writing into html. I can fix the problem really quickly now, but every time I get an error:

"r" + 5 "TypeError: can't convert Fixnum into String"

I think, yes, yes you can easily convert a Fixnum into a String.

"r" + 5.to_s

It just means I have to sprinkle .to_s everywhere in my code. Ridiculous!

please can you explain what

    value = 5 "The value is " + 5
is supposed to do? at the moment i can't work out why it would be frustrating to be unable to type something that looks like a syntax error... (i'm assuming it has some meaning in a language i don't know, which i guess would mean php).

edit: what does "say" mean in the reply below? print? evaluate? and does "later" mean in another statement? or replacing what was before?

The problem is that his two lines of code got joined by HN's formatter (which needs two newlines for a real line break). Presumably his desired message was:

value = 5

"The value is " + 5

He means that he can't say “value = 5” and then later say “"the value is " + 5”.

I see that as a feature. It looks like you're saying "I want to add numbers and letters together and Ruby complains."

If you really want that value in your string, why not use "The value is #{value}"

Maybe they're afraid the translation team will be overly grateful?

Appending doesn't work for translation: not all languages will have the object at the end of the sentence. Better to stick with format strings.

That's what I meant...

Just use inline implicit conversion: "r #{5+3/2}"

It's a lot more typing. :)

If one really wanted that kind of behavior (and not getting errors when trying to add strings and number) you could make it so:

    class String
      def +(o)
        warn "You are adding a #{o.class} to a string." unless o.is_a? String
for example.

Whether this leads to other issues is left as an exercise for the reader and whomever ends up maintaining the reader's code.

or in actual use something like

    "I have #{me.num_of_children}\n"
much better then Auto conversion In python

    "I have " + str(me.num_of_sons) + "and " + str(me.num_of_daughters) +"\n"
does get annoying

He is right about isolates: see https://groups.google.com/a/dartlang.org/group/misc/browse_t... However it is wrong to compare them to Perl threads, because a new Dart isolate has a freshly initialized static environment, and is not a clone of the isolate that spawned it. Dart is designed to make static initialisation fast, so spawning should not be too slow.

I think one of the reasons JS succeeded is precisely a major complaint of experienced developers: globals [1]. But this, (perhaps as with PHP) is what enabled novice developers —some of whom eventually became experienced developers—to learn the language easily.

We all started with some bad practices, and as we matured our code did too. But the loose, forgiving standards are precisely what enabled broad use in the first place.

When a language imposes more constraints on the programmer (it may be for their own good!), it limits adoption and growth.

1: From the summary of Javascript: The Good Parts, "... Douglas Crockford identifies the abundance of good ideas that make JavaScript an outstanding object-oriented programming language-ideas such as functions, loose typing, dynamic objects, and an expressive object literal notation. Unfortunately, these good ideas are mixed in with bad and downright awful ideas, like a programming model based on global variables."

I think there are very few who would say that globals shouldn't be allowed altogether. The problem is that JS relies a bit too much on globals all being crammed into one namespace.

I'm an advocate of the python model (although I should maybe call it the Modula-3 model): only allow module-level globals. You get globals if you need them, but no worries about code mangling your variables.

This is what I like about CoffeeScript too. It wraps the module in a closure, so globals are opt-in. By default, the variable will be scoped to the generated module, but you can simply attach it to the global namespace (window.foo=x in the case of a browser-side CS).

Actually, Gilad Bracha, one of the designers of Dart does say exactly that. He's also working on another language that has no static (ie, global) state. See http://newspeaklanguage.org/

I think you're on to something there. Doesn't processing also encourage the use of globals as a way to make it more accessible to artists who are just learning how to program?

Not quite. Ruby also has easy globals, meaning that you can start out learning it by writing tiny scripts and not worrying about creating your own classes, modules, multiple files, etc.

The difference is that JavaScript has a design flaw in that if you forget to declare a variable before you use it, it becomes globally scoped. This causes local variables inside functions, for example, to become accidentally global leading to bugs.

The worst of both worlds: Dart fails to provides the advantages of static languages, without compensating by the flexibility of dynamic languages.

This was my impression exactly when I first read about Dart. This is the language that Google wants to replace Javascript? Please. Javascript may have some strange design quirks (or outright flaws, depending on who you talk to), but at least it has the sense to have a meaningful boolean context. I understand that you can't try to revolutionize everything if you're targeting a wide audience, but this just looks like decaffeinated Java.

Meaningful boolean context? I understand that the original author is a Perl programmer which probably means he has wildly different views of the world, but seriously?

"Meaningful boolean context"s have given us things like the abundant "if (!!foo)" in JavaScript. And then he goes on to criticize absence of auto-conversions, which are the other plague of dynamic programming languages (see "===" & friends).

I don't see how anyone could actually want that kind of litter in his programming language.

This article conflates "dynamic typing" with "weak typing" all over the place. First he whines that type declarations aren't mandatory (at least within the scope of what a type inferencer can resolve), claims the runtime won't be able to determine type information (which V8) can already do...

...then he goes on to complain that there's no automatic type coercion! This is probably one of the worst misfeatures of JavaScript (and Perl and PHP), but after a short tirade about "weak typing", he's longing for automatic type coercion? Sorry dude, "3 dog night" + 2 should not equal 5

Very interesting article. After playing with the examples and looking at the docs for a bit, it really felt "old school" - but I assumed that that was just a facade to lul any corporate java types into a false sense of familiarity before busting out the modern goodness: much like JavaScript's own "Surprise! I was really SmallTalk all along!" magic trick.

> "Surprise! I was really SmallTalk all along!"

Self, actually. Dart is closer to Smalltalk in Java's clothing.

Dart provides and enforces a bunch of idioms that you don't need in JavaScript in the first place! Adding classes to a language that... doesn't need classes.

If anybody but Google released Dart HN wouldn't give two shits.

The comments under the "Feeble Typing" section make it clear the author has never seen or read Gilad talk about optional typing. For one example, read http://bracha.org/pluggable-types.pdf

Yes, please. Tell me how this language, before it even launches, is finished, or has been tested in the real world, is already doomed to fail.

Oh and it's posted on a Perl blog. Completely objective I'm sure.

Is it just me, or is every dart criticism follow the pattern:

1) I have a hobby horse 2) Dart gets released 3) Flog hobby horse

Go had a bunch of similar criticisms, mostly for not being "innovative" enough and favoring a small, solid feature set.

I don't know why you speak in the past tense, but I consider Go still not being "innovative" enough and also broken by design. We'll talk in about 2 years from now when they'll finally add generics to the language, in a broken way nonetheless, as generics do have to be baked in right from the start to pull it off. And it badly needs generics.

But you see, Go got all this attention just because it is a language released by Google. And in the meantime version 2.0 of a real systems programming language that is innovative and kick-ass is largely going unnoticed.

> And in the meantime version 2.0 of a real systems programming language that is innovative and kick-ass is largely going unnoticed.

Which one?

I'd love an answer to that - I just got given the task to evaluate Go as a possible development tool for a future project.

Edit: I'll take a stab in the dark and say D?

FWIW I actually agree about the generics, both that Go needs them and that bolting them on afterwards typically doesn't work as well as having them from the start. But in general I wouldn't call it "broken" and view being "not innovative enough" as a feature for a systems language.

One of the reasons Go still doesn't have generics is reluctance to 'bolt them on', and that the current language works surprisingly well without them.

Of course that wont't stop people who have not written any Go code from claiming Go is worthless without them.

That's a cheap shot. Here's the opinion of Andrei Alexandrescu:


Yes, precisely the opinion of somebody that has not written anything in Go.

Not to mention somebody quite biased given that Go has been way more successful in being used to build actual systems than Alexandrescu's D2.

> And it badly needs generics.

Everyone I know who has actually used Go strongly disagrees with this claim, they are very happy with Go's interface system and most of them don't notice the lack of generics at all.

Go is not like other languages, and 'features' are not interchangeable across languages.

> But you see, Go got all this attention just because it is a language released by Google.

Yea, that it was created by Ken Thompson and Rob Pike had absolutely nothing to do with it. /a

how is having classes, interfaces, generics, optinal types, operator overloading and functiontypes a small, solid feature set? Scheme, self that are languages with small feature sets.

I was talking about go, which doesn't have generics, optional typing or operator overloading, doesn't really have "classes" (arguable), and a fairly tight type system which covers structs, interfaces and functions in a minimal way.

I don't really have an informed opinion on dart yet, I'm just noting that 90% of the complaints I'm seeing consist of people with a hobby horse that predated the Dart announcement.

I don't really have an informed opinion on dart yet, I'm just noting that 90% of the complaints I'm seeing consist of people with a hobby horse that predated the Dart announcement.

It's a good thing then that the author of this article falls in your 10% group: someone who actually knows a thing or two about language implemetations, and not a hobby horse rider.

He may know a thing or two about language implementations, but his complaints are predictable: he wants it to look more like perl and less like Java. Additionally, his opinions on the implications of their concurrency model are entirely based on perl's implementation and IMO not necessarily correct in another context (message passing was probably chosen specifically to avoid needing heavyweight threads). I'm not a web programmer but those complaints seem superficial to me, I'd probably spend a little time using it before coming to that conclusion.

Those points just mean Dart isn't revolutionary, which is pretty much a requirement for a language to replace Javascript.

Syntactically, Dart looks a lot like ActionScript 3, which is basically Javascript with classes. ECMA decided not to adopt Adobe's AS extensions into ECMAScript 4 because they felt it was trying to turn Javascript into Java. They wanted to maintain the "feeling" of coding Javascript. (Basically the paradigm pioneered by jQuery)

While it's true to say "ECMA decided not to adopt", it omits the important information that most of the opposition came from Yahoo (AIUI Douglas Crockford was instrumental in this) and Microsoft, who might have their own motivations other than "preserving the feeling of coding javascript." And Google too, which is kind of a WTF given Dart.

I often wonder what the disadvantage is of moving to an existing, established, understood and already tooled language (such as the ActionScript family) if one is seeking to migrate away from a legacy language such as JavaScript.

Now I am using ActionScript to make a point, since I have not yet programmed in it to be able to address how much Dart does or does not mimic the AS syntax.

Is ActionScript "type-optional" as with Dart?

And I am so glad they did. I remember feeling unsettled by JS's lack of formal classes... and then I was enlightened.

I'm in the same boat. Back when I was a heavy AS3 developer, JS looked dumb and gimped. But once I drank the JS Kool-Aid, I saw the value in how JS does things. Not everything has to look and feel like Java. Now whenever I see frameworks that force classes onto JS, I have to roll my eyes.

no, it means that dart includes almost as much wtfery as js does. Things like not being able to say if (a) because unless a is a boolean true, then it's considered false. So we need to say if (a==a), except that there's no guarantee == will return a boolean, so it's possible (a==a) may evaluate as false!

If dart wants to replace javascript it has to be better, and by a long shot. at the moment I'm just not seeing that it is

Having a == a be able to generate false is quite common, it's for instance the case in C, C++, Python and Ruby (off the top of my head), when a is the floating-point entity NaN. Surely not all of these languages are ill-defined?

The issue isn't about the == operator returning false. It's about the == operator's return value not being typed.

Evaluating "a == a" could return the string "potatoe" which would evaluate to false (and make no sense).

I don't see the big deal with this. If your == operator is buggy, it's buggy, no matter whether you can only return a boolean or some random other type.

(I personally found most of the criticism in the article to be in the same vein: inflammatory, superficial comments after a quick reading of the spec.)

Generally any blog written a day after the spec released is going to be pretty superficial. It isn't like they have time to be able to construct any real criticisms. Personally, if not being able to say if(a) is the worst criticism of the v.1 spec you can find, then it must be a pretty decent spec.

Actually NaN != NaN is not a language design, it is required for complying with the IEEE 754 standard.

That's still language design... Personally, I would make NaN == NaN in a language I would design, and provide a Float.equals function that would follow the IEEE standard.

You wouldn't have to write if(a==a) in the first place in those languages, however. if(a) would suffice in C and C++ for an integer type or a pointer, and possibly for an object in C++. An integer value would also work in Python. I can't say anything about Ruby.

The fact that a==a can return false isn't really the main criticism here. You could override the == operator in C++ to do that too. The article is claiming that having a literal boolean true be the only "true" value is an issue. I think I agree with this just because this code evaluating to false is out of line with every other language out there, and I haven't seen a case (yet?) that it's an improvement for any other practical purposes:

  a = 1;
  if(a) {
    print("A is true!");

Actually what you say makes perfect sense.

Say Google unveiled a Scala, Haskell or even Go like language.

Sure, the hip crowd would be pleased. But would it ever gain traction with the web dev masses?

Hell, those languages have not that much traction even outside web development.

Now, if Google had unveiled something like Ruby or Python, that would have also pleased the HN crowd (maybe a little less that some extravagant functional language with crazy type tricks). And it could possible gain traction too. But that too would hardly be revolutionary.

Scala is gaining a lot of acceptance even without the backing of a major sponsor like Google. I daresay that if Google were behind Scala, we wouldn't be having this conversation.

One also has to remember that Google itself intends to use this language a lot, so it's much more likely that they approximate something they already use. And they're doing a lot of Java, where Dart doesn't seem too exotic. In some other thread, Dart was compared to GWT, and I think that's getting pretty close. For teams that would've used Java/GWT before, doing it in Dart instead isn't a huge jump. Probably not even as huge as Scala, where at least you'd be able to remain within the secure embrace of the JVM.

(Not a huge jump for Android programmers, either. In light of the Oracle lawsuit avalanche, this might be interesting…)

What is it about Go that would prevent traction where Dart will not. Not having classes?

Other things too. The use of pointers for one (although limited).

Like 80% of these things are true for Scheme/Lisp/Dylan. Canonical true value? Lack of implicit type conversions? These are good design decisions, not warts.

The question is not what they share the question is what dart lacks. The two most importend things are (a) dart has statments and expressions (b) no macros.

Then dart does not have a nice concept of true/false witch most lisps have.

In Scheme/Lisp/Dylan/Racket/Clojure/Your favorite lisp dialect everything except #f is true -- Dart is the other way around.

Right, it's the opposite convention from Lisp. I'm trying to distinguish from say Perl, which doesn't have a canonical true or false value but identifies a set of things that "seem like they should be false" like 0 and "".

This is not a bad article despite the stupid title.

Take-up is all that matters in the future; the quality of a programming language has rarely had any influence on take-up.

We matter in the future. If we want to use a sane language in our future, we should probably help promising ones to take up and prevent more lame from doing so; the article is doing the latter.

So what does influence take-up? And does Dart have it?

I dunno, something like:

  * The backing of the largest web company in the world.

  * Inclusion in the second-most popular browser in the world.
Yep. That aught to do it.

VBScript had both (actually the browser was the most popular), and look how far it went.

One can also argue that VBScript is not tooling-friendly. I am talking a little out of school, since I have never had to use VB.net; it could be wholly different. But I have had my fair share of VBScript interactions and those were not fun (Microsoft-isms aside).

True, but times were different. Microsoft was being evil in the browser wars, being a monopolist and pissing off the hacker community: just the people you need for a new tech like this to take off.

VBScript was one of the most popular server-side scripting languages in the late 90s. If they had a JavaScript translator, MS developers would have used it.

it only worked in IE and Windows.

> Which means that the expression (a==a) might be, in some pathological cases, false.

Most languages (including Perl) have at least one case where a==a is not true. (0.0/=0.0) == (0.0/0.0), or any NaN == NaN expression.

That's less a language thing than it is an IEE754 float thing, as I'm sure you're aware. It's justified there because it's a genuine propagation of an "undefined" result. I'm not sure that is the sort of protocol user code should be able to opt into by accident.

  > if (1) { ... } will never execute, because 1 is a
  > number, not a boolean,
This hasn't stopped Ruby from being popular.

In Ruby it's the other way around: 1 is truthy and the block will execute. Everything in Ruby is truthy other than the null reference and Boolean false.

I can understand forbidding non-Boolean values in Boolean contexts - it's not something I'd want myself but I think I get why a reasonable person might. Dart's semantics, though, are really strange: they follow no precedent I'm aware of, and don't seem to offer anything useful in themselves. Does anyone else have some insight here?

dart's boolean context semantics are like taking smalltalk's strongly typed ones and putting them through something.. i don't know what they were thinking. the basic idea is sound ( the smalltalk part ) but what they then did around the 'edges' is just downright off.

I guess you mean the opposite hasn't prevented Ruby from being popular. All numeric values are truthy, including 0.

A lot of Ruby idiomatic methods will return either a value or nil, which is falsey, so things like:

  x = meep
  if x
    # do stuff
will run as most might mean.

  > I guess you mean the opposite hasn't
  > prevented Ruby from being popular.
I was stating something more like "the fact that there is a hard split between numbers and booleans, and everything -- barring null -- by default evaluates to true." Since everything is a object instance in Ruby, asserting a variable is basically only evaluating whether or not it is null.

null or false, actually.

Slip of the tongue. ;) But the easiest way to think about it is something like:

  type(x) == Boolean ? x == True : x != nil
Disclaimer: That may not be valid Ruby, as I only have a passing knowledge of some bits of Ruby.

If we're going to be posting judgmentally titled articles, then why don't we post "why perl is the language of the past." I suspect the main reason languages like perl and FORTRAN are still used is because of legacy and tradition. But I acknowledge that is my opinion, not fact, so I wouldn't post such a title. Seriouly, we need to stop the pomposity. What about "My concerns about Dart 0.1", or "Early Criticism for Dart 0.1"?

I will not use dart because it comes from google. Wont let them own everything about the web.

I will still not use dart!

May I point to the irony on posting why a language is not "the language of the future" on perl.org?

[UPDATE: yes, please, down vote freely, because humor is dangerous].

I personally don't find an irony. Perl may not be the most popular language ever, but it definitely has its place. I do find an irony that the 0.1 specification of Perl is worst than the 0.1 specification of Dart.

I do find an irony that the 0.1 specification of Perl is worst than the 0.1 specification of Dart.

Why is it surprising that a language designed to occupy a niche between shell and C in 1987 had a spec faster and looser than a language designed from (let's say) 2009 through 2011, based on an existing, popular, and well-understood language with well-publicized flaws?

But, keep in mind that Perl 0.1 wasn't intended to replace a Javascript-like language. It was created as a replacement for Unix shell, and tools like sed, awk etc.

> Perl may not be the most popular language ever, but it definitely has its place.

Sure, but is it? Does it have any chance at all of being "the language of the future"? Or, is it a very old, mature language that has reached a plateau or is in slow decline?

Despite all your downvotes i thought the same when reading (perl.org) behind the headline ;)

I also find it ironic that this link to perl.org took 15 seconds to respond.


It doesn't befit the HN community to be ridiculing the Perl community.

First the Perl hackers were the original web start-ups. Many of the problems you don't have to worry about were either solved by them or were solved by people who wanted to fix things wrong with the Perl toolchain. This makes Perl a huge success in historical terms even if it on the wane today and, even if it is on the wane today, it is still used by many huge organisations.

The Perl 6 project has been a worthwhile and fascinating project to follow for the past 10 years. While many see it purely as a failure there is a lot to praise and learn from in the experiment. Here's what I have learned from watching it for most of the early years:

* Designing a new language by committee is very hard

* Keeping a community focused on creating a Minimal Viable Product can be very hard

* One bone-headed programmer can wreck a project

* Very clever people can make very stupid decisions, especially when operating in a group

* It is better to announce nothing than to announce and then not ship soon afterwards. The internet is like a child with ADHD

Having said all of that Perl 6 is working now and has some fascinating, innovative features if (to many people's taste) wrapped in some ugly and frenetic syntax.

Go read the history of Perl and Perl 6 before you just join the chorus of boots.

There is a lot of great wisdom trapped in Dan Sugalski's blog. I've tried to recommend his articles before but they've never gained any traction.


Of particular relevance here is his Parrot Retrospective:


Especially the last two items. One being the list in the wrap-up: good advice and the other being the one about Leo—the man who single handily derailed Perl 6. One guy is pretty-much the reason we all laugh at Perl now. I watched it happen: he refused to build a MVP, he kept throwing away working code to re-architect it bigger and more convoluted ways. There's a big lesson there for every start-up.

If you do take a look be sure to read Dan's well-written blogs about language design and implementation, especially those tagged "What The Heck Is:"


I've always held a lot of respect and sadness for Dan Sugalski. The guy was not only a brilliant architect and hacker but he had the knack for always being right. It's sad that he didn't stand his ground and was sadly pushed out of the project. I get the impression he had some tough times after that.

If any of you guys see he CV come up you shouldn't think twice in giving him a senior role.

I remember reading Dan's blog religiously when I was a college student interested in language design and compiler implementation. That's where I learned about CPS and register VMs.

Anyway, I've seen him recently pop up in a professional context. I can't find anything on the Internet about where he's working now, so on the off chance he's deliberately protecting his privacy, I won't say where. But knowing what he's working on, I think he's doing fine.


Cool, thanks for those links. I'm going to give them a read.

After a quick read of a few articles....

Hmmm. This is good stuff. I want to sit down with this and read it on my Kindle. Maybe I should ping him and see if he wants to turn this in to a Leanpub book...

I think that's a great point about working towards a minimal viable product. I was shocked to read recently about all the wasted time spent doing things like implementing versions of Python and Javascript on Parrot instead of working to make happen its whole reason for existence, Perl 6.


I've just read that article and I feel there's a subtle re-writing of history going on there. I stopped following Parrot shortly after Dan Sugalski left (somewhere around 2005?).

I wasn't there before the original Parrot announcement but I did start lurking on the mailing list around 2001-2002.

I recall Parrot always being touted as an open VM for all dynamic languages (this is before dynamic languages started to appear on the Java VM). You have to put it in the context of the rumoured .Net VM that Microsoft was about to bring out—(and then did shortly after the Parrot project started to gain traction)—those of us interested in open software were keen to have our own.

Sadly Parrot stumbled and the rest is history. Mono wrote the obituary.

They had to.

• The Perl 6 spec was still being argued about (plus I recall Larry had been seriously unwell).

• The intention was to create an open VM for all dynamic languages.

• Perl 5, Python, Ruby and Javascript were always targets for Parrot.

My frustration was that, instead of putting everything into creating usable Perl 5, Python and Ruby implementations (Javascript was pretty-much browser only at the time) the Parrot team faffed about with toy language after toy language.

I may have imagined this as I can't find reference to it online.

oops! my asterisked footnote after unwell (and linked to the last line) has been interpreted as italics.

IS Perl 6 "working now" ? Last I heard it was still not really running acceptably fast, and major pieces of the language are still under construction.

I honestly stopped following it a while back and turned my attentions to watching Go instead :)

I think it is working now (for some definition of 'working').

There are two Perl 6 compilers under active development. Rakudo is rather feature complete, but not very fast. Niecza is much faster, but still catching up on the features.

While I don't use them in production yet, they are very much fun to use.

It doesn't befit the HN community to be ridiculing the Perl community.


As soon as you use the phrase "the language of the future", the mammalian-mating-ritual-inspired-games have begun!

Seriously. Different languages can and should talk about their appropriateness to different contexts. But "the future" isn't a particular context. It's more of a "I deserve more mindshare now!" statement. And while your special language might indeed deserve just that, you'll have to survive the charge of other long antlered types to prove it. Good luck in your quest but ridicule is part of the charge you'll have to survive after you enter this arena - stop frickin' whining it, just stop...

I think it's completely valid for a bunch of super-smart language geeks who have spent 10 years learning from their own mistakes to make comment on a new language, especially one that due to the backing of the biggest web company in the world will gain traction whether or not it it is any good.

I wouldn't worry too much about traction. See Google Wave.

Ad hominem, anyone? While "Perl" and language criticism in one headline might be worth a "teehee" moment, I fail to see any arguments in the article that seem to stem from the pedigree of the author. The only time he mentions Perl is when he's talking about threading, where there's some kind of feature parity – and he's not too happy about it.

Never mind that Perl hackers do have a certain penchant for language hacking (understandably…), starting with a plethora of object systems (Moose getting quite popular) and ending in extreme absurdity[1].

[1]: http://www.csse.monash.edu.au/~damian/papers/HTML/Perligata....

Hey, some Perl hackers are not business types working on serious software, but some of them are great hackers nonetheless ... it's also a good thing you haven't seen Damian Conway in a presentation talking about programming in a Klingon dialect. Your head might have exploded from the sheer absurdity of it - but me, I was standing there thinking about a time when programming was fun for me and I loved every second of it.

Also, calling an ad-hominem while insulting an entire community? Really?

Several of his arguments stem from his perl heritage. He gives a nod to moose's roles when complaining about Dart's weak OO (I actually agree with this one), advocates implicit conversions in stupid Perl-like ways, whines about + overloading in ways only a Perl user would, and relates Isolates to Perl's threads in ways that don't make sense based on my reading of the language spec (Perl's threads are way worse than what dart's seem to be!). Perl is directly visible in over half of his points! Some of his arguments are actually valid, but some obviously stem from his mind being corrupted from too much Perl use.

> Ad hominem, anyone? While "Perl" and language criticism in one headline might be worth a "teehee" moment

And my comment was just that: a "teehee" moment.

Oh, and "ad hominem" means "against a person". I only pointed to the irony of future-language advice coming from the Perl camp, never said anything against the validity of said advice or the person giving it.

Note that I didn't reply to your original post. While one might argue whether this really is irony or just a tired "haha, Perl" joke (or about the applicability of ad hominem by association), I was referring to the more immediate "envy" issue. Not that I don't regret posting, as I don't think digressing in this direction is worth a discussion here – although that would've taken place without my contribution.

"Perl, right, yafeelme?" is basically the "What's the deal with airline food?" of the programming world…

That's a reactionary comment. We'll see if the post was right in a year or so.

Well, sure, the post can be 100% right.

It's the source that's the "ironic" part.

I'm not sure it's ironic - nobody uses Dart yet.

To me Dart feels a bit like the Embrace/Extend/Extinguish behavior of Microsoft. There's nothing I've seen so far that really makes me excited about Dart, and it kind of feels like Google is really just creating this new thing for the sake of being different. It seems like they want to unilaterally make decisions on the future of the web. We all know how well that works.

The way I understand it, Dart is a higher-level abstraction of Javascript. People thought that Java was crap too when it came out and said C/C++ was still king. Look how that went.

Perl has been out fucking forever and still isn't in the browser, so stop whining. Javascript whipped your ass in this arena, Perl.

"People thought that Java was crap too when it came out and said C/C++ was still king"

I was there in 1995 and I have no recollection of anyone saying that. People were generally excited by its possibilities. The worry was about a) embedding a VM into a browser back when memory was limited and b) java was controlled by one company and wasn't an open standard.

Javascript won the 'browser war' because a) it was already there so it didn't need a plugin b) it was 'good enough' for its time c) it had Java in the title d) it had the sense to become an ECMA standard.

Perl never had any plans to be in the browser it was always a Unix command-line tool.

What is interesting is that Java started as a language for making UIs on iPad-like devices, The first web-server was written in Objective C, Perl gained traction as the way CGI programs were written. Now, Objective-C is used for programming iPads, Java moved from UI to Server and displaced Perl for corporate web apps. Meanwhile Javascript has quietly solidified its role in the browser and may finally make the leap to being a major server tech.

I was there in 1995, too. People were excited about the possibilities in Java and applets, because that was the only purpose in the beginning as far as most were concerned. Then people started writing UI apps in Java and starting to write other standalone apps and it was deemed "slow" and people made fun of it. People seem to forget that these days, because it wasn't long before it took off and became popular.

"Perl never had any plans to be in the browser it was always a Unix command-line tool."

Perl is not (just) a "Unix command-line tool". It is an interpreted language. It was a preferred language for old school webapps in the mid-to-late 90s, and continued in use a lot through the 2000s. Now for the majority, it has been replaced, except for code that is still running on old servers.

So why are Perl programmers ranting against Dart? Javascript is somewhat of a sore spot, even though the average Perl programmer would not admit to it and may not even realize it. A client-heavy web application that doesn't try to do things quickly saving keystrokes is about as far from Perl as one can get. Dart is "Javascript part 2" to the Perl programmer, and subconsciousnessly eats away at the Perl programmers ego. Even though they can't explain why, they feel as they must destroy it.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact