

Dart Bashing - gbaygon
http://dartinside.com/2011/dart-bashing/

======
wbhart
Douglas Crockford details a pile of problems with Javascript that have plagued
it since the year zero. Dart needs to fix all these issues. Moreover Google
really pitched it as a better language than Javascript.

So far the Dart implementors don't even implement their spec correctly and
some of the completely broken things in Javascript have been retained. And
some new ones have been added too.

I'm absolutely stupified that their spec claims integers are not limited to 32
or 64 bits but only limited by the size of memory on the machine. Not only is
that not true (they've just used the completely broken Javascript double
precision floats), but it isn't even sensible. You cannot make bignums
anywhere near as efficient as machine words.

They've even thrown away some of the really brilliant things in Javascript.
It's a total disaster. The spec is full of typos too. This is not a hackish
attempt of some inexperienced computer science graduate student, it is a
language developed by one of the richest corporations on earth.

Regarding the fact that Dart has Null pointers, someone on Reddit commented
"what were they smoking".

Someone commented on HN that Google does not hire programming language theory
experts. I wish I could vote that up a hundred times.

~~~
Peaker
You can make bignums pretty close to the efficiency of machine words -- when
storing values that fit in machine words. You do that by simply using tagged
machine words.

A few extra instructions protecting the bignum's use do not have a significant
cost.

In [http://www.st.cs.uni-
saarland.de/edu/seminare/2005/advanced-...](http://www.st.cs.uni-
saarland.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf) Tim Sweeney says:

    
    
      Factoid: C# exposes more than 10 integer-like data types,
      none of which are those defined by (Pythagoras, 500BC).
    
      In the future, can we get integers right
    
      Neat Trick:
      In a machine word (size 2^n), encode an integer ±2^(n-1)
      or a pointer to a variable-precision integer
      § Thus “small” integers carry no storage cost
      § Additional access cost is ~5 CPU instructions
      But:
      § A natural number bounded so as to index into an active array is
      guaranteed to fit within the machine word size (the array is the
      proof of this!) and thus requires no special encoding.
      § Since ~80% of integers can dependently-typed to access
      into an array, the amortized cost is ~1 CPU instruction
      per integer operation.
    
      This could be a viable tradeoff.

~~~
wbhart
In fact I've implemented precisely that first tradeoff in a library and it
isn't too bad. It's still much slower than machine words though.

In Dart this becomes more constrained as it uses dynamic typing. Apparently
the types specified by the programmer will be thrown away when it is compiled
down to VM instructions. They are only used to specify/check interfaces.

What this means is that those tags become quite large in the implementation.
You can try any Common Lisp or Scheme R6RS without type inference to see how
fast or slow this is in practice.

I don't completely understand the second trick. Where are the bignums?

~~~
Peaker
He's basically saying: If the integer is tagged in its static type to be
within the range of some array -- then that is proof that it is not a "bignum"
and can be stored as a machine word.

~~~
wbhart
Oh yeah. That's an optimisation more languages should use. But he clearly
doesn't do much mathematics if he thinks most integers are array indices. His
point is doubtlessly right in his problem domain though and maybe even for a
large sector of the programming community.

------
gruseom
Don't know if this is Dart bashing, but... Java with actors? An emphatic no
thank you. I fear these guys are committing the perennial mistake of designing
a language for someone other than themselves - as always, for someone more
_average_ than themselves [†] - and are following Java's example in this
respect as well. Personally, I don't want to program in PL/I.

The Go guys seem to have gone the opposite route, designing the C successor
they themselves most want. That's far more likely to please others in the long
run.

[†] Edit: someone misunderstood me here. I'm not advocating this supercilious
attitude, I'm saying it's a mistake that leads to worse languages.

------
calebmpeterson
_I’ve said this numerous times, and will do again, GUI is the next killer-app
for actor programming._

As a full-time GUI developer (Java/SWT) this is so true it hurts. I would love
to see the actor model and/or functional reactive programming become main-
stream in this space. The observer pattern alone is insufficient...

~~~
scotth
I'm having a little trouble imaging how the actor model would apply to GUI
programming. Could you explain?

~~~
antrix
I guess the issue is that making thread-safe GUIs is incredibly hard. So all
GUI frameworks require you to update the GUI state only in the 'GUI thread'.
Which means we use hacks like 'runOnMainThread', etc. to update the GUI.

A saner way is to implement an event bus and send update events to it letting
the main thread update itself from the bus. Some frameworks do use this model
but really, it would be much nicer if the language itself could do this.

------
chrisbuc
Quite right too. Too much of what I've read on here and in the dart forums is
"I thought it would be more like..." and "google have just created something
to aid their development toolchain."

I view dart as javascript grown up. It's not that much different from java et
al, but it's different enough.

Only time will tell if it will replace javascript, but I'll be doing my bit to
help.

~~~
arctangent
I will likely adopt Dart to replace all my client-side coding needs (but not
just yet).

I was a little bit annoyed by the braces/semicolons on first glance but spent
a few hours today reading the documentation and language reference etc.

It does seem like Dart is a "grown up" language, as you say. It's so packed
full of goodies that it seems preferable to JavaScript in every way, except
for the fact that JS is already here and has a large user base.

People who've invested a lot of time in learning how to make JS work might
have a preference for sticking with what they know, but I think the general
reaction today has been to place too much onus on the Dart team to say why
it's better than JS.

It would be interesting to hear opinions from John Resig or Douglas Crockford.
I'm hoping they're giving the matter some thought before weighing in on this,
rather than just indifferent.

------
sciurus
This was written on October 8th, before much information was available about
Dart.

~~~
Semiapies
Every post on this blog appears to be this guy fanboying about Dart with next
to no information on it.

Something smells, here.

------
languagehacker
Leave it to the guy who finds any excuse possible to talk about how great
Erlang is to create a perceived controversy for a new programming language so
he can defend it -- and, tangentially, Erlang.

Dart actually looks pretty good, and since it can be compiled into JavaScript,
there's no real risk in giving it a shot other than the obvious lack of user
libraries, at this point. If Dart solves problems JavaScript has without
introducing its own laundry list of wonky inconsistencies, then it's
definitely worth developing with and participating in.

------
wavephorm
I think it's more general 'Google-bashing'. Developers have every right to be
extremely cautious about adopting any new developer tools from Google.

Anyone who's used GAE has been badly burned, if not deceived by Google. Many
unpopular, or costly API's were revoked unceremoniously. It is not easy to
understand Google's intentions because they never seem entirely genuine or
devoted to anything. This fly by night attitude is not acceptable for a tools
developer. Developers want to trust people that are dedicated to their work.
And I just never get this attitude from Google.

~~~
sambeau
From what I have seen that is not the case. You can't just dismiss the many
informed opinions you see on here as 'Google bashing'.

I've seen many really good reasons for thinking that the web programming
community hoped for something lighter and more innovative from Google.

There are many good reasons why most of us avoid web-programming in Java and
C#, especially those of us in smaller shops where one person might have to
tackle the full stack.

