The same thoughts I had as I jumped over from reddit to here. Though the discussion on reddit (http://programming.reddit.com/info/280e9/comments/c028101) does once again raise the age old topic that people seem so fond of (at least over there): bashing Arc for 'not bringing anything new to the table'. Though, as you point out, I don't think this particular article really needs to spend a few days on news.yc since it's kind of dated and seems an awful lot like a copy-over from whatever is posted on reddit, I would like to know about this.
I know that Paul's offered that Arc is terser - to the point of having the terseness standardized. I really do see the value in that and don't share in the "just make CL macros" sentiment that seems to pervade people lately. In fact, I'm excited to perhaps someday be able to use Arc. However, I WOULD like to know what Paul (and other commentators, though again there are more over at reddit so far as that goes) thinks about the vitriolic remarks. Are they even worth paying attention to? There always seems to be wanton backlash against things that get posted to sufficiently large news sites (Slashdot, reddit, etc.). Yet it also seems like a legitimate matter to consider, though it could just be the case that the public-at-large doesn't see the entire picture for not having access to the language. So, thoughts on this phenomenon, anyone?
I think it would be very interesting to see Pg address some of the criticisms of Arc.
I did notice the comments on the original reddit posting, of which several seemed well considered. CL seems to generate so many branches that it does not seem worthwhile to study the specifics of each one. Can anything really be that new?
What I would be really interested in is hearing who the next McCarthy might be. CL is interesting, but wordiness does not seem to be its primary problem. If Pg disagrees I'd be quite interested to hear his opinion.
Perhaps people have other thoughts about the primary shortcomings of Lisp?
If the problem with Arc is that it's not different enough from existing Lisp dialects like CL or Scheme, all I can say is that being different is not my goal. I'm trying to make it good, not original. So it will only be as original as it has to be to be the best language for writing programs (as opposed to pleasing managers, or writing papers about, or seeming comfortingly familiar).
Do you believe that the best language for writing programs would be universal, or might there be one best for you (syntax you like and find convenient) and a different best for me (syntax I like and find convenient)?
My uneducated guess is that much of Lisp dialect proliferation amounts to a confusion between syntax and logic. I would distinguish as follows, 'syntax' is arbitrary elements which are used to describe program logic. Logic is the underlying computational processes which drive the language.
As a consequence, I would suggest that some language designers may be confusing syntactical and logical elements when they claim their language is 'better' than other variants. Where the syntax varies according to preference and not because of an improvement in logic, I would say that this is a contextual 'better' and not a mathematical (and hence, universal) 'better.'
Anyways, I wouldn't pay the reddit critique much heed, as much of it mirrors the endless debates over vim vs. emacs. Which is to say, I don't find it very helpful and don't know why Pg would either.
I think there might be several optimal languages for different domains. Not 100% sure yet whether that's also true for different users. I know I'm not planning to protect users from themselves. If there is a genuine need for that, then different languages might be better for different users even in the same problem domain.
Wanted to clarify my earlier discussion of syntax with the following example.
Suppose we have three users:
(AS) Andy Smith
(BS) Andy Smith's brother Bobby Smith who can telepathically communicate with his computer
(CS) An alien computer scientist
Because the process of programming involves the translation from natural language to machine language (logic), there will necessarily be various syntactical preferences depending on the natural language of the programmer and the input method.
For this example we will assume AS and BS have the same basic cognitive structure and natural language. The only difference is that BS is telepathically linked to his computer, whereas AS must use a keyboard. Presumably there now is no barrier for BS to use a purer translation of natural language to machine language. Because there is no time cost for typing, BS can represent lambda as 'lambda' instead of 'l' or simply parens. However, AS will likely choose abbreviations, since to them they represent an ease of use in the translation process from natural language to machine language. However, probably he will use only English language derived abbreviations of no less than three characters, as anything else would be outside his cognitive context.
Because our alien programmer (CS) thinks in a completely different language, the preferred syntax will also be different. Perhaps CS will want to represent concepts (such as lambda) as a hexadecimal number. This can hardly be said to be better or worse as machine logic does not change, only the representation/syntax changes.
So I think if I had to rewrite my earlier statement I would get rid of the word 'preferences,' which implies subjectivity. Rather, I would suggest there are different optimal syntaxes depending on the point of origin (or point of translation from natural language to machine language) but likely only one optimal representation as machine logic (per problem).
Seems perfectly sound to me. redditers will have to find something new (possibly - or even probably - artificial) thing to rag on, I guess. Again, I'm not sure why, but it seems to be a favorite activity of online communities to try to criticize anything they can. It feels like there should be an official term for this phenomenon (that is, other than "trolling"; more like WHY it happens), but alas my research fails me. Help, anyone?
As an aside about Arc, but not to harass you:
If you don't mind my asking of silly questions, I have noticed that in provided snippets the syntax you seem to make use of the most is the [_] notation even though you had some additional ideas kicking around originally (e.g., x:y for (x 'y), allowing infix math, letting users define syntax). Are you still planning to include / already have these (and possibly more) syntactical forms? What of parentheses inference (not as a matter of hating parens, but just as something included in your initial writings that hasn't seemed to surface in code snippets)? Any new thoughts about such things?
PG, hats off to you. This is exactly the attitude I wish I had while I was doing research, even though the "must-produce-papers" part of academia teaches you otherwise. That's how you can produce useful stuff, anyway.
What amazes me is that nobody's pointed out the obvious counter-observation: Lisp is not an acceptable LISP. Not for any value of Lisp. There's nothing magical about this, nothing partisan. If Lisp were acceptable, then we'd all be using it.
You've all read about the Road to Lisp. I was on it for a little over a year. It's a great road, very enlightening, blah blah blah, but what they fail to mention is that Lisp isn't the at the end of it.
He points out a few things:
1. Too many different implementations
2. No decent, updated spec
3. CLOS (he gets pretty detailed here, and there are some counter-arguments and clarifications in the comments)
4. Macros, and how they don't quite deliver on the promise because most modern tools aren't equipped to handle them.
5. Type system (He doesn't really get into this one)
I'm a Lisp newb, so while I understand most of the arguments conceptually I'm in no position to really evaluate their validity. But it's an interesting discussion.
Finding the right balance between brevity and hygiene isn't easy. Sounds like the implicit variables / macros issue is resolving on the hygiene side. I look forward to hearing how that works out.
Short operators, explicitly local variables, more operators are favored rather than overloading, and a built-in hashtable type? This sounds alarmingly like Perl. ;)
Just because something is posted over at reddit that doesn't mean it has to come here...
;)