And in the other column, the Arc had significant things going against it. It was a toy implementation by somebody who really had more important things to do with his time. It didn't bring anything substantial to the table -- to a first approximation it felt like a basic Scheme with a idiosyncratic function naming scheme. It didn't have any killer features. It didn't have a significant standard library. Compare that starting point to e.g. Clojure or Go. It's like night and day.
The question isn't why Arc didn't take off. It's why anyone would expect it to take off in the first place.
And though language design (as opposed to implementation) may not seem "substantial," it does matter. Scheme itself was initially an exercise in language design-- a cleaning up of MacLisp. It didn't only become "substantial" when people started writing complicated compilers for it.
I agree that Scheme didn't only become substantial once the implementations became mature. But that's because it had a few fresh big ideas (lexical scoping/closures, CPS), not because it was new. Likewise it didn't become popular (as much as it ever became popular) just due to existing, but largely thanks to being used as the teaching language for a legendary intro course at a prestigious university.
As for the "hype" comment, that wasn't intended as a negative attribute. It's very hard for a language to succeed purely on its own merits. Tens of man years of effort had gone into Go before it was launched. It might still have failed had there not been such powerful marketing hooks in place (created by Bell Labs legends, having the appearance of being backed by Google). Arc had a good marketing hook as well. That would have made bootstrapping an initial community much easier than e.g. for Clojure.
And just to be clear this isn't intended as a personal criticism! There's just one YC, but hundreds of new languages and implementations appear every year, slowly advancing the state of the art.
I played around with Arc quite a bit and the biggest problem was that it did not have a standard library with enough stuff in it. In contrast (and talking about a totally different type of language) one of the reasons Go has been successful is the excellent set of standard packages.
If you start your language with sufficient standard library kindling then others can build on that and write more and more libraries. But start with insufficient libraries and it's hard to take off.
So, the initial releases of Arc were constrained by things that PG had already written. If you wanted to do the same then it was just fine.
As as concrete example I wanted on my old UseTheSource site (which was in Arc) to go grab a URL from a third-party site inside the code. There wasn't an easy way to do that so I wrote an external script in Perl and did the following in Arc:
(def hn-valid (user)
(if (uvar user hn)
(let v (system (+ "perl check-hn.pl " user))
(if v (= ((profile user) 'hn) t))
A similar thing happened with UseTheSource's Twitter integration where I could use Perl's Net::Twitter::Lite to trivially integrate.
NOT has large initial library => NOT gain widespread use
It is NOT the case that from my statement that it's possible to derive:
has large initial library => gain widespread use
My colleague Steve presented a talk at RubyConf India titled 'Why Clojure is my favourite Ruby' which might appeal to such people. http://www.youtube.com/watch?v=PCdEbUBk6a0
I'm already using something else. Why should I use the Java eco-system?
There are other valid choices, of course, and the JVM is obviously not the optimal choice for every project, but I think it's fallacious to rag on the JVM just because its most popular language isn't to your taste.
In short: Java bad. JVM very good.
I object to the word "fan", but I understand your meaning. Regardless, I love Clojure and like Java just fine.
I recently realized why the arc webserver needed to track IPs to ignore almost from day 1: everytime somebody requests a page, memory usage goes up. This includes crawlers, drive-by users, everybody. The standard way arc encourages webapps to be programmed (http://paulgraham.com/arcchallenge.html) is by leaking memory. Every load of the front-page creates 30 closures just for the 'flag' links. Ignoring IPs is just a band-aid to cover up this huge sink.
I've been programming in arc for 3 years and am still active on the arc forum. I love that arc is a small and simple language without lots of bells and whistles. I really couldn't care less that it's 'not taken off'. But there's a different between toys that encourage exploratory play, and painting yourself into a corner design-wise. I now think of continuation-based webservers as an evolutionary dead end.
 Another ill-effect of the ip-ignoring is that every newcomer to arc who tries to run it behind a proxy server immediately runs into a gotcha: The arc webserver now thinks all requests are coming from the same IP address and so throttles everybody. http://www.arclanguage.org/item?id=11199
 If you spend any time with it it literally begs to be tinkered with. And the experience of programming in a language while referring to its compiler in another window is something everybody should experience.
Using closures to store state is like using lists as data structures: it's a rapid prototyping technique. You're in no way painting yourself into a corner, and the current news software is proof of that. You have a very old version of it. In the years since we released the version you have, we've gradually squeezed most of the closures out of the more common things, and we do have a proxy in front of the Arc server. That's how we manage to serve 1.7 million pages a day off one core.
I was conflating continuation-based and closure-based webservers because both allow straight-line code instead of explicitly managing url handlers.
I'd never seen anybody say this is just an early-prototyping technique. But searching found Patrick Collison concur: http://www.quora.com/Whats-the-best-continuation-based-web-f...
It was too harsh to say it paints us into a corner; it is possible to replace fnid links with new defops. I went back and looked at the frontpage when not logged-in, and saw that there are 0 fnid links in that common case. (One possible way to gradually use a second server would be to just serve the frontpage off it. You'd need to move to a centralized store for user/profile/story objects, but the fnids table could continue to live on one server.)
It also turns out that there's at least one company trying to scale continuation-based servers (http://bc.tech.coop/blog/040404.html). So it was overly harsh to call it a dead end.
"The problem: because closures can’t be stored in databases, you really have to use a hash table on your web daemon."
This is not true. Closures can be stored anywhere you want.
I don't consider Arc to have died, incidentally. You used it to say that, and I'm using it to reply. If I ever retired from YC I'd probably start working on it more actively again.
FYI... I still use arc for certain projects and still have hope that one day it will get the attention it deserves. I know you've taken a lot of criticism about Arc, but I'll suggest there are quite a few of us that do appreciate the work you have done so far.
Lispers are always tempted to blame Lisp's niche status on being too advanced, but it always sounds a bit like a smug non-answer to the job interview classic "what's your biggest weakness?" One friend who likes the syntax plus the existence of reader macros does not prove the syntax is not a big issue.
That said I think his conclusion is pretty sound, though you'd be hard-pressed to find much evidence of Unix-friendliness in C# or Java.
 Deterministically, I mean - heteroiconic languages do provide for metaprogramming, but it is inherently less robust than metaprogramming in a homoiconic language: http://lists.warhead.org.uk/pipermail/iwe/2005-July/000130.h...
Take Perl, for example. You can't have robust metaprogramming on a language with an undecidable grammar. Read through the above link and you'll see the difference.
In the languages I've mentioned it is as the same level. Here are some examples - http://news.ycombinator.com/item?id=3125375
> Take Perl, for example. You can't have robust metaprogramming on a language with an undecidable grammar
I'm referring to Perl6 here which is (intended to be) self hosted via perl6 grammars & also comes with hygienic macros:
While perl5 won't get this good you'll be surprised what metaprogramming can be done with it.
NB. For an example of macro-like things here is a list of CPAN modules that make use of Devel::Declare - https://metacpan.org/requires/distribution/Devel-Declare?sor...]
> Read through the above link and you'll see the difference.
I have and I remember reading it back in 2005 :) This doesn't affect the list of languages I mentioned earlier.
When I go to the Arc site [http://arclanguage.org/], it seems pretty obvious to me why Arc has not "taken off": A drab HTML 2.0 site with a tiny font describing how it's unfinished, and the only way to install is through another Lisp version, and no indication that there is a community of developers behind it.
It's almost like they don't want anyone to use it. To be fair, it seems Paul Graham does not care about popularity. But not caring about popularity means it's dead in the water.
It's not like lots of people don't want a Lisp these days. Clojure has become very popular, after all; it has been able to hit the sweet spot in terms of modernity and lispiness; good, practical technology with a solid community.
There are a few exceptions, but for the most part, your language is going to be successful to the degree that it can interoperate with these environments. Common Lisp, Racket, Haskell, OCaml -- all of these languages have ffis, but they're "begrudging" ffis. They don't really care about your "legacy" systems or the titanic number of C and Java libraries that already exist.
Clojure and Scala do care about that, as does Lua. And Ruby and Python (and Perl) care as well.
I see beautiful new languages every day. I can tell whether they're going to live or die primarily on the basis of their attitude toward the dominant library environments in modern programming.
If you ask about C interop or creating standalone binaries on a Common Lisp board, you'll get an answer (because you can do this). But you'll also get this whole, "Oh, but why would you want to infect our beautiful language/runtime with the fallen world of imperative code, UNIX conventions, etc. Free your mind!"
That. That right there. That's the problem.
Not all "hackers" are language geeks, plenty of smart people will not necessarily invest their time in learning new exotic programming languages as language learning for the sake of it is not interesting for everyone.
Other considerations for language choice will certainly include documentation and the accessibility of that documentation. Looking up a few pieces of code to do familiar tasks is much easier than reading a grand language design document as a way to get a feel for a language.
Having usage in some large commercial setting is also a consideration. For example knowing that Google uses python extensively can give one some confidence that python is unlikely to suddenly die out one day because Google wouldn't let that happen.
C++ took the low road by co-opting C. Scala and Clojure took the low road in co-opting Java. Languages like Arc, Smalltalk, Haskell and Eiffel took the high road and as a result they don't have as many users.
Having said that, I don't want to imply that language designers are doing a bad thing when they take the "low road." They get adopters and they bring powerful tools into the world. It's just that when they do that they have to sacrifice coherence a bit.
It implies that Java failed in the marketplace. It did not. It has taken and kept huge market share. Sadly, it is the language most employers are hiring for. I would rather poke myself in the eye with a pencil than program in Java, but it has most certainly succeeded in the enterprise.
Paul Graham was building a web company in the mid-1990s, so rewriting a bunch of libraries was not an impediment. First, they had top-notch programmers who were willing to work with FFIs, write tools they were used to in other languages, and understand technology deeply enough to write good libraries. Second, the state-of-the-art for libraries in the 1990s was, if nothing else, less Big. Third, they didn't need to sell their language choice to a boss; they just needed to sell the product to the market and investors, neither of which cares what language you use as long as it works.
Arc is prettier than Clojure but, in my experience, Clojure is more than attractive enough... and it has the familiar JVM libraries.
This way you could spend zero time writing your own urlencode function (and screw up a corner case) or searching through maven to find functions to do very simple things and finding later other people on your team imported similar functions from four other projects.
Some of which were dangerous and unwieldy actually. Then proven dangerous and unwieldy. And then most of things that made idiomatic PHP were removed from the language or discouraged and nobody writes that way anymore.
Of course the ultimate in brevity is to have the program already written for you, and merely to call it. And this brings us to what I think will be an increasingly important feature of programming languages: library functions.
In fact, read the whole section 6, on Libraries.
Where there is disagreement is that most CTO's are afraid to adopt "weird" languages out of library FUD, which PG asserts doesn't matter. On that, I think PG is right.
Lisp variants can be seen as a scripting language equivalent -- with efficient compilation. It was (and still is, to a large degree) an obvious win compared to Perl/Ruby/etc.
Academic stigma and/or culture clash with scripting users is what I'd guess. Or maybe it just is lack of hype?
But I don't expect to see an explanation. It is a larger mystery than dark energy to at least me -- why didn't lisp take over the world 10-15 years ago when the scripting languages started going?!
Edit: pjmlp, I talked about the last 10-15 years and compared with scripting languages, so hardware/AI winter/IDE are earlier/later problems. (Today, a lack of libraries might be the worst problem, except for Clojure(?).)
- Lack of a proper IDE
- The AI Winter
- Mainstream hardware wasn't powerful enough
- Most blue collar developers don't understand FP concepts
Just some possible reasons out of my head.
HAHAHA! The vast majority of FP advocates are either unemployed or work as math teachers.
Like it or hate it, FP is absolutely nothing more than a pseudo-programming paradigm (largely emulating some concepts from astract math and using notation somewhat similar to math notation) that attracts people who can't wrap their heads around OOP, rich frameworks and other associated stuff. Sorry folks, computers are neither abstract nor stateless. And the same holds true for software, which often deals with real world stuff, which again, is neither abstract nor stateless. Virtually everything that can be done in a functional language, can also be done in a procedural or OOP language. The opposite on the other hand is totally untrue. It's really funny to see how FP advocates struggle even with some extremely basic things. Using languages/platforms like C/C++, Java,.Net - there's always an increase in performance compared to any functional language (yeah, including scala, f#, clojure ocaml an so on)
The "elegant code" argument is one of the most ridiculous things FP advocated come with, since it's almost always synonymous with crappy, cryptic code that no one wants to read besides its authors (maybe not even them after a few weeks or months) :D
So I'm afraid that the FP advocates are far worse than real blue collar workers.
It's true. But you know what computers are first and foremost? They're deterministic.
And you know what's one the biggest problem programmers do face in the Real-World [TM] when the shit hits the fan (and most devs' jobs is to fix shit that just hit the fan)? It's being able to recreate the state and to then be able to deterministically reproduce the shit that did hit the fan. As to prevent it from hitting the fan again.
Why do we seen market makers using 90 Ocaml programmers and raving about it?
Why do we see investment banks moving from Java to Clojure and slashing their codebase by doing so by a factor of ten? And then explaining how easier their life became in the face of changing requirements (eg new laws/regulations coming in)?
Do you really think that a codebase ten times smaller is "harder to read"? Do you really think that making it easier to reproduce the state is not a goal worthy to achieve?
I realize you feel insecure in your Java/C# + ORM + XML + SQL hell but don't worry: there's always going to be lots of real-world jobs for code monkeys like you ; )
That's like saying that computers have mass and are made of matter.
"And you know what's one the biggest problem programmers do face in the Real-World [TM] when the shit hits the fan (and most devs' jobs is to fix shit that just hit the fan)? It's being able to recreate the state and to then be able to deterministically reproduce the shit that did hit the fan. As to prevent it from hitting the fan again."
That's false. Programmers don't have to recreate the same exact state, actually it's not necessary to recreate the error at all in many cases. There are more tools than you can imagine for identifying errors from logging to memory dumpers and analyzers/profilers...
"Why do we seen market makers using 90 Ocaml programmers and raving about it?
Why do we see investment banks moving from Java to Clojure and slashing their codebase by doing so by a factor of ten?"
Well, I'm afraid that happens in your imagination only. I also happen to be a trader. Almost NO ONE uses functional languages (fewer than 0.01 %) for financial trading. The main languages are C/C++ (especially for high frequency trading) and, of course, Java and also .Net.
"I realize you feel insecure in your Java/C# + ORM + XML + SQL hell but don't worry: there's always going to be lots of real-world jobs for code monkeys like you ; )"
You're pretty delusional about how secure or insecure I feel (haha!) and how much of a "codemonkey" I am. LOL! You don't even know me, but you already pretend that you know me. Unfortunately for you(and all those like you), this is a typical characteristic of FP advocates: you live in an illusory world, have a totally distorted view about software engineering and of course about the people who do make real world software. Anyway, it's always funny to see the reactions of FP advocates when they're left without any objective, verifiable, real-world arguments. :D
Lisp example also always cheat at things like Quines... the program will return itself instead of writing itself to stdout.
There is something to be said for being able to just run 'python foo.py'
Where as in many lisp variants there is no clear entry point, and often not really an "interpreter".
The people who care about this distinction are the sort of people who give job interviews where they ask you to implement a balanced binary search tree on the whiteboard and then grill you on the order of growth of each function that you wrote. :-)
Develop a stubborn willingness to ignore the at-times unsatisfactory performance of immutable data structures.