Hacker News new | past | comments | ask | show | jobs | submit login
Arc: Where are we going? (arclanguage.org)
69 points by nreece on Oct 25, 2008 | hide | past | favorite | 27 comments



Sounds like the community is not very happy, for various reasons (some valid).

If I were an arclanguage community member reading this and seeing pg's responses, I would not feel better.

The whole situation sounds like a big mess especially considering lisp's history of "community fragmentation and trivially incompatible implementations."

I sincerely hope things get better.


What you call fragmentation in the Lisp community I actually consider a sign of health. It was a sign of the vitality of the Lisp community in the 70s and 80s that people keep forking off new dialects. That was actually something I was hoping to encourage more of with Arc.

CL was in my opinion (and in the opinion a lot of Lisp hackers) a disaster in that respect. Before CL, Lisp had evolved rapidly by spawning new dialects. Once CL was established as the standard, this evolution practically stopped.


I've thought about this some, and think it depends on how you define "health". What you describe sounds very healthy in terms of a good place to be for those wanting to hack on cool, new things, to push boundaries, to explore, and so on. However, I am less certain that it makes for a healthy community in terms of one where there are relatively fewer producers of the language, and many consumers, and those consumers want something that "just works" that they can depend on. In that sort of situation, you run into the problem where people get annoyed with the "which one!?" problem described here:

http://journal.dedasys.com/articles/2006/02/18/maximizers-sa...

My gut feeling is that there's a 'right' balance that's hard to find. If you just want to explore and be creative and push the limits, that kind of fragmentation is probably good, or at least not harmful. If you want to create something that people, say, base their businesses on, I think that it is not as positive. People get nervous about picking the 'right' one; beginners are especially confused, because they may not even have acquired the knowledge to decide what is 'right'. Lots of hacking goes into implementations, rather than libraries, and libraries may not work on all implementations, further exacerbating the 'which one' problem, which now involves looking at which libs run where.

Another part of the "problem" is that if you're aiming for "fun and experimentation", you are in a somewhat different (enviable?) position than most programming language creators: you've written about and popularized the language already, and are somewhat famous yourself, guaranteeing Arc a wide audience, something that is probably likely to attract more 'consumer' types than 'creators'. They're the ones who will be grumpy about not having something stable.


"hack on cool, new things, to push boundaries, to explore, and so on"

That sounds pretty good to me, actually.

I don't see why aiming for "fun and experimentation" would put me in a different position from other language designers. It's not like anyone does this for a living. I bet most of them started out working on their language in the spirit of intellectual adventure, then gradually got dragged into being motivated more by duty and eminence, till eventually like poor Guido their time was all taken up thinking about character sets. That sounds to me like something one ought to resist. Doesn't it to you?

Amusingly enough, you can see the pressure in action all over this thread, e.g.

http://news.ycombinator.com/edit?id=343663

I'm pretty good at resisting though ;)


I don't see why you are so dismissive of Unicode handling as it relates to programming language design. To deal with string encoding correctly, your programming language needs to transparently deal with different types of strings, or a single string that has two ways to iterate over it. That's an interesting design problem.

In particular, you do not want everything in Arc to be an object. But the typical way other programming languages deal with Unicode nicely is to have different classes that implement a common string interface. So how can Arc approach the string encoding problem without making everything an object? I don't know, but I do think it's an important question.


Then it becomes, as is so often the case, a question of proper marketing, and managing expectations. You seem to have acquired a group of users/interested onlookers expecting something they're not getting. Or at least a few loud ones:-)

I'm actually fascinated by these 'social science' (i.e. not really a science) aspects of programming languages. If you think about it, programming languages are all about people, how they interact with computers, and how they interact with one another to solve complex problems.


Amen. Lisp history is like Chinese history. The 'warring states' period, though lamented as chaotic, was the most innovative time for both. Harmonious unity sounds good but yields stagnation. So three cheers for Arc and Clojure, and Estonia, the Czech Republic, the Liga Nord, the Bloc Québécois and the state of Jefferson.


I predict that Paul will end up abandoning Arc and restarting his quest for the perfect programming language anew. It seems to happen with other successful languages - Python wasn't Guido's first attempt at a programming language, and Ruby wasn't Matz's first attempt at a language. It's hard to change fundamentals once you have already publicized a language and you have working code, and I think some of the fundamentals of Arc will definitely need to be changed. In particular, the distinctions between the different types of iterable objects - strings, arrays, lists - the inability to design your own objects that are also usable in the same ways as the built-in objects, these are pretty key aspects of the more recent programming languages. I could easily be wrong, but I still suspect the 100-year language quest needs a reboot.

Good luck, and I'll keep trying Arc 3 or whatever future stuff there is. I would love a successful language with real macro capability.


There wouldn't be any need to make a new language. I can change anything I want in this one. There's very little working code out there; that is in fact probably the main complaint about Arc; and one reason that's so is that I don't want it to be hard to change:

http://www.paulgraham.com/core.html

I'm not deeply committed to the current approach to iteration. But I don't want to switch to something more complicated till I can prove it's a net win. My standard of proof is whether the proposed language change will make the source of a real program shorter. So if you have an idea for a new iteration abstraction that would make some piece of code in news.arc shorter, please show me the source before and after. (You don't have to implement the new abstraction; just show me what code using it would look like.)


I think the non-object-oriented philosophy in Arc is so deeply ingrained it'll be hard to change without starting over.

But really the question is whether any object-oriented philosophy is necessary. My take is, let's say you write a function in Arc that operates on lists. It's recursive using car and cdr. Now ideally, you could also run that function on anything else that is list-like. But in Arc, car only works on lists. Maybe you can fix this without making everything an object, that'd be fine by me.

In practice modern programming languages are tending towards a default dynamic-array-like datastructure that supports both random access and amortized-O(1) append. This is so convenient for hacking programs together, I can't believe the 100-year language won't have this as a very basic part of the system. But Arc doesn't do this. In news.arc you have a very Lisp-list-centric way of programming. But I think for many problems less restrictive datatypes are the most convenient. Hash tables and dynamic arrays need to be just as easy as Lisp lists. This is the Python way, the Ruby way, and I think the way of the 100-year language. But I don't know how to make iteration on any of these look equivalent without making the core of the language be more object oriented.


To be honest, I don't understand your point fully. Are you able to expand on your points a bit more?

Are you saying you believe you need some OO elements of polymorphism on car and cdr? Not sure you need these - there are loop constructs that let for "for each do X". With macros you could make this do almost any type of iteration.

I might be missing your point though, as I'm not sure how Objects help in this scenario.

Why can't you iterate on hash tables in Arc? ...or another related question, what do you mean by iterate in the context of a hash table?

Similarly... Isn't a dynamic array just a list? Or it is specifically the performance of this that you're concerned about - surely this is just an optimisation step rather than a hole in the language?... Either way, I'll agree that they're convenient for hacking programs together :)


Your point is that it is possible to iterate over any sort of data structure. This is true. The problem is that you have to use different sorts of iteration for different data structures.

Yes, it is possible to use macros to write code that works over both lists and other iterable things. But that isn't the most natural usage; Arc encourages you to write recursive functions using car and cdr. And if you have a lot of code using car and cdr, you have to rewrite that if you want to convert data from lists to another data type. Objects could possibly give you a polymorphic car and cdr, similar to Clojure, so you just don't have to worry about "what sort of iterable object" a function is going to work on.

As far as dynamic arrays versus lists, you can consider it "just an optimization step" as long as you can swap in a dynamic array for a list later on without changing your code. But that isn't possible in Arc because you will have to replace car/cdr with other iteration constructs. I think it's a little unfair to consider the difference between Lisp lists and dynamic arrays just a matter of optimization - there are plenty of things that you can't even play around with when random access is O(n). E.g. randomly shuffling the lines of a large text file.


Lua's tables seem like a pretty elegant solution to the dynamic-array-like datastructure, and they're also usable as a relational map. Also, Lua is not object-oriented, but it's trivial to build on the core semantics to make it OO, when doing so would be a practical approach to the problem at hand. (As far as the iteration concern, everything in Lua besides unboxed primitives is a (pseudo-hash) table.)

> But I don't know how to make iteration on any of these look equivalent without making the core of the language be more object oriented.

Haskell's typeclasses would work, as well, and in practice seem to generalize far better. Instead of restricting the iterator used to a specific type, restrict it to any type that supports the interface of iteration. (It's kind of like duck-typing that is verified at compile-time.) It sounds like Clojure has a similar mechanism.


Clojure seems to tackle this problem by having different "iterable" data types (lists, hashes, vectors), but having them implement a common interface that allows many functions to operate on all of them.


I think Clojure does this well; this solution pretty much requires that every collection is a full-fledged object, though, which is why I can't see it going into Arc. Maybe you could special-case lists and hash tables.


One thing that seems to be a net win in scientific programming as well as graphics programming is a method to view code that renders sub and subscripts and unicode characters (as in Fortress). This doesn't reduce the number of nodes required, but it does improve readability substantially. Since reading code is often harder than writing it, and since scientific and graphics code is ordinarily extremely hard to read, and improvement to readability helps.

Adding very powerful and general capabilities to syntax, as Fortress does, also reduces the number of nodes required, but I've yet to determine whether the added complexity is worth it.


> There wouldn't be any need to make a new language. I can change anything I want in this one.

> I'm not deeply committed to the current approach to iteration. But I don't want to switch to something more complicated till I can prove it's a net win.

Paul, this doesn't sound right. Prove that a given change is a net win? Since you obviously can change anything you want at any time, why not go ahead and add the features that seem like they might be worth trying out? See what your users do with them. If the changes/additions turn out to be duds, just rip them out in the next version and let a subset of users whine about it. After all, users will whine either way -- may as well take the path that benefits Arc the most, right?


> My standard of proof is whether the proposed language change will make the source of a real program shorter

Short != Clear. Brevity is a false economy, I think you are pushing for the wrong goal.


I didn't say that making programs shorter was a sufficient condition for implementing a proposed change.


This whole attitude of now, now, now, seems very against the hacker ethos. What happened to procrastination and laziness?


"This whole attitude of now, now, now, seems very against the hacker ethos."

But perhaps in tune with the entrepreneur ethos?


Yes, I think that is a good point.


What happened to procrastination and laziness?

To say nothing of impatience and hubris.


I think the best answer in the thread is the one by " cchooper", that starts with "Personally, I think the only thing". Search for it, it shows an interesting pov.



I like that, the 100 year language is an idea, not an implementation.


I personally feel that a language core should be the work of a single person.

Even 2 persons should not work together on a language core. The quantity of communication required if 2 persons work on a single language is so large, that hinders and messes up the development process.

The core set of axioms that form a language are almost fundamental as the laws of nature to that language. Just imagine theory of relativity being developed by several scientists together... almost impossible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: