Hacker News new | comments | ask | show | jobs | submit login
Doug McIlroy: McCarthy Presents Lisp (paulgraham.com)
129 points by pg on Sept 25, 2009 | hide | past | web | favorite | 61 comments

I just finished reading the LISP 1.5 Programmer's Manual (still readily available from Amazon), published in 1962. The book is mind-blowing, because it mixes discussions of high-order functions with instructions on how to properly lay out programs on punch cards.

The "high level language" which shipped on every microcomputer platform of the 1970s and 1980s was Basic. I have often wondered why. Compared to Lisp, it is inelegant and inexpressive. Perhaps people who hacked larger computers of the era did not interact with microcomputer hobbyists. Perhaps it was too difficult to get Lisp fast enough to be usable on the old hardware, although if it was possible on big iron in the late 1950s, it should have also been possible on microprocessors in the 1970s.

An entire generation of programmers grew up using GOTO <line number> to call subroutines and execute loops. That same generation could have grown up using lambdas. How sad.

There was a tradition of using Basic from the days of printer terminals. A language with s-expressions doesn't work well with printer terminals, and a language with line numbers does.

Lisp would also probably have overloaded the hardware of early microcomputers.

And then of course there is the fact that Basic seemed accessible, and Lisp seemed frighteningly advanced. Even Lisp hackers seem to have thought so in the 1970s.

What surprises me is the unpopularity of FORTH. The interpreter is simpler than BASIC and the language spans many more levels, from bit groveling to macros and function composition.

LISP 1.5 Programmer's Manual (and others) in PDF: http://www.softwarepreservation.org/projects/LISP/book

Nice link! There are PDFs of old APL texts, too.

"An entire generation of programmers grew up using GOTO <line number> to call subroutines and execute loops."

It was not that bad. In the end, it led to Windows.

Oh wait...

Now, seriously, we used GOSUB for subroutines.

An excerpt from The Dream Machine:

"As a Lisp programmer continued to link simpler functions into more complex ones, he or she would eventually reach a point where the whole program was a function-which, of course, would also be just another list.. So to execute that program, the programmer would simply give a command for the list to evaluate itself in the context of all the definitions that had gone before, And in a truly spectacular exercise in self-reference, it would do precisely that. In effect, such a list provided the purest possible embodiment of John von Neumann's original conception of a stored program: it was both data and executable code, at one and the same time

In Mathematics, the technical name for this sort of thing is recursive function theory, which was why McCarthy called his first public description of Lisp 'Recursive Functions of Symbolic Expressions and Their Computation By Machine' Today ranked as one of the most influential documents in the history of computer languages, that paper established that a language could have a rigorous mathematical foundation. And it signified that John McCarthy had finally come up with a framework that was precise enough, rigorous enough, and compelling enough to satisfy even him"

His paper can be found here: http://www-formal.stanford.edu/jmc/recursive.pdf

I am just amazed how often his name appears on this book (the first time he is mentioned, Waldrop talks about how strange people said he is). But in my book he will always be a legend and a visionary.

I read that paper around six months ago. I found it a little dry at first, and then absolutely enlightening. It's amazing what beautiful and complex things can be built with such a small handful of functions.

I always wondered why Lisp is dead in the water. I got replies from people that never professorially coded anything in Lisp or Scheme (yeah..) - argument was along the lines that in the past hardware was always dictating lower level languages and leagues of people were taught the ways of the everyday code that was only a few (if any) levels above the hardware. Thus, all of the primitives of the age (ASM, C, Pascal, whatnot).

However, I don't think that is the main story here. It sure is a big part, but not main for sure. I mean, there were LISP systems in usage at the big boys houses where money/hardware was no object. Later on, Moore law brought us supercomputers of the past to our desktops and we got C++, Java and Python; Even Ada was formed and deployed.

So, what's the deal? My primary focus in life has been computer graphics, and my first language was C. I was born in 1980 and my early serious computer endeavor is inherently tied to the Amiga platform (SAS C and alike), so I didn't have exposure to other language paradigms at all until later.

Later on, through a marvelous turns of event I was exposed to Scheme and at first I didn't "get it" - but there was this epiphany moment or two where I could see almost everything I needed in computer language for my needs, yet it was so simple.

Not to bore you more with details - I always wanted to get into game development. Career of TV and Commercials direction happened in the mean time, and now I am back into making my game development ideal a reality. My weapon of choice is D/Tango, since it really best suites my mindset. However, after seeing a PPT/PDF from naughty dog earlier this year about how they have utilized an embedded PLT Scheme, I came to wonder once again about maybe using Scheme as a scripting language after all.

tl;dr; Can anyone, preferably a pro Lisp or Scheme developer give a rationale why it isn't used more often? Everything I've read makes absolutely no sense why it isn't. Oh, and about the python argument that gets thrown around - I like the python as the next guy, but let's not fool ourselves.


I'm not a pro Lisper, but I think reasons offered for this include:

* The AI winter (with Lisp, at the time, being seen as "the AI language")

* The heavy-cost and closed nature of the toolchain, and lack of standardization between toolchains, when industry was searching for a language to rally behind several decades ago

* The continued lack of standardization on all the "included batteries" when the open source movement was looking for languages to adopt for scripting a decade ago

* The deep semantic rabbit-hole a Lisp codebase can turn into when any macro-like functionality is used, meaning that a programmer has to basically learn every project as its own language (that is, DSL) before they can get to work—further meaning that projects grow a sense of individual stewardship/artistic direction, and become harder to work on by hiring committees and one-off contractors

* The continued outside impression of Lisps as slow, pure-functional and academic (even though most today are none of those)

* The unwillingness for most Lisps to embrace the Unix philosophy (meaning that it's easier to write a web browser in e-lisp than to loosely couple to one already written in some other language. Clojure is basically the only Lisp that escapes this, cleanly interoperating with the APIs of other languages... on the JVM, which itself throws Unix design principles out the window.)

I'm sure there are more well-founded answers, but all of those came to mind on my road to accepting Lisp.

Your point about "embrac[ing] the Unix philosophy" is a good one. Probably not the biggest factor for mainstream adoption, but significant nonetheless. While there are some Lisps now that that do this well (particularly those that compile to C, such as Chicken Scheme), it has a big historical precedent.

I've been wondering how Lisp would look if it were redesigned from the ground up to include Unix's philosophy, and I'm beginning to think that Erlang would be a good candidate. (With a healthy dose of Prolog as well.)

Check out Lisp Flavored Erlang for a lisp developed on the Erlang VM. http://github.com/rvirding/lfe

Thanks! (erlog in his other projects looks good too.)

I think you presented the list very well.

Several points have been touched on already, I will mention one: DSL. I agree that the perception of it as a barrier to entry in a project may exist. I think it is as incorrect as tying Lisp to AI, albeit for a very different reason. Indeed, a new member of a team has to learn DSL if one is used in the project. However, new members always have to learn the conventions and patterns used in the project they are joining. DSL makes some of them explicit and more understandable.

Strong arguments there, I cannot even question them. Probably the two that stand out are DSLs and impression of Lisp as slow, academic in nature.

I'll go read up on AI winter again, shame I wasn't around to witness that era. On a related note, I just got an email a few days ago for "Third Conference on Artificial General Intelligence, AGI-10 (in Lugano, Switzerland, March 5-8 2010)".

Can anyone, preferably a pro Lisp or Scheme developer give a rationale why it isn't used more often?

You will need to bite bullets, take responsibilities and own projects and budgets if you want to use lisp at "work". I am doing my current project in Common Lisp, all the way down. It wasn't always like that; I started as project manager for a tiny but growing PHP codebase. The application domain is both too broad, and also error prone; a large web application platform. There are both budget and time constraints and corporate buyers are on our ass daily.

Between conference calls and discussions with the outsourcing company that was doing our project, I decided to prototype it on the side in CL and see how far I can go. Honestly, I prototyped it just to see if I understood the application domain. The programmers where coming to me on an hourly basis asking questions and it came down to the level where I had to spoon-feed them fully specified PHP classes, well documented API designs and database models. These guys were just typing out my sweat and getting paid for it.

I bit the bullet, took full responsibility, looked my boss in the eye and and said it was now MY project.

> I am doing my current project in Common Lisp, all the way down.

Right on. What Common Lisp Operating System did you find suitable for what you are doing?

That's a question you should ask every non-C programmer. But yeah, I run Movitz.


Chew it.

[Edit: this, of course, is a wise-ass reply to something I considered a wise-ass question. I develop on Win32 and deploy on Linux]

Sorry if my question upset you. I was being serious. You mentioned Turtles all the way down so I figured that you were using a Lisp OS.

What did you mean by Turtles all the way down if you actually aren't using a Lisp OS?

What is "Chew it"?

s/"all the way down"/"all around"/ and I didn't say "turtles" anywhere.

"Chew it", that's my most well-behaved curse word. It's the second person imperative version of "darn-it!" and "blimey!"


Sorry for the confusion:)

It's not that dead. You used it to make this comment.

Touche and haw haw, but you very well know what I've meant.

No, actually, I'm not sure which of two usual misconceptions you have: (a) that Lisp = Common Lisp, or (b) that Lisp is rarely used.

Usually when people think Lisp is dead, they mean (b), but you've been here long enough to have noticed the volume of stories about Clojure on HN, so the probability of (a) and (b) seem about equal.

As for a) I was referring to Lisp as a family of computer languages - not Common Lisp or Scheme per se, so that might have come out unclear. Maybe because my native language is not English.

However as for b) - Of course I have seen a trend on HN about Lisp, so what's better place to look for answers than here? It's a safe haven, and that's it. Even if all people on HN wrote nothing but code in one of the Lisp dialects, would that be statistically significant at all? Try to find questions in general for lisp, scheme, cl and variations on stackoverflow for example.

I know you have wrote a successful software where you raved about Lisp - and it's cool and all, but that was back then. I also see people talking about Clojure in volumes, but I haven't seen any real world usage of significance statistical merit.

However, when watching presentation at google from Dan Weinreb - http://www.youtube.com/watch?v=xquJvmHF3S8 or when looking at presentation from naughty dog (for their new engine): http://www.gameenginebook.com/coursemat.html you really have to think about it and say to yourself: 'Hell, why this isn't used more?'. That is all.

Clojure crossed Common Lisp on Google Trends in october 2008, then remained stable until 1.0. From then, it kept on growing, at a slow but steady pace.

I guess we'll see another boom when the major libraries will have stabilized as well...


The trend for Clojure is encouraging, however when comparing it with the trend for Common Lisp, one has to keep in mind that the latter is often called simply Lisp. A slight decline in the trend for Common Lisp may be related to an increase in such usage (if there is one, which I am not sure about).

Not to mention the fact that even non-lisps are getting progressively more lispy over time.

Yes, and this, I think, has so far been the largest contribution of lisp. Not too bad!

And if Clojure becomes a widely used language, so much better. :-) That's still a big "if", of course.

What you mean is that "dead" is a perception that exists only in your mind.

If you look at concerns other than langauge, you could easily say that "writing quality software is dead", since most programmers write low-quality programs that barely work at all. That doesn't mean that you shouldn't try to do a good job, even though quality in the industry is "dead".

Don't base what you do on the actions of an ignorant majority. If you like Lisp, use Lisp. You won't be the only one.

The difference is that the industry desires quality; it just doesn't know how to achieve it. The industry does not desire Lisps (or, for that matter any language that doesn't resemble either C or BASIC.)

Mantras like "YAGNI" and "worse is better" are not indicative of wanting quality.

But my point is; ignore the industry -- it's a failure. Do what you want to do instead -- you can't do worse than the industry as a whole.

There is nothing really 'dead' about Lisp, whether you're speaking about Common Lisp or all Lisps. But it is true that most programmers would not turn to it when considering new projects.

I essentially see two problems with the state of Lisps. First off, the one with major library support, Common Lisp, is often times lacking in good library support. You'll find 10 libraries to do anything, but 5 of them will be undocumented, 3 of them will be dead projects, and 2 of them are incomplete.

The second problem is community. Again, I speak of Common Lisp, because it's the most fully-featured Lisp out there right now. Where is the community of Common Lisp? Who do I go to ask for feature requests in the 'next version'? Do I get some form of support? (I mean, c'mon, just compare it to Python!)

I think that Clojure has solved the second problem, more or less, with it's BDFL. However, I don't think it has the breadth of features that CL has, and being on the JVM brings some of its own problems.

Clojure is getting there, but I think that if a group of, say, 10 people decided to 'popularize' Common Lisp, they could. Just create a distribution of Lisp (say, using sbcl or anything else cross-platform) and ship it with a few helper libraries - since Lisp has macros and reader macros, it can essentially be transformed into the clean language it sometimes isn't.

So how's CL different for getting support than Python?

When I google for a bug/question in Python, I invariably find remarks about the same Python I am using (CPython, the effective standard and single implementation that works on all platforms equally well which all users have rallied around).

When I google about the same for whichever of the multitude of Common Lisp implementations exist out there, I have to filter down to which one as well as which platform.

That is a tremendous difference, especially for newcomers.

As for "actual" support, given the focus on a single implementation (which I understand and like that it is not possible with CL), when you need to hire some random consultant service, you don't have to do the same thing when filtering prospective people.

My opinion - a lot of it comes down to implementation. There are a couple of really good open source implementations, but for the most part a lot of Lisp folks prefer the commercial ones. Anecdotally, Lisp came along so long ago that it has survived several different phases of computer software sales evolution. The phase that saw dynamic programming development tools be sold for significant sums of money altered the Lisp implementation landscape significantly, and today we have commercial implementations like Lispworks and Allegro, which while excellent, are expensive. This pulls enough talent away from open source implementations, of which there are several, that they all remain somewhat warty. So while I use the heck out of SBCL, I must admit that I find myself occasionally envious of the libraries and attention, and especially smoothness of integration with new technology (as opposed to new paradigms) that other dynamic languages such as python and ruby seem to enjoy.

Actually there are users that were looking for alternatives to expensive or dying CL implementations. These were and are financing work into SBCL and now also CCL. For example the port of CCL and its development environment to Windows is paid for. Recently at the ECLM (the European Common Lisp Meeting here in Hamburg), Dan Weinreb of ITA Software described what enhancements and robustifications of CCL (Clozure CL) were paid for by ITA for their reservation system project. I can tell you that it was quite an impressive list.

Currently the state of implementations like SBCL and CCL is really good. There are lots of further improvements possible and necessary as the computing landscape evolves. But from a Lisp user perspective, the free implementations have never been better than today. Users should be aware of that and take advantage of that. More work on graphics, development tools, etc. is welcome, though.

> commercial implementations like Lispworks and Allegro

I understand that Clozure CL and SBCL are also good. Maybe not THAT good, but good. Thing that bothers me with that argument is if CL or Lisp in general had the appeal to the critical mass of programmers, it would be built - no matter if standards were the obstacle or commercial implementations. They would come and build a new one, no?

As for the libraries - there surely is a way to make bindings to other libraries. For example, we have that kind of problem in D, but we can easily bind to C libraries, so that is not that much of a hassle.

My deep thought on all of that is what bothers me the most. Here is my theory. Is it possible that the real reason is that there are basically two types of programmers:

1. Recipe / Cargo Cult programmers that are programmers by career inertia

2. Programmers that think more about the nature of the problem and hack their mindset to the code. You know, the type that thinks more than they code - code is just a splurt at the end of the process.

If that is true, and if Lisp appeals only to the second bunch (not all of them, of course). It might be a hypothetical disastrous indicator of the ratio in programmer population.

Or maybe I'm just thinking too much about it all.

Hmmm... well, there are probably more than two types, but I think there are at least the two types you mentioned. The only person I really feel comfortable speaking for is myself. I choose Lisp (Common Lisp in particular, and SBCL at that) for a number of reasons, but mostly:

1. because I seem avoid so much code that I could see myself having to write in other languages, and

2. because it gives me all this great dynamic functionality while still being extremely fast, especially compared to other languages.

Once I realized all the crazy things you can do with it and still run fast its been difficult to imagine using anything else where I have a choice.

"The phase that saw dynamic programming development tools be sold for significant sums of money altered the Lisp implementation landscape significantly, and today we have commercial implementations like Lispworks and Allegro, which while excellent, are expensive. This pulls enough talent away from open source implementations, of which there are several, that they all remain somewhat warty."

This assumes lisp = Common Lisp.

Clojure for example is entirely Open Source and a delight to code in.

Good point. Bad assumption on my part.

Clojure is not 'warty'? Can it even dump an image?

> My primary focus in life has been computer graphics, and my first language was C. I was born in 1980

Ah, you've come a bit too late. Late 80s through early 90s Lisp machines were popular in the CG industry; modeling, animating and even rendering software were written in pure Lisp. After Lisp machines were taken over by unix workstations, those tools were ported on Common Lisp and as far as I know they kept some share until late 90s, used in quite a few popular titles on consoles such as PlayStation or Nintendo64.

Why have they gone? I think the reason is compound of many factors, and I've only seen the transition from just one corner of the industry; other people have different opinion. One thing I suspect is that Lisp camp didn't have enough resource to catch up the steep increase of demanded quality and quantity as graphics hardware quickly evolved.

Back in mid-90s I used 2k to 5k-poly model, made in the modeler written in Lisp, to make some demo game that run on big SGI machines; such poly count became norm in late stage of PlayStation, and when PlayStation 2 appeared the poly count increased order of magnitude, the trend which still keeps going. The architecture of the Lisp modeling tool back then wasn't suitable for the "next generation" graphics; it needed to be rewritten, but they had hard time to do so. Meantime, bigger players entered into the industry, using legion of developers to pump out authoring tools and middleware.

I believe Lisp can boost productivity, but it tend to work better in a small team tackling one hard problem. When tons of features and optimizations done by lots of developers are required, I guess the power of language probably matters relatively less, and the amount of circulating money matters more. (Naughtydog was probably an exception. The founder was Lisp guy, incredibly good one.)

I suspect that it may be a general trend. When a problem is very hard and only small group of enthusiasts are working on it, they choose tools that are most effective for themselves. Like explorers who go into the wild where nobody ever traveled. Lisp may shine in such circumstances.

Once the wilderness is roughly mapped, small towns are built, and dirt roads are created, then much broader development is required. Lots of developers come to the front and start expanding the envelope. Once the field enters in that stage, the choice of language reflects the proportion of their share in general; many use C/C++, for example, and Lisp becomes minority. (I don't intend to despise the developers in that stage; there are still hard problems and they are still doing incredibly cool things. It's just that the earliest stage, where even what's hard is not really clear and only extremely ambitious people are active, may have different demographics in terms of the choice of the language. I guess Viaweb was also one of such cases.)

There was a time when most uses of a high-level language in domains like graphics were replaced with C++. C++ quickly became the dominant language in graphics and simulation. It still is. One also saw a lot of special hardware, for which it was difficult to come up with higher-level language implementations (Playstation 3 is such an example with its cell processors). One would have needed experts to port languages like Lisp to all kinds of new hardware - the expertise and the demand was just not there. So with every new exotic hardware, the trend to just use C/C++ with some simple scripting component was accelerating.

In some other areas there was the trend to standardize on Java. Though Java failed (mostly) over time in graphics and simulation.

Still some people use Lisp in graphics related domains. For example CoCreate uses Common Lisp for its CAD package. Recently Lisp has seen increasing use by hobbyists and enthusiasts to write games (using SDL etc.).

Yup. I think there was a demand (of better language), but supplying necessary tool chain in Lisp required too much resources. In PlayStation2 era, NaughtyDog got advantage from their custom compiler, that's I heard in their GDC talk. But not everybody could afford that resources. I used Scheme in a part of toolchain in the production but never managed to push the code in the actual game shipped.

I am still trying to use Lisp for these domains. A small piece of Scheme code is running on the server side of one of the metaverse applications currently doing beta.

Naughty Dog used Allegro Common Lisp for their development environment - to implement some kind of Scheme dialect for the Playstation II. Then they got bought by Sony and for the Playstation III they used C++. As I understand they wanted to share code and technology with other parts of Sony working on games - and those were not using Lisp, but C++, like most of the industry. From reading some of their latest presentations I get the impression that this did not work out the way they had thought. So Naughty Dog is back using Lisp, this time using Scheme as part of the toolchain.

Xach recently wrote about the use of Lisp in Games:


But there is more, I remember for example that there is a relatively large online game that uses Common Lisp: neuroarena.com - a tutorial video is at http://www.youtube.com/watch?v=XFzP6Shxbbs .

Thanks for the pointers. Encouraging.

The thing is, as TY pointed out Nichimen - I've read that essay awhile ago from pg how Lisp gave them competitive advantage... and if you look at Nichimen (spinoff from Symbolics) their path was almost the same. There were SubD's before in our industry (Catmull Clark, Ligthwave MetaNURBS that fori made), but when Nichimen made SubD's into Mirai (and later Nendo) it caused a revolution in 3D, especially modeling. Prior to that we were churning out NURBS for detailed characters and objects, some ventured into poly-by-poly modeling, and when Mirai came along (even if not alot of people used it) it was suddenly a box-modeling revolution. There were stars aligned for them though, they had a system built on top of Franz Allegro, Bay Raitt was there to give them input (the guy that modeled Gollum - look at the workflow that changed minds of modelers back then: http://www.youtube.com/watch?v=ubgvomRTW80 ) and there was a stale air around other software packages.

This was still the era when PA/Maya was coming out, SI|XSI was on the horizon and packages cost were $15K+. Bad management ruined Nichimen.

There is a common motif amongst Nichimen, Naughty Dog, even Viaweb - they all had a system that let them react to and develop new stuff at levels and speed no one could've competed against. And if you look at the talk from Dan Weinreb, he is basically saying the same thing about them: http://www.youtube.com/watch?v=xquJvmHF3S8

My memory is fading so I'm not sure Mirai brought SubD first... did it? I was in Final Fantasy 7 project where we used Nichimen almost exclusively for real-time models, which was 1995-1997. Then I went into Final Fantasy the Movie project which adopted Maya (beta back then). We started off with nurbs but soon decided to shift to poly model, which was sometime in 1998, iirc. I don't quite remember we jumped right to SubD or we used layered shapedrive (low-poly cage morphs high-poly model) first.

I do remember I cursed a lot about Maya's scripting interface and C++ API, and wished to go back to the old days I could invoke REPL in Nichimen and hack almost everything in it. I ended up making a Maya plugin that allows Scheme to be scripting interface instead of MEL, and after that I used Scheme almost exclusively scripting Maya, but it wasn't as close as the feel in Nichimen where you could open the hood and mess around its guts.

> My memory is fading so I'm not sure Mirai brought SubD first... did it?

I'm sure it wasn't the first one. LW had MetaNurbs before that, and Catmull had paper before that also - which is funny since Pixars RAT incorporated SubD's only later.

What Nichimen did was show the world a new way of doing things, that how we got things like connect poly shape in maya and other tools which didn't yet have proper SubD support.

Funny you've mentioned working on FF the movie (I still have nice rejection letter for being too young :) ), I still reference that kilauea parallel renderer paper now and then. It was a real feat. I also remember reading somewhere how you guys had only a handful of shaders in your pipeline - first time I've heard about someone actually making and using ubershaders in production.

Nichimen Mirai tech was bough by softimage and later some of the features made it into the XSI (poly bridge and other cool stuff).

Maya is still IMO a great platform, you can extend it to whatever you wish.

Mirai used to be one of the high-end 3D animation and modeling tools. If my memory does not deceive me, it was written in a Lisp dialect.

Wikipedia (http://en.wikipedia.org/wiki/Mirai_%28software%29) tells me that it

  "traces its lineage to the S-Geometry software from Symbolics"

Unfortunately, it seems that it has slipped into oblivion since 2004. The last high profile project it was used in was one of the Lord of The Rings films...

Yeah, nichimen was a spinoff from symbolics, later turned into IZware and doomed by management. However Mirai and Nendo were softwares that revolutionized 3D workflow when it came out. It brought into the mainstream what we call now box-modeling. Bay Raitt (Gollum modeler, he is at Valve now I think) had a lot to do with it, and it was not the first software to offer such a workflow, but it was the most prominent one.

You may want to consider using Lua for game scripting. It's a lot more strongly influenced by Scheme than e.g. Python, and has been designed from day one to be embedded as a scripting language for existing C/C++ projects. It's quite mature, widely used in the game industry, and has been in production use for over fifteen years.


It's easy to get people to line up by ASP.Net. That's the Microsoft way like it or lump it.

It's easy to get people to line up by PHP. That's the popular choice.

It's easy to get people to line up by Jave. That's the non-Microsoft business option. It's the popular business choice.

It's not easy to get people who are dissatisfied with someone elses standards and who are all individully looking for the best option of features they like and want and minimal amounts of the things they dislike, and who are raised in a society of quiet agreement and noisy disagreement, to line up by any particular standard.

I am not an Lisp user at the moment, but I worked on a project before with NAG IRIS Explorer, which is a scientific data visualization tool. The scripting language in the system is scheme based around 1993. (I think it is a dialect of scheme that is closed to GNU Guile).

I think scheme/lisp is used inside a lot of big systems, such as AutoCAD, Emacs, IRIS Explorer and probably more. One of computer graphic pioneers Ken Perlin wrote a shading language for his own renderer in scheme. But unless you use his renderer, you will never know he put a scheme interpreter inside to process his shading language.

It occurred to me recently that 1959 is the 50th anniversary.. - did you/he mean 2009 is the 50th anniversary? The talk presumably was in 1959.

* It turned out you could do it: I programmed it for the IBM 704.*

Anyone else find this comment motivating? It kind of sums up a moment of reflection I guess I somehow aspire to. Find an insanely difficult challenge, that you deem important enough to devote a significant amount of time and energy to, and then one day: "It turns out you could do it: I programmed it for the ${preferred architecture}." Thanks Doug^H^H^H^H hold on - didn't Steve Russell implement McCarthy's original Lisp on the IBM 704 - not Doug McIlroy. What am I missing that pulled me away from my moment of zen here?

I took him to mean that he had programmed a subroutine that could call itself, but only as a curiosity, and that it was McCarthy's talk that made it clear for the first time what was important about recursion.

Speaking of Steve Russell, does anybody know what became of him? It's extraordinary that he wrote both the first Lisp interpreter and the first computer game.

And continuations, apparently. As of 2002, he was apparently working in embedded for Nohau Corporation: http://www.nytimes.com/2002/02/28/technology/a-long-time-ago...

That's a really interesting article about the guys who created Spacewar. I hadn't realized, for example, that there were six of them. Anyway, you should post it in its own right.

Ok, I see that now. Thanks.

Not sure about Steve Russell, but I agree that is rather extraordinary.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact