Hacker News new | comments | show | ask | jobs | submit login

> My primary focus in life has been computer graphics, and my first language was C. I was born in 1980

Ah, you've come a bit too late. Late 80s through early 90s Lisp machines were popular in the CG industry; modeling, animating and even rendering software were written in pure Lisp. After Lisp machines were taken over by unix workstations, those tools were ported on Common Lisp and as far as I know they kept some share until late 90s, used in quite a few popular titles on consoles such as PlayStation or Nintendo64.

Why have they gone? I think the reason is compound of many factors, and I've only seen the transition from just one corner of the industry; other people have different opinion. One thing I suspect is that Lisp camp didn't have enough resource to catch up the steep increase of demanded quality and quantity as graphics hardware quickly evolved.

Back in mid-90s I used 2k to 5k-poly model, made in the modeler written in Lisp, to make some demo game that run on big SGI machines; such poly count became norm in late stage of PlayStation, and when PlayStation 2 appeared the poly count increased order of magnitude, the trend which still keeps going. The architecture of the Lisp modeling tool back then wasn't suitable for the "next generation" graphics; it needed to be rewritten, but they had hard time to do so. Meantime, bigger players entered into the industry, using legion of developers to pump out authoring tools and middleware.

I believe Lisp can boost productivity, but it tend to work better in a small team tackling one hard problem. When tons of features and optimizations done by lots of developers are required, I guess the power of language probably matters relatively less, and the amount of circulating money matters more. (Naughtydog was probably an exception. The founder was Lisp guy, incredibly good one.)

I suspect that it may be a general trend. When a problem is very hard and only small group of enthusiasts are working on it, they choose tools that are most effective for themselves. Like explorers who go into the wild where nobody ever traveled. Lisp may shine in such circumstances.

Once the wilderness is roughly mapped, small towns are built, and dirt roads are created, then much broader development is required. Lots of developers come to the front and start expanding the envelope. Once the field enters in that stage, the choice of language reflects the proportion of their share in general; many use C/C++, for example, and Lisp becomes minority. (I don't intend to despise the developers in that stage; there are still hard problems and they are still doing incredibly cool things. It's just that the earliest stage, where even what's hard is not really clear and only extremely ambitious people are active, may have different demographics in terms of the choice of the language. I guess Viaweb was also one of such cases.)




There was a time when most uses of a high-level language in domains like graphics were replaced with C++. C++ quickly became the dominant language in graphics and simulation. It still is. One also saw a lot of special hardware, for which it was difficult to come up with higher-level language implementations (Playstation 3 is such an example with its cell processors). One would have needed experts to port languages like Lisp to all kinds of new hardware - the expertise and the demand was just not there. So with every new exotic hardware, the trend to just use C/C++ with some simple scripting component was accelerating.

In some other areas there was the trend to standardize on Java. Though Java failed (mostly) over time in graphics and simulation.

Still some people use Lisp in graphics related domains. For example CoCreate uses Common Lisp for its CAD package. Recently Lisp has seen increasing use by hobbyists and enthusiasts to write games (using SDL etc.).


Yup. I think there was a demand (of better language), but supplying necessary tool chain in Lisp required too much resources. In PlayStation2 era, NaughtyDog got advantage from their custom compiler, that's I heard in their GDC talk. But not everybody could afford that resources. I used Scheme in a part of toolchain in the production but never managed to push the code in the actual game shipped.

I am still trying to use Lisp for these domains. A small piece of Scheme code is running on the server side of one of the metaverse applications currently doing beta.


Naughty Dog used Allegro Common Lisp for their development environment - to implement some kind of Scheme dialect for the Playstation II. Then they got bought by Sony and for the Playstation III they used C++. As I understand they wanted to share code and technology with other parts of Sony working on games - and those were not using Lisp, but C++, like most of the industry. From reading some of their latest presentations I get the impression that this did not work out the way they had thought. So Naughty Dog is back using Lisp, this time using Scheme as part of the toolchain.

Xach recently wrote about the use of Lisp in Games:

http://xach.livejournal.com/229485.html

But there is more, I remember for example that there is a relatively large online game that uses Common Lisp: neuroarena.com - a tutorial video is at http://www.youtube.com/watch?v=XFzP6Shxbbs .


Thanks for the pointers. Encouraging.


The thing is, as TY pointed out Nichimen - I've read that essay awhile ago from pg how Lisp gave them competitive advantage... and if you look at Nichimen (spinoff from Symbolics) their path was almost the same. There were SubD's before in our industry (Catmull Clark, Ligthwave MetaNURBS that fori made), but when Nichimen made SubD's into Mirai (and later Nendo) it caused a revolution in 3D, especially modeling. Prior to that we were churning out NURBS for detailed characters and objects, some ventured into poly-by-poly modeling, and when Mirai came along (even if not alot of people used it) it was suddenly a box-modeling revolution. There were stars aligned for them though, they had a system built on top of Franz Allegro, Bay Raitt was there to give them input (the guy that modeled Gollum - look at the workflow that changed minds of modelers back then: http://www.youtube.com/watch?v=ubgvomRTW80 ) and there was a stale air around other software packages.

This was still the era when PA/Maya was coming out, SI|XSI was on the horizon and packages cost were $15K+. Bad management ruined Nichimen.

There is a common motif amongst Nichimen, Naughty Dog, even Viaweb - they all had a system that let them react to and develop new stuff at levels and speed no one could've competed against. And if you look at the talk from Dan Weinreb, he is basically saying the same thing about them: http://www.youtube.com/watch?v=xquJvmHF3S8


My memory is fading so I'm not sure Mirai brought SubD first... did it? I was in Final Fantasy 7 project where we used Nichimen almost exclusively for real-time models, which was 1995-1997. Then I went into Final Fantasy the Movie project which adopted Maya (beta back then). We started off with nurbs but soon decided to shift to poly model, which was sometime in 1998, iirc. I don't quite remember we jumped right to SubD or we used layered shapedrive (low-poly cage morphs high-poly model) first.

I do remember I cursed a lot about Maya's scripting interface and C++ API, and wished to go back to the old days I could invoke REPL in Nichimen and hack almost everything in it. I ended up making a Maya plugin that allows Scheme to be scripting interface instead of MEL, and after that I used Scheme almost exclusively scripting Maya, but it wasn't as close as the feel in Nichimen where you could open the hood and mess around its guts.


> My memory is fading so I'm not sure Mirai brought SubD first... did it?

I'm sure it wasn't the first one. LW had MetaNurbs before that, and Catmull had paper before that also - which is funny since Pixars RAT incorporated SubD's only later.

What Nichimen did was show the world a new way of doing things, that how we got things like connect poly shape in maya and other tools which didn't yet have proper SubD support.

Funny you've mentioned working on FF the movie (I still have nice rejection letter for being too young :) ), I still reference that kilauea parallel renderer paper now and then. It was a real feat. I also remember reading somewhere how you guys had only a handful of shaders in your pipeline - first time I've heard about someone actually making and using ubershaders in production.

Nichimen Mirai tech was bough by softimage and later some of the features made it into the XSI (poly bridge and other cool stuff).

Maya is still IMO a great platform, you can extend it to whatever you wish.


Mirai used to be one of the high-end 3D animation and modeling tools. If my memory does not deceive me, it was written in a Lisp dialect.

Wikipedia (http://en.wikipedia.org/wiki/Mirai_%28software%29) tells me that it

  "traces its lineage to the S-Geometry software from Symbolics"

Unfortunately, it seems that it has slipped into oblivion since 2004. The last high profile project it was used in was one of the Lord of The Rings films...


Yeah, nichimen was a spinoff from symbolics, later turned into IZware and doomed by management. However Mirai and Nendo were softwares that revolutionized 3D workflow when it came out. It brought into the mainstream what we call now box-modeling. Bay Raitt (Gollum modeler, he is at Valve now I think) had a lot to do with it, and it was not the first software to offer such a workflow, but it was the most prominent one.




Applications are open for YC Winter 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: