Ah, you've come a bit too late. Late 80s through early 90s Lisp machines were popular in the CG industry; modeling, animating and even rendering software were written in pure Lisp. After Lisp machines were taken over by unix workstations, those tools were ported on Common Lisp and as far as I know they kept some share until late 90s, used in quite a few popular titles on consoles such as PlayStation or Nintendo64.
Why have they gone? I think the reason is compound of many factors, and I've only seen the transition from just one corner of the industry; other people have different opinion. One thing I suspect is that Lisp camp didn't have enough resource to catch up the steep increase of demanded quality and quantity as graphics hardware quickly evolved.
Back in mid-90s I used 2k to 5k-poly model, made in the modeler written in Lisp, to make some demo game that run on big SGI machines; such poly count became norm in late stage of PlayStation, and when PlayStation 2 appeared the poly count increased order of magnitude, the trend which still keeps going. The architecture of the Lisp modeling tool back then wasn't suitable for the "next generation" graphics; it needed to be rewritten, but they had hard time to do so. Meantime, bigger players entered into the industry, using legion of developers to pump out authoring tools and middleware.
I believe Lisp can boost productivity, but it tend to work better in a small team tackling one hard problem. When tons of features and optimizations done by lots of developers are required, I guess the power of language probably matters relatively less, and the amount of circulating money matters more. (Naughtydog was probably an exception. The founder was Lisp guy, incredibly good one.)
I suspect that it may be a general trend. When a problem is very hard and only small group of enthusiasts are working on it, they choose tools that are most effective for themselves. Like explorers who go into the wild where nobody ever traveled. Lisp may shine in such circumstances.
Once the wilderness is roughly mapped, small towns are built, and dirt roads are created, then much broader development is required. Lots of developers come to the front and start expanding the envelope. Once the field enters in that stage, the choice of language reflects the proportion of their share in general; many use C/C++, for example, and Lisp becomes minority.
(I don't intend to despise the developers in that stage; there are still hard problems and they are still doing incredibly cool things. It's just that the earliest stage, where even what's hard is not really clear and only extremely ambitious people are active, may have different demographics in terms of the choice of the language. I guess Viaweb was also one of such cases.)
In some other areas there was the trend to standardize on Java. Though Java failed (mostly) over time in graphics and simulation.
Still some people use Lisp in graphics related domains. For example CoCreate uses Common Lisp for its CAD package. Recently Lisp has seen increasing use by hobbyists and enthusiasts to write games (using SDL etc.).
I am still trying to use Lisp for these domains. A small piece of Scheme code is running on the server side of one of the metaverse applications currently doing beta.
Xach recently wrote about the use of Lisp in Games:
But there is more, I remember for example that there is a relatively large online game that uses Common Lisp: neuroarena.com - a tutorial video is at http://www.youtube.com/watch?v=XFzP6Shxbbs .
This was still the era when PA/Maya was coming out, SI|XSI was on the horizon and packages cost were $15K+. Bad management ruined Nichimen.
There is a common motif amongst Nichimen, Naughty Dog, even Viaweb - they all had a system that let them react to and develop new stuff at levels and speed no one could've competed against. And if you look at the talk from Dan Weinreb, he is basically saying the same thing about them: http://www.youtube.com/watch?v=xquJvmHF3S8
I do remember I cursed a lot about Maya's scripting interface and C++ API, and wished to go back to the old days I could invoke REPL in Nichimen and hack almost everything in it. I ended up making a Maya plugin that allows Scheme to be scripting interface instead of MEL, and after that I used Scheme almost exclusively scripting Maya, but it wasn't as close as the feel in Nichimen where you could open the hood and mess around its guts.
I'm sure it wasn't the first one. LW had MetaNurbs before that, and Catmull had paper before that also - which is funny since Pixars RAT incorporated SubD's only later.
What Nichimen did was show the world a new way of doing things, that how we got things like connect poly shape in maya and other tools which didn't yet have proper SubD support.
Funny you've mentioned working on FF the movie (I still have nice rejection letter for being too young :) ), I still reference that kilauea parallel renderer paper now and then. It was a real feat. I also remember reading somewhere how you guys had only a handful of shaders in your pipeline - first time I've heard about someone actually making and using ubershaders in production.
Nichimen Mirai tech was bough by softimage and later some of the features made it into the XSI (poly bridge and other cool stuff).
Maya is still IMO a great platform, you can extend it to whatever you wish.
Wikipedia (http://en.wikipedia.org/wiki/Mirai_%28software%29) tells me that it
"traces its lineage to the S-Geometry software from Symbolics"