
A startup's "Why we use Lisp" story - zachbeane
http://article.gmane.org/gmane.lisp.lispworks.general/9675
======
mojuba
Lisp is a beautiful language but I think the biggest problem with it is its
proponents failing to explain the merits. I'm sorry, this post would have
probably made a bit more sense 15 years ago, but definitely not now.

> (a) Very fast development that is enabled by CL (e.g., _everything_ from
> hash tables to string-operators to memory management is automatically
> included - there is nothing that is not included).

Name a modern mainstream language that doesn't have these things.

> (b) Excellent programming environments - e.g., parentheses-savvy editor.

You haven't seen XCode, Delphi or MS Visual Studio, where, for example, you
can jump to the definition of a symbol with "one click", allow interactive
step-by-step debugging with variable watch, disassembly, stack trace, etc - I
shouldn't really name all the things that are possible in a typical modern
IDE. And I don't know any text editor which is not paren-savvy.

> (c) Excellent compiler, especially with declarations, enables very fast
> code.

A compiler which doesn't "enable very fast code" has no place under the sun
nowadays.

> (d) Excellent system stability with no random crashes at all.

Very exciting, although GC-based languages (i.e. those usually lacking
pointers) should not crash at all, or if they do crash that's a shame.
Stability and robustness of your compiler and your runtime system shouldn't
really be mentioned as a merit. If it doesn't meet stability standards, it
shouldn't be released.

> (e) Macros and all that.

Finally getting to the point and you say "and all that"? Btw, "all that"
includes unification of code and data - something no other language provides,
let's say, idiomatically. This is an amazing feature, and in fact Lisp macros
are Lisp macros thanks to just that - unification of code and data and
symbolic nature of the language.

Memory footprint: megabytes do matter because of the CPU cache. A 30,000 LOC
program should take a few megabytes at most and fit a modern CPU cache
entirely. Compared to a 50MB program the performance gain can be enormous.

~~~
philwelch
"You haven't seen XCode, Delphi or MS Visual Studio, where, for example, you
can jump to the definition of a symbol with "one click""

That's an old feature. It used to be called "ctags" and even console-based
text editors support it.

"Finally getting to the point and you say "and all that"? Btw, "all that"
includes unification of code and data - something no other language provides,
let's say, idiomatically. This is an amazing feature, and in fact Lisp macros
are Lisp macros thanks to just that - unification of code and data and
symbolic nature of the language."

Lisp's problem seems to be that, until you know how to use macros and
code/data unification, you can't be easily convinced of their merits. It takes
a considerable commitment to learn Lisp before you can reach that level,
though.

~~~
lg
Yeah, Slime does this and without ctags, I believe most lisps keep track of
the location of definitions in source.

But what's a definition? Slime does this for defun/defvar etc, but not for
things defined with, say, hunchentoot's define-easy-handler. Can you tell your
IDE to take me to that location in a single keystroke? Maybe Slime supports
adding definition types, I have no idea. Or maybe it throws up its hands
because of the potential naming conflicts.

All this makes you wonder whether Lisp is too powerful for any IDE to keep up.

~~~
lispm
LispWorks does that. You can add your own ways to record source locations of
your definition forms.

[http://www.lispworks.com/documentation/lw60/LW/html/lw-60.ht...](http://www.lispworks.com/documentation/lw60/LW/html/lw-60.htm)

------
gaius
This is the key point:

 _I started programming in LISP way back in 1971 on a Univac 1108 mainframe
and also implemented a 68000-based Lisp system (~50K lines of real-time
assembly) for mobile-robotics use in 1983 - and so know my way around the
language._

All Lisp (or Smalltalk, or...) success stories I've read hinge on someone with
an enormous amount of experience with the language. I'd argue that someone
with that much experience could get the job done in (almost) any language. I'm
surprised that someone with that much experience would put it down to language
choice rather than deep knowledge of the problem domain.

~~~
olavk
That is close to saying that all languages are equivalent given the same
amount of experience. I don't buy that. For a specific task some languages may
be better suited / more productive than other languages, even given equivalent
levels of experience.

I don't believe in a global linear scale of language power ("the blub
theory"), but I do believe that some language may be better than others for a
specific task given specific constraints.

E.g. if I have equivalent levels of experience in C++ and Python, I'm pretty
sure I can write small webapp quicker in Python. OTOH if the languages are
very similar, like Python and Ruby, the level of experience is much more
important that the relative strengths and weaknesses of the languages.

Of course, it is seldom that you get that kind of fair comparison between two
languages - usually everyone has a favorite language they know better than any
other.

~~~
coliveira
In some place I read that differences in languages could account for at most
30% of improvements in speed of development. This is an epsilon if compared to
differences between programmers.

Another issue is that, if you are talking about small-to-medium sized
applications then clearly there is a difference in languages. For example, it
is pretty clear that writing a script is easier in perl than in C, or writing
a medium sized expert system is easier in LISP than in Pascal.

However, if you consider large scale applications (100k+ LOC), then I don't
believe there is any difference between writing in C, C++, Java, or Common
LISP, as long as the programmer(s) have deep experience with the used
language.

Just notice that the language is not going to solve the large-scale problem by
itself. As long as such language has tools for creating abstractions, the code
will have about the same complexity no matter what. If that complexity will be
encapsulated in simple concepts (structs and functions) or higher level
concepts (closures and continuation) depends on the taste of the developers
and the language used.

~~~
loup-vaillant
30% is an understatement. Think of the productivity gain when you go from
assembly to C. Now, consider the fact (I do mean _fact_ ) that the jump from
assembly to C and the jump from C to LISP are comparable.

The choice of abstraction _do_ matter. If you use weak ones, your productivity
is taking a serious hit: your program will be bigger, more complex, and have
more errors (squared).

C++ abstractions, for instance, are incredibly weak. Take the function
abstraction, which isn't even complete: you have no way to write anonymous
functions the way you write literal integers[1]. Higher level concepts, as you
call them, aren't more complicated than the "simple" ones. Often, they are
just less familiar and more consistent.

[1]: Anonymous functions should actually be called "literal functions":

    
    
        (fun x -> 7 x + 42) -- a literal function
        357                 -- a literal integer
        2 + 3               -- expression which yields a integer
        f . g               -- expression which yields a function
    

Nothing "high order" about that. This is just acknowledging that functions are
mathematical objects like any other.

~~~
drunkpotato
Funny, I went to a talk today on statistical methods for opinion analysis, and
in the annotated corpus presented, the only opinion word that was used more
often in a subjective frame than an objective frame is the word "fact".

------
zck
>(d) Excellent system stability with no random crashes at all.

This is interesting, considering one of the main reasons Reddit switched from
Lisp to Python was because it was crashing often.

~~~
lispm
that was another Lisp implementation

------
skilldrick
I don't understand why a Lisp hacker wouldn't match parens properly when
ending a parenthetical statement with a smiley:

    
    
      (commentless, of course :)
    

Hasn't he seen xkcd: <http://xkcd.com/541/> ?

~~~
mbrubeck
If you do it the double-chinned XKCD way, it messes up your auto-balancing
text editor.

~~~
loup-vaillant
…which you obviously use to write your e-mails.

~~~
BrandonM
Apparently you don't know anyone who spends 90% of their working time in
emacs.

~~~
loup-vaillant
I do. :-(whoops, I don't : he uses the _other_ editor).

------
vii
Seems to me this discussion is missing one of the main points of the original
post: a massive plug for an unfairly under-appreciated book:

    
    
      > "Let Over Lambda" 
      > (which is really quite scary to read - I can't say that I understand 
      > 100% of it - maybe 60% and I am very happy with that level of 
      > comprehension) -- you end up with an enormously powerful set of 
      > programming tools unlike anything else out there.
    

I really like this book too and recommend it.

<http://letoverlambda.com/index.cl/toc>

------
zandorg
A friend of mine keeps asking me to dump Lisp and "Get modern" with C#, and I
try to explain why I prefer Lisp, but he won't accept it.

It was Paul Graham's essay that encouraged me to try Lisp in 2005.

~~~
gridspy
I'm just dreading the day that Microsoft decides that C# isn't modern enough
and they want to sell a whole bunch of seats of Visual Studio (insert new name
here)

~~~
icey
They release new versions of Visual Studio every few years, and C# is updated
almost as regularly. C# / .net 4.0 will be released this year (as will Visual
Studio 2010).

The differences between C# 1.0 and 4.0 are _enormous_. Microsoft isn't shy
about making changes to the language.

------
idlewords
"Excellent system stability with no random crashes at all"

This holds for pretty much any language you care to use.

~~~
aharrison
People keep saying that, and I can understand where they are coming from, but
this isn't the case. Just yesterday, I managed to segfault python
2.6.something. I have no idea why (I really should have figured it out) but my
hypothesis is that I freaked out the parser. It was vanilla python code; it
should never segfault. I have had the same experience with the JVM: I once
segfaulted the parser by incorporating a static string that was hundreds of
lines long. Curiously, Eclipse "compiled" it just fine.

Sometimes environments have bugs. If you need an underlying runtime that
simply will not crash, using something tried and true may be at the top of
your priority list. That might be overkill for most projects, but I can see a
guy who wants something so robust he knows its his fault when it breaks.

~~~
idlewords
So based on your experience (as a comp sci student, I'm assuming, based on
your other comment), Python and the JVM are both 'crashy', but you don't
happen to remember exactly why?

Reminds me of a bunch of compiler bugs I found back in the day. Strangely the
longer I program, the fewer of these compiler bugs I seem to be able to find.

~~~
aharrison
I am going to defend myself. Then, I am going blow my argument out of the
water.

Backstory: I am a fifth year comp sci student. I have worked part-time as an
SDE for most of my college career. The python crash I referenced earlier was
in class, the java problem at work. I have written an optimizing compiler
before (C subset to LLVM to SPARC MIPS), so I hope that you will at least
agree that I am not a complete moron in this area.

Java story, more detail: To make a really long story short: I wanted to move a
really really long stored procedure (~2k LOC) into a JDBC query. Don't ask
why, it was a very scary legacy issue. So I copy and paste the stored
procedure into one long const string in eclipse, and it compiles just fine. I
run our ant script against it, and it explodes with a stack overflow (IIRC).
The solution that I found was simply to replace newlines with \n" + (newline).
No stack overflow on compilation anymore. From this I could only assume that I
had fubared the compiler (which would have been a reasonable way to fubar,
being as I was trying to create an absolutely massive const string directly).

Now we could quibble over whether this even counts as a crash in the sense we
were talking about, but the underlying premise is: you never want to have to
work around your tools. As computer scientists we do it a lot, but it is never
fun and its worse when something goes down in production because the tool
crapped out. Theoretically, a VM or OS should never fail and bring the entire
system down. A compiler should never outright crash.

Now to debunk my own defense: I just sat down for about an hour and did
everything I could to reproduce the bug in JDK 1.5.6 (the original JDK I broke
it on). Well, go figure, I can't get it to break in the hypothetical way I
wanted to. I might one day do exactly what I did with the stored procedure,
but setting that _particular_ environment up again would take quite some time.

In conclusion, you can assume I was an idiot because I can't show you the
code. I would in your shoes. :)

P.S. This all assumes you are using tools given to you by the platform itself.
Using JNI to dereference a NULL doesn't count. :D

~~~
pvg
That's not a 'random crash.' The compiler ran out of memory trying to compile
your file. Stack space is not infinite, a typical Java VM comes with some
default which can also be reconfigured. After running out of memory, the
compiler told you what the error condition was and exited.

~~~
lispm
Why not extend the stack at runtime?

LispWorks:

    
    
        CL-USER 101 > (defun foo (n) (unless (zerop n) (cons n (foo (1- n)))))
        FOO
    
        CL-USER 102 > (foo 1000)
    
        Stack overflow (stack size 15998).
          1 (continue) Extend stack by 50%.
          2 Extend stack by 300%.
          3 (abort) Return to level 0.
          4 Return to top loop level 0.
    
        Type :b for backtrace or :c <option number> to proceed.
        Type :bug-form "<subject>" for a bug report template or :? for other options.
    
         CL-USER 103 : 1 > 
    

Now use restart 1 or 2.

------
motters
Apparently iRobot also uses Lisp in an embedded context.

~~~
jjwiseman
iRobot used to do embedded Lisp (back when they called themselves IS
Robotics), but I haven't heard anything indicating that they still are. See
<http://lemonodor.com/archives/2004/08/l_mars_venus.html>

They had L, their Lisp dialect, and Mars, the macro layer for doing robotics
in L.

The paper "L - A Common Lisp for Embedded Systems" is available at
<http://www.cs.cmu.edu/~chuck/pubpg/luv95.pdf>

~~~
motters
The last I heard Rod Brooks said all the pacbots software was written in Lisp.

