Hacker News new | comments | ask | show | jobs | submit login
Woo: Fast HTTP Server in Common Lisp (github.com)
100 points by momo-reina on Dec 27, 2014 | hide | past | web | favorite | 40 comments

What is interesting here is to compare this one man's effort with "ancient tools" well-researched by two generations of old-school programmers with that amateurish hype and nonsense about node (author publicly stated in his blog that he has no experience with server side development on a POSIX system) and golang - a minimalist approach, carefully crafted by a rigorously trained "professionals". It is not just about KBs and milliseconds.

Btw, this "Hello world" example tells a little, basically, how efficiently memory allocation, and IO layer are implemented (that's why no one still could beat nginx).

Much more interesting comparison would be of some "real-world scenario", say, "implement an http look up for some public data-set, imported into a persistent local storage, say, Postgresql" and then compare not just throughput rates, but also resource usage.

>Btw, this "Hello world" example tells a little, basically, how efficiently memory allocation, and IO layer are implemented (that's why no one still could beat nginx).

Actually at least on other guy just beat nginx just other day -- don't remember the name, it was some Japanese made server.

I think you might be thinking about Kazu Yamamoto and Mighttpd2? (Written in Haskell, incidentally.) I think this was the web server that beat out nginx on a largish number of cores (42?). For the life of me I can't find the original paper with the comparison graph, otherwise I'd link to it...

Anyway, a lot of the performance work which went into mighttpd2 is automatically shared by all Haskell applications. (Most of it went to optimizing the GHC VM I/O Manager which handles I/O in the Haskell runtime. It's sort of like libev/libuv only at the VM level instead of as a library.)

I don't think that was the "Hey, look! Haskell beats nginx!" one, but very interesting nonetheless! :)

It doesn't really matter how well the LISP version was implemented because in the end people will choose the language with modern libraries and support.

You can be as edgy as you want and use LISP but real-world scenarios take into account economical factors. And surprise, surprise no company gives a shit about the beauty of a solution.

It's not really written in Common Lisp.

"built on top of libev"

The lisp part is a thin ffi wrapper on top of lots of C code.

Not than an ffi wrapper around C code isn't useful, but it's C code that does all the heavy lifting there.

And libev is a C wrapper around epoll and other system calls. There's not much code needed to use epoll directly, that's what John Fremlin's TPD2 web server (the previous "world's fastest Common Lisp web server") did:


What libev provides is a very portable wrapper around all the different non-blocking IO system calls in all the different Unixes. That's the hard part.

Also note that HTTP parsing is done completely in Common Lisp, not in C. Woo uses fast-http, which is by the same author.


Indeed, Node.js is a thin FFI wrapper on top of lots of C (and C++) code that does all the heavy lifting, but it is still useful. I think Woo is useful the same way.

the title implies "the heavy lifting" is written in LISP.

Every node user knows the net lib nodejs uses isnt written in javascript.And most also know that in order for node to be useful,scripts should do as little as possible and delegate any serious operation to C/C++.

>the title implies "the heavy lifting" is written in LISP.

Woo uses fast-http[0], quri[1], and http-body[2], all written in pure Common Lisp and very, very fast.

[0]: https://github.com/fukamachi/fast-http

[1]: https://github.com/fukamachi/quri

[2]: https://github.com/fukamachi/http-body

It makes me really happy to see projects like this for common lisp. I still relatively new to the lisp world but I have fallen in love with the language and development process.

I see so many 'blazing fast' HTTP servers, in every language imaginable. Does anyone write a slow HTTP server any more?

The real challenge is in writing a useful, useable server which still stays fast under load. In contrast, you have to be writing terrible coding horrors for your home-grown static file web server NOT to be wire-speed :)

(No offence to the writers of this particular server, I haven't looked at the code)

The reason people care about speed here is the reason people care about speed in all infrastructure projects: every millisecond of the timeslice spent doing work inside the transport is a millisecond you're not accomplishing application logic. Efficient http code lowers distributed system latency and saves power.

When I see someone doing benchmarks like this against localhost i get skeptical. Race conditions might very well be present.

Lately I began to study information security in more detail. Isn't the "code is data" philosophy of common lisp a nightmare for a web server?

Classically, "code is data" is used upfront, in creating macros that manipulate Lisp code as data, before sending the result to the interpreter or compiler like normal functions and data. Ideally you develop your own Domain Specific Language that provides power in expression and execution for what you're trying to do.

As implied in other answers in the subthread, nowhere in this paradigm does random input data get treated as something that can safely be executed. READ in Lisp Machines is mentioned because reader macros like #. allow data in the form being read to be evaluated, which in this domain is an obvious big no-no.

As early as 1983, I think earlier, it was recognized that things like eval servers were a bad idea if accessible by the outside world.

So, I am really just a beginner. To me it seems to be the case that attackers may find a way to break out of a particular data processing instruction and execute arbitrary lisp code.

If you say, this can't be so, you may well be right...

As I understand it, unless you're using a Lisp with an reader macro that calls eval, or your code calls eval, and you allow arbitrary input into that code while running vs. loading/compiling, it's just not going to happen.

No. Interpreting untrusted data as code can be pretty dangerous. Lisp is usually about interpreting trusted data as code, which is no problem.

The web server isn't calling "eval," is it?

As a fun aside: The Symbolics network code called READ on untrusted data on more than one occasion.

Have you seen that code?

I just searched through sys:network; and sys:ip-tcp; in Genera 8.3 and the only use of READ was in an Eval server.

I distinctly recall, c. 1983, that 'read' was called as part of the NFILE protocol.

The reason I noticed was that I had created a package that did not use 'lisp:', and therefore 'nil' was no longer 'lisp:nil', which broke the NFILE client.

Maybe this got fixed sometime between then and 8.3. (Or maybe NFILE was ripped out altogether?)

I also recall that Jeff Schiller found some remote vulnerabilities. I don't know if they involved 'read', but they certainly could have. Again, this was long before 8.3.

Are you sure? There is a READ command in NFILE, but there are no obvious calls to READ.

Symbols could have been created or looked up by other means...

I read it on a tweet somewhere.

Then it must be true.



I don't think it is competing with Nginx. It seems to be more of Node.js/Go competitor.

How did you generate those pretty graphs?

When doing a speed benchmark,I consider the memory profile to be as relevant as the number of requests per second.

Why not use libuv right away?

See https://github.com/orthecreedence/wookie, a fast app server that uses libuv (very much in the same vein as Woo).

One of the greatest barriers to lisp for me is the lack of free IDEs and better tooling. Having cut my teeth in programming using VB, I generally find any language that doesn't come with a GUI builder and a half-decent IDE with intellisense out of the box to be dismal.(Scripting languages is the exception) I really hope someone will improve on the tooling of lisp. I strongly believe in the mantra that you should not do something that the computer the computer can do better, I believe a smart intellisense autocompletion is much better than having to figure out the intricacies of emacs/vim/<insert your favourite editor here>. I really like the syntax of lisp, it would be really sad if I can't be productive with it. :(

Yes, there are a lack of non-commercial IDEs for Lisp that are purpose-built and give a great "Out of the Box Experience".

However, don't dismiss SLIME due to emacs. It contains all the nice IDE functions - symbol completion, attaching (and editing!) long running processes, function tracing, stack tracing, find references etc.

Access to a REPL in a running system image is an amazing and productive experience.

download AllegroCL if you want a GUI Builder.

SLIMe doesns't have a GUI but is pretty good tooling. has Autocomplete according to symbols available to your image out of the box, xref (who calls function/who setsvariable/etc), a menu for managing threads, an object inspector and much more.

For Clojure there's a plugin(albeit a very big plugin) over IntelliJ IDEA, called Cursive that I've heard a lot of nice things about: https://cursiveclojure.com/

Have you tried Allegro CL or LispWorks? They come with IDE.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact