Hacker News new | past | comments | ask | show | jobs | submit login
Lisp as an alternative to Java (2000) (norvig.com)
156 points by azhenley 7 days ago | hide | past | favorite | 93 comments





I think the lesson to learn from Java was that we, developers, were evaluating languages on the wrong parameters.

Java is too verbose! (But that can reduce the need for so many comments, and comments can often mislead as software changes. Also, Java 8 was 2014 and it changed the game massively.)

Java runs slowly because of the JVM! (But the JVM evolves and improves, giving free upgrades to all Java programs over time. Also, I have a multi-GHZ processor now.)

Java takes longer to write! (But the first time you write a program is never the hard part- it's continuous evolution of software that matters.)

Norvig's code is beautiful and very good Lisp. But it's so dense that it requires more lines of explanatory comments than actual functionality. The code's functionality isn't self-evident. And sure, you can write Java code that has the same flaws, but I find it's easier to write readable Java code than it is to write readable Lisp.

I think Python's success is further evidence of this perspective.


> Norvig's code is beautiful and very good Lisp. But it's so dense that it requires more lines of explanatory comments than actual functionality. The code's functionality isn't self-evident. And sure, you can write Java code that has the same flaws, but I find it's easier to write readable Java code than it is to write readable Lisp.

I mostly agree with this in general, but in my opinion, this kind of stuff gets exacerbated with programmers like Peter Norvig.

I was a really early MOOC student, and I was taking Udacity's first programming courses. I was a senior in college and already had plenty of coursework and multiple jobs/internships under my belt when I took their "Introduction to Programming" course. It was extremely easy for me.

The next programming course in their curriculum was "Design of Computer Programs," which is a course by Peter Norvig. When it was released, it had the difficulty set at "Beginner." While people were still going through the first iteration of the course, it got changed to "Advanced."

The problem, in my opinion, was the code. Even though it was Python, it was always extremely dense and required an insane amount of thought in order to comprehend what was going on. His variable and function names were also often very short and not descriptive enough.

I was able to get through the course, but it was a lot harder than I'd like to admit. As a mildly more seasoned programmer (I'm turning 30 soon), I would not think highly of a coworker's abilities if they wrote code like this. You definitely need to be smart to write the code the way Norvig does, but it takes a certain skill to write code that other people can understand, and I don't think he has that skill, regardless of the programming language.

All that is to say is that it's probably not just Lisp's fault, at least in this case.


> You definitely need to be smart to write the code the way Norvig does, but it takes a certain skill to write code that other people can understand

Just as functionality is relative to a purpose, readability is relative to an audience; Edmund Spenser is difficult for most of us to read today, and Dan Brown is incomprehensible if you speak only Chinese. The code of Norvig's that I've read (if we include the comments) is very readable to me, but I'm not a senior in college. (Without the comments I think I would have a terrible time.)

If I'm writing code with people like Norvig as an audience, I'll write it differently than if my audience consists of relative novices.


which means if you're working with other people in a business, a whole class of programmers won't be able to work well with your code. I think this makes the OP's point for them

Yes, that's always the case. Whenever you're writing, your writing is more accessible to some audiences than to others, according to how familiar the vocabulary and conceptual framework you're using are to them. So it's important to know what audience you're writing for so you can tailor your style to be maximally accessible to them.

It's tempting to think that you can organize kinds of writing on a one-dimensional continuum of levels from "more recondite" to "less recondite", with people being able to read anything at or (more easily) below their level, but it's totally false. Yesterday I went to check out some power usage statistics in the datasheet for a microcontroller I wanted to use for a project. To my dismay, it said, and I quote, "电流消耗是多种参数和因素的综合指标,这些参数和因素包括工作电压、环境温度、I/O引脚的负载、产品的软件配置、工作频率、I/O 脚的翻转速率、程序在存储器中的位置以及执行的代码等。"

I mean, that's surely understandable to many more people than what I'm writing here. But I'm not one of them. (Fortunately, in this case the data tables mostly answered my questions.)


As usual, I think it depends on the situation. Certain parts of Java's verbosity are just a net loss. However, some of those pain points have been addressed, and they're working on addressing others.

For example:

    public class Point {
        private final x;
        private final y;

        public Point(int x, int y) {
            this.x = x;
            this.y = y;
        }

        public int getX() {
            return x;
        }

        public int getY() {
            return y;
        }
    }
I don't think this level of verbosity helps readability (at least not for anyone familiar with the language). Luckily, there's Lombok (and I think a recent or upcoming Java version is adding support for record syntax):

    @lombok.Value
    public class Point {
        int x;
        int y;
    }

You can go even further with Java 16, which came out last month.

    record Point(int x, int y) { }
No lombok needed. That is the same code, but it also has equals and hashCode methods.

https://openjdk.java.net/jeps/395


I don't think this is a good example, because it's easy to read in any case. The readability of the easy parts is not what matters - it's the readability of the hard parts, since that's where you'll spend 95+% of your attention.

My point is that adding that level of verbosity for something that should be simple makes it really hard to find the important parts of the code.

Boilerplate obfuscates.


Take a step up the enlightenment ladder and put a type dynamic language on top of java. Lombok is silly and makes your object’s serialization capabilities a dice roll. Keep annotations to a minimum and bring in a type dynamic language (groovy, kotlin, etc)

I find static languages much easier for large code bases. Also, can you code dynamic features with java's reflection API?

https://docs.oracle.com/javase/tutorial/reflect/index.html


Kotlin is statically typed. It has full type inference, though (AFAIK).

Kotlin is statically typed. Perhaps you mean Clojure.

I've used plenty of dynamic languages before, but I don't see how that would help in this case.

The point is that I wanted statically typed records, which a dynamic language won't give me. Java has added syntax for defining records now, which is great, but I'm not (yet) able to use the latest version.

> makes your object’s serialization capabilities a dice roll

Not sure how? The effect of @Value is documented.


Yes, groovy + @CompileStatic is comprehensively better than java with little to no performance loss.

Exactly.... all the benefits of dynamic, with all the benefits of static compilation.

I have no problem whatsoever with groovy, but I heard “all the parts Java programmers hate about JavaScript with all the part JavaScript developers hate about Java”, and I had to reread your reply to realize you are not writing something like this :D


C# solves it well with auto Properties. Java records will be a nice addition.

Records were introduced in Java 14 [1]. An immutable DTO, with minimal boilerplate, that still behaves a class. Example:

record Point(int x, int y) {}

[1] https://blogs.oracle.com/javamagazine/records-come-to-java


They were a preview feature until Java 16, though, so they probably haven't seen wide use yet.

https://openjdk.java.net/jeps/395

Java 16 was only released within the last few weeks! (Gradle still has some trouble compiling it or running on it; Gradle 7 should be dropping soon with explicit Java 16 support.)


> Norvig's code is beautiful and very good Lisp. But it's so dense that it requires more lines of explanatory comments than actual functionality.

I've thought the same. Just look at the comment for the `print-translations` function. Like yeah, the code is short, and I am sure it didn't take Peter long to write it: he's one of the best out there after all. But if this were real software, how long until the comment becomes out-of-sync with the implementation, due to negligence or simple laziness? Is it not safer, and a better use of time, to write code that itself reads more like that comment, which I'd guess would be the case with Java, and even more so with Python?


> but I find it's easier to write readable Java code than it is to write readable Lisp.

I think that depends a bit on the style. Pattern-heavy, "clean" Java is a mess of boilerplate that makes it very hard to find where things actually happen, let alone get an"idea at a glance" of what the code is about. Concise code might be harder to get started with, but in the end tends to be easier (for me, at least) to grok.

In any case, the main difference here, beyond the languages, is Norvig being a genius.


There's more to Java than the language: It's a complete ideology.

The boilerplate, insistence on explicit pattern naming and the comically long identifiers are all attempts at making boring enterprise code easy to read and modify by mass produced offshored "enterprise coders". With the UML certs that comes with them.

Try putting someone of Norvig's caliber on a team of these "enterprise coders" and watch him run in circles around them. Of course, none of them could ever hope to keep up. But good luck hiring a lot of Norvigs.


The nice thing about concise, highly performant, and most importantly, highly documented code is that it gives you the best of all worlds: it lets the computers do what you need as fast as possible, while also letting programmers get their learn on and actually deepen their understanding of the language they work with because the code comes with its own textbook.

You need all three though: take any of those aspects away, and you end with something that's going to get ripped out when the next programmer needs to maintain the codebase either because it's wastefully verbose, needlessly underperformant, or no on can understand what the hell all these "clever tricks" (usually based on an understanding of computer science, rather than software development - the two haven't been the same thing for decades now) do.


Agree with this. My central belief about code is that you read it more than write it, so legibility and simplicity are usually the most important features. That's one of the reasons I really liked Java. These days I feel that Kotlin achieves similar goals - arguably a little better than Java

I still think we just don't appreciate actual marketing and corporate sponsorship.

If curious, related past threads (some with comments by norvig! https://news.ycombinator.com/posts?id=norvig):

Lisp as an Alternative to Java – Faster, More Productive - https://news.ycombinator.com/item?id=21899854 - Dec 2019 (1 comment)

Lisp as an Alternative to Java (1999) - https://news.ycombinator.com/item?id=12197131 - July 2016 (103 comments)

Lisp as an Alternative to Java (2000) [pdf] - https://news.ycombinator.com/item?id=9045574 - Feb 2015 (21 comments)

Lisp as an alternative to Java (2000) [pdf] - https://news.ycombinator.com/item?id=8446368 - Oct 2014 (55 comments)

Lisp as an Alternative to Java - https://news.ycombinator.com/item?id=3447101 - Jan 2012 (37 comments)

Python as an Alternative to Lisp or Java, Peter Norvig revisited - https://news.ycombinator.com/item?id=2032743 - Dec 2010 (95 comments)

Ask PG: Lisp vs Python (2010) - https://news.ycombinator.com/item?id=1803351 - Oct 2010 (192 comments)

Lisp as an Alternative to Java [PDF] - https://news.ycombinator.com/item?id=61320 - Sept 2007 (9 comments)


Worth noting that last link is to a different article. It is in fact the original work that norvig's post was based on.

Oh yes! I remember ;)

Will add 'related' to my comment above.


I have been using Clojure as a solo developer for about two years now, there is definitely a productivity increase. It always feels like the amount of time required to do the next thing is incrementally decreasing (like Ologn?)

While all that is fun, Clojure is still a very enterprise ecosystem, where participants don't share as much elementary code as in Python ecosystem. Participants are also quite experienced, thus I have to sit, read through and make architectural decisions for the entirety of the project. When we add this thinking time into the measurement, the time taken for a Clojure 'project' (not a piece of code) is definitely more than Python or Java (IMO).


I have no hands-on experience with Clojure but it always appears to me that the language manages to get even complex computations done in relatively little code. This is because Clojure offers powerful abstractions, and this contributes to getting things done rather quickly once you've found the right way of representing data.

However, it also makes me wonder if this advantage for writing code might later on turn into a shortcoming for reading, i.e., understanding code - either someone else's or your own code six months later.

Complex computations that are highly compressed through the use of powerful abstractions seem to lean towards puzzle solving when you're trying to understand code that you're not already familiar with. Am I wrong?

You mention Python and Java at the end of your post, and I think Python at least definitely has a mindset where readability is valued as a means for greater accessibility / maintainability.


> However, it also makes me wonder if this advantage for writing code might later on turn into a shortcoming for reading, i.e., understanding code - either someone else's or your own code six months later.

Well, you're always at the mercy of whoever wrote the code, but I will just say that in my experience (as an intermediate Clojure developer, I learned it in 2018 and have been doing it professionally ever since) reading other people's Clojure code is quite easy, since basically every line of code builds on the same core abstractions (mainly the seq abstraction) and uses the same handful of functions from Clojure core (the standard library). And obviously it's mostly pure functions and immutable data, so you get the benefit of being able to isolate the code and test it out in the REPL.

The main advantage of Clojure's particular data-oriented style is that there are no classes and associated methods to learn. Clojure basically defaults to using data literals for pretty much everything and the same custom is respected by most of the popular libraries (even many of the Java and JS wrappers). That's also part of why the code is pretty simple to read and why you tend to use the same few functions and macros for absolutely everything you do: you're literally just manipulating the same few kinds of data structures all the time.


For my part, I am rather disillusioned with data-oriented programming. I admit I haven't spent a lot of time using Clojure professionally, but, in a recent experience of having to learn a large pre-existing codebase, I found that the difference between, "everything is a map," and, "the application's entire data model is a big indistinguishable ball of mud," seems to be commenting discipline. And commenting discipline is always terrible.

Officially, by the book, you're supposed to use data access functions to give everything distinguishable names and keep it clean. What I ran into is that some nice language features for writing code quickly and tersely, such as map destructuring, actively discourage you from doing that. And without that, the difference between a map full of data and a class is that a class has a single file you can read to find out what's in it, while a map may have been built up in a completely ad-hoc manner.

I think the code maintenance story may have actually been a little bit better back when I was using lisp, because lists. It's actively painful to use raw functions like cdadr to unpack your data structures. Whereas assoc-in is a fun toy and encapsulating it so you don't get to use it as much would be a bummer.


"the application's entire data model is a big indistinguishable ball of mud," seems to be commenting discipline. And commenting discipline is always terrible.

From Rich Hickey's "History of Clojure"[1]:

Not all is rosy. Users perennially report dissatisfaction with the error reporting from the compiler and various macros. They are confused by Java’s stack traces. They face challenges maintaining and understanding, e.g., the data requirements of a codebase when a discipline around documentation has not been maintained.

Hickey is signaling here that "commenting discipline" is a must for working with Clojure.

[1] https://download.clojure.org/papers/clojure-hopl-iv-final.pd...


It's like difference between having to use something like mongodb vs postgres, in mongodb you have great flexibility but must have very high commenting discipline, in postgres, less so.

When doing Python I find myself using namedtuple all over the place - I neither want nor need the ceremony of a class, don't like the laxity of a map, and want to be able to see what the fields are at a glance.

Not really. Having more powerful ways to express yourself doesn't make it harder to read code. There isn't some cosmic "power corrupts" system of karma at work. Power just makes the code more powerful.

The issues I've had with Clojure are the small community leading to questionable documentation and supporting libraries just feel a little underdone once off the beaten track. There is also the radically different style of programming (which is also the biggest plus). But the power of the abstractions isn't a problem, it just means there is less to read. If anything, reading the source code of libraries becomes more feasible because often libraries are about instantiating an idea than writing lots of code.


> Having more powerful ways to express yourself doesn't make it harder to read code

Well, depends. For example, reading my older Haskell code is definitely harder than my Java code. Also, only a handful of teams is lucky enough to have only good programmers. There is always someone who sees some great advanced concept and applies it without the necessary know-how on the dangers/context of that feature. I think Haskell, Clojure and Scala as well are somewhat prone to this.


Given the choice between undocumented Java code and undocumented Clojure code... I'd rather take my chances with Java.

I've had similar issues with immature libraries, to a point where nowadays I will usually end up using the Java libraries and write my own wrapper around it if it's not Clojurey enough for my liking. Much as I dislike Java as a language, I don't think anyone would dispute that it has pretty excellent library support, so I figure that there's no reason I shouln't exploit that fact in Clojure.

I think this is a danger with any powerful language. In order to aid readability, abstractions need to be chosen, or designed, to be intuitive to readers and to align with their understanding of the domain. Sometimes people can be overly determined to decrease the verbosity of their code, and after spending enough time immersed in it, almost any detectable pattern can start to feel "intuitive." Using these patterns to compress the code can feel like a process of discovery and innovation to the person doing the writing, but if the abstractions are not intuitive for readers, it has the same effect on readability as gzipping a text file. Patterns are found, verbosity is decreased, but readers are not aided by the abstractions and must mentally decompress the code in order to understand it.

In my own day-to-day work, I see this issue with Scala programmers (myself included) who suffer from a tendency to see any kind of struggle with code as a valuable learning process. All of us got to where we are, slinging around monads in a "hard" language, because we have an appetite to expand our mental repertoire and a tendency to lean into difficulty. Selectively applied, this is a wonderful attitude to have towards learning programming. It is a counterproductive attitude to have towards your own codebase, though. In your own codebase, you have to flip your assumptions on their head and assume that if code is difficult to read, then more work should have been put into writing it.


> However, it also makes me wonder if this advantage for writing code might later on turn into a shortcoming for reading, i.e., understanding code - either someone else's or your own code six months later.

I understand your concern, if you're used to old-style PHP or Perl, which many people say is "write-only", because it's incredibly fast to create code, but a nightmare to maintain it.

That said, I think Clojure is a bit difference, since the entirety of the language is designed to make it easier to actually make abstractions. It's trivial to break things into smaller functions and compose them, it's easier to compose stream-based computations, and it's even not too hard to do a fairly compositional concurrent system once you've gotten used to core.async.

Very often when I write Clojure, I do a "quick and dirty" version of whatever I'm trying to do (giant functions, everything in one file, single-letter variable names, etc), just to get something working, and yet I still find it fairly straightforward to read, and also fairly straightforward to make non-gross later when I have more time. Most of the time refactoring really is as simple as "cut and paste".

EDIT, Cont'd:

There are definitely exceptions to this. Occasionally people new to the language will find out about macros and try and build a bunch of custom constructs that are incredibly hard to debug. There's definitely an art to figuring out the best time to use a macro (one that I still haven't mastered, if I'm being honest), because abusing that feature definitely can lead to problems of maintainability.


Clojure code is more dense, but once you stop trying to read as many LOC per minute as you would in other languages you are fine. Also the increased use of pure functions and immutable datastructures makes it easier to reason about code.

I have always heard that Haskell has fewer LOC than something like C++. I was quite skeptical when someone told me, that if we count words instead, they are pretty comparable, but based on a few projects, it is absolutely true. I haven’t tested it regarding Clojure, but I wouldn’t be surprised if that would be the case here as well (maybe somewhat less due to not having typenames?)

You mean the steep learning curve, and lack of “starter pack” type of frameworks (for lack of a better description) is holding you back being more productive with Clojure?

I’ve been doing Clojure professionally for over 5 years now and it’s definitely a known issue that Clojure is very expert-friendly.


Getting a 'hang' of clojure as a language is quite easy, my colleague and I took only two weeks to jump into things. Even today, specific code development is order of times faster than we can do in Python.

However, lack of general 'all-purpose' libraries for numerous use cases means we have to implement that API or library in a general way (if only for our use case) then integrate it into the system.

This approach has worked quite well, creating our own template code (as everything is functional) helps reuse across many projects, however the initial investment of time/labour is something we would like to avoid for the sake of finishing the work fast.


Are you counting Java libs as part of that ecosystem? It seems odd to describe the JVM as lacking all purpose libraries.

Java libs are definitely helpful, the time I mentioned includes wrapping Object oriented code into functions. This is easy if all that is required is one or two methods from the Java, however for more than immediate usage, it feels like building a library just because I need to use it.

In Python ecosystem, that labour already done by someone else, preferably who didn't start writing a library just to deliver code within a week.

One good thing about writing libraries in Clojure is the immense stability and composibility it offers. We can use the same code many years later without fearing breakage,a 'primitive' function can also be easily squeezed into numerous other functions which means reusability is quite high.

However, it is still an impediment for shops less than 5 devs.


So the issue is, you want some sort of bindings rather than directly invoking the Java API. I'm curious, isn't that the sort of thing macros are supposedly good at - could the bindings not be auto generated to a large extent?

Being expert-friendly is not a problem per se for a language. What matters is the ratio between sufficient wage and productivity increase as compared to other languages. By "sufficient wage" I mean paying your experts enough so that they become less rare.

I wouldn't mind paying someone 3 times as much if its expertise in the given language means he can be 10x more productive than an average programmer in an average context. (I personally have measured a 30x incrase in productivity by switching from ruby to clojure and have observed the same phenomenon with other people).


My experience has been the same. I do enjoy clojure very much but I've spent hours and even days just searching for examples on how to do X or Y and most of the sources are not beginner friendly at all.

If it wasn't because of my huge desire to use datomic I think I'd switch to some other language. On the plus side I've learned a lot.


As an advanced Clojure programmer I never look for examples on the web, which is not the case for other languages.

The reason behind this is that since Clojure is data-oriented, API interfaces are clear, minimal and self-documenting. With object orientation or anything that relies on datatypes, I always end up browsing docs looking for what's possible to do with the given list of methods. Never in Clojure.


Yep, I think the readily usable examples are so bountiful on the Python side because people tend to use it for smaller self contained things (more scripting, data ETL code, cli utilities). Plus the focus on Jupyter notebooks and teaching programming, often as a first programming language.

There are a bunch of grown up largish open source clojure apps up on github though and pretty good discussion forums with people bounce off ideas from on slack, zulip, the official forum, r/clojure etc. And consultancies if you have a budget.


Would be keen to see a similar comparison after 21 years of language and VM/runtime (depending on your Lisp, I guess it doesn't apply to Chicken Scheme) evolution.

We programmers are very picky sometimes. I wonder if the Lisp would have taken off better if it had parentheses free syntax. It could have been easier to sell for uninitiated that way.

I just found out that there exists Wisp [1], that basically removes the parentheses for indentation.

https://www.draketo.de/software/wisp


The Racket people are attempting to do that in a new way with the Honu language, to truly have Lisp powers with an alternate syntax. It's probably going to replace racket as the main language on the racket runtime:

https://www.google.com/url?sa=t&source=web&rct=j&url=https:/...


I hope not. My mind works well with the syntax of Lisp languages.

BTW, at Racketfest two weekends ago, I heard no mention at all of the alternative syntax.


I also find myself at home with lisp syntax. However it would be very interesting to have a well-supported mainstream syntax regarding the accessibility of the runtime. Incidentally, this was the main stated goal: be more accessible to students and beginners. This could bring people to the racket environment, which would definitely be a win for OSS projects hosted on racket.

Some folks at Apple made the same argument in favor of giving Dylan a non-Lisp syntax. In fact, I was one of them.

In hindsight, I think we were wrong. The infix syntax didn't serve its purpose of growing Dylan's user base, and it made the surface language clumsier to work with. I strongly prefer its original s-expression syntax.


Indeed, but in this case the racket lisp syntax is also here to stay, and I expect many experienced programmers to keep using it. So, we'll see... let's hope for the best!

We all thought the s-expression syntax for Dylan was here to stay, too.

>It's probably going to replace racket as the main language on the racket runtime

Doesn't seem so. If that was the case the repo wouldn't have the last commit be 2 years ago:

https://github.com/racket/honu


Apparently, Matthew Flatt Made finishing the switch to Chez Scheme a priority and I think that work on Honu is picking up now that's done. But it's not in the implementation stage yet.


My assertion is... just no. The parentheses are a complete red herring. You think they are bad and somehow annoying, and yet json gets by with requiring quotes in the official spec for the keys, which is far more annoying. (And, as an aside, one of the few things I give YAML credit for nixing.)

Further, while there is a bit of truth to needing a few extra parens for a lot of math based code. For code that is heavy in calling to other code, it is primarily a switch from curly { to regular (, and to put the function name after. That really is it. Oh, and a comical reduction in commas.


Show someone structural editing and in-editor form evaluation (REPL driven dev) on day 1, and they'll hate anything that doesnt have parens.

Scopes has very good "naked" syntax too https://scopes.readthedocs.io/en/latest/dataformat/

I especially love the fact that comments are indentation aware. It's nice that you can freely mix naked and parenthesis syntax too.


Thanks. I'll cons this on to my list of parenthesis-free Lisps. The others were Lisp 2, CGOL, and Dylan. None of these was ever used much (AFAIK Lisp 2 was never even implemented).

Lisp doesn't really have its own syntax, as it borrows the syntax it uses (and needs) to enter data. Its parser is just the read function. The special characters, e.g. open parenthesis, quote, hash, etc. are (or can be implemented as) read macros. Read macros can be used to add syntactic sugar if you want to, but there's really no need.

And finally, the bad old days of self-indenting Lisp programs and counting parentheses are over. The editors used for Lisp (predominantly EMACS and variants) autoindent code and then you stop noticing the parentheses and it looks to you much like Python, except more powerful.


In a similar idea, you can also make them less visible, so indentation strikes more: https://github.com/tarsius/paren-face/

Yes, there also exists readable Lisp: https://readable.sourceforge.io/

Feels like we're 5-10 years away from "Lisp as an alternative to Go".

I'm waiting for "Lisp as an alternative to Rust." :)

Why not get the best (?) of both worlds with the macro-lisp crate: https://github.com/JunSuzukiJapan/macro-lisp

A small snippet from the project's examples shows minimal boilerplate between Rust and a native-looking Lisp experience: lisp!(defun factorial ((n i32)) i32 (if (<= n 1) 1 (* n (factorial (- n 1)))));


Orange crab bad!

As a large, mature Go shop we have several “macros” that are now quite load-bearing. They’re based on Bazel rules and text/template.

Get support for these workflows (which are increasingly common) into the upstream toolchain, combine with the “AST” package and pretty printer, and Go is well on its way to being a Lisp.


There are several abandoned Clojure/Go projects out there, and I can’t be the only one thinking about how Common Lisp displaced arrays and multiple values and exported package symbols resemble core Go features. The catch is using type declarations to drive codegen, when so many types can be passed by value or reference. And single dispatch for methods, when CL tends to choose CLOS or nothing.

But my north star is to write REMOVE-IF-NOT and get the dozen lines of noise it takes to bubble errors and fill a new slice.


Is there a modern equivalent to the mentioned research test program that allows browsing submissions?

I would love to have a known problem set with browseable submissions in different languages. To see if I can get close to the optimal solution with my own preferred languages. But also to learn what an idiomatic solution in another language looks like.

The research problem from the article is interesting because it encompasses IO, some data structures and an algorithm for generating output.

edit: aside from single-binary problem sets like this, it would be nice to also have a more open category for "distributed systems" or "concurrency-related" problems to solve with modern coroutines/fibers/greenthreads or even AMQP-style message queues.


I think this fits what you are looking for:

http://www.rosettacode.org/wiki/Rosetta_Code

There is a listing of 'tasks' (explore > tasks in the topnav) and each one features submissions in many different languages.


One of the biggest productivity boosts, even more so than the language is ecosystem and libraries. Having libraries that do json, interface to AWS S3, interface to Postgress/MySQL, OAuth, etc is a huge productivity boost, versus having to roll your own.

Many times in software development, more work is spent in glue code and wiring stuff together than actual algorithmic development. And speaking of algorithmic development, I would bet, that just about any common algorithm, and likely many esoteric algorithms have easily available, high quality implementations that you can plug into your code without too much hassle.


Just confirming these libraries are there for CL.

(https://github.com/CodyReichert/awesome-cl/ for a start)


Well, some of them are. With a rather large probability, libraries I would need are not.

Such as? Maybe we'll be surprised to find a couple.

True that. I wonder why there is so much fuss about programming languages as such and not much about the ecosystem. Especially in web development backend. In my anecdotal observation, we spend most of the time in glue code and exploring documentation for various services/libraries. Thats why middleware tools like Data and Application Integration solutions sell like hot cakes in enterprises.

Because many devs have it wrong.

It should be "I want to use ecosystem X => language YWZ", while many do "I have language YWZ => what can I do?".

Hence why language wars based on grammar/semantics are pointless.


That's where Clojure(Script) shines.

Yep, I vastly prefer Clojure over JS/Python as a programming language but I'll pick JS/TS/Python first any day for real apps for the reasons you mention.

So where's all the error handling? File IO in Java would be rife with exception handling and not without reason.

(with-open-file ...) is roughly equivalent to the java try-with-resources pattern. `(loop for num (readline stream nil) while num ...` means, roughly, "loop through the lines of 'stream', exiting the loop on EOF". If you're assuming the input files are well-formed, CL is just much more concise with a roughly equivalent level of robustness to what a Java programmer would right for this.

> If you're assuming the input files are well-formed

Seems like a big if. Lots of failure cases like file not found or bad permissions are handled with Java checked exception. The minimum Java app would have to cover those cases in some way.

Does CL just silently ignore these types of issues when you "code as you would professionally"? Seems like a lot of code is missing but maybe I'm just used to the verbosity.


In Common Lisp implementations these conditions are usually reported by default in some way.

  CL-USER > (with-open-file (s "~/foo.lisp")
              (read s))

  Error: The file #P"/Users/joswig/foo.lisp" does not exist.
    1 (continue) Try opening "~/foo.lisp" again.
    2 (abort) Return to top loop level 0.

  Type :b for backtrace or :c <option number> to proceed.
  Type :bug-form "<subject>" for a bug report template or :? for other options.
A non-interactive program might bring up such a dialog automatically.

When one would do actually application specific error handling, this could (depending on how much comfort is wanted) be much more complex, since Common Lisp has more features to handle and repair errors.


> Does CL just silently ignore these types of issues when you "code as you would professionally"?

Nah, it just compresses the default cases well, and it doesn't force you to check for errors (like with checked exceptions)[0]. See the already mentioned `with-open-file'[1] macro, which is essentially a wrapper around `open'[2] that implements "try with resources" pattern. The documentation of `open' shows all the interesting flags you can use to predefine some behavior (that's IIRC the equivalent of Java's wrappers around file readers/writers) - like what to do if a file exists or doesn't.

In case there is an error, either when opening or reading data, the relevant function will "signal a condition", which is Lisp for "throw an exception", except it's better :) [3]. Per documentation, it'll typically be a condition of type `file-error'[4] or `error'. CL standard doesn't specify subtypes of `file-error' further, but individual CL implementations can and do subclass `file-error' to allow for more granular identification and capture.

So, in short, the way you'd use this professionally would be to use `with-open-file', and use appropriate condition handlers and restarts to deal with any problems that occur.

--

[0] - Which I consider to be unfortunate. Checked exceptions got a bad rap, but they were a good idea. However, given the extremely fluid nature of Lisps in general (and CL in particular), fully checking for unhandled exceptions would likely run afoul of the halting problem...

[1] - http://clhs.lisp.se/Body/m_w_open.htm#with-open-file

[2] - http://clhs.lisp.se/Body/f_open.htm

[3] - It's a long topic but, TL;DR: signalling a condition does not automatically unwind the stack - it looks up the call stack for a handler, which then executes on top of existing stack, and can pick a way to recover from the problem - including unwinding the stack partially, and invoking recovery code defined there.

[4] - http://clhs.lisp.se/Body/e_file_e.htm#file-error


Yeah, I'd be interested to see a modern comparison, I'm thinking Java may have evolved somewhat to make up some of the distance to Lisp of the 2000s :D



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: