Hacker News new | comments | ask | show | jobs | submit login
Whoever does not understand Lisp is doomed to reinvent it (2007) (lambda-the-ultimate.org)
175 points by rlander on May 14, 2016 | hide | past | web | favorite | 229 comments



I don't get it. If LISP is such an EPIC language why are there no major pushes from LISP community, or just showcase a piece of software that proves it. I have been looking on internet about what makes LISP cool it's ideas about code is data, powerful macros and hell lot of bragging about being able to define your own syntactic sugar but can't find a single MUST HAVE piece of code that makes me say "I have to learn this". It's a ML age now and developers around the world are looking for such gems, but I don't see LISP picking up momentum (I did notice Erlang picking up momentum upto the point people feeling crappy about syntax invented Elixir).


Because language goodness really doesn't matter that much. It's like people arguing about which car is best. Sure, a BMW can do more than a Kia, in more style. But in terms of what influences the course of human events, what usually matters is, can your car get up to the speed of traffic and continue running until you get to your destination. Everything else is just niceties.

Even the advantages in comfort don't really change much, because your expectations just adjust to whatever level of luxury and effort you are used to.

The truth is, functions, variables, and loops are an extraordinary form of magic. Being able to use those forms of magic well is an infinitely deep space of learning and growth. Maybe your language feature helps you solve a problem better, but compared to the opportunities available to you in terms of just becoming a better programmer, it's a comparatively small opportunity.

For example, knowing when to put away your primary language and use a special purpose one (and when not to) is an ability that can improve your productivity 1000x. I don't think there are any language features that can come close to it.

Compounding this, every time you use a unique language feature, you incur readability costs, and portability costs. The more control structures you put into your toolbox, the more choices you have to consider every time you read and write code. That cuts into whatever wins these fancy control structure provide.


> but can't find a single MUST HAVE piece of code that makes me say "I have to learn this".

Because Common Lisp isn't a better language anymore.

When Lisp was big in the early 80's, assembly, COBOL, FORTRAN, C and some Pascal pretty much ruled the day. Compared to any of those, Lisp is deeply magical.

So, the real question is why did such a magical language lose to the upstarts that all appeared in the late 80's and early 90's: Perl, Python, Tcl, Lua, etc.

Answer: files, strings, and hash tables. All of those languages make mangling text pathetically easy. Perl is obviously the poster-child for one-liners, but all of those make that task pretty darn easy.

Lisp makes those tasks really annoying.

Just take a look at the Common Lisp Cookbook for strings: http://cl-cookbook.sourceforge.net/strings.html

Why should any beginner willingly subject themselves to that kind of verbosity?


> Answer: files, strings, and hash tables. All of those languages make mangling text pathetically easy. Perl is obviously the poster-child for one-liners, but all of those make that task pretty darn easy.

Fair point about strings, not sure about files and hash tables. But you've mentioned a bunch of scripting languages, and CL was never competing with these. Why it lost to Java and later to Python and Ruby in webdev is a different question. But string handling - you need to notice that as an industry, we shot ourselves seriously in the foot with Unix and proliferation of unstructured text. The unstructured text-oriented philosophy is frankly insane, and lisps are older than it - so for better or worse, they didn't follow.

People now are slowly rediscovering why adding structure to your text is better than having each program implement its own incompatible, bug-ridden regex-based parser. And surprise surprise, structured data is something lisps excel at since before most of us were born.


> But you've mentioned a bunch of scripting languages, and CL was never competing with these.

Except that it WAS. Programmers have a limited amount of mindspace. So, the question as a programmer in the late 1980's/early 1990's is how I'm going to allocate that space:

1) Systems language--C is pretty much your only option at that point unless you are in assembly (God help you). Anything bashing hardware (graphics, audio, networking, etc.) goes straight for C.

2) Many programmers are also system administrators in this time frame. So, I'm going to use a language that mangles text well--Perl wins here. People transition to other languages only once they get fed up with Perl.

3) Operating system specific languages. Unix pushes you toward C. Windows was C or Visual Basic (that ecosystem was huge at one point). Mac was C or Pascal.

So, in this time frame: Lisp is painful for system programming on small systems (fails 1). Lisp manipulates strings badly (loses on 2). Lisp is not tied to an architecture that would drive adoption (loses on 3).

So, as a programmer, Lisp is, at best, at slot 3 or 4 in terms of programming languages for daily use. Is it any wonder Lisp never got real traction?

You also have to consider how knowledge got disseminated in this time frame. The web doesn't exist. Email is actually scarce unless you are at a university or a big technology company. Netnews is something that soaks up enormous amounts of disk space so sysadmins hate carrying it in useful ways.

It is hard to appreciate, in this era of instantaneous, ubiquitous information, just exactly how much the pink camel Programming Perl book with its cookbook section singlehandedly drove adoption of Perl. I had 3 copies of that book that I used until they disintegrated. Can you think of ANY other programming language book for which that is the case? (Actually, I can, but I'm the odd duck. My workbook copy of The Little Lisper got the same level of abuse to disintegration).

> The unstructured text-oriented philosophy is frankly insane

It is NOT insane when memory and disks are small--which defines most of the time when Lisp was popular. Structured text requires lots of extra overhead that simply wasn't going to fly when RAM, disk, and bandwidth were very restrictive.


Very true. Now you mention it, I wonder why, given the awesome power of extensibility, this hasn't been improved? Shouldn't it be trivial to write a DSL that makes string handling almost, if not as good as, Perl et al?


It is, which is probably why nobody bothered to put it in a library before. But you have CL-PPCRE, which is arguably the most performant and one of the most solid implementation of Perl-compatibile regular expressions out there.


The only string related task I've found Common Lisp excelling at is it's print formatting syntax. That stuff is really nice.


> If LISP is such an EPIC language why are there no major pushes from LISP community,

The main reason is that the LISP community is just not paying attention to the modern world.

Back in its days, LISP was revolutionary and ahead of its time. It gathered a bunch of followers who could reasonably be justified in thinking their language of choice had something unique about it.

And then, they made a major mistake: they stopped paying attention.

The tech world moved on and passed them. And they ignored it, confident that nothing new or interesting could ever be invented after Lisp.

It's the ultimate hubris: thinking that programming language science can ever reach an end.


> And then, they made a major mistake: they stopped paying attention.

Nah, the funding just dried up after Cold War ended and symbolic-based AI turned out to be a failure.

> Back in its days, LISP was revolutionary and ahead of its time. It gathered a bunch of followers who could reasonably be justified in thinking their language of choice had something unique about it.

Those "bunch of followers" actually invented half of the programming you use today, and also there were quite a lot of them during the Cold War. The history of computing is not like what C/Java books would like you to believe.

> The tech world moved on and passed them. And they ignored it, confident that nothing new or interesting could ever be invented after Lisp.

Maybe a bit, but for a good reason too. The tech world made a huge step backwards with introduction of Unix and C, and we're only slowly recovering and beginning to pay attention to the things people did before.

> It's the ultimate hubris: thinking that programming language science can ever reach an end.

I agree. But it's also hubris to believe it can not take steps in the wrong direction and enter blind alleys.


> The tech world made a huge step backwards with introduction of Unix and C, and we're only slowly recovering and beginning to pay attention to the things people did before.

When I discovered UNIX via Xenix, I already had ZX Spectrum (48K, 128+2A, 128+3), SAM Coupé, Amiga 500, Windows 3.x experience, so the world of UNIX OS architecture seemed quite interesting.

When I got to learn C, that experience already had showed me there were other means to make use of all computer capabilities.

So the time spent on UNIX land, meant I was initially one of those renegades trying to use C++ instead of C like everyone else.

Eventually the introduction to Smaltalk and Native Oberon spiked my interest to delve into their history, as I was specializing in compiler design.

That showed me a world much more interesting that what AT&T has ever produced.

A saner mentality regarding what computer safety and developer productivity should actually be like, but sadly failed in the market, because UNIX was initially available for free.

I would dare to say that the OS architectures and development stacks from Apple, Google and Microsoft are much closer to those ideas than whatever exists in the pure UNIX world.


You're doing some major revisionism with this post and claiming a lot of completely unsubstantiated (and in my opinion, completely made up) facts.

(https://news.ycombinator.com/item?id=11699585)


I don't think that's true: I think the language / libraries did evolve, however the things they stopped (or never did) pay attention to; pr & marketing. Common Lisp[0] has a new site now so they are feeling the need, but it needs a lot more; modern videos with modern girls & guys making stuff with it and presenting it.

Ofcourse I long for some of the revolutionary stuff the Lisp community was known for once; particularly a better unified GUI library (I find GUI programming painful because of all the libraries which all do something else 'right'; no-one is doing real revolutionary stuff there) and a better concurrency library. With the knowledge within the community it 'feels' possible to make something for this modern age aka run the same code on AWS Lambda or on a Greenarrays ga144 without changing it.

[0] http://lisp-lang.org/


> they stopped paying attention

I don't think that you've been paying attention to recent Common Lisp libraries or recent lisps. Lisp has never stopped evolving and continues to take ideas from other languages.


Just as other languages continue to take ideas from Lisp!

C.A.R. (Tony) Hoare said of Algol 60 that it was such a good language that it was not only an improvement on its predecessors, but an improvement on nearly all its successors as well. The spirit of that statement rings true for me[1], and I feel that the same could equally be said of many Lisps, especially when I think of things like read/write, CLOS, streams, conditions, (delimited) continuations, hygienic macros, etc. Common Lisp has been virtually static for over 30 years[2] not because it's "behind the times", but because it was so far ahead of the times that everything else is still playing catch-up!

[1]: I feel as though Algol 60 was, with few exceptions, the best language available until around the mid-to-late 80s.

[2]: Not to mention that it really hasn't been totally static: now we've got ASDF, QuickLisp, {U,C}FFI, CLIM, SLIME, etc. And loads of libraries.


Algol 60 didn't have structured data types, only arrays and numbers. You can't manipulate strings in Algol 60 other than by calling procedures written in another language. It has very strange and problematic procedure call semantics.

Algol 60 was largely superseded in the mid-1960s.


Mind naming a single point where the "tech world" outpaced the Lisp heritage?


Clojure is running enterprise tools around the world. We use it exclusively at our company, partly because we are small and time:productivity ratio is really good with it for few developers.

But much larger companies are using it also, nearly exclusively.

As far as macros go, the most popular Clojure library for routing web requests on a server is Compojure, and it wraps a lot of low-level stuff up in a tight lisp macro set that makes it trivial to functionally compose your http requests. It's pretty nifty and is used extensively in industry.

Also this web site you are interacting with right this moment as you read this is built on lisp.


> Clojure is running enterprise tools around the world.

That's a really weak argument. Java is used orders of magnitude more in enterprise, and you don't see anyone proclaiming that as the pinnacle of language design.


The original question was "why noone uses lisp", you argue something else.


The OP asked for examples of lisp being used anywhere that is relevant, and I gave some.


>Also this web site you are interacting with right this moment as you read this is built on lisp.

And didn't they at one time have to reset the server every few weeks because HN would use up all the RAM or something? That's not something other forums built in lesser languages have to do.

I mean, it obviously works, but would anyone say it works particularly well, even as a Lisp implementation of a web forum?


Well other languages cheat in a way, php restarts the runtime frequently, and many other scripting environments do as well.

Also, as a side note, one would be quite surprised how frequently will known companies solve memory problems by restarting processes frequently.


The RAM issue (if there was one) has nothing to do with the language in this case. If we are talking about lisp, then I assume we are referring to any true lisp, in which case the underlying hardware scenarios widely vary.

I don't know how Arc does it but Clojure's resources are no different than any of the millions of Java web servers running; just a much better language to run on vs Java for that platform.


I also don't get it. I've literally never seen a job ad in my area a single time that needs lisp or lisp-based languages, and the only time I ever see it mentioned is by insanely devoted people who love spending dozens of hours learning a bizarre language that nobody else uses.

It reminds me of Esperanto a bit, it might have some nice ideas but it will never take off. I've yet to hear a succinct, practical argument as ot why I should take the dozens of hours to learn lisp or any of its derivatives as someone working as a non-academic.


> practical argument

If you want to increase your productivity in orders of magnitude, if you want to be able to complete projects on your own that otherwise large teams would struggle to do, you must learn Lisp or any other meta-language suitable for DSL construction. Luckily, now there is quite a few of them, so if you don't like Lisp in particular there is a number of alternatives available.


>If you want to increase your productivity in orders of magnitude, if you want to be able to complete projects on your own that otherwise large teams would struggle to do

I've seen this claim a few times in this thread. What makes this possible? If you can increase productivity so drastically, why isn't everyone using it all the time? It seems utterly irrational not to.


> What makes this possible?

Take a look at what kind of power DSLs can give you:

http://www.moserware.com/2008/04/towards-moores-law-software...

If you use this methodology consistently, all of your code (including the DSL compilers themselves) is just as readable as this. Imagine, all your code reads as just an English specification of a problem you're solving, with some tables and diagrams where needed.

> why isn't everyone using it all the time?

I do not have an answer to this question.

Yet, given that I never managed to convince anyone who was not already doing this stuff that macro-based DSLs are a way to go, apparently, there is a huge cultural problem and a huge stigma attached to macros (likely because of the C preprocessor and some really bad Lisp examples - think of the stuff like LOOP macro).

I personally use this approach exclusively, and I cannot understand those who prefer to write 1000x more code.


> I personally use this approach exclusively,

In my experience, most people who are so in love with macros/DSL's work on solo projects.

Are you working on a team? If yes, ask your coworkers what they think about your macros and it should be more obvious to you why macros/DSL's should be used much more sparingly than what you recommend.

> I cannot understand those who prefer to write 1000x more code.

You're setting up a false dichotomy. It's possible to not use macros and still keep the amount of code down to a reasonable level.


> In my experience, most people who are so in love with macros/DSL's work on solo projects.

Yes, but not because it's hard to cooperate when you use DSLs (it's in fact far easier to work in a team this way). It's more because scale of the projects shrink to such an extend that you don't need a team. One person can do with macros an equivalent of what a team of many people would do in a much longer time without macros.

> If yes, ask your coworkers what they think about your macros and it should be more obvious to you why macros/DSL's should be used much more sparingly than what you recommend.

Yes, I work in a team, and most often worked in teams. Yes, DSLs were always the greatest performance boost available. No, your beliefs about DSLs being somehow incompatible with team work are entirely wrong and unfounded.

> It's possible to not use macros and still keep the amount of code down to a reasonable level.

Mind naming a single viable alternative?


How would you suggest a typical run-of-the-mill object-oriented programmer go about learning such a methodology?

Most of the links in that article are dead.


> How would you suggest a typical run-of-the-mill object-oriented programmer go about learning such a methodology?

That's exactly what's wrong with this thoroughly broken industry. They should not have learned this awful religion in the first place. Now it's much harder to unlearn and re-learn from scratch than learn it without ever being exposed to the wrong ways.

This methodology is far, far easier than anything OOP (the latter cannot even be defined comprehensively), and if you learn it first you can become productive much quicker than if you go the typical OO way.

And the most important feature of this methodology is that it's far more accessible. You don't need to be smart to use it. No need to memorise dozens of "design patterns" and all that. No need to invent complex twisted designs, trying to cram as many patterns as possible into a trivial task. DSL-based approach is very straightforward and mechanical, and can be executed by practically anyone who managed to learn how to read and write (i.e., around 95% of the population).


> DSL-based approach is very straightforward and mechanical, and can be executed by practically anyone who managed to learn how to read and write (i.e., around 95% of the population).

I mostly agree with your comments, but here I think you're underestimating the difficulty of creating a good DSL way too much.

It's like saying that anyone who managed to learn to read and write can create Esperanto.

Also, being able to read and write does not guarantee the ability of clearly expressing one's thoughts. Without this crucial skill inventing a DSL is an exercise in frustration, both for the original author and later for the poor users of such a DSL.

And one more thing: you can create (internal) DSLs in almost all existing languages, by (ab)using their syntax and semantics. Smalltalk and Ruby are quite happy with their many DSLs, despite being an OO language.

As for GP:

> How would you suggest a typical run-of-the-mill object-oriented programmer go about learning such a methodology?

See Avail: https://klibert.pl/output/avail-and-articulate-programming.h...

Then learn Scala, Ruby, Elixir, Racket, Smalltalk, F#, Elm, Haskell. They all make use of DSLs and contain many examples of well-designed eDSLs. After that, go read a bit on language design and compiler construction and you're done.


> you're underestimating the difficulty of creating a good DSL way too much.

I'm doing this for living, and I can't stop wondering at how laughably easy it is.

The method is very simple:

* describe the problem in plain English (maybe with some diagrams)

* iterate it a few times until you have a syntax you think is unambiguous enough

* strip this syntax from all the sugar you just introduced in order to define an AST

* find DSLs in your toolbox that are potentially close to the one you're building and cherry-pick the necessary language components from them

* write a sequence of very simple transforms that would lower your source AST into a combination of the parts of the ASTs of the target DSLs of your choice

* Done. An efficient DSL compiler is ready, with a language designed as closely to your current view of the problem domain as possible.

* If you later find that your DSL is inadequate and your understanding of the domain was insufficient, than just start over again, this entire process is so cheap and simple that it does not really matter.

> ability of clearly expressing one's thoughts.

This is crucial indeed, but not having such a skill is a functional illiteracy. Current estimation of the functionally illiterate population in the first world countries is from 5 to 10%. The remaining numbers are sill much higher than the percentage of people who are capable of understanding OOP.

> you can create (internal) DSLs in almost all existing languages, by (ab)using their syntax and semantics.

But in order to do this you need to be smart (or smart-assed). It's a hack, and hacks are bad.

OTOH, macro-based DSL implementation is purely mechanical and does not involve any thinking at all.


BTW, are you /u/combinatorylogic on Reddit? Just guessing because of the (potential) serendipity.


awesome!

(and sounds totally intuitive and obvious, not like a trick, but like a proper methodology, proper approach, never had this feeling of before)

thank you for info! will try to dig in


This sounds way too good to be true


Yet it is something I know firsthand. I also know how hard it is to convince people to try this approach. There is a lot of lies people believe in: that compiler construction is hard, that macros are hard to debug, that DSLs are only suitable for large and "smart" projects, and so on.


A language is more the choice of a coder and the environment they use.What you need to realize is the history of Lisp and face the face that new languages are merely re-implementation of paradigms which are present in some other language. Lisp started as a set of axioms which then were implemented.You should read pg's [1] description of Lisp.

>If you have seven primitive operators (quote, atom, eq, car, cdr, cons, and cond) then you can define in terms of them another function, eval, that acts as a Lisp interpreter.And so any language that contains these seven operators is a dialect of Lisp, whether it was meant to be or not.

So Lisp is hard to understand and use but what happens when you try to modify/make languages which handle the situations that Lisp can,you end up with Lisp.And as a Lisp Dev, when you have seen this cycle.You really do not feel the need to defend every attack against Lisp.

PS:I am not a Lisp dev,but i am trying hard to learn Lisp.

[1]:http://www.paulgraham.com/ilc03.html


Checkout the VR script by John Carmack

https://www.youtube.com/watch?v=ydyztGZnbNs

It's on racket (scheme dialect). There are no language which can make you more efficient. But some languages open your mind. Just read SICP and find out what makes language EPIC. Many big projects are in Php. But surely no one would claim that it makes php "the EPIC" or even great.


"LISP is such an EPIC language..."

IMO, the true advantage of lisp nowadays is that of an abstract syntax tree algebra ripped clean of cruft (well, that's scheme, but...). It's much more useful as a way of thinking rather than as a language. "Structure and interpretation of computer programs"is great because if one goes through the entire book and especially implements the scheme interpreter in e.g. C++ or whatever living language it gives you this fantastic general normalized thought model of programming languages which makes it much easier to learn new languages and concepts. That's anyway my experience.

So, to me, Lisp is useful similarly as say, relational theory is useful in software designed (having a relational model is wonderful for complex systems even with a lack of SQL database as a part of the system - if there is more than one 'thing' screw objects and rather use explicit tables and maps).


A nice example of Common Lisp is Maxima http://maxima.sourceforge.net/ a Computer Algebra System


> LISP is such an EPIC language

The point isn't that LISP is a great language. I think it is, but that's mostly personal opinion. The important part is to understand the concepts that LISP introduced and/or popularized.


You have surely heard that success has many parents and failure's an orphan. The proverb is of course cynical, but there is more truth to it than it seems. Success often needs many parents, while failure may come from a single factor among many. Or simply because there were not enough parents to make it succeed, in which case, you could say only half-jokingly that the failure happened for no reason.


A small few: http://lisp-lang.org/success/.

EDIT: Also, https://www.youtube.com/watch?v=8X69_42Mj-g for a recent example of Common Lisp pushing the limits of science and computing in chemistry.




"I don't get it. If LISP is such an EPIC language why are there no major pushes from LISP community, or just showcase a piece of software that proves it."

Irony. Using HN to say this. [0]

[0] Arc ~ https://en.wikipedia.org/wiki/Arc_(programming_language)


Sorry but a simple site like HN is not a good showcase of any language. I think they mean more ambitious and popular software.


The funny thing about this is that there is no language that simply allows you to make great things, because making great things really does not have anything to do with the language you use.

The only types of areas where this is even remotely the case are things where the rest of the field consistently undervalued a certain concept/category of problems. That's why there is really no language that can do concurrency as easily as Erlang.

The only way to figure out whether a language is good for you or not is to use it for whatever you can think of. Trying to be convinced by an Internet forum that a language is good because someone else made something in it is pointless. People have made whatever you can think of in almost every language older than a few years. Their successes say nothing about whether or not the language is good or not or whether or not it was a painful process to make it.


Well said!


"a simple site like HN is not a good showcase of any language"

HN the site, looks deceptively simple. It solves a number of problems simultaneously [0] yet seems to work at scale [1] without bugs. [2]

The success of Arc, a Lisp derivative is in the everyday usage of the site and the hard problems it solves. [3] The original comment stands.

[0] http://paulgraham.com/hackernews.html

[1] "1.6 million page views and 200,000 unique visitors on a given weekday" 2013 ~ http://techcrunch.com/2013/05/18/the-evolution-of-hacker-new...

[2] In fact almost 10 years ago I reported a bug, pg actually responded it was a server reset. https://flickr.com/photos/bootload/406590838/

[3] Submitting high quality links, with high quality discussion while increasing users. That is a hard problem.


It's too awesome to lend itself to such showcasing


Talking about lisp in 2016 is mostly just debating static vs. latent/dynamic typing. The lisp guys were talking up their advantages in the 80s/90s era of fortran and C and really bad c++, and I guess the really shitty original JVM. It's a discussion from a dead age.

latent/dynamic typing and also macros work very poorly when the codebase is large and there are many people involved, or when we're talking about decade plus code-base lifespans. It's that simple. If you're one or three noticeably smart dudes building a system from start to ultimate finish (financial exit in four years?), why not go with LISP or something like it. If you're a team of one or two I advise you do go with lisp or perl or python or erlang or whatever.

But that's not the systems anybody builds or maintains much anymore. We make things that a rotating cast of 100 might touch over 30 years. We need static typing.


I disagree on several points:

> latent/dynamic typing and also macros work very poorly when the codebase is large and there are many people involved, or when we're talking about decade plus code-base lifespans.

Lisp is one of the few languages that can say it actually doesn't age; Common Lisp code that was written 20+ years ago is often used today without a single change. You can't say that about most of the popular languages.

RE many people on the team - I see a lot of talk about how macros can be unreadable and all, but frankly, IMO that's totally backwards. Readable code is not about using a subset of language that you can find in "X for Dummies" book. Readability is about structuring your code to express intent, to be logically consistent, and about all the other things that transcend the syntax of the language. Macros are an ultimate tool for increasing readability, because you can keep recursively eliminating boilerplate, cruft and repetitions, bringing your code closer and closer to the intent it's meant to communicate.

> But that's not the systems anybody builds or maintains much anymore. We make things that a rotating cast of 100 might touch over 30 years. We need static typing.

Static typing is cool and all (I like it), but RE systems - no, it was in Lisp age people actually cared about buildings systems that would live for decades. Today, people build temporary systems that get thrown away or rewritten every couple of years at most.


> Lisp is one of the few languages that can say it actually doesn't age; Common Lisp code that was written 20+ years ago is often used today without a single change. You can't say that about most of the popular languages.

Examples?

This is certainly false for most languages in use today: C, C++, Java, even C#: code written in these languages 15-20-30 years ago can still be compiled and run fine today.

I'm not sure what this proves much, though.

> I see a lot of talk about how macros can be unreadable and all, but frankly, IMO that's totally backwards.

Why?

Macros are basically syntax defined for a specific task. Why is it so hard to see that this can lead to an explosion of unreadable code if left unchecked? Wouldn't you be concerned if you had to work on a huge code base where most of the code is written using macros?

I would run away, personally.

> it was in Lisp age people actually cared about buildings systems that would live for decades.

We still care about this today. Even more than in "Lisp age" because we know how long code will be around. Which is one of the reasons why we have been moving at an accelerated pace toward statically typed languages.


> Examples?

Half of the libraries in the Lisp ecosystem? They were done once, polished over years, and pretty much did not age with time.

> Why is it so hard to see that this can lead to an explosion of unreadable code if left unchecked? Wouldn't you be concerned if you had to work on a huge code base where most of the code is written using macros?

Because again, readable code is not about using the same small subset of programming language constructs and design patterns everyone knows. It's about clear communication. Macros done right let you express your ideas more clearly, and hide/remove unnecessary boilerplate that makes code hard to read. Think about e.g. Java or C++ codebases, where 50%+ code is scaffolding and otherwise irrelevant to what the program is meant to do / communicate. Lisp macros let you hide all that.

Now if you do macros wrong, then of course code will be unreadable. But the same is when you design your API wrong using functions, or using classes.

Moreover, whining about macros being hard reminds me of whining about ternary operator in C++/Java/PHP world, where many people say not to use them because "juniors don't understand it". The solution isn't to ban ternary operators - it's for the juniors and the whiners to get their shit together and spend 5 minutes learning about it.

> Even more than in "Lisp age" because we know how long code will be around. Which is one of the reasons why we have been moving at an accelerated pace toward statically typed languages.

Do we? All I see is throwaway code. Especially on the Web, everything is ephemeral, and nobody honestly expects stuff to last more than few years (most startups are actually based on this assumption).


> Why is it so hard to see that this can lead to an explosion of unreadable code if left unchecked?

Anything can lead to an explosion of unreadable code if left unchecked. Any language feature, with no exceptions. Variables, loops, functions, types - you name it. Anything can go wrong if used with a bit of imagination.

Macros, on the other hand, provide an exclusive way of eliminating this complexity. A way of eliminating degrees of freedom of what can go wrong. Nothing else is capable of doing it.

> Wouldn't you be concerned if you had to work on a huge code base where most of the code is written using macros?

I'd be extremely happy to be able to work on such a well designed project.

> I would run away, personally.

It only means that you don't know how to use macros. Nothing else.


> I'd be extremely happy to be able to work on such a well designed project.

You really think the project is well designed just because they use macros, without even looking at the code or knowing the engineers?

Now I really think you're not for real and just messing with us.


I think that it can at least be a sign of a good design. If it went that far and still kicking, chances are pretty high.

Any comments on any other points I made?


I completely agree and I wish we could have the best of both worlds. I love Clojure and I love the compiler support with static typing in other languages.

Clojure attempted to add static typing with the core.typed projects, but high-profile exits [0] from that framework have made clear that this is a problem that can only properly be implemented at the language level.

If I could have a lisp with static types, I might not use anything else again.

[0] https://circleci.com/blog/why-were-no-longer-using-core-type...


Common Lisp has static types.


But not good parametric polymorphism. Common Lisp's static types are "too static", and often unsafe.


Ok for parametric polymorphism: the identity function in OCaml has type 'a -> 'a, whereas in SBCL it is "(function (T) T)". On the other hand, "(lambda (x) (if x 0 1))" is a function from T (anything) to the type BIT, not to some arbitrary integer type. Moreover:

    (lambda (x)
      (unless (minusp (if x 0 1)) 
        (error "Oops")))
... has type "(function (T) NIL)", meaning that the function does not return a value. That means that types are propagated so that the test can demonstrably always fail. A type in SBCL defines what is returned when execution terminates normally. So for example the type of `(lambda (x) (loop))` is "(function T NIL)", because it never returns (yes, I know you cannot always tell if it halts or not). The bottom type NIL should not be confused with NULL, the singleton type for the NIL value.

In OCaml, exceptions are not visible by the type system and you can write:

    let f x = raise (Failure "NO")
... and still have the type 'a -> 'b

So the kind of analysis that make sense in a language, as well as their soundness, is relative to the properties you want to check. Would you say that OCaml type system is unsound because it allows you to run code that can raise exceptions at runtime? I would love to see more precise type checking in OCaml, for example, and it probably already exists (I'am interested, if anybody has an example). But it probably makes little sense over there.

The same goes for parametric polymorphism in Lisp. The most in-depth approach to bring parametric polymorphism in CL is LIL (https://common-lisp.net/~frideau/lil-ilc2012/lil-ilc2012.htm...), but since it is dynamic, people who view dynamic typing as a deficiency might see that as a restriction.

You also claim that the type system is "unsafe". On the contrary, types being checked at runtime is a safe approach (buffer overflow, etc.) and plays well with the fact that everything can be redefined at runtime (maybe you don't like this aspect). In SBCL, type declarations are assertions. With the default optimization levels, that means that if they cannot be proved, they are checked at runtime (with the few caveats listed in SBCL man page). So if your input variable X is type as NUMBER and the function terminates normally, then you know that X effectively was of type NUMBER (that's a guarantee instead of an assumption). That result can be used in the following calls so that checking the type of X is not necessary anymore (if it is not modified in the meantime). The only case where types are trusted blindly instead of being checked, when necessary, is when you set the safety level to zero. This can be changed locally, not necessarily as a global switch.


Run time type checks in my experience mean a few things:

1) You can compile or run the program fine even if there are type errors.

2) An existing type error may never be caught if that particular block of execution never runs. Then you can have unexpected surprised later when you finally do something to trigger that block.

3) They are slower than static type checking due to the runtime costs of checking types. Statically-typed compilers can do enormous optimizations once they know exactly what the types are.


I am fine with statically checking types, or any kind of proof. Are your floating points calculations always precise enough? Can you detect if no user inputs ever reaches critically secure code without being sanitized? Does this multi-threaded code ever deadlock? There are plenty of things that can be made to guarantee properties you care about, and the reason you do not always use them is because of budget (time, money). Do you always write contracts around your stuff? is it a good contract, not one that just repeat what your code does? If you don't, please understand that I don't always use static types for everything and in all cases. Also, my code is not purely functional either.

If I need to do smart things in Lisp ahead-of-time, I may want to use a DSL and prove whatever I want to prove on it, and generate correct-by-construction code that does not check things at runtime. I could use ACL2, too. Please note also that I can work in a different language when it makes sense.

3) Statically typed compilers include Lisp ones. My code is sometimes underlined in orange like yours, because I made some dumb typo or because types do not match. Likewise, optimizations are done too, in particular inside functions, where things are more static than in the global environment.

What about efficiency? Look at the postal system: you have to wrap letters and objects in enveloppe or packages (except postcards, which are like fixnum), put a label on it with many informations... what a waste of time and space! Yet, each object has a type now and can be dispatched reliably and efficiently. If you put your letter in the wrong box, you will have enough information to recover and perform your job. Once letters are all filtered out from packages, you can avoid checking that they are letters and gain some efficiency. You can build a new kind of service (drone-delivery?) with a special label and dedicated rules and integrate it within a running system without restarting the world. If anything goes wrong, you can have a generic error handling mechanism that does not crash everything. Static types are more like pipes: clean water here, used water there, and they never mix in a wrong way thanks to pipe calculus. So while I agree that static analysis can be great for doing crazy optimizations, there are use-cases where dynamic typing shines, and generally people replicate that using tagged objects anyway: think about game entities which are "typed" dynamically, or frameworks when you can load custom scripts (and those are generally not typed-checked, unlike Lisp).

2) You plan for failure. Even in a statically-type language, you'll have runtime bugs. Only in a mission-critical system are errors fatal. If you place restarts or error handlers accordingly, you'll have the opportunity to fix your stuff. Ask Erlang people about reliable systems.

1) Worse, you compile your code, the compiler complains but you can still run the produced code! Then you can test your error-handling code.

Seriously, this is not a problem in practice. Coding and testing are interleaved because the environment is right here awaiting orders. If you look at the end-result, I am doing the job of the static type checker by fuzzying input in a way that makes sense for that particular function. I also have a global view of the system and more context to decide what will happen at runtime, or not. When I fail (I am not proud of it, like most static analyzers), the runtime is here to catch it.


And Racket. And Shen.



Irken looks interesting, thanks for the link.


Javascript though.


In both cases of Javascript and Objective-C, programmers didn't deliberately choose "dynamic" as an explicit engineering design.

Even though the Apple System9/OSX platform has been around a long time, the Apple ecosystem got really popular with the iPhone & iOS in 2007. From 2007 to 2014, the official SDK for that was Objective-C. Obj-C is dynamic, and as a consequence, people happened to program in a dynamic language. ("When in Rome do as the Romans...")

Same situation for web browsers. The only "sdk" for adding interactivity and actions to web pages was Javascript -- which happened to be "dynamic".

In other words, "dynamic" was something programmers had to live with rather than something they chose for architectural superiority.

In both cases (Swift, TypeScript), the desire for static typing became apparent.


People are constantly complaining about using it. And many of them are trying to make it safer, it's just a slow process since it requires standardization and adoption at a large scale. Think about strict mode, ES6, Google Closure compiler, Flow, Typescript, etc.


The piece of code is called Open Genera and it is available from pirates.

Some sources and excellent documentation are on the bitkeeper.

There is also the MIT Scheme, as seen on TV in the Wizards Lectures (1986 Abelson & Sussman SICP lectures).

BTW, the guys who designed and developed early lisps, up to 80s, were bright people heavily influenced by discoveries of "modern science" of the time, which were genome and protein structures, basics of cell biology, signaling pathways, early genome sequencing techniques etc. Their designs has been based on right principles. Erlang is another example.

Another obviously successful project is called (surprise, surprise!) GNU Emacs. Non surprisingly it is a continuum of early emacses, such as Zmacs.

Recently, Eitaro Fukamachi (a true hero, in my opinion) have bootstrapped a few remarkable projects, which should mark the beginning of the Common Lisp renaissance, but these projects also went unnoticed by packers. Why Clack when we have PHP or Java Server Faces?

As for Erlang syntax - the pattern-matching on receive with variable binding for simple, terms-based protocols is, again, too good for mediocrity to grasp

   {ok, Val} | {error Why}
etc.


You're doing some major revisionism with this post and claiming a lot of completely unsubstantiated (and in my opinion, completely made up) facts.

I don't think Gosling, Stallman or Armstrong ever mentioned any basis in genetic science to justify their language. And seriously, think about it... It's completely absurd.

I'm also not sure why you think that returning error codes is "too good for mediocrity to grasp", because in the 21st century, it's considered a very bad practice that leads to extremely unsafe coding practices, which is one of the many reasons why it's been deprecated in most modern languages invented in this century (except Go ;-)).


Sussman cites a desire to mimic biology in his essay "Building Robust Systems"

https://groups.csail.mit.edu/mac/users/gjs/6.945/readings/ro...


It is actually a long story, and it goes back to the old dogma that lists and structures made out of conses are good-enough, almost universal building blocks for code and data. Type-tagging was obviously inspired by DNA, etc.

As for protocols, there is, for some obvious reasons, still nothing better than asynchronous, out of order message-passing over a packet-based networks, with fixed-size header (upon which one could patter-match in a single pass) and adjustable size payloads, which could adapt to a physical channel by adjusting to its frame-size, so, the stdlib and OTP is good-enough and well-balanced and just works.

It is also not a coincidence that Erlang is pure-functional - so is the world of proteins and enzimes.

There is a bit more universality in a chunked linear structures and stateless asynchronous message passing of type-tagged binary data with feed-back loops co-regulation that it seems.)


All things you mention are library-level stuff. It's also no surprise that Erlang is actually a Lisp with Prologish syntax - just take a look at the intermediary language that sits between Erlang and BEAM bytecode ;). Yeah, the one you have to learn about if you want to do parse-transforms.


> Type-tagging was obviously inspired by DNA

Source?

> It is also not a coincidence that Erlang is pure-functional - so is the world of proteins and enzimes.

That world is mostly driven by mutations, which are the exact opposite of functional purity.

I really have no idea where you get all these broken metaphors.


Good polysemous use of the word 'mutations.'


Want to see a real killer example? Look no further than the Nanopass framework. It outhaskells Haskell and outmls MLs. It solves the expression problem they struggle with. It makes compiler construction laughably trivial.


Actually, the expression problem has been solved in Haskell. (Or rather, they are multiple good attempts at solutions.)

Is nanopass typed? I'd like to see some good examples of typed Racket.


It is a sort of a typed DSL hosted on top of an untyped language.


Thanks!


Computer science isn't a contest. There isn't any reason for a push. Among scientists and good engineers, the more intelligent tech rises to the top. Everyone else uses PHP.

There are practical limitations. If everyone else is using C, it will be more difficult to introduce LISP.

Of course, making a push for Lisp usage would gain users and, therefore, tutorials and make it more practical but probably only among those same scientists and engineers due to the nature of Lisp itself and how you think when using it.


> I don't get it.

1) The purpose of HN posts on Lisp is to fellate pg, nothing more.

2) One of the big reasons Lisp has failed commercially is that expert programmers create their own personal libraries, which cannot be taken to the next employer for legal reasons. So all that effort is wasted.


Mate I like reading about lisp and I couldn't give 2 craps about pg. HN has gone beyond yc and pg quite a while ago.


Why is #2 only true for Lisp? I've seen it true for every job I've ever had.


In Lisp things that would be syntax in other languages are libraries instead, so the impact is bigger.


(2007)

lisp (imho) pushed the limits of what it was possible for a language to do. It pushed in many directions. eval, macros, gc, the comments suggest many things. As computers got faster, it became worthwhile to incorporate more and more of those expensive ideas. Heck, go is billed as a systems language and it's GC'ed! That would be crazy 20 years ago. Perhaps still a little crazy today, but far more feasible.

Lisp is like the Simpsons. Lisp did it. (maybe not first, but lisp did it) Of course later languages are going to pull some of that wonderful functionality. Other stuff, like reader macros gets left behind. it's even possible to do pretty explicit typing in lisp, but it never felt as natural as an ML/Miranda/Haskell kind of typing.

The C# observation is amusing, the original garbage collector was written in lisp. [1]

[1] https://blogs.msdn.microsoft.com/patrick_dussud/2006/11/21/h...


> Heck, go is billed as a systems language and it's GC'ed! That would be crazy 20 years ago.

Modula-3 was a systems language with GC that was released in the late 80s. It could be disabled for applications that couldn't benefit from it, but in the long run it helped Modula-3 be a much safer language for systems programming than anything with manual memory management. After all, most "systems programs" aren't OS kernels anyway.

Even if I were to write an OS in Modula-3, the first thing I'd do is write a real-time GC as an `unsafe` module (possibly mixed with some assembly), then write the rest of the OS such that it can make use of that. Hell, modern machines have so many cores that it might not be crazy to dedicate a core to a concurrent GC. There are also GCs that allow programmers to set a cap on latency, and the GC always ensures that it returns control before the cap is reached [1].

Given the considerable amount of research that has gone into writing operating systems in managed languages (Lisp, Oberon, Haskell, C#, ...), it doesn't seem particularly crazy to me today. Besides, there are plenty of other reasons to dis Go :p

[1]: http://www.cesura17.net/~will/Professional/Research/Papers/b...


> Heck, go is billed as a systems language and it's GC'ed! That would be crazy 20 years ago. Perhaps still a little crazy today, but far more feasible.

Not at all, I was introduced to Native Oberon around 20 years ago. It was a great OS used at ETHZ and many European universities. Specially the system 3 version with its gadgets framework.

Around the same time DEC/Compaq had SPIN OS implemented in Modula-3, with POSIX API and a very interesting distributed objects framework.

Xerox PARC had the Mesa/Cedar workstation in the early 80's.

There were a few OSes implemented with Algol 68 variants that had GC at OS level in the early 70's.

The craziness is that in the last 20 years, thanks to the rise of FOSS and its UNIX/C culture, younger generations seem not to bother with everything else that existed.


> Heck, go is billed as a systems language and it's GC'ed! That would be crazy 20 years ago.

When your allow the term to mean anything, it's only natural that anything qualifies.

The computing world uses the term to describe things you build operating systems and kernels and hardware interfaces with.

Google redefined it to mean something you build web apps with ala NodeJS. By that definition, PHP is a systems language too.

Basically Go as a systems language is a false claim.


Just to clarify my comment in case it comes off overly negative:

I don't think go is a bad language as such. I just think representing it as an alternative to C, a systems language, is misleading.


20 years ago, perhaps. But 30 years ago, you could buy workstations that were programmed in Lisp all the way down to the metal: https://en.wikipedia.org/wiki/Symbolics


> go is billed as a systems language

When people say "system" they mean "back end" servers and tools. They don't mean kernel programming.


There was some confusion about it, but in general people backpedalled once they realized the language is not a systems langauge and so they redefined what "systems" means.

http://techcrunch.com/2009/11/10/google-go-language/

--- Go attempts to combine the development speed of working in a dynamic language like Python with the performance and safety of a compiled language like C or C++ [...] the compiled code runs close to the speed of C.

We’re hoping Go turns out to be a great language for systems programming ---

They pit it against C and C++ and if Rob Pike doesn't know what "systems" means then well, nobody does ;-)

It doesn't matter really, Go is a popular, productive, useful language. There is nothing wrong if it "pivoted" or updated its description a few times in the past. It doesn't make bad or deficient, but there is also no point in denying it or trying to rewrite history.


> It doesn't matter really, Go is a popular, productive, useful language.

Yes it does, definitions matter or you end up with marketing buzzword like "serverless". Was calling Go "a system language" a marketing ploy? that's the question.


Not true. Ruby is not a systems language. I would say a systems language is characterized by static compilation, static typing and fast performance with low latency. Of course, someone will quickly suggest a counterexample like this comment.


I've most often seen "systems software" used to designate software whose primary users are other programmers or systems as opposed to end users (e.g. databases, message queues, monitoring tools, etc.).

Ruby is typically used for writing business web applications and while those applications run on the back end, their primary users are business/end users.

Of course, there doesn't seem to be a clear consensus on how those terms are used and the line is often blurry between the categories.


Accidental downvote, sorry. Counterexample: Holy-C isn't typechecked.


> Ruby is not a systems language

Ruby can be used for everything go is used for. It will just be slower.


I can imagine a webserver that drives a little robot via bluetooth. Low consequence for failure. It's easy to imagine the webserver part of the functionality causing a long pause, and a "stop" message to the robot being late.

golang would do a fantastic job at this, i think it would be far more enjoyable than C. Latency almost never matters. But sometimes it matters.


Many people consider Go (and Rust) as a reasonable choice for developing a toy kernel, and some projects have turned out to be somewhat serious.


Go and Rust aren't in the same domain. Any time you have a GC then you need to talk about pinning, non-determinism and a whole host of other issues.

FWIW last time I looked Go doesn't even specify heap vs stack which would be a non-starter for me.


"When possible, the Go compilers will allocate variables that are local to a function in that function's stack frame. However, if the compiler cannot prove that the variable is not referenced after the function returns, then the compiler must allocate the variable on the garbage-collected heap to avoid dangling pointer errors. Also, if a local variable is very large, it might make more sense to store it on the heap rather than the stack.

In the current compilers, if a variable has its address taken, that variable is a candidate for allocation on the heap. However, a basic escape analysis recognizes some cases when such variables will not live past the return from the function and can reside on the stack."

https://golang.org/doc/faq#stack_or_heap


> Perhaps still a little crazy today

Why?


RAII is an exceptionally well done memory management model that some languages, like C++ and Rust, use to handle memory and it works exceptionally well. Hell even reference counting in Objective-C is pretty good. Garbage Collection adds a lot of overhead but gives you a good amount of safety and speed in development but you certainly wouldn't want to use it in a scenario that requires something done in real time or many types of embedded systems.

Honestly I wish more things embraced RAII. It seems like it would be a better thing to do, in the long run, but that's just my opinion.


Depends on how you build your application. Small focused processes work great. If you're not careful, and all the functionality gets stuck in one huge executable, it's possible to have long pauses. If that one huge executable winds up interacting with a serial device, like an old printer, or dumb terminal, well buffers are small. data gets lost. you have to resync and it sucks.

FWIW, my experience with that stuff is with java perhaps a decade ago. It can all be made to work with smaller heaps, better partitioning, but you wind up introducing more, but bigger buffers.

It almost never matters. Right up until you have to read the status from some old microscope, or something equally obscure.


Java came out 20 years ago, though.


I only programmed in LISP for a semester of grad school for an AI class. My impression: it's a programming language with a unique perspective. Did I find it to be the programming language of the gods? No.

Sometimes I find myself with a problem where I find myself wishing I was programming in LISP. Does this happen every day, every week, every month? No. More like once every couple of years.


-- This is prolly the most arrogant opinion I have --

I think if every programmer started using LISP we would have a massive unemployment crisis - since many software engineers would lose their jobs - as most are forced ( due to the nature of capitalism ) to sell snake-oil solutions to problems already solved 50-60 years ago.

--

I am not a Lisp programmer , so no accusation on smugness pls. I just ended up using Lisp concepts daily when programming terrible languages since that is what the market wants :) # Resume Driven Development


Lisp has some good traits, but it isn't the 'ultimate' language. There are a number of issues with Lisp, for example I've heard debugging Lisp macros can be quite hard, I'd also say optional types is a weakness. Even Sussman, one of the creators of Scheme, classes Haskell as the best programming language we currently have, or in his words "the most advanced of the primitive languages.".

http://limist.com/coding/talk-notes-we-really-dont-know-how-...

That's not to say Lisp doesn't have merit, because it does, it's easy to pick up, very flexible, and in the right hands can lead to impressive results, but we've still got plenty of work to do to improve the state of the art.


I disagree with your assessment, and I guess with Sussman's appraisal as well.

I believe that you are mistaken in your belief that Lisp isn't the most advanced language. The submitted thread contains many comments that state that Lisp is basically a pure implementation of the untyped lambda calculus. I don't think this is correct -- Lisp is itself a unique model of computing, and it generally unifies Turing's and Church's computational model in a relatively unbiased manner IMO. For that reason, I regard Lisp as the most flexible languages of all "multi-paradigmed" languages. The Lisp machine's software was basically exclusively implemented in Lisp; the fact that developers could comfortably accomplish that testifies to Lisp's flexibility.

In Lisp, entire linguistic constructs can be dynamically created to iteratively solve a problem. That's basically a bunch of buzzwords, but I think the statement holds significance. I've read comments elsewhere regarding Lisp as the "Maxwell's equations" of PLT. I agree with that. If you're not familiar with the meta-circular evaluator, I encourage you to read up on it, as it justifies this statement.

That being said, Haskell's type system is really fantastic. I'd love an implementation of a Lisp with a Hindley Milner-esque type system.

Basically, I think Lisp is the most advanced language because it captures what a language is perfectly. This makes it free to involve and embrace new ideas, and has contributed to its continual success and long life.


> "Lisp is itself a unique model of computing, and it generally unifies Turing's and Church's computational model in a relatively unbiased manner IMO. For that reason, I regard Lisp as the most flexible languages of all "multi-paradigmed" languages."

I agree that Lisp is a flexible language, what I disagree with is that this automatically makes it the 'ultimate' language. By ultimate language I mean the best language we can ever create.

The reason I can say that is I see Lisp's flexibility as a double-edged sword. On one hand you can adapt Lisp to match a task, effectively creating a DSL for the particular application you're working on. On the other hand, the flexibility means that you cannot automatically assume as much about an application structure compared to some other languages.

This point about application structure is probably easiest to explain through the lens of type systems. Some of the benefits that type systems can give to other languages do not necessarily apply to Lisp. Do any Lisps have type systems? Sure they do, but they're always optional. It is this that causes an issue. Consider if you were writing some application that used a Lisp type system, and now you want to pull in an external library that performs a set of functions that it would be non-trivial to implement. There are no guarantees that this external library has any types you can query, regardless of how you designed your own code.

Whilst I appreciate that some Lisp coders are comfortable with 'rolling their own' when it comes to reusable libraries, I'm sure you can think of libraries that would be a lot of work to put together. For example, a library on natural-language processing.

Lisp has its advantages, but I think we still have room to improve when it comes to programming languages. Even if Lisp can express everything that can be calculated, there are other factors to consider when looking at the merits of programming languages.


> I'd love an implementation of a Lisp with a Hindley Milner-esque type system.

I am not certain of whether it is more Hindley-Milner or more "esque", but you may find something of interest in Shen [0].

[0] http://shenlanguage.org/

Edit: I submitted this reply an hour or so after opening the tab, during which interval several others mentioned Shen (which is good, as it deserves a bit of exposure).


Shen (in particular its type system) is an executable Sequent Calculus system [0] written with a small portable lisp. This derives from a propositional logic system, not H-M over typed lambda calculus.

[0] - http://www.shenlanguage.org/learn-shen/types/types_sequent_c...


Thank you for clarifying. Can you comment on the relative expressive power of Shen's type system versus e.g. Haskell? I understand that the former can accept as well-typed terms the other can't; in what way is this a consequence of modeling Shen's type system using the Sequent Calculus versus Hindley-Milner deduction?


The page describing Shen uses the terms "proprietary" and "closed source" as if they're good things when describing the "Shen Professional" dialect. (Compiler? Environment? It's a bit opaque.)

I'm certainly no Free Software zealot, but that's been an almost unfailing recipe for getting a language and its ideas ignored for 20+ years now.


Shen was opensourced some time ago, under a BSD-style license: https://github.com/Shen-Language/shen-sources/blob/master/BS...


And the creators appear to be reserving some features for the "commercial" version, and also attempting to heavily promote that version in a way which—I think, based on a few decades' experience—will actively turn away the people who might otherwise be interested in using it.


http://shenlanguage.org

Lisp can have typing just as strong as Haskell.


I wonder if Sussman has come across Shen [1]?

It is a small Lisp based on Klambda, comprised of just 43 instructions, that has been ported to Common Lisp (SBCL), Haskell, Ruby and Emacs Lisp (just this month). It has also been ported to Python, Clojure, Javascript, Java, and the JVM which are not up to Shen 19 certification yet.

Shen has pattern matching, an integrated fully functional Prolog, static type checking and even optional lazy evaluation. It has a very strong type system.

Shen commercial has Griffin, an optimized compiler, SML (Shen Markup Language) to generate HTML, and concurrency.

Popularity has not been my reason for selecting a language. I have been using J, an APL-derived, language for years, and I also like APL, K, and now Q! It is amusing to see developers rediscover APL concepts after 50 or more years!

[1] http://shenlanguage.org/ [2] jsoftware.com


How well the idea of commercial languages fits development needs? I mean, were there any success stories for commercial languages?


I am enjoying using J for fun. What kind of work do you use it for?


Anything I can think of using it for. I have used it at several jobs, for fun, and now I'm trying to learn how to build standalone J/Qt cross platform apps. The J runtime runs on Android and iOS too, but my needs are Windows, Mac, Web and Linux for now.


Good luck man. It's a really interesting language. Error messages could use a little more context, but it's super fun to do /r/dailyprogrammer puzzles with it.


Thanks.

Error messages are concise, but to the point, and the code is short enough that no scrolling is usually needed. One page view!


True. I still have trouble when I get a domain error due to my thinking about verbs / interpretation order / context / rank not being up to scratch. Would love to get something a little more verbose like "expected ~" or something to help me zone in on what's not working.


> I've heard debugging Lisp macros can be quite hard

It's much easier now that there is slime-macrostep[0]. Hmm. Maybe I should write another debugging lisp post on it.

[0] http://kvardek-du.kerno.org/2016/02/slime-macrostep.html


I'd be interested in reading that post. Thanks for the link as well.


Debugging lisp macros is like debugging anything else. You have functions that take an input and produce an output. The input is data, the output is data, and there aren't likely side effects. Use the same tools to debug macros that you would use to debug other functions. We have debuggers, tracers, and the repl to help us debug macros, since macros are just normal lisp functions (that run at a different time than "normal" code).


I agree. As sklogic showed, you can actually get the remaining attributes of LISP by building a Haskell or whatever on top of it. Doesn't have to be Common LISP or even a full LISP. Just the syntax, macros, eval, and basic compiler. One can do the rest in the other language as a DSL.

And then another language as a DSL. And another. :)


What are some examples of problems that are being snake-oiled that were already solved half a century ago?


Most of these are from the late 70s/early 80s, so not quite half a century.

Debugging – look at Smalltalk if you don't believe me.

Security – Intel i432 didn't sell well, but now we're sure wishing it had. The entire system was built on an object-capability model from the hardware up. The OS and userspace were fully written in Ada.

Persistence – System/38 and AS/400 — everything has a persistent address. All pointers are 128-bit (yes... this was designed in the late 70s). Everything is an object and the OS has a built-in database. You can persist anything effortlessly.

String processing – Regular expressions are useful; SNOBOL is more readable, more powerful, more efficient, and existed in 1962. Awk is a joke.

Declarative programming – I feel like Greenspun's 10th Rule applies at least as much to Prolog as it does to Lisp. The sheer number of cases where embedding a little unification engine dramatically simplifies code is pretty baffling.

GUIs – we've almost caught up to the ease of use of the Dynamic Windows system on Symbolics Lisp machines.

Concurrency – it amuses me that Go gets so much attention lately when it's just a re-skin of Occam from 1983.

Probably more. I feel like i432 and AS/400 in particular were very, very ahead of their time.


Do you really think a very high level processor was the way to go? It seems like we're just going lower and lower in the stack in the more advanced applications.

I mean, on i432 (from what I learned about it) you couldn't even free memory - you were stuck in it's specific instructions, GC, and other elements. It could be an interesting version of a secure execution enclave (like Intel tries to create today), but I'm not sure it would be useful or performant in general processing.


Tagged memory was definitely a way to go. Would have eliminated most of the security issues we had so far.

Luckily, it's slowly making a comeback, see the RISC-V efforts for example.


And Intel's MPX.

We already had all of that in the early 60's with Lisp and Algol dialects based OSes.

Apparently we need to have daily memory corruption CVE's as industry wake-up call.


> The sheer number of cases where embedding a little unification engine dramatically simplifies code is pretty baffling.

Would love to learn more about this, having done Prolog back in uni. Know of any good articles or posts showcasing "embedding a little unification engine" in modern/otherwise-sub-par programming?


https://rwmj.wordpress.com/2015/12/06/inspection-now-with-ad... is pretty interesting. LuaJIT uses logic programming extensively in its optimization pipeline IIRC, but I don't know of any articles on it (just Mike Pall on newsgroups).


Even Datalog can be extremely useful: http://www.cs.cmu.edu/~aldrich/courses/654/tools/bierhoff-bd...

Also take a look at this, quite a convincing example of how useful Prolog can be in a compiler pipeline:

https://pdfs.semanticscholar.org/3667/96ef0c3e9a8b4acb70f838...


A lot of things currently done in the javascript world are just people discovering that the "just copy all code into a file" world of things doesn’t work – and so they reinvented modules, after the "kik" debacle, now reinvent namespaces, they reinvent dependency management, etc.

Just look at the javascript world, they have enough examples.


This. JavaScript's engineering culture is a dumpster fire of 40 year old bad ideas that the rest of the software community abandoned 40 years ago.


It is a language in a constant changing environment. The web today has different requirements than 20 years ago. It's not right to criticize the reimplementation of things in different systems. That has to happen...

Paradigms shift, so thus do implementations


Abstract concepts that are necessary for software to be useful or practical are applicable despite the alleged changing environment. If they weren't, js/node wouldn't be reinventing these concepts, like package and dependency management. I think it's good to revisit occasionally to see if there's a better way, but my experience with node has always been an amalgam of cognitive arrogance and assumptions about how "those guys" (any other software ecosystem) built a needlessly complex and bloated or slow solution to a common problem. At the end, these solutions are equally complex or lack major features, such as security or stability.

I'll point to one major counterpoint before anyone else does though: JSON turned out to be pretty handy. I am glad we didn't stick with XML, although that's what browsers expected in the early 2000's (Ajax/xmlhttprequest).


> I'll point to one major counterpoint before anyone else does though: JSON turned out to be pretty handy. I am glad we didn't stick with XML, although that's what browsers expected in the early 2000's (Ajax/xmlhttprequest).

You didn't look behind far enough. S-expressions were pretty handy and performed more-less the same task as JSON (besides being used for stuff like representing code). Using XML as a data serialization format was a mistake, and I'm glad we're finally recovering from it, even though it took the new generation to reinvent some wheels to get there.


That doesn't explain the left-pad debacle, and the apologists defending that anti-pattern. That's a people problem, not a technology problem.


Left pad was both.

The ultra small single function library approach prevalent in npm is caused by fundamental limitations of javascript in the browser.

The political issues that surfaced it all was a people problem.


While the impetus for a left-pad like function is due to a lack of a standard library, that's not what went wrong. Brushing the problem aside as merely a "political issue" shows that you don't understand what went wrong either.

1) The whole culture of bringing in a mass of external dependencies for single specialized functions is to be quite honest rather strange and problematic on it's own. Sure there's no standard library, but no one thinks about making one. Instead everyone makes "frameworks" for turning DIVs into BUTTONs for the 86th time.

2) Bringing in an literally an unknown number of external dependencies is dependency hell. How many different left-pads and left-pad-likes do you have? Do you even know?

3) The justification of trivial functions as standalone dependencies is justified as a vetting process. Somehow by pulling in everything separate will improve code quality? No. Fixing bugs improves code quality. Honestly, someone in this very forum justified left-pad as a standalone because it helped "find edge cases." What's the edge case in left pad? There's literally only two edges. -1 and (padtosize - len(num)). On one end you get an undefined, and on the other, your string is wrong. Even if you walk off the entire string, you get another undefined. It's a lesson 1 Intro to Programming problem.

4) Having a bunch of minimum functions makes choosing the right one impossible, or maybe it doesn't matter at all, in which case what's the point. Seriously. I was going to make a remark about "Why not just implement printf?", when a quick google found THREE! printfs on npm (format, sprint, and qprintf.) And then of course there's left pad if all you want is "%03d".

5) The actually source of the left-pad debacle is autoupdating external dependencies from the Internet. This is a horrifically Bad Idea. It was a Bad Idea before the left-pad debacle, and it's a horrifically Bad Idea after the left-pad debacle. And I have no doubt that the javascript community did not learn this lesson. Instead it appears all that was learned was to be scared of trademark claims. Autoupdating dependencies is a Bad Idea because it's how you get bugs. Seriously, if the javascript community simply didn't do this, and instead followed the best practices software engineering practices since literally the dawn of external libraries, then the whole left-pad debacle would have been greatly mitigated.

The whole security injection clusterfuck happened because the kid that runs NPM got scared because "code was breaking", of course it's not his code, it's just random code on the Internet, because the javascript community doesn't employ widely implemented best engineering practices. NPM as exploit delivery vector was always there, and it's still there if you don't do proper versioning and vetting of your external libraries. Passing the buck, isn't acceptable.

Left-pad was to the greater software engineering community like the scene at the end of Lord of Flies, when the adult finally shows up, looks around at the chaos and says, "What are you guys doing?"


> Sure there's no standard library, but no one thinks about making one. Instead everyone makes "frameworks" for turning DIVs into BUTTONs for the 86th time.

Well, I know that I brought javascript into this, but there are some stdlibs being built: underscore.js is one nice example.

http://underscorejs.org/


Nice! Although, I did have to double check that it wasn't just setting "text_decoration: underline" on some span. ;)


The problem is not as simple as just a missing stdlib.

The problem is a standard lib that doesn't ship with the browser compounded by the fact that javascript build pipelines can't easily and in fact don't do dead code elimination. So shipping an entire stdlib 60-70% of which you don't actually use to the browser is a terrible idea.

Javascript really is a special snowflake here and refusing to recognize that is problematic to say the least.


It was fairly trivial to foresee the requirements. I was ranting about how stupid html is back in 1993. If things were done the Lisp way back then, we would not have seen any of this "changing environment" epic circus.


Alas, "worse is better" (or so they say), and the industry decided to reinvent everything every 5 years :(.


CRUD. My day job involves maintaining ASP webform sites which are a set of inputs to create compute contracts. Lots of copy pasted code I'm trying to consolidate. Lacks use of generics or ORM or form models


Not really 'problems' but some examples of things which did not had to be written over and over; we still do so because it is a) hard to reuse software b) we have no source from others c) snake-oil; we want to make money and just giving them something that has been proven to work and just needs a little tweaking doesn't make so much.

Besides 'graphics' and AI we are not doing much different from the 80s at least. The software I write for mobile devices is in essence the same as the software I wrote for DOS/CPM/MSX. The crazy amount of time we spend on the frontend (animations, UI/UX etc) is very different from that time but the functionality is not very different. I actually wrote POS systems (for bars/restaurants) in the 80s (for a friend of the family), 90s, 2000s and this year and the logic / db stuff is the same for these. The only change is the GUI and even that not massively. Why didn't we write a POS in the 80s and just call that 'POS-barres', give it away for free and NEVER again write POS software for bars/restaurants? 'But they have different wishes!'. No they don't. I sold (as being the coder but also the (pre-)sales person); they don't have different wishes; every POS-barres is the same plus/minus some hardware and more (over time) data gathering. For me this falls under snake-oil; knowing my own ex company and my competitors; we sold basically lies ('our software is better' and luckily bar/restaurant owners have no clue what to say to that because if you go into it you would be hard pressed to even see any difference, maybe besides hardware support years back).

This goes for many software systems. My first company wrote educational software sold to almost all schools in the Netherlands; this software is still sold, it runs on the same logic since the 80s (written in (Turbo) Pascal, directly imported into Delphi as unit in the 90s and on); the GUI and ofcourse the data in it was tweaked but the logic did not change.

Sure we can now add advanced data analysis; but you don't have to add that INTO the software; you can do that in the cloud and make a screen to show the analysis results. That doesn't change the software and you could do implement the 'showing of the results' in the 80s. Sure we add AI but that can be added on mostly instead of rewriting from scratch.


Pretty much everything in the enterprise. It simply should not exist. All that CRUD stuff (that should have been automated long ago, it does not deserve all the boilerplate), everything Java, everything that is going on now in the Web. It is all amazingly overengineered. Doing things in an idiomatic Lisp way would have resulted in 1/1000th volume of a code doing 10x more. And, of course, in 95% of the developers being redundant.


It goes both ways. Lisp has certain attributes that also make modern programmers go "Yeah, back then it sounded like a good idea".

Being dynamically typed comes to mind.


Lisp is both dynamically and statically typed:

    (defun foo (x y)
      (declare (type (integer 1 12) x)
               (type character y))
      (make-string x :initial-element y))
And the compiler will provide a compile-time error if someone tries to call (foo 1 2) but will accept (foo 1 #\a). Even cooler, it will provide a compile-time if the developer tries to call foo with a first argument which is not provably an integer from 1 to 12.

In fact, you might even say that it's statically typed since values which have undeclared types are really of type T, but I don't think I'd go that far.


Sure, Lisp has many derivatives, some of which are statically typed.

The most popular Lisp today, Clojure, is dynamically typed, though.


Clojure is a kind of Javified Lisp; if you're looking for a canonical example, the GP provided an example of Common Lisp code, which gives you optional type declarations that are compile-time checked and are used by compilers to generate more optimized code.


Common Lisp not a Lisp derivative: it is Lisp. Someday, of course, there may be a successor language, but Clojure is not.

Clojure is a Lisp-like language with — I'm told — many good and/or interesting idea. I don't know if it's actually the most popular Lisp-like language (I'd think Scheme would be), but it probably is more popular than Lisp. That's sad, but that's life.


I like your term "RDD". It seems like the type of development strategy where you write implementations of hash trees and database whenever you need them.


I didn't create the term RDD - that credit goes to

http://radar.oreilly.com/2014/10/resume-driven-development.h...

I do RDD since I want to watch the world burn and reap as much benefit from it before I drop dead.


There's one thing severely lacking in lisp: compile time type checking!

IMO Haskell has the best typing system I've seen in a language so far.

TypeScript is probably a close second (it kinda has to be as flexible and powerful as possible, to accommodate as much existing javascript code as possible).


Common Lisp supports compile time type checking: https://news.ycombinator.com/item?id=11529039

Have you seen Shen? http://shenlanguage.org/


> There's one thing severely lacking in lisp: compile time type checking!

Lisp does have compile-time type checking; see my comment: https://news.ycombinator.com/item?id=11699581


But (Common) Lisp does have compile-time type checking, and uses it to great success for both ensuring programming safety and generating optimized code comparable in efficiency to C!

That said, I agree the CL's type system is nowhere near as good as Haskell's.



It doesn't matter what you say about Lisp, there will always be people on HN who vigorously defend it to their last dying breathe it seems. The idea of a language not being (one(definition(after(another)))) seems foreign to these people, as if making everything a function call solves all problems (or maybe, that is what the HN crowd really believe?).

If only programming problems could be solved by lots of functions calls!! The truth is the difficulty exists in the solving of problems, not how many functions you define. If someone thinks you re-invented Lisp at some point, then either they do not understand the problem or they fail to understand the solution.. or of course; they do not care what you have written and instead think you have not used enough parenthesis.


There are several things you could take away from Lisp. I don't think you caught the good parts yet though.

My idiot instructor for comparative programming languages set me back years in understanding the cool parts of Lisp. He focused on "List Processing", which seemed pointless to me because lots of languages have lists... (I work with him now, so I'm allowed to call him an idiot)

You seem to think it's about the function calls. Maybe you had a bad instructor too, and he focused on "functional programming", which is all the rage for the last decade. I dunno, maybe you came to that conclusion on your own.

The thing I now think is wonderful about Lisp (Scheme for me), is that you can write your program however you want. If you want to use switch statements, and your language doesn't have one (Python), just make your own:

    (switch foo
        (case bar -> (do whatever you want))
        (case hmm -> (do something else)))
Some languages have backtracking:

    (backtrack
        (keep trying)
        (different things)
        (until it works))
You just can't add those kinds of features to most languages because most languages are not programmable...


(Common) Lisp has functions, arguably the best object system that exists (ask OMG group if you don't believe me), has facilities for functional programming and has a proper compilation-level programming facility (aka. macros) which lets you seamlessly add any other thing as if it was already there.

You want to have a switch instruction? A quick macro, done. You want to have a Prolog-level pattern matching? A few macros, done (or use optima library). Want Perl-like convenient string manglings with regexps? A few reader macros + CL-PPCRE (arguably the fastests implementation of Perl-compatible regexes in existence) and again, you're done.

I think you're seriously projecting the limits of more constrained languages onto Lisp.


> as if making everything a function call solves all problems

There are also special forms and macros.

> not how many functions you define

I've never seen anyone argue that this is what makes lisp good.


I know Haskell pretty well, having spent a cozy two or three years with it as my main squeeze for hobbyist programming. Now I'm doing more Prolog. I like stuff that makes me think differently. Anyhow...is there something I'm missing from Lisp? I've never Lisp'd, but people talk like it turns you into Neo in the matrix.


Metaprogramming in Lisp is easier than in other languages I've used (though I tend to use Scheme, not CL). Grokking apply/eval and the code-is-data nature of the lambda calculus is the key enlightenment. I stopped thinking of metaprogramming as some weird esoteric edge-case and started using it to solve problems. I mean I write tools to help me write code in all languages, to some extent, but in Scheme it is ubiquitous.

I went to Haskell from Lisps, and found quite a lot of similarity there, so if you're fluent Haskell, you might not find anything particularly revelatory. Though Haskell's more strict nature makes it much harder, in my experience, so I do it less often.

Eval/apply is a different model of code than imperative languages, which can be 'mindblowing' if you've only thought of languages as higher level assembly.

I do love Scheme, but I struggle to justify using it in practice. Which is telling I think.


I think Lisp is kind of magical if you buy into the iterative nature of computing.

Sure, it's functional in the sense that functions are cheap and easy to declare/use as values, but if you look at how things like macros end up working, it ends up being about having control of your runtime.

If you're into Haskell/Prolog, you probably are much more on the mathematical side of things... honestly I can't think of much you can express easily in Lisp (CL for example) that you cannot transform into an almost equivalent solution in Haskell.

It's good to try out a Lisp (and get up to macros or something), but my personal feeling is that Lisp is probably more of an advancement for people in the dynamic language universe (JS/Ruby) than it is for people in the ML-y languages.


The "Neo in the matrix" thing, i think, is the "lisp enlightenment experience" under another name.

Trying not to be some smug lisp weenie about it, I'm going to try to put my own experience into normal english. It happened for me somewhere around the 6-12 month mark of daily lisp study/coding.

You usually start out learning lisp like other languages: you're just trying to learn the syntax of various commands and the quirks of how everything fits together of the concepts (if you've learnt another language) that you probably already know. And a big part of this is laying the foundation of a deep understanding of linked lists, cons cells, cars and cdrs. And after that, you start learning about the identation style available to you as well. And right now, its just another language.

Then one day, you wake up, and you look at code, and some switch you weren't even aware of has flicked in your brain, and it all looks different. Mature lispers read/structure code subconsciously using both symbols and the indentation style to convey real information about the structure of a program, which is why you can really mess with their heads if you mess yours up too much. But there's a difference between being told you can do it this way, and the practice that actually makes you do it, because its not a conscious thing.

Anyway, the reason lispers "think they're neo in the matrix" is because the mental effect this has on you is not unlike that scene towards the end of the matrix where Neo looks down the hallway and his perception of the thing he's been interacting with this entire time just fundamentally changes. Its no longer writing code, that is to say, strings of text. Its interacting with and perceiving the very structure of the programs themselves.

And it is a real psychological phenomenon. It makes you really happy, and its why lispers talk about it. It makes you want to go and shake your neighbour and be all like "Oh my god, I get it, do you see this!"...and of course they don't. They see scribbles on the screen.

The funny thing is, they always say they're Neo in the matrix, which i understand is cooler, partly because that final scene captures the phenomenon of what it feels like, but its arguably summed up in a more lackadaisical way by the character Cypher in one of his quotes from the same movie:

"You get used to it, though. Your brain does the translating. I don't even see the code. All I see is blonde, brunette, redhead. Hey uh, you want a drink?"


But what are the practical implications? What things can lisps do that nothing else can? Macros seem to be the biggest thing mentioned; pg says that was what let him write features so much faster than competitors. On the next hand he says he can't imagine not having different types of items in a list.

And like Cypher, if you reach some higher enlightenment, then can't use it (practical constraints) it just makes you want to go back. E.g. a friend that recently picked up F# now laments so much about having to work in C#; it's now painful. Whereas a couple months ago, C# was pretty nice.

1: Though I'd like to see which features really made much of a difference.


One is possibly absorbing other paradigms, like you can do language design too, even if you're "just" a user. Take Clojure for recent examples: core.logic, core.async (which gives you Go-like channels). These are distributed as libraries.

For many non-Lisp languages, something so small as a new "for" loop often requires you to wait for language implementors. In Lisp, you want pattern matching? Or some Erlang-like semantics? [1] Maybe it's possible! In an accessible way, because you can customize your language's user interface. You are not just a consumer.

[1] http://blog.paralleluniverse.co/2013/05/02/quasar-pulsar/


Warning: hairy high falutin' answer...

The "what can lisp do that other languages can't" is, I think, a bit of a red herring...or rather, depends on what problems you face and in what context we're discussing the issue.

The short answer is of course: nothing. And if most of the problems you face are defined by by working on a particular technology (database tables/javascript), in a particular social concept (say a corporate workplace), or a problem that isn't LISP based, then the answer is probably "REALLY NOTHING...and possibly less".

But if your problem doesn't currently have a technology acceptably applied to it, or you think the current offerings fail in some way, or you're finding the load of trying to smoosh things together because in your head the expression is crystal clear, but your language can't express it easily...

The longer answer:

Apart from a genuinely "linguistic" analysis of the language itself, I think "a good computer language" is one that minimises, in the abstract, some distance function between the programmer's mental models, the machine's mechanical implementation, the problem's domain, and the aesthetic simplification of the source code.

Now, no language can minimise that equation in the absolute for all possible problems, if not because several of those things are often fixed absolutely or practically. And if you're working in a domain that you feel has already gotten it pretty damn good/close. Some languages trade-offs of maximizing certain aspects to the detriment of others, but it doesn't matter because it fits their "biological niche" as it were.

So what is LISP (i'm talking Common Lisp here in my own personal experience) biological niche, or what does it offer that others don't?

I think my answer is that its a fantastic flexible generalist that allows you to minimise the aesthetics/complexity of your code across multiple domains.

- It lets you get pretty low level like C, but gets you a cheap move to a high level language like python to trade off convenience against efficiency. You get high level, like python/perl, but can drop down to far more performant code while minimising the cognitive load by staying in the same language. - SBCL, at least, mixes some of the benefits of strongly typed and highly dynamic languages (static compiler checks and fully dynamic code in the same place) - Macros allow computation and expression on the language itself more easily than in other languages, allowing you to move your language closer to your problem scope and to the programmers mind and aesthetic expression of source code. - Deals quite nicely with functions, objects and imperative programming all in the one place, allowing you to move the language closer to the programmer's mind/problem scope/aesthetic expression of source code. - And there's the whole dynamic interaction with your program, rather than an explicit code/compile/run cycle.

So in short, use it where you need to care about those things seriously, which in my land is really:

- Cross domain problems where the clash between the two tools/paradigms are too great - Problems where you disagree with some of the trade-offs made with the tools/languages that have come to dominate a specific area - Problems where there isn't a dominant/well-fitting tool currently (of course, for those problems, community/social/library issues isn't really an issue because there isn't one/aren't any...) - Problems where you honestly don't know what the solution is going to look like yet and no one, to your knowledge, has gotten there before you.

I generally say that LISP is the second-best solution to a lot of solved problems, but perhaps the best solution to multiple-solved problems and unsolved problems for now.

And that's what I think LISP gets you...the shortest distance between disparate optimisations.

Now...I personally accept that 95% of programming these days ISN'T that...


I've only played with Lisps, so can't quite say I've had the 'enlightenment' experience, but, realizing that "I know how to manipulate lists...and my program is just another list (of lists of lists of lists)" was a bit eye opening.

The regularity of the language makes it so that taking advantage of this is far more reasonable (both a logical thing to do, and something that can be reasoned about) than any other language I've seen (where the closest equivalent would be the eye gougingly bad decision to read in your code as a string and perform string operations to munge it and then 'eval' it. Or, arguably, since this is obviously metaprogramming, reflection, but there it still ends up being something 'special', something very different than just writing a program, and as such is still frowned upon. Lisp is homoiconic, and realizing the ramifications of that made metaprogramming go from a "avoid if at all possible" to just another tool, one that sometimes provides the most elegant solutions, and is reasonable to choose in those situations)


And in a lisp you'd be able to mix Haskell and Prolog together, with dozens of other languages as well, seamlessly and efficiently (unlike the non-destructive Prologs people tend to embed in Haskell).


My feeling when using lisp is I am writing an AST . And since most languages rely on generating AST it kind of makes sense to reinvent it.


I implore everyone here to read Naggum: http://xach.com/naggum/articles/

Also read Stanislav Datskovskiy: http://xach.com/naggum/articles/

Also consider that the x86 hardware platform is terminally busted. For god's sake why isn't gc in hardware already?


One of the comments nails it on the head (paraphrasing a little bit): lisp is the executable implementation of lambda calculus. So of course you will re-invent it. As much as any domain at some point takes on a computational flavor the closer it gets to having primitives that can be used to build a turing machine the closer it gets to being a lisp. That and having your computation being representable as a data structure comes in handy pretty frequently so you end up emulating lisp even if you didn't intend to because the data/computation duality is ever-present. None of this is an endorsement of lisp though since there is a very deep and fundamental reason that it keeps being re-invented. Lisp is the manifestation of the primordial soup of computation.


Reminds me of Greenspun's Tenth Rule Of Programming:

Any sufficiently complicated C or Fortran program contains an ad-hoc, informally-specified, bug-ridden, slow implementation of half of CommonLisp.


Reverse is also true: Any sufficiently complicated Lisp program will contain an ad-hoc, bug-ridden and slow implementation of half of mainstream programming languages and their standard libraries.


I think lisp adds mental overhead for figuring out the precedence of operations - while other languages have a default that people understand.

Perhaps if a lisp editor was 3d instead of just parenthetically topographic (not just color coded) it may help people understand what is happening.


It's perhaps a little confusing when first learning Lisp, but once you've got the hang of things, working out the precedence of operations is really easy.

In most programming languages with infix operators, the precedence often depends on the operators themselves. For example, with:

  5 + 3 * 10
...You'd need to know that * has higher precedence than + to figure out the value of this expression is 35, not 80.

But with the following Lisp expression:

  (+ 5 (* 3 10))
...The syntax itself tells you the precedence, so there's no ambiguity. Even if you didn't know anything about the functions involved:

  (bar 5 (foo 3 10))
...You don't need to worry about whether foo has precedence over bar or the other way round -- there's only one way to evaulate the expression.


> I think lisp adds mental overhead for figuring out the precedence of operations - while other languages have a default that people understand.

Wait what? Lisp removes the mental overhead by not needing the concept of "operation precedence" - prefix notation is unambiguous. The discomfort people have when looking at prefix math is because everyone here spent at least a decade being taught math in infix notation.


Everything has been said by Richard Gabriel long ago - Lisp is too good, and "packers" don't need or even appreciate refinement and excellence, like most of folks don't appreciate beauty of DNA and related machinery.

For them PHP or Java are tools for getting shit done. There are huge piles they have produced, like Hadoop or whatever it is.

The last great Lisp was ZetaLisp from Symbolics (just read the docs!). Common Lisp is already a bloatware suffered from the kitchen sink syndrome, nevertheless it is way better than anything else (multi-paradigm, mostly functional, strongly-typed (type-safe enough) meta-language that compiles to native code with pattern-matching, OO and other fancies as DSLs).

But who cares if one is satisfied with writing

   RepetitiveStupidityFactory myStupidityFactory = new RepetiriveStupidityFactory;
for living.


> RepetitiveStupidityFactory myStupidityFactory = new RepetiriveStupidityFactory;

Yeah because

    (let v (make-instance 'RepetitiveStupidFactory))
is so obviously superior, right?


Obviously. Less repetitive, less verbose. And, most importantly, one could hide all the irrelevant details inside a macro which is equivalent to extending a language with a new control structure.


The Lisp version is less verbose because it's dynamically typed (in other words, it's less verbose because it contains less information that would allow this code to be proven correct).

By the way, these days, we write things like

    val l = ArrayList<String>()
If verbosity is your number one criterion to write your code, just skip all error checking and your code will become super concise.


I'd suggest the last great Lisp wasn't ZetaLisp, but (prefix-syntax) Dylan, which was designed by many of the same people. It's essentially a Scheme plus CLOS "all the way down," with a cleaned up subset of the core of Common Lisp. Find the book from 1991-1992, it's worth reading.


In that case it is PLOT3 by David Moon (Google it).


Nope. Moon apparently bought into the "people don't like Lisp because of the parentheses, so we should make a Lisp that looks more like ALGOL and people will like it" line like the rest of the Dylan core team.

IMO if it doesn't have homoiconicity, it's not a Lisp, and macros will be much more difficult.


> For them PHP or Java are tools for getting shit done.

... as opposed to Lisp?


Yes, Lisp is a distinct, different culture (think of Aryans versus tribes in ancient India). The difference is as it is between craftsmen (who are artists) and assembly-line workers.

Look, read Gabriel or Graham or early Norvig (his Lisp style and anti-design-patterns essay), not me. There is something behind their insights.


> The difference is as it is between craftsmen (who are artists) and assembly-line workers.

It's a bit harsh to call Lisp programmers assembly-line workers. I think the language has value and pioneered some language concepts that we find across a lot of languages today, but Lisp is very dated by 21st century standards.


> It's a bit harsh to call Lisp programmers assembly-line workers.

I think GP meant it the other way around.


I know.


Yea, understanding macros and writing your own DSLs doesn't make you some sort of aryan ubermensch programming craftsman. Macros really aren't that hard to get, and mostly not necessary when you have more coherent constructs at your disposal.


Yes, macros are not hard to get - they're far simpler than anything else imaginable.

No, they are necessary. There is no better way of doing things than macros.


> There is no better way of doing things than macros.

We have invented plenty of better ways since Lisp. Much better ones.


Mind naming a single one?

I do not believe such a thing exist. Macros are the ultimate solution.


There is much that can be done with regular, bog-standard functions. Yes, macros can be useful in certain cases, but it's often more straightforward to use the Lisp language as it comes. As always, it's important to use the right tool for the job.


> There is much that can be done with regular, bog-standard functions.

You cannot implement simple, modular, debuggable, composable DSLs with functions alone. And you cannot tackle complexity without DSLs.

> Yes, macros can be useful in certain cases

In most cases. Cannot think of anything beyond "Hello, world" that does not deserve a DSL implementation.


> You cannot implement simple, modular, debuggable, composable DSLs with functions alone.

We've been doing this for at least ten years. Example: [1]

For someone who's convinced that DSL's are the silver bullet that everybody should be using, you don't seem to have kept up with the state of the art of the field much.

[1] https://github.com/Kotlin/anko


No, this has nothing to do with anything simple. It is awfully overengineered. It is definitely not a way do implement DSLs.


These are functions, it doesn't get any simpler than that and it doesn't require learning anything new once you know the language.

Macros are over engineered in comparison since they force the developer to learn a whole new section of the language with its own rules and own compilation lifecycle.

This is why hardly anyone uses macros these days and why most language are embracing the Groovy/Kotlin approach to writing DSL's.


I moved from Apache Groovy to Clojure for writing tests for my own software about 5 years ago. A single limited-range macro can cut out a lot of repetitive syntax of the sort found in repetitive test cases. A one-line Clojure macro can easily do the same as what it takes 50 lines of Java defining a Groovy annotation.

The small amount of time it takes to learn Clojure and get used to its syntax is well worth it. Too bad not many workplaces are willing to similarly switch from Groovy to Clojure for writing tests. They even stick with the older Groovy version 1.8. Once something works, they won't upgrade.


> These are functions, it doesn't get any simpler than that

Simplicity of the building blocks does not affect complexity of the resulting mess of code.

Look, Brainfuck have only 8 operators. It cannot get much simpler than that. Once you learn all 8 operators no new constructs require any learning. Does it help at all? No. You're stuck at a pathetic abstraction level.

Same with your functions. They're limiting your abstraction level, therefore making it impossible to make things simple. And your DSL example shows it in quite a dramatic way - it is a horrible mess and nothing but a horrible mess. It can be somewhat handy for the users, but the implementation is plain awful.


Show us what you think is a nice DSL then, because you keep waving your hands about DSL but you're not showing a single line of code to prove your point.


I did it many times elsewhere in this thread already.

For example, this one: http://www.moserware.com/2008/04/towards-moores-law-software...

Or Nanopass itself, not just as a tool for building DSLs, but as an example of a nice and clean DSL (of course, if it could bootstrap itself it would have been even better):

https://github.com/akeep/nanopass-framework

Or everything you can find in my github repositories (username: combinatorylogic).


Free monads would like to have a word with you about simple, modular, debuggable, composable DSLs. On the other hand macros are anything but composable or modular. Debuggable for some far off definition of debuggable, maybe.


Monadic interpreters? Composable? Are you kidding? It is a joke, not a composability. You cannot take a type system of one DSL, modules from another and only the expressions syntax from a third one and then mix them all together into a new DSL.

And, anyway, interpreters. Who in a sane mind can ever consider an interpreter?

Also, you obviously do not realise that macro-based DSLs are by far more debuggable than any monadic interpreter would ever be.


Before we continue, post a link to a free monad DSL you have written; I do not intend to waste my time in a discussion with a zealot.

(I can send you examples of macros I've had to write, if you desire.)


@sklogic (can't reply directly): I know how to use macros, I've used them and I have come to the conclusion that there are much better tools.

Seeing you know what a logical phallacy is, here is one more for you: "if you don't like macros you don't know how to use them". Can you guess the name?

My question was not ad hominem. I wanted to see whether you have actually used the thing you are arguing against or instead just blindly praising something as the ultimate solution and thereby determine whether I should continue this discussion with you.


Why would you be interested in my pitiful attempts if even the best examples of the monadic DSLs people keep pointing too are all thoroughly horrible, convoluted and unmaintainable? Likewise, why should I look at your macros if you already admitted that you do not know how to use them?

Can you comment constructively about the composability issue without resorting to ad hominem?

Do you even understand the idea if the macro-based DSL design methodology?

I guess you can have something as horrible as the CL LOOP macro in mind, instead of any properly designed DSLs.


Drop me a mail, p<dot>kamenarsky, gmail.com.


(cannot answer there too, weird...)

No, I conculded that you have no idea how to implement macro-based DSLs from your remarks about composability and debugging. You cannot be so out of touch and still know something about the macros.

And, no, you did not point to a viable alternative. Monadic interpreters are complex, convoluted and very inefficient. This is an objective fact, not a matter of a taste.

Do not agree? Show me a definitive example of such a DSL and I will demonstrate how much simpler a macro-based version is.


P.S.: and the free monad part is not what I'm calling convoluted and unmaintainable. It's ok to assemble an AST this way (yet, a custom parser is better). The problem here is in the interpreter part.

Interpreters are inherently bad and there is almost nothing you can do about it. The only sane way of keeping an interpreter complexity under control is to use a single common trivial VM interpreter, write it once, never touch it again, and lower all your monadic DSL ASTs down to it.

Which makes this solution quite similar to the macro-based one (it would actually reek of Greenspun Tenth), but still much less efficient.


I gave you my mail in the other reply, HN is really annoying for longer discusssions.


> Mind naming a single one?

We no longer need macros to create intuitive DSL's any more.


This is not an answer to my question. What are you proposing instead of macros?


And the rest of us are doomed to keep re-implementing it.


The "Lisp" programming language covers a large territory, and often one needs to embed some sort of interpreter or language is a complex system. (Even our text editors contain sophisticated interpreters because we desire to customize them to such an extent.) Furthermore, a Lisp model is one of the simplest ways to build a powerful interpreter.

Consider the problem of being locked in a room that has a computer and all the manuals you need to program it...in machine language. How would you build a system of any significance? Could you escape the evil genie that won't release you until you write a program that solves Sudoku puzzles?

I know that I would escape the genie. I'd build a simple macro system and key it into the machine in binary (I've had to patch a machine's boot loader more than once from a panel of switches). Then I'd build a simple assembler from the macro system. After that, I'd build a simple lisp-like system because that's about the simplest language one can implement that does anything interesting, including writing a sudoku solver.

So yes, I'd be reinventing Lisp. However, I don't want to spend my life programming in some shitty pathetic system I wrote to escape from an evil genie. Why does it matter that Lisp is so easy to implement that undergrads often build Lisp systems as programming assignments? That doesn't make it some language of the future invented in the past; it just a simple to implement language invented in the past.

A thread I've seen here before is "well Algol 68 was a great language and we are not using it because ...<insert some conspiratorial reason>..." Well, Algol 68 is a great language from the past, but it's great not because it's a great language to program in--it's great because it manifested some of the earliest thinking about what features ought to drive the design of programming languages.

Today, people aren't locked up and given terrible programming assignments with no real programming language to use (well...maybe I should say this doesn't happen all that often). So why not just embed Lua. Places that I would have used Lisp 40 years ago are now the places I'd use Python or Ruby.

But what about the meta-object protocol? What about programs that need to write other programs? What about DSL's? Well, that's another subject. But the short answer is so what? These step up the power of the language one programs in, but not essentially the limits to what a program in your language is capable of doing. Yes, writing complex data structures in old FORTRAN is a big pain, but we aren't in that arena anymore. Writing complex data structures in C++ or Java or Go or Rust isn't a big pain anymore.


Okay, well "What about homoiconicity?" you say? Well, I think it's a totally cool sounding word that all hipster programmers need to have ready to pull out, but it isn't a game changer for Lisp. Or I should say, that yes it has kept Lisp in the game, because without it Lisp would be like FORTRAN IV, a language that didn't have enough useful stuff (fixed size arrays of numbers was pretty much all there was to program with in early FORTRAN), but it doesn't make Lisp "better" than more modern languages.

Homoiconicity means that the the abstract syntax of the language can be represented easily in the programming language itself. Just the fact that Python is written in text and can easily represent and manipulate text doesn't count. Because of homoiconicity Lisp can do amazing things to itself, see [1] and [2]. Python programs aren't commonly producing another python program to feed to the python interpreter.

However, Python makes the the AST available as a python construct. Go does something similar. So yes, python and go could support the Metaobject protocol if they wanted to. The important thing observation is that Python and Go programmers don't want to and don't need to. These languages already provide powerful enough abstractions to support 99% of a programmer's needs: object, classes, interfaces, lambdas and so forth.

"Well, yeah, but ithout homoiconicity how do you construct a good DSL?" I presume that you mean how can you implement something like CLOS? Really? I don't want to implement CLOS and I also don't want to write the Boost library for C++. Both are complex exercises in programming to extend the capabilities of the underlying language. Why wallow in macros (again see [2]) or templates (see Boost source)? I understand that C++ templates solve an essential problem that the language has: original compatibility with C and a desire to be able to operate as close to the metal as C. Likewise, Lisp originally lacked even a decent set of control constructs. High powered macros (and call/cc for Scheme) allowed the languages to build there own extensions just as templates have for C++.

Other language just start with a good set of abstractions, control abstractions (like Java's enhanced for loop), data abstractions like interfaces, concurrency abstractions like go's channels.

"Yeah, well like man, how can you anticipate exactly what the program needs?". Well, I've found higher level programming with functional languages and object oriented languages to provide almost all my needs.

"Well, DSLs make programming so obvious. You end up programming so much closer to the problem domain?" Yes, that's true. I find that DSL's and even macros helpful in configuration files, for example in a 3000 line Emacs configuration file. I use John Wiggle's use-package macro extensively to simplify my Emacs init.el file. My problems with macros as a fundamental organizational abstraction for programming-in-the-large are worth a separate post (it has to do with hidden semantics and the difficulty in even informally reasoning about the correctness of programs written using macros).

[1] https://en.wikipedia.org/wiki/The_Art_of_the_Metaobject_Prot...

[2] http://letoverlambda.com


Cut me some slack!




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: