Hacker News new | past | comments | ask | show | jobs | submit login
Why Lisp is now an acceptable scripting language (tunes.org)
149 points by wtbob on March 26, 2016 | hide | past | favorite | 125 comments



Lisp is an acceptable language for a variety of things, not just "scripting". I work on many projects written in Guile Scheme that range from game engines to static site generators to dynamic web applications to package managers, and I write one-off scripting tasks with it, too. There's a lot of mystique around Lisp and Scheme, and people tend to write it off as a relic of academia, but I use it for practical needs every day. Lisps enable a type of development via live coding that I have yet to see a non-Lisp match. I've used the REPL for doing everyday calculator tasks to live coding video games, and I've used it for customizing an X11 window manager in the past, too.

But rather than read about me confess my love for Lisp, I recommend that people just pick a Lisp that looks fun and take it for a spin. You might find a lot to like.


>> Lisps enable a type of development via live coding that I have yet to see a non-Lisp match.

How does it compare to using python for live coding?


If you want to compare Lisp Live Coding and you know Python, then let me compare it like this: How much faster can you get things done using something like ipython vs. a compiled static language like C(++/#) or Java? I don't want to go into a flamewar against the other languages, of course. Yet, if you favor an ipython repl against the compile/build/test cycle, then Lisp gives you just about the same order of magnitude more power.

I've done professional Python coding for 2.5 years, Ruby and JS for 7. I always believed that the REPL combined with those languages being dynamically typed can yield big advantages.

With a Lisp you can evaluate your code in real time in your editor. Also your app will re-load your code live without losing it's state. I've done that with Backends, Web Frontends and Mobile Apps. It's pretty awesome, not just compared to everything else^^

Take it for a spin and enjoy the ride(;


To me, the live loading of code and repl is a two edged sword, and I'm not sure which side is sharper.

It sounds great to say "live coding" and whatnot, but it turns out that you very quickly get the repl into an unknown state where you aren't sure what version of what has been eval'd, and what temporary cruft is lying around changing how your program behaves.

A number of times I would find out that a bug I thought I squashed was actually still there, just hidden by some temporary "what-if" I had eval'd. The only way to know is to clean the workspace and reload everything.

This was recognized as problematic by the clojure community and "Component" methodologies were developed, along with namespace refreshing. It would be far less useable if not for this.

However, in a live production system, a blanket reload of anything kind of defeats the purpose of keeping it running. I don't think I'd ever try loading code on a live production system unless all the alternatives were worse than possibly messing up the state of the system.

It's nice to have the repl for investigating what's going on in production, for inspection. But I avoid any code reloading in a live system.


What are the bread-and-butter (aka not sexy and cutting edge, but mature and stable) lisps used for web development?


Well, I run my own companies, so I like the cutting edge - and we are using Clojure.

A good friend of mine works for one of those big consultancies, however, and they are currently rewriting the backend of the second biggest online retailer of the world and they are using Clojure, too^^

So either I'm getting too old for the sexy Startup business or Clojure is at the same time stable and fancy(;


SBCL (free), Allegro ($), probably some folks are still using clisp (which is a pretty cool Lisp, but almost defunct).


We built kindista.org with SBCL/Hunchentoot.


> With a Lisp you can evaluate your code in real time in your editor.

Technically there's nothing about Lisp that makes it uniquely suited for this; it's just that people developing tooling for most other languages have a tragic lack of imagination. Smalltalk, Forth, Factor, and to a degree Lua all support the same styles idiomatically, while Racket is a lisp, but it discourages this style.


in lisps, data is code is data. So imagine you can edit your program while it's running, since it's just data.

How that compares to python, I'm not sure. But I don't think you can easily modify python code as easily as you can modify lisp programs.


>you can edit your program while it's running, since it's just data.

When I was playing around with Franz Lisp some years ago, I saw this feature, and then soon after, I saw the Edit-and-Go feature in a new MS Visual Studio release. Think I read somewhere that it was ad{a|o}pted from Lisp.


Lisp invented live coding.

Something like IPython was how the REPL from Lisp Machines and Interlisp-D used to work.

Plus Lisp is a compiled language.


Well, if it was just the REPL then perl, python, ruby etc could be pointed to. It is more than that. Although I do have a deep love for python via ipython and ipython notebooks.


> I've used it for customizing an X11 window manager in the past, too.

always glad to find another old Sawfish user. :)


Well, you can use LISP/Scheme as a procedural language with a little "code obfuscation" ...

  (define (return x) x)
  (define (! a b c) (not (b a c)))
  (define (-- a) (- a 1))
  (define (cout junk thing) (display thing))
  (define << 'nothing)
  
  ;ignore this   look at this
  ;...........   ...............................................
  (
                  define (__I_AM_HAPPY happiness) (
                    if(! happiness <= 0) (
  begin (             cout << "happy happy joy joy";                   
  ) (                 __I_AM_HAPPY (-- happiness);
  )                 ) 
  ;                 else (
  (                   return "yay! i'm done with 6.001 project";
                    )
                  )
  )
  
  ; got that? now we can do this:
  
  (__I_AM_HAPPY 10)


Wait, so now what determines whether a language is “procedural” is its surface syntax rather than its semantics?


Just take a little joke mate. It's fun.


What other kind of syntax is there? But yes, I've seen a lot of procedural code in every paradigm I can think of. Lots of cargo-cult programming results in it.


Sit down dear child and hear of languages long ago...

Back I. The days of vacuum tube and electromechanical switches, when punch tape rules the world p, there were two languages: The Algorithmic Language, and the List Processor. The List Process, or LISP, to its friends was built around s-expressions. The other, also known as ALGOL, was built around a strange and unholy hybrid of text and formulae that only an insane mathematician would love, for almost every statement was a special case.

ALGOL syntax went on to rule the world.

Seriously kid. There are only two syntaxes, and procedural, structural, object oriented, aspect, imperative, and all the rest are immaterial to the syntax, for they refer to how programs are organized, not what keys you press on the keyboard. Don't they teach programming languages anymore?


Have you heard of Prolog? (Just to mention a single language whose syntax doesn't fall in either category you mentioned. Not that there aren't others.)

Also, as someone who studies mathematics for fun, I disagree with your statement that a mathematician would love a system with lots of special cases. If anything, mathematicians are willing to pay a higher abstraction tax than most other people, just to unify special cases.


I've noticed that the kinds of things a mathematician might pride an elegant proof over are the same kinds of things a programmer might recognize as elegance in code as well.

I think the Curry-Howard Correspondence is actually an intuitive result rather than a surprising one, and that any experienced programmer who understands formal proofs as well as they understand computer programming would eventually arrive at the Correspondence on their own, or at least a suspicion of it.


thanks dad. by the way, you are wrong, but you probably knew that already

You've grossly oversimplified everything so nothing has useful meaning anymore.


Abstract syntax.


...trees? Fair point, but we don't program with abstract syntax trees, unless we are talking about genetic programming.


Or unless you're programming in Lisp. Lisp code is mostly its own AST.


You didn't specify in your question that you only wanted to know what other kinds of syntax there are that people program in.


Tell that to a compiler writer.


compiler writers write compilers in programming languages, not in syntax trees. Lisp is arguably the only language you write your code in "trees", but even then, it's still all lexical. A syntax tree is not syntax, it's a tree.


> it's still all lexical.

I have no idea what you mean. Are you talking about lexical scope? If anything, determining the lexical scope of a variable is much easier from a syntax tree than from the original program text.

> A syntax tree is not syntax, it's a tree.

Pretty much all the literature on programming languages identifies “syntax” with “abstract syntax (tree)”. It is typically understood that textual concrete syntax exists primarily as a convenience for the programmer, because it's easier to edit, but the very first thing a compiler does is recover the abstract syntax tree, and from then onwards proceed as if the concrete syntax had never existed (except perhaps for error reporting purposes).


Lexical means "as it appears 'written' on the 'page' of code". So lexical scope allows you to look and see what the binding is. Contrasted to dynamic scope where you must run the program to find the binding.

If you will, syntax is the vocabulary of the language. Also called "lexicon" and the syntactical tokens are sometimes called "lexemes". It's all about the text. Syntax is text. Or more technically, syntax is the format of the input.

Syntax trees are data structures that represent syntax, but they are not syntax. There are different types of syntax trees, too. The trees are built by the compiler, and never written by hand unless for exercise.

Syntax trees have already passed semantic analysis and contain context. They are much more than just syntax.


> If you will, syntax is the vocabulary of the language. Also called "lexicon" and the syntactical tokens are sometimes called "lexemes".

I'm pretty sure that most programming languages have their syntax defined in terms of two components: (0) a lexical specification [which lexemes are valid], (1) a grammar [how to arrange lexemes into well-formed programs]. And the latter tends to be richer and more complex than the former. So it isn't just, or even primarily about the individual lexemes.

> It's all about the text. Syntax is text. Or more technically, syntax is the format of the input.

Syntax is the (tree!) structure of utterings (declarations, statements, expressions, whatever). Textual representations of syntax just happen to be convenient to manipulate in a text editor.

> Syntax trees have already passed semantic analysis and contain context.

This is wrong. Syntax trees are the input for semantic analysis.


You must be fucking with me by now. Parsing IS context also, and it's the parsing of said syntax that _results_ in trees.

That's why we have syntax highlighting. Have you ever seen a tree in your syntax highlighting? I didn't think so. Do you have a tree when you have a syntax error? No, because it never gets that far. Syntax is independent and prior to your trees, the IR in a compiler doesn't HAVE to use trees, it can use whatever it wants. I see there is no point in continuing this, we are arguing over definitions. Not even sure what we are talking about anymore.


> You must be fucking with me by now.

I'm honestly not. Also, let's try to keep this civilized.

> and it's the parsing of said syntax that _results_ in trees.

When did I say otherwise? As you say, parsing concrete syntax indeed results in abstract syntax trees. Then those trees are used as input for semantic analysis (type checking, data flow analysis, etc.). Parsing is not semantic analysis.

> Have you ever seen a tree in your syntax highlighting?

The whole point to syntax highlighting, as well as good code formatting practices, is to make it easy to see the abstract syntax tree from the concrete program text.

> Do you have a tree when you have a syntax error?

If you're using a modern production compiler, yes. Compilers don't stop at the first syntax error. They add the error to a list, and then continue trying to build the syntax tree.

> Syntax is independent and prior to your trees,

I'd argue otherwise. When designing a programming language, abstract syntax comes first. You can define the semantics of a programming language purely in terms of its abstract syntax, and only later come up with a concrete syntax for it.

> the IR in a compiler doesn't HAVE to use trees

IR is what the compiler's frontend hands out to the backend. Syntax trees are an internal data structure used by the compiler's frontend.


"Syntax highlighting" is often a misnomer. More often than not, it's actually "lexis highlighting" (though there are some actual syntax highlighters, they're few and far between in my experience).

Take Vim or jEdit for instance (among the vast majority of others). When you want to make a color scheme, you do it by assigning colors to linguistic elements like "string literal", "character literal", "numeric literal", "boolean literal", "identifier", "comment", "keyword", etc. These elements are the sort of thing a lexical analyzer produces: a token type (in practice, lexers typically identify the associated lexeme as well). This is no coincidence, as most text editors with "syntax" highlighting use regular expressions to categorize and colorize the text. If you've studied any automata theory, programming language theory, and/or compiler construction, then you know that regular expressions are equal in power to finite automata and are therefore incapable of analyzing any syntax beyond the lexical for a great majority of programming languages. In fact, without a grammar of some sort beyond regular expressions, the "syntax" highlighting engine is provably incapable of actually understanding syntax.

I've heard that Scintilla's highlighter is fed parsing expression grammars to describe the syntaces of various languages (using Lua LPeg). Now that's a syntax highlighter -- but (perhaps unfortunately) it's the exception, not the rule.

So that's why you don't (often) see trees in your "syntax" highlighting, but instead see a string of tokens.

catnaroek is correct: source code as stored in text files is merely a linear string of characters. That is not syntax, per se; it's the parser's job to determine the (ostensibly tree-like) syntactical structure according to a collection of syntactic rules. The first result of parsing is the concrete syntax tree (also called a parse tree) deduced from the source text. This can be, and usually is, transformed into an abstract syntax tree (AST)[1] prior to semantic analysis. This is the case for every language; it's only so apparent in the Lisps because they belong to the tiny set of languages that explicitly specify their AST[2] -- for most languages, each translator has its own, possibly unique notion of abstract syntax for the source language.

So, back to your original question: what kinds of syntax exist beside "surface" syntax? Abstract syntax is one kind. You can think of the notion of compilers' internal representation (IR) as another kind, too.

[1]: Even in parsers that produce the AST directly, the concrete syntax tree is still constructed, but it's implicit in the process's state rather than explicit in a data structure.

[2]: See, for example, "The Revised^n Report on the Algorithmic Language Scheme" (RnRS). It specifies not only the concrete syntax (called the external representation in the Report), but, to the extent to which it's reasonable to do so, it specifies the abstract syntax as well (the internal representation per the Report, or even just data in some cases). It's this specification of the abstract syntax that enables the syntactic macros for which the Lisps are so renowned -- syntactic macros are essentially an API for manipulating the AST at compile-time.


Not that procedural programming is intrinsically a bad thing. Hierarchically decomposing tasks into simpler ones is a time-honored approach to dealing with complexity because it works. It's not “cargo-cult”.


that's not what procedural programming is, it has nothing to do with decomposing tasks. it has to do with a procedural style of telling the computer what to do.

note that I didn't say procedural programming is cargo cult, I said cargo cult programming often results in procedural style programs


Wikipedia disagrees with you:

> “Procedural programming is a programming paradigm, derived from structured programming, based upon the concept of the procedure call. Procedures, also known as routines, subroutines [emphasis mine], or functions (...)” [https://en.wikipedia.org/wiki/Procedural_programming]

Procedural programming is all about decomposing programs (descriptions of tasks) into subroutines (descriptions of subtasks).

Maybe you were thinking about imperative programming instead? Procedural programs are certainly imperative, but not all imperative programs are procedural.


yes you are right, that's what I meant. Sorry


François-René Rideau has done a lot to evolve ASDF, the Common Lisp build system, and lots have been done since, also, like an implementation manager[0], an assets downloader[1] and compiler[2].

[0]: https://github.com/roswell/roswell

[1]: https://github.com/eudoxia0/rock

[2]: https://github.com/eudoxia0/asdf-linguist




People who agree with the overall philosophy of this will be interested in avesh, an attempt to write a full shell in CL.

https://gitlab.com/ralt/avesh

I am also very interested in xonsh. I am increasingly of the opinion that as useful as a package manager would be like {bash,zsh} compliant shells written in host languages that allow you to merge bashisms/Unixisms into your host lang and vice versa.

I started looking into this, specificaly with Haskell. I founde chrisdone's Hell and tangentially turtle. Combining those would be insane!


There is/was also scsh, a Scheme shell for Unix.


Well, yes, I am aware of it. I had trouble getting scsh to work well. Outside of Guile, I am not even sure what Scheme implementations are viable. I have played with Chicken and Gambit very minimally.

IIRC, schs uses the MIT48 or a classic Scheme variant specific to it, and I do not know if there is anything "built" on top of that. Where as least other esoteric schemes (Chicken, Gambit, Gauche, etc.) are full environments people can code in, in theory.

Guile is fascinating to me because this concept, subsumed into Emacs (a la the guile->elisp super project), means a very interesting idea that I think moves beyond the IDE. Some people here will poo poo that, but I find it very interesting as I have moved away from tmux and screen and into Emacs as my combined editor AND terminal manager. I know I know, where's the Unix philosophy.

But, I like this. Soon I will play with this stack on top of Guix, and well, that is more Lisp Machine that I should be allowed to have.


Have you considered Racket? I believe its a sort of Scheme (experts, feel free to correct me) and a friend told me it has a small but good community. Also, an IDE, many libraries, etc.


Can anybody explain the advantages of switching to a functional language when it is not a pure one? For example, how is it better than Javascript? Is the uniformity of the syntax (and perhaps implicit currying) the main benefit?


When the inputs are similar to the outputs, (e.g. text with linux CLI commands) functions can be very flexible and do things they were not originally designed to do (e.g. linux CLI commands). An assumption that's implicit in what I've said is data flow programming. Data flow programming is when programming is thought of as connecting black boxes with inputs and outputs together. This way of programming is inherently modular. This is to be contrasted with (i) iteration, or recursionn which are inherently un-modular and (ii) The von Neumann model of computation, where there is a separation between code and data, and programming is thought of to be functions acting on the data (the problem with this model is that things need to be executed in a certain order, and sometimes the order isn't explicit in the code, which leads to bugs. Contrast this with data flow programming, where time is modelled explicitly)

In LISP, the common interface between functions in the list rather than text. To encourage data flow programming and discourage recusion, map, filter, and fold are provided.


Lisp isn't a functional language. It's a family of languages, most of which support more than one programming paradigm.

Most languages that are identified as some kind of Lisp are strictly evaluated, do not support partial evaluation (explicit currying only), and support mutable variables and mutable aggregate objects.


Minor nit-pick; the terms here are backwards. Nearly all lisps support explicit partial evaluation, but not many support currying, which is implicit.


Knowing Lisp since the days it was the only FP one had access to, I beg to differ.

I really disagree with the 21st century fashion to make everything that isn't a Miranda derived language as not being a FP language.


This may not be the answer you're looking for, but I think you might get the highest ratio of information-to-time by just trying it out. Second-hand opinions are hard. Practical Common Lisp is the best intro, and two chapters (maybe 40 min) in you'll know if it's worth going further.


Which dialect/s of Lisp have implicit currying as a language feature? Or do you have in mind other non-pure functional languages?


Lisp is not a functional programming language. Lisp is a multi-paradigm programming language. Functional programming, although awesome, is not the optimal choice for everything. Designers of Lisp understood this, so the language tries to get out of your way as much as possible. No programming language is perfect sadly, and Lisp too has its warts, but programming in Lisp compared to any other mainstream programming language, once you master it, is like walking up a hill after losing 200 pounds of weight.


This really depends upon which functional language you're referring to. Each one has a different set of features.

JavaScript is in some sense an impure functional language, however hastily designed in various places.

In the context of CL, the canonical example would be "macros". But CL has a bag of other features that makes it appealing.


I don't really think of Lisp as a functional language: it's really an imperative language with some functional features, and some OO features, and some aspect-oriented features and so forth.

The advantage that I see in switching to Lisp is that it's a nice, pleasant, powerful language in which to work. The standard is well-thought-out. The abstraction capabilities are wonderful.

Compared to JavaScript, there's no comparison:-) Just off the top of my head, Lisp has integers, ratios (e.g 1/2), big integers (e.g. (expt 2 2048) → a number so large it breaks the HN page wrap) and complex numbers (e.g. (sqrt -3) → #C(0.0 1.7320508)).

Lisp's error-handling system of conditions and restarts is profoundly powerful: one part of code hand signal an error, another can determine how to fix it, and execution can resume without unwinding the stack (as opposed to exceptions, where by the time a higher level of code determines what to do the function encountering the problem has already returned). Technically, one could implement this in JavaScript (it's all based on first-order functions), but since JavaScript doesn't have macros it'd be pretty hideous.

Which raises the point of macros, which are enabled by the uniform syntax. They're a powerful tool for syntactic abstraction, just like functions are a powerful tool for procedural abstraction. A language without macros is almost as primitive as one without functions.

There are also read-macros, which enable one to customise or even replace the parser (called a reader in lisp). Don't like Lisp? That's okay, you can write a reader which parses JavaScript and returns Lisp objects to be evaluated or compiled! Then there are compile macros, which enable compile-time optimisation. And there are symbol macros, which can make what looks like a variable access actually able to perform any operation. Imagine being able to write 'foo = bar' in JavaScript, and having that assignment be logged. In Lisp, you can do that.

One big advantage is that all of these macros systems are … just Lisp. They're not special: you can do anything in a read macro that you can do in your code. You don't have to learn a special syntax or a limited subset of the language: it's all there if you need it.

Then there's the object system (which is enabled by macros). CLOS is the most powerful, most convenient object system with which I'm familiar. JavaScript methods belong to a prototype, which means that in foo.bar(baz) only foo determines what bar is; in Lisp, methods are multimethods, and may be specialised on any and all arguments: (bar foo baz) could do different things depending on the type of foo, the type of baz or the types of both.

Places are an amazing concept. In most languages there's no way to say 'foo(bar) = baz', but in Lisp one can do (setf (foo bar) baz) with the right setf functions and it Just Works.

Lisp has optional type annotations, which can lead to extremely fast code.

Then there are little things like well-thought-out equality predicates, cleaner-looking code &c.

The advantage of JavaScript over Lisp is that because it is installed everywhere it has a much, much, much larger community, which means more libraries, wider support. But install base and community size are honestly the only real advantages I can think of.


>> The advantage of JavaScript over Lisp is .. more libraries, wider support.

Clojure solves that.


It doesn't, really. Not all that much is written in Clojure. Clojure has a nice FFI, and it turns out that lisps are pretty good at calling into java code, but other lisps also have FFIs. The FFI support is inconsistent between implementations, but of course the same could be said of JVM Clojure and Clojurescript.


> it turns out that lisps are pretty good at calling into java code

I think that gives the design of Clojure too little credit. I can imagine MUCH worse ways to do it than Clojure's well-designed Java interop.


I'm not so sure about implicit currying in CL. The uniform syntax that provides homogeny between code and data is exactly the main benefit because, for example, you can go ahead and add implicit currying (and much more). An example result of this benefit is CLOS from memory - a filly featured object system that is composed of extensions to the core language. In fact, I think it has many powerful features not present in archetypical OO languages. Multi-dispatch is one feature I recall from the top of my head.


A disadvantage of intensional purity (limiting the language to guaranteed-pure constructs, which is enforceable by a compiler) is, ironically, that it gives the programmer less control over how extensionally pure functions (i.e., no observable effects, which is obviously undecidable) are evaluated. In ML (an impure functional language), you can evaluate extensionally pure functions strictly (the straightforward manner), lazily (which Haskell gives you out of the box, but in ML requires reference cells), memoizing previously computed results (which Haskell can't easily do, but in ML can be done with reference cells), etc.


Lisp is not a functional language and should never be used as if it is somehow "functional".

Lisp is a homoiconic meta-language, and this is the only thing that matters, not all that functional-schmunctional crap.


Back in 2012 when I was sure I wanted to master some dialect of Lisp but wasn't sure which one yet, I read about 80% of Practical Common Lisp before I noped the heck out of there. The whole reason I was drawn to Lisp was for its semantic and syntactic purity, simplicity, and consistency, and out of the three popular Lisps at the time, CL felt like it had these the least. Ultimately I ended up going with Clojure after a year of flirting with various Schemes.


I have tried lisps looking for the famed syntactic purity but I didn't really find it.

For example, following the little schemer books, they don't print scheme as the actual ascii you need to type, instead they use typography like bold, italic and greek symbols that you have to replace with the uglier reality to distinguish between things like s-expr from lists. It feels like they are embarrassed by the cludgy real syntax, so they hide it and push the difficulty on to the reader.

To give clojure a whirl, I was prepared for lispy paren extravagance but when translating a small program to my best guess at idiomatic clojure (I'm sure I failed) there were so many non-s-expr syntactic special cases for things like list comprehensions and common macros that the result felt like it was jealous of non-s-expr languages and didn't actually want to be a lisp.

In truth, I believe a strong contrast between different syntactic elements helps readability, so I don't think the s-expr regularity I expected is something I actually want.


> ... they use typography like bold, italic and greek symbols that you have to replace with the uglier reality to distinguish between things like s-expr from lists.

My main complaint with Scheme is that it's hard to distinguish a "call" from a plain list. For example, in `(let ((foo 1) (bar 2)))`, `let` is a call, but `foo` and `bar` are just being assigned, not called. Clojure uses `(let [foo 1 bar 2])`, so you can tell that `let` is a call, but `foo` and `bar` are now unwrapped and placed inside square brackets to signify plain data, not a call.

> To give clojure a whirl, ... there were so many non-s-expr syntactic special cases for things like list comprehensions and common macros

Everything is still an s-expr. The square and curly braces still follow the same rules, but they just mean they are not "calls" like parens would normally signify.

For a list comprehension example: `(for [m messages] (str m))`, `for` and `str` are calls and `[m messages]` is just data. It could be unwrapped like `(for m messages (str m))`, but it's wrapped in square brackets as a visual indicator for what's important. But if it was wrapped in parens like in Scheme, it could be confused as a "call".

Clojure is very well-designed to solve visual problems of Scheme and other Lisps. I recommend looking at it again :D


From what I can tell, Scheme (or it may be a Racket thing) allows the use of square brackets - this aids in readability greatly. There is no type difference as there is in Clojure.


With the downside being inconsistent style across the community, making for more overhead when reading code (see also macro abuse). Scheme has bigger fragmentation problems anyhow.


I sympathize. Different concerns pull lisp in different directions.

CL is, and always has been, about getting stuff done and isn't above having some warts, especially historical warts.

Scheme was more academic and elegant, but not great for real world use until some implementations "fixed" that. Racket, for instance, isn't quite a proper Scheme any longer, but is definitely practical.

Clojure looks interesting, but I have absolutely no need or desire to use a JVM-based language.


Maybe give it another look sometime? You rarely have to interact with the Java ecosystem if you're not using Java libs, and the JVM != Java. There's also ClojureScript, which is the same, but runs in browsers, Node, etc.


I don't think purity and simplicity are virtues of Common Lisp. It's more of a pragmatic, industrial, old language, meant to unify the competing dialects back in the 1980s.


Old as in so old, its goes directly back to the first LISP, which was implemented as FORTRAN subroutines which ran on a vacuum tube computer, the IBM 704, which was, at least, advanced enough to use magnetic core memory.

Scheme rethought a number of things with the experience of a couple of decades in developing languages, and Common Lisp included one of its breaking changes, lexical scope (I think Gnu Emacs Lisp and AutoCad's AutoLISP are the only widely used dialects with dynamic scope, and they're obviously not primarily language implementations).


>semantic and syntactic purity, simplicity, and consistency

Who wants simplicity? People learning the language, so they have less to learn and a lower cognitive overhead when starting out. People teaching the language, so they have less work. Language implementors, so they have a much shorter period of time between prototype and release of a language implementation.

Who doesn't want simplicity? People who've already learnt the language and are proficient with it and use it regularly, having grown accustomed to the features it provides that can't be properly added afterwards as macros or libraries.


The Common Lisp standard is definitely neither pure, nor simple, nor consistent, but it is pragmatic, practical and powerful. Scheme is a fun learning toy, but I wouldn't use it for a production system.

As for Clojure? I'd like to get around to taking a serious look at it someday, but I do somewhat emotionally & irrationally blame it for stealing the momentum of the Lisp resurgence of the late 2000s. It probably has some good ideas, but it's also pretty radically different from Lisp, which is (under the parentheses) a fairly traditional language.


FWIW, PicoLisp[0] is one of those lisp dialects that went way out of its way to be syntactically simple/pure/consistent. Whether this is a good idea, I don't know, but it is nevertheless an interesting example of a modern-made lisp that actually considers less data types (i.e. numbers, symbols and lists) to be a virtue.

[0] = https://en.wikipedia.org/wiki/PicoLisp


I have been interested in taking PicoLisp for a spin, but I've found the friction of installing on OSX a bit much. Do you know of an easier way to use PicoLisp, without re-hacking gcc back into the system?


I have no experience compiling anything on OSX, so I am not sure if I will be very helpful.

Looking at the download page, it looks like the OSX version is 32-bit only: "It should compile and run on 32-bit GNU/Linux, FreeBSD, Mac OS X (Darwin) and Cygwin/Win32 distributions, and on 64-bit GNU/Linux systems." Reading through the INSTALL file seems to confirm this. [0]

A while ago (2013), some one seems to have gotten a 64-bit version up and running on OSX through make emu (don't know what the emulation option exactly is, so I can't say if it is relevant). [1]

[0] = https://github.com/evanrmurphy/PicoLisp/blob/master/INSTALL [1] = http://picolisp.software-lab.narkive.com/oCd1TRXz/pre-compil...


If someone wanted to get started into Lisp programming, what REPL and resources would you recommend? I began working through SICP a while back, but found it difficult to know what dialect of Lisp to use. As a complete newbie to Lisp, it was near impossible to debug why the code I wrote from SICP didn't work at all.


SICP is written for Scheme, not Common Lisp, so that might've been your issue. They're not at all the same, despite both being Lisps.

Just use Common Lisp. I use SBCL. Read Practical Common Lisp [1] &/ ANSI Common Lisp [2]. Skim the CLQR [3].

If you're just starting out, you can get away with just using the REPL. You might get that ^]]A crap when you up-arrow to previous expressions, so install rlwrap and run 'rlwrap sbcl' instead of just 'sbcl'.

Once you get tired of that and want to use an editor, get Emacs (on OS X: see [4], on Ubuntu: apt-get install emacs24). Install SLIME [5] through package-install to send expressions down from a .lisp file to the REPL (with C-x C-e).

If you get stuck, try looking for youtube videos (or add an email to your HN about).

[1] http://www.gigamonkeys.com/book/

[2] http://www.paulgraham.com/acl.html

[3] http://clqr.boundp.org/download.html

[4] https://emacsformacosx.com/

[5] https://github.com/slime/slime


For SICP, you need a Scheme implementation (specifically, one that supports IEEE Scheme if you're using the second edition of the book -- R5RS, the most widely-implemented standard, is a strict superset of IEEE Scheme and works just fine with the code in the book, apart from the non-standard "picture language" stuff in chapter 2).

I highly recommend using Racket with the DrRacket IDE and Neil van Dyke's SICP package from the Planet package repository. It's the easiest way to get up and running with an editor and a REPL, and it's got support for the "picture language" in chapter 2.

You could also give it a shot with MIT Scheme -- I've got a feeling it's the implementation SICP (2e) was developed with. It's also got a built-in Emacs-like editor called EdWin. Launch the MIT Scheme REPL and type `(edwin)` to get at the editor.

GNU Emacs has an extension called Geiser (available in MELPA) that works well to integrate with Racket and Guile. Otherwise, the cmuscheme package that ships with Emacs gives you rudimentary REPL integration with just about any Scheme interpreter (and quite a few Schemes, such as Gambit, come with packages to better integrate with cmuscheme and scheme-mode).

If you're not an Emacs user and you're not willing to invest the hour or so it takes to get a handle on the basics, you can, of course, use any editor you want and use the `load` procedure (which, iirc, was omitted from IEEE, but most Schemes have it) at the REPL to import definitions from a file (or there's always good ol' copy+paste).

If you want to learn Common Lisp rather than Scheme, you can check out "Successful Lisp" by David Lamkins and/or "Practical Common Lisp" by Peter Seibel, both of which are freely readable on the Web. You'll also want to keep the ANSI Common Lisp HyperSpec by Kent Pitman handy as a reference (also free on the Web). For this, you can grab virtually any ANSI Common Lisp implementation (GNU clisp, SBCL, Clozure CL, ABCL, etc.). GNU Emacs has excellent support via the SLIME package (available from MELPA), or the other-editor with the `load` function will work here, too.

If you want to learn Emacs Lisp, GNU Emacs comes with an introductory programming book that uses Emacs Lisp, "An Introduction to Emacs Lisp Programming" by Bob Chassel, and the Emacs Lisp Reference Manual. As you might guess, GNU Emacs has the best editor support for Emacs Lisp, though Vim at least has syntax highlighting for Emacs Lisp.

If you want to learn Clojure, the only freely-available book I'm aware of is "Clojure for the Brave and True" by Daniel Higginbotham, though I'm sure there are others. MELPA has the CIDER package available for Clojure support in GNU Emacs. I'm not familiar enough with Clojure at this point to know what the equivalent to `load` is, though.

At this point, I've pretty much exhausted my usefulness on this matter. If you want to learn T, ISLISP, EuLisp, Le Lisp, Arc, or any other Lisp dialect for that matter, I can't help you.

After all that, I highly recommend starting with SICP. It's really a great introduction to programming, and it also covers quite a few concepts that will likely be new to you even if you're already a programmer (mostly in chapters 3-5). (Disclaimer: although I like most of the Lisp dialects, my favorite is absolutely Scheme, so I may be biased ;)). Pick an (IEEE or R5RS) Scheme interpreter, have a glance over its documentation, and you should be good to go!


I recently rewrote many of my shell scripts in guile, a really fun experience. And tail call recursion makes it pretty easy to reason about the code, no more looking up bash cheat sheets every time I write a script


[flagged]


2 things:

1.) that's more a statement of the hardware than the software, and they may be running on a tight budget (I have no idea b/c I can't see the site).

2.) If it has 10 upvotes, then that probably means that a significantly higher # of people are trying to view the site so they can decide whether or not to vote, which is not a trivial situation if you aren't prepared for it.


Would be fun to have a tracker of how many clicked the link from HN as well


Have you read the article? Its a guide to ASDF 3, a CL build system, and features that help with scripting. There's nothing "no code consultant" about it. Its not even a philosophy-methodology piece.

The above comes across as outright dismissive.


No, I hadn't read the article. The site was down...


If you don't know the most common causes of a site going down, do you really have the right to ask people if they really have the right to tell people how to write code?


No to mention they're not actually telling people how to write code.


"Weak programming"?


I rest my case.


"The site went down, but it's not my fault. Even though I'm the guy coding it". ok, 10x.


Please stop posting unsubstantive comments.


You clearly don't know what actually causes most site outages. Move along.


And you're clearly the arbiter that gets to tell people what they do and don't know. I guess pretending to be competent online helps you, in some way.


Just pointing out the facts. Some of us do clearly know more about this issue than you do, so not sure why you are continuing in this manner.


Because four times now you've said "I know more than you" and have provided nothing of substance to back your claims. I know this is HN and "what you say" means more than "what you do", but hipster bravado will only get you so far.


We've banned this account for repeatedly violating the HN guidelines.

If you don't want it to be banned, you're welcome to email hn@ycombinator.com.


Other people in this thread have pointed out the erroneous assumptions in your comment. In this particular example, you are showing your ignorance of the discussion and continuing to respond in a spoiled manner. Please don't contribute to HN this way.


[flagged]


I don't see anything wrong with that. Just because it doesn't use the fanciest and newest version of bootstrap, doesn't mean the site is bad. Sure it uses an old version of HTML but what the heck? It can read it just fine and safari's reader works perfectly – much better then most of the sites that use modern CSS. I'd rather have a site that loads quickly then a site that loads tons of MBs for scripts, big images and fonts (and thus eats up all my mobile bandwidth) just to see a site that breaks safari's reader, and disables zoom so I can barely read it. I'm not saying that all of those fancies sites are like that but a lot are. Especially when they're made by backend engineers who don't know much about UX design. I'd rather get a well made HTML 4 site then a badly made HTML 5 site.


Looks like it was created with an older version of Scribble: http://docs.racket-lang.org/scribble/

I'm not sure why that would call the contents into question.


This is sickening-grade bikeshedding. Next time, spare us, and say it to your rubber duck instead.


I hope he doesn't look at HN HTML code.


Looked. So what? There's a difference between "ugly" and "invalid". Both cases are bad but the first one has much more impact on end users.

Edit to answer "how is it invalid": is this not enough? http://validator.w3.org/check?uri=http%3A%2F%2Fnews.ycombina...


This is HN webpage, I meant the article page doctype.

taken from w3c:

    <!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
	       "http://www.w3.org/TR/html4/loose.dtd">

article doctype:

    <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
		"http://www.w3.org/TR/html4/loose.dtd">


How is it invalid ?


deleted b/c brigading


Having an insight into one area of computing doesn't make you an expert in all areas of computing.


It certainly makes some people think they do though.

They can't separate the content from the medium.


deleted b/c brigading


Consider if I was an expert in databases. Why would the appearance of a site dedicated to databases matter? It'd be completely unrelated to the topic I'd be sharing information on, and time I could spend improving the site could be time I spend improving my knowledge even further.

As for this Lisp scripting site, for all I care the author could've shared the information via a Pastebin site and I would've got the same value out of it.


And this happens regularly! I know of experts in programming languages[0], compression[1], graphics[2, 3], and computational geometry[3] who all have sites that both have excellent content, and look fairly awful.

[0]: http://piumarta.com/index.html

[1]: http://cbloom.com/

[2]: http://iquilezles.org/ (okay, this one doesn't look that bad in retrospect)

[3]: http://paulbourke.net/


[0] and [3] look great, unless you are opposed to simplicity. They kind of follow "brutalism" in a web architecture.


Except that most pastebins would at least make the text more readable than it is there. Especially on mobiles.

Markdown with Strapdown inclusion, a pastebin, a CMS, bare Bootstrap... anything would be more readable than this.


I feel as though this is at least somewhat relevant: http://motherfuckingwebsite.com



They made it washed out with poor contrast and made the lines annoyingly short. Seems like the first site is more readable to me.


Everything we know from centuries of design experience and all science knows about the eyes and how we read, tells us they made it better than the first site.

There's not even a parallel universe where the first one is better -- and it's not a subjective issue either.


What papers say that low contrast is better than high contrast black on white? There is research to show that black on white is better than white on black, but I doubt there's any to show that poor contrast dark grey on light grey is better than black on white.

It is just a fashion, there's no science to say that low contrast is better. http://contrastrebellion.com/

Shorter lines I might grant you, though.


>What papers say that low contrast is better than high contrast black on white?

Not about low contrast being better -- only about (slightly) reduced contrast than full on white on black blaze.

Black on white is better for readability, but the brightness can induce fatigue for longer term use (and people use computers all day long) and also hurt knowledge retention (according to educational related research).

A slightly off-white is better there than shining a white backlight on your eyes for hours on end (and adjusting monitor brightness achieves the same effect, color-wise as having a "less white" background in the first place, and is recommended by doctors).


Note that the website you linked doesn't have any sharp black on white.


[flagged]


This is the website of John Walker, cofounder of Autodesk: https://www.fourmilab.ch/

There are lots of good articles on there, but the site uses frames. Does that detract from the quality of the content? No.

Paul Graham's site: http://www.paulgraham.com/

Has a HTML 4.01 Transitional doctype and uses tables for layouts. Does that detract from the content? No.

Hell, you're on hacker news. It doesn't even define a doctype and also uses tables for layouts.


[flagged]


> You are obviously trying to do some appeal to authority, which doesn't work on me. No one is an authority for me.

You're mistaken, I'm actually using these people as examples of their technical ability. People may have a website, but not be web devs. Judging people and the contents of their site on the way their site looks is just wrong.


> You are obviously trying to do some appeal to authority, which doesn't work on me. No one is an authority for me.

That attitude is a pretty good way to ensure you learn nothing from anyone. I hope for your sake you grow out of it!


>You are obviously trying to do some appeal to authority, which doesn't work on me.

No, he is obviously giving counter-examples of content in poor markup, and how that doesn't stop it from being great content.

Whoooosh...


I am not sure if this is a case of what this website calls "composition/division" or "genetic":

https://yourlogicalfallacyis.com/composition-division


Now that scripting languages are almost dead?


http://www.tiobe.com/tiobe_index 4/10 top languages are scripting languages. Not bad for something that's almost dead.


What does that tell us? That there is a lot of legacy perl & php code around that has to be maintained? I personally used to like lisp (scheme actually) and ruby but I would never use them (or another dynamically typed language) again. Never! I'll take another look at common lisp as a (scripting) language when optional static typing is available, standardized, and commonly used. And no, typed clojure and (unfortunately also) typed racket aren't viable alternatives.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: