
A difference between Haskell and Common Lisp - psibi
http://chrisdone.com/posts/haskell-lisp-philosophy-difference
======
lispm
Actually Common Lisp supports a gazillion of different programming styles.

The version with small functions, similar to the Haskell version:

    
    
        (subseq
         (remove-if
          (complement #'numberp)
          (butlast list 3))
         0 5)
    

In above Common Lisp code, we use four different functions which do one task:

    
    
        * subseq sequence start &optional end => subsequence
        * remove-if test sequence => result-sequence
        * complement function => complement-function
        * butlast list &optional n => result-list
    
    
    

For different approaches see Common Lisp libraries like Series or Iterate...
Iterate is LOOP on steroids.

[http://series.sourceforge.net](http://series.sourceforge.net)

[https://common-lisp.net/project/iterate/](https://common-
lisp.net/project/iterate/)

> This is known as composability, or the UNIX philosophy. In Lisp a procedure
> tends to accept many options which configure its behaviour.

Check out the UNIX man for tail, grep, ... to see how strange above quote
about 'unix philosophy' is. In reality Unix commands are programs with obscene
amount of configuration options, sometimes piping data as text around,
sometimes glued together by strange shell languages...

~~~
sdegutis
> Check out the UNIX man for tail, grep, ... to see how strange above quote
> about 'unix philosophy' is.

The "UNIX philosophy" is good, it's just that UNIX doesn't really follow it.

The C and shell programs that make up UNIX commands are actually UNIX's own
little _programming language_ , complete with ad-hoc data types.

Look at `cat` for example. It might as well be `str + str...`. Or look at
`grep`, it's basically `filter` or `reduce` at heart. But it has to do this
per-line, and it has no metadata about the line, it literally just works on
string contents. The `head` and `tail` commands aren't much different than
`take` and `drop`. Heck there's even a `sort` command.

That's why shells that actually use a programming language (e.g. eshell) don't
seem so crazy to me. To be honest, I use eshell way more than I use bash, and
half the functions I use there are written in lisp rather than bash or C.

~~~
lispm
> The "UNIX philosophy" is good, it's just that UNIX doesn't really follow it.

It never did, beyond some examples in beginner books.

> Look at `cat` for example. It might as well be `str + str...`. Or look at
> `grep`, it's basically `filter` or `reduce` at heart.

Then look at the options of `cat`. On my Linux system cat has a -n option,
which numbers the output lines. If it were following the 'UNIX philosophy',
this option would not exist.

~~~
qwertyuiop924
UNIX may have been about small commands ("cat -v", anybody?), but its most
important property is COMPOSABILITY. You could use small programs to build
larger programs, allowing you to do things by gluing together code that you
would previously have to write new programs to do. It made this very easy, and
you could also do it from within C, so it wasn't an either/or situation. This
is something your namesake system could have learned to do. The Right Thing
isn't always the right thing.

~~~
pjmlp
That was already available in Xerox PARC systems, by making use of function
call composition in Interlisp-D REPL, Builders in Smalltalk transcript or live
debugger in Mesa/Cedar.

The UNIX composition is only a novelty for those that never saw other OSes
that surfaced at the same time. After all UNIX just adopted the idea from
MULTICS.

~~~
dllthomas
_" The UNIX composition is only a novelty for those that never saw other OSes
that surfaced at the same time. After all UNIX just adopted the idea from
MULTICS."_

One could certainly chain programs through intermediate files. My
understanding has it that pipelines were new in Unix, and Wikipedia agrees:
"The pipeline concept was invented by Douglas McIlroy and first described in
the man pages of Version 3 Unix."

~~~
pjmlp
Yes the concept was popularized by UNIX, but anyone that spends time doing
computer archeology will find similar patterns in other OSes, like the ones I
listed.

Don't forget that back then computing was developed in silos, with knowledge
only shared at conferences or when researchers switched
universities/companies.

So it was quite common that researchers across the globe would come up with
similar discoveries.

------
fnordsensei
Is this really a philosophical difference? Granted, I've only used Clojure as
far as Lisps go, but composability seems to be something that's emphasized.

Rather than

    
    
       (remove-if-not #'p xs :count 5 :start 3)
    

It seems to me like most Clojure users would do something like

    
    
       (->> xs (drop 3) (filter p) (take 5))
    

which is much closer to

    
    
       take 5 . filter p . drop 3

~~~
catnaroek
Minor nitpick. Technically, this:

    
    
        #(->> % (drop 3) (filter p) (take 5))
    

Is equivalent to:

    
    
        take 5 . filter p . drop 3

~~~
fnordsensei
Is

    
    
        take 5 . filter p . drop 3
    

An anonymous function call in Haskell? I don't know the language (although
it's on my to-learn sequence).

~~~
catnaroek
It's an anonymous function, which it not being called.

------
agentultra
Common Lisp has these kitchen-sink functions and macros because it was a
standard developed by a committee whose goal was to incorporate several
popular implementations of Lisp that each had several decades worth of of
cruft. Lisp was big enough business at the time that having several mutually
incompatible versions of Lisp was making knowledge sharing and business
difficult. And having a standard was important for many reasons.

 _update: spelling and..._

The real philosophical differences between Haskell and Lisp are basically
apples and oranges. Lisp's composition strategy is the recursive application
of symbolic expressions... the famous EVAL and APPLY. I don't know enough
about the theoretical underpinnings of Haskell to make any kind of apt
comparison but my intuition suggests that to do so would be moot. They're just
too different.

~~~
lispm
> standard developed by a committee

You know that Haskell was developed by a committee and was designed to bundle
the various research streams on lazy/statically typed/ purely functional
programming languages?

From the Haskell history:

> ... to discuss an unfortunate situation in the functional programming
> community: there had come into being more than a dozen non-strict, purely
> functional programming languages, all similar in expressive power and
> semantic underpinnings. ... It was decided that a committee should be formed
> to design such a language...

[http://haskell.cs.yale.edu/wp-
content/uploads/2011/02/histor...](http://haskell.cs.yale.edu/wp-
content/uploads/2011/02/history.pdf)

Now about Common Lisp:

> it was a standard developed by a committee

That's basically nonsense.

The core of Common Lisp was designed by mostly five people (the 'gang of
five': Scott Fahlman, Guy Steele, David Moon, Daniel Weinreb, and Richard P.
Gabriel) in 1981/82\. This group was supported by various Common Lisp
implementors - around 30 people. Steele wrote the book on Common Lisp the
Language and published it in 1984.

The ANSI Committee to expand and clean-up the language was set up years later.

> incorporate several popular implementations of Lisp that each had several
> decades worth of of cruft

That's also nonsense. Common Lisp is mostly based on a single language: Lisp
Machine Lisp (aka Zetalisp), which is an extended version of Maclisp. Groups
developing other successors to Maclisp joined: NIL, Spice Lisp and S1 Lisp.
Lots of things in Zetalisp were new in Lisp and it took quite some time to get
it into Common Lisp, which itself also brought new things into Lisp (like type
annotations, sequences, ...).

So Common Lisp is directly based on Lisp Machine Lisp, which only was a few
years old. NIL, S1 Lisp and Spice Lisp were brand new and morphed into Common
Lisp. Spice Lisp was morphed into CMU Common Lisp, which forked the now
popular SBCL.

The style to use keyword arguments in parameter lists was brand new in Lisp.
It appeared in Lisp Machine Lisp just a few years before. LML did get it from
an obscure Lisp dialect called MDL ('Muddle').

> Lisp was big enough business at the time that having several mutually
> incompatible versions of Lisp was making knowledge sharing and business
> difficult.

Lisp was emerging as an implementation/research language for the military.
DARPA called for a Lisp standard, because they got several Lisp applications
and each had its own version of Lisp...

~~~
agentultra
As always, thank you for the clarification. My 6 month old is teething and
waking me every 40 minutes starting around 3am. I'm mostly recalling from
hazy, caffeine fueled memory. I like your posts -- so informative.

I admit to knowing little about Haskell's history. I had assumed it was an
outgrowth of ML -- itself having an interesting history intertwined with the
LCF theorem-proving system. Thanks for the clarification.

>> it was a standard developed by a committee

> That's basically nonsense.

In my much-abridged comment, yes I understand it could sound that way. I was
referring to the entire process but mainly the ANSI committee. I was also
under the, apparently mistaken impression, that there was more than one Lisp
involved.

    
    
      In April 1981, after a DARPA-sponsored meeting concerning the splintered Lisp community, Symbolics, the SPICE project, the NIL project, and the S-1 Lisp project joined together to define Common Lisp. Initially spearheaded by White and Gabriel, the driving force behind this grassroots effort was provided by Fahlman, Daniel Weinreb, David Moon, Steele, and Gabriel. Common Lisp was designed as a description of a family of languages. The primary influences on Common Lisp were Lisp Machine Lisp, MacLisp, NIL, S-1 Lisp, Spice Lisp, and Scheme. Common Lisp: The Language is a description of that design. Its semantics were intentionally underspecified in places where it was felt that a tight specification would overly constrain Common Lisp esearch and use.
    

Quite right.. it's all there in ye old Hyperspec. Thank you for pointing that
out.

Cheers.

~~~
lispm
> My 6 month old is teething and waking me every 40 minutes starting around
> 3am.

Congratulations!

------
nsfyn55
I'm not sure I agree at all with the premise of this article. Languages are a
syntax they don't have "Philosophies." They may have design elements that
reflect or support the philosophies of their designers(e.g. functions as first
class citizens), but the author doesn't provide critique on what they believe
these to be instead looking at a few std lib functions that some yahoo
implemented.

Look at something like java. Java has a core set of design elements. It was
built by a person with some philosophical leanings("Everything is a class",
"Checked/Unchecked exceptions being different things", "no first class
functions") Then it has standard libraries(i/o, collections,
threading/synchronization) each of these was built by a person with there own
set of biases and understandings. The original Collections implementation has
lots of mutable data structures then later Josh Bloch decided that he didn't
like that anymore and stopped adopting "immutability" as core design
philosophy. Immutability was not previously considered important when
evaluating a java implementation. What you end up with is a mish mash of
different opinions that only gets more different as you go.

Some 3rd party libraries like Guava didn't jibe with the Philosophical
leanings of the language itself and looked for work arounds. They went as far
as to create their own implementation of functions as a first class citizen in
a language that was expressly designed to omit them. Some commonly used
Android libraries do this as well.

My point here is that "language" can mean lots of things. It can refer to the
language itself, its run time, the syntax+runtime+community. What is idiomatic
and on, on, on. People and groups of people have the philosophies. I'd like to
see the author point to something a little more critical about the differences
between these two concepts. This premise is a little muddled.

------
jlarocco
Some of the Lisp and Haskell code examples aren't doing the same thing. For
example, these two do completely different things:

    
    
        (remove-if-not #'p xs :count 5 :start 3)
    
        take 5 . filter p . drop 3
    

I don't like Haskell, and it's not immediately obvious to me how to achieve
what the Common Lisp is doing, so I won't bother, but one way of writing the
Haskell in CL would be:

    
    
        (loop for x in (subseq xs 3)
              when (p x) collect x into result
              until (= (length result) 5)
              finally (return result))
    

Another option would be to use subseq and remove-if-not, etc, but without lazy
evaluation, the loop version will be more efficient. And though some Lisp
people dislike LOOP, I like that it's easy to read, if not always easy to
write ;-)

About the topic of the article, though, to me this seems like less a
philosophical difference than a result of Haskell not having easy to use
default and optional parameters. There's currying, but it's not a great
substitute, and it's a little awkward to use.

I like the Common Lisp way, even if it's crufty at times, because the keyword
arguments to functions like remove-if-not and sort are easier for me to use
than chaining a half dozen functions. I don't have to think about whether I
need to call filter before or after take or drop, etc. At the end of the day,
it's a personal preference, though.

Another advantage is that I don't have to "roll my own" for common idioms.

------
m0skit0
IMHO whoever wrote this article should change the title to "A philosophical
difference between Haskell and Common Lisp". There are a lot of LISPs and not
all of them follow Common Lisp's philosophy.

------
qwertyuiop924
And us schemers would express it like this:

(cut take 5 (filter <> (drop 3 <>)))

or, without the cut macro:

(λ (pred list) (take 5 (filter pred (drop 3 list))))

And yes, in most schemes, you can use λ as a synonym for lambda. Sometimes you
have to define it first, though. Anyways, that looks a lot like the Haskell to
you, doesn't it? It doesn't have the laziness, but other than that...

Oh! Oh! I almost forgot! you can also use the thrush combinator, if you use
the clojurian package:

(λ (list pred) (->> list (drop 3) (filter pred) (take 5)))

And yes, I think this is all the ways you can do this in scheme. CHICKEN
specifically. With some various macros packages.

Oh, wait! I forgot we have function composition, too, but you'd have to use
lambda or cut constantly to make the thing work because we don't have
currying. So these are all the ELEGANT ways to make this in chicken scheme.
With the cut SRFI. Or clojurian. And srfi-1, which is basically standard. and
it's better than the examples given, because pred and list aren't pre-
specified.

~~~
nickpsecurity
It's easier for people propping up one language over another language family
to ignore the similarly easy ways to do things in prominent members of that
language family. ;)

~~~
qwertyuiop924
Well, yes. Lisp and CL are not the same.

If anything, this has made me rethink possibly going over to common lisp,
which I was thinking of for the larger stdlib, because SBCL is very fast,
supports TCO, and you can disable case insensitivity.

~~~
nickpsecurity
Common LISP was made to smooth over the differences of the ancient LISP's plus
standardize their situation a bit. If those don't matter to you, then the
alternatives are often better. SBCL being one of them. If I get back into
LISP, I think the Racket Scheme community looks like it's delivering most bang
for buck in terms of what the tools can do after investing time learning them.

~~~
qwertyuiop924
Most schemes don't have a lot going for them in terms of libraries, which is
the primary reason I'd want to go over to Common. I really don't like racket,
for mostly aesthetic reasons. It seems a bit over complicated, trying to stuff
a lot of cutting edge research into its core, and forcing you to learn about 6
different models at once just to understand what its documentation is saying.
Its OO system is apparently lackluster, especially since most schemes fork
tinyCLOS for OO, and syntax-case is too complicated for its benefits.

I'll stick with CHICKEN for now. Its module system may be a bit awkward at
times, (you can't cleanly bind all your dependent modules into one
executable), but it's much more minimal, and while its documentation may not
be the best, at least I can understand what its saying.

------
jinfiesto
I'm not entirely sure that this is due to philosophical differences. The fact
that Haskell is lazily evaluated makes writing functions that do only one
thing much easier, since there is no performance hit for writing code like:

    
    
      take 5 . filter (not . p) . drop 3
    

In a strictly evaluated language. This would involve iterating over the list
three different times. (Kind of not really, since take 5 isn't going to be
that expensive.)

~~~
Veedrac
> there is no performance hit for writing code like:

> take 5 . filter (not . p) . drop 3

That doesn't seem to be the case in my testing. Writing this out as an
explicitly recursive function sped things up 3x on GHC -O3. (Dropping 10^8,
taking 10^7.)

FWIW, I also tested on Rust, Python (PyPy3) and Java (OpenJDK) with iterators,
iterators and streams, respectively. Python and Java were about as fast as the
manually recursive version, and Rust was two orders of magnitude faster than
that. And half of the time in Java was spent boxing, because Java can't reify
types away like Haskell.

It's true these are all using opt-in laziness, unlike Haskell's lazy-by-
default, so it's not a fair comparison. But I also see little reason to
believe that there's "no performance hit", given that that's exactly what I
saw.

------
catnaroek
The difference is not one of "philosophy", but rather of _using the right
types_. Haskell's so-called "list" type constructor is actually a type
constructor of _streams_. Unsurprisingly, streams are much better than lists
if you want to do stream processing.

~~~
wyager
>Haskell's so-called "list" type constructor is actually a type constructor of
streams.

Most papers consider streams to be lists _without_ a terminal constructor.
Haskell lists are a superset of streams. All streams have infinite lengths,
but only some lists have infinite length. (Again, this just comes down to
definition, but this is how most functional papers describe it.)

~~~
catnaroek
In chapter 4 of his book, Okasaki defines a type of both finite and infinite
streams (which is really just the type of Haskell "lists"), and, in the
remainder of the book, he only uses finite ones.

A stream of infinite length, which you call "stream" without qualification, is
what I call "a function on the natural numbers". (Well, up to isomorphism.)

~~~
chas
Speaking of things that the Haskell type system lets you make explicit, the
isomorphism between streams and functions from the natural numbers means that
streams are "representable functors" in the jargon of category theory. [1]
Knowing that a data type is representable allows you to immediately build a
bunch of other interesting structures on the data type. [0]

[0] [http://covariant.me/notes/rep-
functors.html](http://covariant.me/notes/rep-functors.html) [1]
[https://pamiz.wordpress.com/2014/02/13/the-functor-of-
infini...](https://pamiz.wordpress.com/2014/02/13/the-functor-of-infinite-
lists-is-representable-by-natural-numbers/)

~~~
catnaroek
Nitpick on both links. They say `alpha . beta = id = beta . alpha`. This is
wrong: `alpha . beta` composes to a _different_ `id` from `beta . alpha`.

------
ZenoArrow
Perhaps it's just me, but I don't see that Haskell and Lisp are that similar,
other than...

1\. They're both programming languages.

2\. They both allow you to pass functions as arguments to other functions.

Am I missing something here? Why are the two linked? Is it because Lisp is
seen as the birthplace of functional languages (because of point 2)?

~~~
lmm
Yeah, people talk about "functional languages" as if they're a unified thing.
IMO the differences between the strongly typed ML/Haskell tradition and the
Lisp tradition are as big as the differences between either and "OO languages"
or "imperative languages", but that's not the way it's usually presented.

~~~
qwertyuiop924
That's nonsense. Lisp in the functional style vs. Haskell is two different
implementations of what are at their essence the same ideas. However, lisp is
multi-paradigm, which complicates the comparison somewhat...

~~~
ZenoArrow
What are the essential features that they share?

~~~
qwertyuiop924
They're both primarily based on lambda calculus. Although haskell on typed
lambda calculus. They both emphasize purity over mutations. They both are
designed to make higher-orderisms idiomatic.

But the point isn't so much the features they share. I'd be the first to admit
that Lisp and Haskell are very different. But functional programming in both
is very much based on the same ideas. Claiming that they're as different as OO
vs imperative is like saying that washing dishes by hand is as different to
using a dishwasher as football is to baseball.

------
dgreensp
Title is misleading; this is about Common Lisp in particular.

~~~
Grue3
Well, Common Lisp is Lisp.

Scheme, Clojure etc. are lisps, with lower-case "l" ;)

~~~
qwertyuiop924
That's just wrong.

Clisp is the most popular lisp bearing the name, but the LISP, Lisp, or lisp
family includes Scheme, and Clojure. if you just refer to Lisp, you may be
referring to Clisp, the lisp family, or maybe even LISP 1.5, the lisp
equivalent of V7 unix.

~~~
lispm
Clisp is an implementation of Common Lisp. Common Lisp includes a core of the
original Lisp. Lisp programs from the 60s can either be run or ported to
Common Lisp with little or no effort.

Clojure is FULLY incompatible with any other Lisp dialect or Lisp derived
language. Porting code means 'rewrite'.

~~~
qwertyuiop924
Oh. Sorry. I was abbreviating Common Lisp. Whoops. Anyways, that's like saying
that Go, Plan 9 C, D, Java, and Cyclone aren't in the C family, because you
can't run ANSI C '99 on them and have it work. The Lisp family is diverse.

~~~
lispm
See C++, Objective-C and a bunch of other languages actually in the C family.

They all share C. Real basic C.

'The Lisp family is diverse.' You interpret it in a way to make it practically
meaningless.

~~~
qwertyuiop924
No. The lisp family is a real thing. They all share homoiconic syntax, singly
linked lists as a primary data structure, and most of them support syntactic
extension through macros.

While the validity of some of my examples may have been questionable, how is
Plan 9 C not in the C family? and just TRY to compile ANSI C on a P9C compiler
without significant modification.

~~~
lispm
Significant modification is something different than complete rewrite.

The Lisp languages share a common core language, code, ideas, literature,
community.

Derived languages like Logo, Dylan, Clojure, Racket, ... have their own core
language, their own code, their own ideas, their own literature and their own
community.

~~~
qwertyuiop924
Yes, but they also share a history and a set of syntax and ideas... Well,
except Dylan, syntax-wise. But ultimately this is an argument about
terminology. And the majority of people would say that scheme is a lisp. A
lisp. Not a language derived from lisp, that is independent. A LISP. in the
LISP family, because that is a thing. And if you want proof that that is what
most people think, you need only check Wikipedia:
[https://en.wikipedia.org/wiki/Category:Lisp_programming_lang...](https://en.wikipedia.org/wiki/Category:Lisp_programming_language_family)

~~~
lispm
THat's what people thought twenty years. Today this does not matter anymore.

> Yes, but they also share a history and a set of syntax and ideas...

Clojure:

    
    
        user=> (cons 1 2)
        IllegalArgumentException Don't know how to create ISeq from: java.lang.Long  clojure.lang.RT.seqFrom (RT.java:542)
    
        user=> (cdr '(1 . 2))
        CompilerException java.lang.RuntimeException: Unable to resolve symbol: cdr in this context, compiling:(NO_SOURCE_PATH:2:1) 
    
    

What I say...

~~~
qwertyuiop924
... Well, maybe you could be right. I'm not sure. I do know that when most
people say "lisp," they think of scheme, clojure, etc. as in that category.

I like to think that I can admit I may be wrong, and in this case I think I
might be wrong.

------
iamcurious
I'm confused by the last example. There are no elements greater than 5 in the
list (1 2 3 4).

Also, I'm not sure why they used takewhile instead of filter in the last
haskell part.

~~~
Xophmeister
I assumed it to be a typo; i.e., it should say "less than".

`takeWhile` is different to `filter` in that is returns the input list until
some element matches the predicate, whereas `filter` returns all elements that
match the predicate. For example, if the predicate is `(< 5)` and the input
was `[9, 2, 3, 6]`, then `takeWhile` would return an empty list, but `filter`
would return `[2, 3]`.

~~~
gkya
He should have used a better input, though, maybe 1, 2, 3 ,4 ,5. With the
given input it's a bit confusing, it's equivalent to filter . even [1..4].
Also, the Lisp loop should have been checking for (>= i 5), to be equivalent
to the Haskell example.

------
hardwaresofton
The biggest difference between Haskell and Lisp is that Lisp is multi-
paradigm, while Haskell is not. Haskell is more opinionated, and makes a bunch
of decisions for you (that you can choose to work around/sugar/hack until
Haskell looks like something else/does what you want).

All the other things that Haskell comes with - strong typing, monads, lazy
evaulation, can be written into common lisp, but whether you need them is
often questionable.

~~~
reikonomusha
They can't be written in Lisp well, or at all. Strong typing with inference
can't be bolted on. Monads that take advantage of this typing can't be bolted
on. Changing Lisp to have lazy semantics across the board ain't gonna happen.

You might be able to write a Haskell interpreter from scratch that interacts
with the Lisp environment, but you can't feasibly transform Lisp itself.

~~~
emidln
Nobody told Dr. Tarver[1]. He built a series of lisps with tight integration
with the underlying Common Lisp platform featuring a strong typing system
rooted in the propositions of Sequent Calculus[2][3]. Type delcarations are
only on top level forms with all locals inferred. There seems to be work to
improve this in Shen Professiona's new compiler. Shen offers tight integration
with the underlying platform (although it has been ported to a number of
runtimes (SBCL, CLISP, and V8 amongst others)[4]).

[1] -
[http://www.shenlanguage.org/history.html](http://www.shenlanguage.org/history.html)

[2] -
[https://en.wikipedia.org/wiki/Sequent_calculus](https://en.wikipedia.org/wiki/Sequent_calculus)

[3] - [http://www.shenlanguage.org/learn-
shen/types/types_sequent_c...](http://www.shenlanguage.org/learn-
shen/types/types_sequent_calculus.html)

[4] -
[http://www.shenlanguage.org/download_form.html](http://www.shenlanguage.org/download_form.html)

~~~
reikonomusha
I guess I wasn't clear in my post.

Usually when people say "you can do X in Lisp", they mean that you can
integrate a feature into the language.

Don't have list comprehensions? Write a macro.

Don't have "thrush" syntax? Write a macro.

Don't have lazy evaluation? Write a... err. You can't write a macro[1]. You
can't write something which completely changes the entire calling semantics of
the language. What you can do is make a little sub-language which uses
different values and objects from Lisp, but you can't magically make the
standard library support lazy evaluation.

Same with types and type inference (a la Haskell).

Same with many other things.

Of course, _obviously_ , you can write a compiler that compiles to Common
Lisp, or C, or Scheme, or COBOL. But writing a compiler -- while Lisp is good
at that -- isn't really taking advantage of what it usually means for Lisp to
be extensible.

[1] For those who are ready to call out the many "lazy" libraries in Common
Lisp, perhaps even my own, you're missing the point. You can make a new
version of DEFUN and a new version of FUNCALL. But that doesn't make REMOVE-
IF, SUBSEQ, MAPCAR, LIST, CONS, etc. lazy. As such, many, if not most, of the
benefits that laziness brings, don't get applied to the language at large.

~~~
emidln
What are macros if not custom compiler extensions?

Sure, Shen is enough of a compiler extension to be its own language, but I can
call Common Lisp code from Shen and call Shen code from Common Lisp. Doing
either isn't very different from calling a native function (calling Shen from
Shen is similar to calling CL from Shen).

As far as [1], I don't really see this as an issue with CL. CL has not made
the effort to rewrite its core in terms of generic functions. If this were to
be done, drastically changing CL's behavior for random new types would be
relatively painless. But the way that the current CL standard chose for its
library and core functionality isn't to use generic functions everywhere. As
such, it is expected that if you drastically change implementation, you end up
with separate functions. I think this is a huge draw of something like
Clojure/JVM with its interfaces/protocols as a base level over a far more
mature system like CL/SBCL.

------
kazinator
> _In Lisp a procedure tends to accept many options which configure its
> behaviour. This is known as monolithism, or to make procedures like a
> kitchen-sink, or a Swiss-army knife._

This is the case in some areas of the Common Lisp language; it is not true of
the Lisp, as a family of dialects.

There are plenty of examples of Lisp functions or operators that just do one
thing: `cons`, `car`, the `lambda` macro operator.

    
    
       ;; CL
       (remove-if-not #'p xs :count 5 :start 3)
    
       ;; Haskell
       take 5 . filter p . drop 3
    
       ;; TXR Lisp: a dialect with ties to CL:
       (take 5 [keep-if p (drop 3 list)])
    
       ;; Compose the functions using opip macro
       ;; (result is a function object):
       (opip (drop 3) (keep-if p) (take 5))
    

TXR Lisp's library functions don't have the :count, :start and whatnot. In
fact, there are no keyword parameters, only optionals. If you want to default
an optional on the left and specify a value of one on the right, you can pass
the colon keyword to explicitly default:

    
    
       (defun opt (a : (b 1) (c 2)) ;; two optional args
         )
    
       (opt 4 : 5)  ;; b takes 1, c takes 5.
    

The colon is just the symbol whose name is the empty string "", in the keyword
package. It makes for nice sugar and has a couple of uses in the language.
Note how in the defun it separates required args from optionals.

(Anyone else cringe at "UNIX philosophy"; how silly! This is the Unix
philosophy: let's reduce everything to a string in between processing stages
and parse it all over again, with simplifying assumptions that it always has
exactly the format we are looking for without actually validating it.)

~~~
nutate
That's the JWZ perspective, but if you add strong typing you can basically
turn the "string" into "SomeType" and process records like that. If you look
at languages with a |> operator they often act like that.

------
tunesmith
Real general question about function composition - when you're comfortable
with it, how do you think it to yourself, or say it to yourself, like what's
your shorthand mental model?

I was able to really intuitively get comfortable with unix piping a long time
ago, so instead of:

take 5 . filter p . drop 3

I'd be thinking, ok, take a thing, drop the first three, grep (or whatever),
now take the first five. It felt intuitive because each step solved a problem,
and then I'd move forward into the future, and have to only think of one
additional solution (head -5) assuming I did the previous steps right.

Meanwhile, take 5 . filter p . drop 3 is in the opposite order, from right to
left.

Maybe I'm just saying that left association is easier to think about than
right association. Don't you feel that weird recursive bump in your head, the
increasing mental stack, when you are dealing with function composition and
right association?

~~~
klibertp
I like imagining functions as various pipes, the real world ones. Compose is
that short pipe that connects two others. I imagine that the data comes from
the right and needs to be output on the left of the function. I then can read
such line (take 5 . filter p . drop 3) twice. Once following the order of
creation (ie. we're laying our pipes first) and then the second time in
reverse, following the data and function applications. That did the trick for
me when I first learned about function composition coupled with
(auto)currying.

------
malisper
Lisp actually does have stream fusion, but it has it in the form of a
library![0]

[0]
[https://www.cs.cmu.edu/Groups/AI/html/cltl/clm/node347.html](https://www.cs.cmu.edu/Groups/AI/html/cltl/clm/node347.html)

------
vorg
Haskell has strong typing and lazy evaluation, which makes it easy for
functions to _take only one argument_ at a time. Although a function could
take a tuple parameter, it's usually rewritten to take each component of the
tuple as a separate parameter, which makes the strong typing and built-in
currying simple, higher structures like monads possible, and a syntax to suit
this style. Lisp functions and macros OTOH _must be variadic_ to enable the
homoiconicity of the language. It's therefore much more difficult for
parameters to be typed, or to curry them.

These two styles make Haskell and Lisp mutually incompatible, unless they use
clunky addons like Template Haskell macros or Typed Clojure annotations. The
pure form of each language, however, is based on two mutually exclusive
foundations, i.e. strongly typed auto-curried parameters vs variadic untyped
macro parameters. The poster child of each language, i.e. monads and macros,
thus also don't mix well with each other.

~~~
pron
> higher structures like monads possible

What is it about the features that you mentioned that makes monads possible?
Lisp (or at lease Scheme and Clojure, which I'm familiar with) make monads
trivially possible -- just as they are in Haskell. They're not as useful
because those languages have other mechanisms that make monads largely
unnecessary, but they're no less easy to express.

~~~
lmm
Well you can't have implicitly resolved typeclasses without a static type
system. You could pass around explicit dictionaries with all your values or
some such, but most of the value of explicitly sequencing relatively minor
effects is only there if you have an extremely low-overhead way of doing so,
and a system that can verify the correctness of that sequencing at compile
time.

~~~
kd0amg
> You could pass around explicit dictionaries with all your values or some
> such, … if you have an extremely low-overhead way of doing so

A lot of that plumbing could be hidden from the user given the right dynamic
features.

[http://www.eighty-twenty.org/2015/01/25/monads-in-
dynamicall...](http://www.eighty-twenty.org/2015/01/25/monads-in-dynamically-
typed-languages.html)

~~~
lmm
It sounds like racket generics are at least partway on the road to a type
system. And those placeholder values seem a bit greenspunny - I'm not sure how
they'd interact with native racket features (e.g. macros). (I mean, you are
right, but in a degenerate sense you could implement typeclasses in any
dynamic language by having your program construct strings and writing a
Haskell compiler in that language that executed those strings at runtime. So
the more relevant question is whether monad techniques can be used effectively
in an idiomatic program).

~~~
kd0amg
The big difference is that generic dispatch is resolved dynamically.

I haven't yet found myself reaching for monads in Racket (or writing much non-
Redex Racket code) since tonyg wrote that post, so I don't know if there are
any odd interactions to watch out for.

------
rusabd
Having worked on one of the largest common lisp projects I have to say this is
spot on. And monolithism it's not just visible on level of functions it's
visible on higher level. Common lisp nudges you to write monolithic
applications and it's crucial to keep many details in one head (in case of big
project - many heads).

Also I think Clojure is scheme, so overall approach is different.

~~~
catnaroek
No, Clojure isn't Scheme. Scheme has hygienic macros and pattern matching on
syntax, which in turn Racket's `syntax-parse` library makes even better.

Clojure... well... has that ugly `defmacro` kludge.

EDIT: I was being kind of unfair. Clojure also has kickass collections in its
standard library, which Scheme doesn't have.

------
PuercoPop
Minor nitpick. The loop at the end could make the range by 'for i from 1 upto
4' instead of a list literal to more similar to the haskell code.

~~~
kiiski
That wouldn't be the same. The point was filtering a list (which could contain
anything, not just a sequence from 1 to 4).

The example seems a bit strange to me. Why does it have the check for i being
less than 5, when the problem is "get all elements _greater_ than 5, then just
the even ones of that set.". Was it supposed to be "remove all elements
greater than 5..."? (edit: although the code would then only work for sorted
lists; I think it'd be better to just loop over the whole list and have "(and
(< i 5) (evenp i))" as the condition for collecting)

------
grabcocque
You have to be clear about which Lisp. Clojure and Scheme are Lisps, and their
focus is very much towards simplicity and the use of combinators.

The complex, do-it-all-in-one-huge macro is a Common Lispism, not a Lispism.

Secondly, Clojure, like Haskell, is focused on sequence abstractions, not
lists. Sequences can include collections, streams, observables, sockets and
many other kinds of process that can be modelled as an event stream.

~~~
lispm
> Clojure and Scheme are Lisps

Basically new languages with strong Lisp influence.

> very much towards simplicity

Scheme maybe until the early 80s. Later it grew to Common Lisp size and
beyond. See Scheme R6RS which is a complex language with tons of features and
still underspecified.

> The complex, do-it-all-in-one-huge macro is a Common Lispism, not a Lispism.

Not really. Macros appeared in Lisp in the early 60s and many Lisp dialects
have used it in complex ways. For example the 'famous' LOOP macro of Common
Lisp actually was developed in Interlisp in the early 70s (as a part of
'Conversational Lisp' / CLISP), redesigned in Maclisp/Zetalisp and then
brought into Common Lisp. The original Common Lisp in 1984 did not even have
that macro in the language description, it was standardized several years
later - after a search for a better alternative failed.

~~~
qwertyuiop924
R6RS was basically DOA, with only 3 or 4 implementers (Guile, Ikarus, Ypsilon,
and (partially) Racket) actually going along with it. Most of the other
implementers, most notably CHICKEN's Felix Winkelman, refused. R7RS is now
split into a small core, and a large standard library for practical, as
opposed to teaching, use, whose development STILL isn't done.

~~~
lispm
The R7RS homepage says something different:

>This is the home page for R7RS, the Revised⁷ Report on the Algorithmic
Language Scheme. This version of Scheme has been divided into a small
language, suitable for educators, researchers, and users of embedded
languages; and a large language focused on the practical needs of mainstream
software development.

>The report on the small language was finalized on July 6, 2013. It is
available in PDF format and as LaTeX source code. There are errata.

>The development of the large language is still in progress. For details on
the development process of both languages, see the ​Scheme Reports home page
and the Working Groups home page.

~~~
qwertyuiop924
Yeah, that's what I said. Large still isn't done.

------
sklogic
The main difference is: it is trivial to build an efficient Haskell
implementation on top of Common Lisp, keeping interoperability with the rest
of the system. And it is impossible to do it the other way around, to build a
Lisp on top of Haskell.

~~~
ZenoArrow
[https://wiki.haskell.org/Haskell_Lisp](https://wiki.haskell.org/Haskell_Lisp)

~~~
sklogic
I'm talking about an _efficient_ implementation with a full interoperability
with the host. Interpreters and standalone compilers do not count.

------
srott
Or you could use Shen and spend less time philosophizing

[http://www.shenlanguage.org/](http://www.shenlanguage.org/)

~~~
devty
Could you elaborate? What about shen makes you say this?

~~~
catnaroek
Shen is like C++ in that it's bolted on top of a less typeful language (except
this time it's Common Lisp, rather than C), and is "type safe" as long as you
don't deliberately use a number of escape hatches.

~~~
qwertyuiop924
That's wrong in just about every respect.

Shen is a PLATFORM INDEPENDANT (not clisp based) language with an emphasis
upon functionalism, a novel and very powerful type system based on sequent
calculus, and OPTIONAL type checking, IF you want it. It is platform
independant because it is built upon an incredibly simple lisp that you can
build an interpreter for on top of almost any platform, so long as you can
guarantee TCO.

It really is lisp flavored haskell.

~~~
catnaroek
The point to type checking is protecting abstractions, and it being "optional"
reduces its value to zero.

~~~
qwertyuiop924
Wrong. Sometimes type checking is uneeded. In addition it can make prototyping
easy: Write the code, add the annotations later. Of course, you need some
discipline to make it work, but of what good practice isn't that true?

And if you really hate it, just type (tc +) at the start of your code. Magic!

