

Lisp evangelists that miss the point about syntax - Confusion
http://confusionist.tumblr.com/post/369255746/lisp-evangelists-that-miss-the-point-about-syntax

======
fexl
As much as I like Lisp and even pure functional languages (such as my own
Fexl), I still like that old-school loopy, branchy, bracey, break-ish, drop-
down to the iron, procedural feel of straight-up C and even Perl.

You won't get a language war from me, but there's a huge difference in "feel"
between those two major language classes, and strangely, I kinda like both of
them. Sure I can do some impressive reasoning in the functional realm, but
even in the procedural realm I can do iron-clad reasoning and correctness-
preserving transforms that (I hope) would make Dijkstra proud.

~~~
brazzy
It's no coincidence they feel different. The Lisp family of languages was
based on theoretical mathematical considerations (Lambda calculus), with
implementation as an afterthought.

The Fortran family of languages was based on the hardware's capabilities, with
theoretical considerations like expressiveness as an afterthought.

We are fortunate to have reached an age where we have CPU cycles and RAM to
spare so that languages that were not designed with performance-related
restrictions in mind are becoming practical for a wide variety of
applications.

~~~
fexl
I concur. Fexl can whip combinators around so fast it makes my head spin. But
I still _like_ writing C, even if it's only to write a Fexl interpreter.

Consequently the whole purpose of Fexl is to be a thin functional "layer" on
top of C. That way as a C programmer I can escape to functions to avoid all
the gnarly horrors of memory management and such, and as a Fexl programmer I
can escape to C to embrace the hard-edged bits and bytes. That way if you
asked me whether I was writing the thing in C or in Fexl, I couldn't really
give you a straight answer. ;)

~~~
aaronblohowiak
Off-Topic: Can you do a fexl vs gambit scheme comparison? I am interested in
the interleaving of functional programming and C

~~~
fexl
Sure, I'm reviewing the gambit code a bit. I'm really busy so I'll have to
follow up later.

Right off the bat I can tell you that Fexl's combinators aim to be written in
readable C.

I suppose the simplest combinator to show would be "I" (the identity
function). In Fexl I have a file named "type_I.c" which says:

    
    
      #include "node.h"
      #include "type_I.h"
    
      static struct node *reduce(struct node *parent)
          {
          return parent->RIGHT;
          }
    
      static struct type type = { reduce, clear_pair };
    
      struct node *make_I(void)
          {
          return make_pair(&type,0,0);
          }
    

(Quick note: all node values are reference counted. Cycles are impossible, and
unnecessary because of the wonderful Y combinator.)

Speaking of Y combinator, here is "type_Y.c":

    
    
      #include "node.h"
      #include "type_app.h"
      #include "type_Y.h"
    
      static struct node *reduce(struct node *parent)
          {
          return make_app(parent->RIGHT,parent);
          }
    
      static struct type type = { reduce, clear_pair };
    
      struct node *make_Y(void)
          {
          return make_pair(&type,0,0);
          }
    

Those are easy. The most gnarly one is the S (fusion) combinator ... which ...
oh well I might as well paste it in:

    
    
      #include "node.h"
      #include "type_app.h"
      #include "type_S.h"
    
      static struct node *reduce(struct node *parent)
          {
          if (0 == parent->LEFT->LEFT || 0 == parent->LEFT->LEFT->LEFT)
              {
              parent->type = parent->LEFT->type;
              return parent;
              }
    
          struct node *x = parent->LEFT->LEFT->RIGHT;
          struct node *y = parent->LEFT->RIGHT;
          struct node *z = parent->RIGHT;
    
          struct node *fun = make_app(x,z);
          struct node *arg = make_app(y,z);
    
          hold(fun);
          hold(arg);
    
          parent->type->clear(parent);
          parent->LEFT = fun;
          parent->RIGHT = arg;
    
          return fun->type->reduce(parent);
          }
    
      static struct type type = { reduce, clear_pair };
    
      struct node *make_S(void)
          {
          return make_pair(&type,0,0);
          }
    

OK now I'm pushing the limits of long-windedness, but I just wanted to feed
you some information right away while I work on some other things.

------
stcredzero
_So why not create syntax that catches 99% of the cases for which you intend
the language to be used_

This just indicates the author is clueless about the substantive use of a DSL.

The only way to implement this is to vastly reduce the scope of that "intend."
Basically equivalent to, "all you really need is Blub."

80% would be reasonable. But the last 20% is going to encompass some powerful
stuff.

~~~
Confusion
I'm not really sure why you bring up DSL's, unless you consider something like
Python or Perl a DSL as well. In that case: are you saying that, for instance,
Perl is only useful for 80% of the cases for which that language is intended
to be used?

~~~
stcredzero
Nope. You can "do it all" in any Turing-complete language. But you can often
do it more elegantly with a DSL. Once a language reaches a certain threshold
of power, making a DSL is indistinguishable from ordinary coding.

It's obvious the author hasn't a Clue about that.

~~~
Confusion
The point of the article was that creating a sufficiently powerful language,
that doesn't offer easy ways[1] of extending the syntax, does not warrant the
criticism that the language has 'too much syntax'. If anything, it
demonstrates the power of being able to create syntax. I don't understand how
your comment relates to that point.

[1] Of course you can "do it all" in any Turing-complete language, but
theoretical possibilities are often much less interesting than practical
impossibilities. In Python or Java, creating new syntax is so hard it might as
well not be possible.

~~~
stcredzero
Exercise for the student -- find the contradiction here:

 _The point of the article was that creating a sufficiently powerful language,
that doesn't offer easy ways[1] of extending the syntax, does not warrant the
criticism that the language has 'too much syntax'._

Evidently, you are as inexperienced with DSLs as the author.

 _Of course you can "do it all" in any Turing-complete language, but
theoretical possibilities are often much less interesting than practical
impossibilities. In Python or Java, creating new syntax is so hard it might as
well not be possible._

Insufficient knowledge is probably why you mistook the sense of the statement
you are replying to by 180 degrees.

~~~
Confusion

      Exercise for the student -- find the contradiction here:

You consider 'sufficiently powerful' and "doesn't offer easy ways of extending
the syntax" to be contradictory. Does that mean you consider only Lisps
'sufficiently powerful'?

    
    
      Insufficient knowledge is probably why you mistook the
      sense of the statement you are replying to by 180 degrees.
    

Do you mean extending the syntax of Python or Java is easy? Otherwise I'm not
sure what the knowledge is you think I'm lacking.

------
alec
"So why not create syntax that catches 99% of the cases for which you intend
the language to be used and abolish the ability to create further syntax?"

Because then I've got a syntax that works for 99% of the use cases you've
already thought of, and that doesn't overlap very well with the use cases I
have or that either of us will come up with in the future.

------
l0stman
You can't expect people to be reasonable when it comes to language flamewar.
Personally, I think weak advocacy is worse than no advocacy at all.

 _So why not create syntax that catches 99% of the cases for which you intend
the language to be used and abolish the ability to create further syntax?_

Although it's possible to create new syntax in Common Lisp via reader macros
-- I don't think lispers recommend to extend the syntax. But when you miss a
language feature, instead of fighting against the language, you could
implement it. I don't remember exactly who said it in one the SICP video
lecture, Harold Abelson or Gerald Sussman, but he said something like

 _Lisp is not good at solving a particular problem. What Lisp is good at is
extending the language to solve a class of problems._

The problem with syntax is that it's difficult to use correctly with macros.
It's no wonder that most introduction books to C warns against the pitfalls of
using macros. Even the simple ones could bite you if you're not careful.
Consider:

    
    
      #define DOUBLE(x)	(2*x)	/* should be (2*(x)) */
      #define MAX(a, b)	((a) < (b) ? (b): (a))
    

Even the last one is not immune against multiple evaluation problems if you
try to evaluate MAX(a++, b). In Lisp you could avoid this by using gensym and
local bindings. That being said, C macros and Lisp macros are completely
different beasts. And I recommend reading ``On Lisp'' for advanced use of Lisp
macros.

------
cabalamat
> _The ability to create syntax comes at a cost and the existence of so many
> other programming languages shows one thing very clearly: not everyone is
> willing to pay that cost._

The cost of Lisp is that the syntax is all the same. So a conditional looks
the same as an assignment statement, looks the same as a function call, looks
the same as a plus operator.

Lisp syntax is bad, because many humans find it easier to scan source code if
different constructs use different syntax (whether that is just because it is
what they are used to, I don't know). But Lisp syntax is also good, because it
allows macros.

You could get round this problem by having two levels of a language, one with
C-like syntax which compiles to Lisp-like syntax (which may itself be
compiled). So this:

    
    
      if (a>b) c := foo(d);
    

Might compile to this:

    
    
      (if (> a b) (:= c (foo d)))

~~~
jules
Sexps are better than strings because they are more structured. It would be
even better to represent code with separate data types for every construct
instead of the relatively unstructured sexps.

~~~
lispm
s-expressions are an external syntax. You can create any data structure you
want from them, given the right reader.

S-expressions are not INTERNAL to Lisp, only external. Thus s-expressions are
defined on characters. Comparing s-expressions to strings makes no sense.

The representation of code is a different issue. You could read an
s-expression and represent it internally as a string.

------
anamax
As I commented, get back to me when you've got a non-lisp syntax that doesn't
result in bugs from precedence/associativity errors.

If infix was actually "natural", there'd be only one set of
precedence/associativity rules and there wouldn't be bugs due to folks getting
it wrong.

With very few exceptions, folks use parentheses to try to protect themselves
from their ignorance of their language's precedence rules. (C++ has over 10
levels of precedence.) Even then,they occasionally get it wrong.

And, it's worse than that on teams because different folks have different
knowledge (and illusions) of the precedence rules.

------
kls
Barring the complaints about syntax and the assumptions made in the article, I
would still say the one reason someone should learn LISP is that most likely
one will have an entire career of writing logic the "other way", I am not
disparaging the "other way", just stating that it is far more prevalent and if
you are learning something for personal growth LISP will give you a more
diverse perspective than using a language that is an evolution of the "other
way".

------
brazzy
Common sense? In _my_ language flamewar?

~~~
tayssir
I concede that Lisp's notation probably isn't common sensical; and that
learning it is a significant investment for many programmers, who are
hardwired to quickly understand the conventional notation.

However, many important developments in math and science ran counter to common
sense. In math, people with common-sense arguments resisted numbers which
seemed "irrational", "imaginary" and "negative." (Supposedly, there were times
when you could die over them.) Euclidean geometry seemed so obvious that
people were scared to discuss non-euclidean ones. In science, even Newton had
some self-ridicule about spooky action at a distance, like gravity:

"That one body may act upon another at a distance through a vacuum without the
mediation of anything else, by and through which their action and force may be
conveyed from one another, is to me so great an Absurdity that, I believe, no
Man who has in philosophic matters a competent Faculty of thinking could ever
fall into it."

[http://en.wikipedia.org/wiki/Newton%27s_law_of_universal_gra...](http://en.wikipedia.org/wiki/Newton%27s_law_of_universal_gravitation#Newton.27s_reservations)

~~~
brazzy
I did not mean to imply that Lisp's notation itself was _against_ common sense
- just that the article made a very good common-sense point about its
expressiveness coming at a price that's often not worth paying, relative to
the advantages it gives you over languages with less expressive but more goal-
oriented syntax.

