These are terrible answers to the question "Why Lisp?" (to which the most obvious answer would be "For what?").
1) Good ideas? Yes, it did, but many modern languages implement them so it is no longer a reason to use Lisp. Macros? There are numerous arguments on both sides about whether those are good or bad. Needless to say, not enough to justify using Lisp.
2) Your "extension to the language" is my "wtf is this developer doing here?" Consistent syntax is part of what makes a language intelligible to many different people. I don't want to be a human parser to understand code.
3) Thrust/weight... ?. OK, I think your point is that there is little syntax to understand. Unfortunately, that means that context needs to be parsed by the developer because there are few "road signs" that tell me what is going on without reading the whole chunk of code in detail.
Finally, I'm not sure what this is in response to (other than it being the eternal question that burns in the hearts of all men) but if it is an argument for using Lisp on a project, it would be better directed at what Lisp has that makes it good for production work in a modern environment. However, in that way, I don't think it is much better than many of the other languages today.
> Macros? There are numerous arguments on both sides about whether those are good or bad.
Every argument I've ever heard about macros falls into one of two categories:
1. Concerns about overuse, which says nothing about correctly used macros - any tool can be overused or used incorrectly.
2. Complaints about macro systems from heteroiconic languages like C++ - these two concepts are similar in name only. Lisp macros have nothing to do with macros in any other language, and they provide freedom that is literally unavailable in any other[1], as explained well in this description: http://lists.warhead.org.uk/pipermail/iwe/2005-July/000130.h...
[1] Yes, yes, Turing-completeness, so nothing is stopping you from writing your own parser from scratch. But the whole point is that Lisp doesn't require you to go through all of that - and secondarily, even if you did, parsing sexprs is so easy, you've finished before you've even started.
1. True. My concern after years of development both in and out of corporate environments is that any tool available will be misused if given the chance. The goal is to minimize the damage as much as possible. If I could be surrounded by the cream of the crop at all times, that'd be great, but I'm not and most developers aren't.
2. I'm aware of this as well. I developed in Lisp for two years. Lisp macros are not comparable to C macros. I also didn't say they were all bad, just that there are arguments to be had about their utility. For example, OOP in Lisp has been described many times as a "leaky abstraction". In my opinion, it just plain sucks. It gets the job done, but I always feel dirty afterwards. It shows the power of macros and Lisp, but it also shows the limitations of not changing your language to support a little syntactic sugar.
BTW, I know it's not apparent from my post, but I actually like Lisp, I just don't think it is the answer to any question that is being posed right now.
"they provide freedom that is literally unavailable in any other"
There have been lots of arguments on LtU about this. You can go very, very far with either laziness or lightweight closures. Laziness didn't go all the way with Haskell, whence Template Haskell; I don't know enough about Smalltalk to know if they've managed to get by with extremely (syntactically) lightweight closures.
Re: "for what" I would say that the best answer would be: for rapid development of technically complex programs with small, highly-skilled teams.
Re: 1) Many modern languages do implement some Lisp features, but none do so as well as Lisp. I have yet to see a language that mixes functional programming, an elegant and intuitive standard library, meta-programming, and high performance native code incremental compilation in one development environment. E.g. Javascript is full of bizarre quirks and bad semantics, and its REPLs aren't up to snuff yet, even though it has native-code compilers that are up to snuff. Nothing does meta-programming as well as Lisp. LINQ, Caml4p, etc, are ugly hacks compared to Lisp macros which are baked into the compilation pipeline by design.
Re: 2) I think you are mistaken in assuming that programmers using other languages don't extend the language. They do, they just do it with external, ad-hoc tools. A great example is LLVM's tblgen utility, which generates native assembler backends from table descriptions. It has its own grammar, own parser, own set of bugs. You can do this stuff in Lisp within the language. When I was still learning the language, I put together a table-driven assembler for x86-64 in about two weeks (http://code.google.com/p/amd64-asm/source/browse/#svn%2Ftrun...). It uses macros to take instruction descriptions and generate encoding functions which are then compiled to native code by the Lisp compiler. It's simple, it uses the parser built into the language, you can't ever forget to run the external tool, and it's totally integrated into the development environment. One hotkey sequence expands the macro and shows you the result right in your editor. An error in a macro is debuggable with the integrated debugger just like any other code (compare to C++ templates, which are basically impossible to debug except through trial and error). Literally no other metaprogramming feature comes close to Emacs+SLIME+SBCL.
Re: 3) Lisp doesn't have "less context." It just uses words for context rather than syntax. In C, you see curly braces and you know you have a block. In Lisp, you see the word "progn" and you know you have a block. Lisp is, to a degree, a language for people who think verbally rather than symbolically. The signals are all there--it's just that the signals are words.
Lisp is the only language I've ever used (to be fair, I haven't really used Haskell) where I don't feel like the language/development environment is the bottleneck. It's the only one where I don't lose my train of thought trying to remember to remember the idiosyncratic way someone ordered the parameters to an API call, or because I'm stuck writing a few dozen lines of boiler-plate. It and Python are the only language I've used where API's have sensible defaults (Java is absolutely terrible for this) and I'd dare call the standard library "tastefully designed."
It's kind of funny you mention this. Without making a serious claim that Ruby has metaprogrammability comparable to Lisp --- again, this is I hope clearly not a serious argument --- I built essentially the exact same project as part of a programmable debugger a bunch of years ago, in Ruby. Here's a snippet:
The "advantage" (if you are crazy enough to call it that) Ruby has over Lisp is that your assembly opcodes are all sexprs (I guess you could implement the reader macros to get past that). The hackery required to pervert Ruby syntax so that it accepts most basic x86 assembly is... well whatever.
Incidentally, assemblers written in high level languages? Surprisingly useful.
(Ragweed itself is I believe also long since deprecated, though I used it recently as a starting point for an x64 Windows debugger, using JRuby and ffi).
Very neat. I haven't spent much time with Ruby, but in my estimation it's Lispier than Python and Matz is less actively hostile to such comparisons than GVR. :D
Ruby is much lispier than Python, but it has the same fundamental deficiencies as Javascript (which is even lispier in that the idioms in Javascript are so closure-dependent); it's idiosyncratic, not particularly orthogonal, and nowhere nearly as powerful as Lisp.
Don't take my comment as a recommendation for Ruby. I use it in preference to Python, but Lisp is a better language.
I'm currently looking quite seriously at Julia... it'll be interesting to see if they move it in a more general purpose direction, out of it's current numerical programming niche. What's most interesting is that Julia has Lispy macros in an infix language. I'm not quite sure how they do it, but it seems to me that macros for infix languages was solved in 1973 with CGOL...
The fundamental fact about Lisp macrology is that macros are not parsers, they are semantic expanders. The parser is really the reader, which is very simple and can parse only a few types of things. You can support infix syntax naturally by replacing the reader with a Pratt parser, which adds support for expanding the parse table dynamically as well as for operator precedence. But nobody actually wants to write parsers for every little macro (I think this is a UX issue that other languages based on extensible parsing get wrong). But nobody says the parsers (parser-lets) have to be hand-written. There is nothing stopping you from automatically generating the parser-lets from the macro description itself. The destructuring lambda list for each macro should give you all the information you need to generate a parser-let for most common code shapes you might want to use. From there it's a simple task to turn everything into lists, vectors, what have you.
There have been lots of infix macro implementations, but to my knowledge none of them makes it over the usability hurdle, so they don't get much used. I tried writing one for a very simple infix language (arithmetic expressions, function calls, and little else) and even there it turned out to be too much of a pain to be worth the trouble. The operations needed for generating code were so awkward that they quickly overwhelmed and obscured the code itself, much as how if a product requires packaging that is too unwieldy and expensive, it changes the economics of the product.
Only the Lisp notation seems to be lightweight enough to make full macros easy enough. It isn't just s-exprs; it's also quasiquoting (backquote-comma-splice). Those two together are what keeps the structure of the macro close enough to the structure of the code it's generating. Once you have a notation that is supple and lightweight enough, you begin to think in terms of it — and that leads to qualitative changes in your programming.
But it sounds like you have some neat thoughts on how to overcome the unwieldiness of infix macros. It would be interesting if someone figured out how to make them work well.
p.s. I'd never heard of CGOL (thanks!). Perhaps it invalidates everything I said.
"What's most interesting is that Julia has Lispy macros in an infix language. I'm not quite sure how they do it"
Yeah, Julia-macrology seems to be a funny mixture of operating with infix and prefix representations. You write the macro in infix (in Julia), and the generated output code in infix (or I guess you can generate Expr-type data structures directly), but if you want to inspect the piece of code that was the input to the macro, you need to understand its internal representation as an abstract syntax tree (lisp-like representation), and inspect that.
Now that actually seems like one of the points of hesitation. Lisp is actually notorious for not having a lot of accessible libraries, to paraphrase "The Bipolar Lisp Programmer". http://www.lambdassociates.org/blog/bipolar.htm
I mean, Lisp may have managed to get a bare standard library after many, many years but the odds of you finding X algorithm implemented in Lisp on the web is, in my experience, low compared to c++, java or python.
Lisp sounds like perhaps the strongest possible tool to "hack your way through the wildness". One can do amazing things "right out of the box" (as your example show). Unfortunately, it also seems like it winds up mostly being only that, rather than being a tool for building large systems interfacing with many people and other systems (that might not be a bad thing but its worth mentioning if you mention libraries).
Are you maybe using 'standard library' and 'available libraries' interchangeably? Ruby and Java and C-iods for example have very limited standard libraries, but their respective ecosystems provide a great number of additional libraries. Python is the odd one because it basically comes with loads of additional libraries preinstalled. Lua on the other hand has a tiny standard library and a relatively small number of available libraries for it.
I would say that CLs or Schemes standard libraries are actually quite comprehensive, but they don't have that many available libraries.
I was referring to the standard library built into the language (Common Lisp), which has been in there since like the early 1980's. Like C, that standard library is a basic set of tools--it's not a whole platform like Java's standard libraries. But it's very well designed and tasteful for what it is.
I assume you are talking about Common Lisp; Clojure clearly does not suffer any issues with library availability, since it can utilize and interact with any Java class. That being said, most of the CL implementations I know of have some sort of FFI that can be used to invoke functions written in at least one other language. Finding a C implementation or library is not so hard for most things; using the FFI is a bit painful sometimes, although grovelling tools can help.
Also, where it counts, there certainly are Common Lisp libraries -- XML, SQL databases, GUI, distributed/parallel programming, various mathematics, etc. all come to mind.
I don't think writing extensions to the language is necessarily bad, nor is that necessarily limited to a single developer goofing around with the code. Let's put it this way: if your program needs to do a bit of linear algebra, would you rather have an embedded language for that, or have a bunch of classes and functions flying around that perform those operations? There is a certain threshold beyond which an extension to the language makes code more maintainable, not less, even when a new member is added to the team.
I think the problem is that people assume that extending the language means that everyone writes their own "for loop." Lisp programmers generally have enough taste not to do that, and if they do it's something more like writing the "iterate" package, which you can control with team policy.
Generally, though, where macros come in is in defining things that would otherwise involve a lot of boiler plate code. E.g. SBCL uses a DSL built on macros to make it easier to write reuasble backends.
Note to any potential readers: I was barely a vertebrate when I wrote this (edit: that must have been 6-8 years ago). I did so because I had selected TinyScheme for a project that I was doing at work and a bunch of people who had never heard of scheme or lisp wanted to know why. I briefly got tired of talking about it (which is rather unusual, to those who know me), and wrote this.
I still think that lisp is a great language to learn (although forth is even lighter-weight and about equally powerful), and like clojure a lot for practical work nowadays.
"Sadly, something essentially killed lisp innovation around the time Common Lisp got standardized"
I think it was timing, more than anything else.
The movement to create Common Lisp came right on the cusp of several important shifts in computing. The first was from specialized hardware to standardized platforms. Lisp bet heavily on the former as embodied in Lisp machines at the very time that ISA and x86 machines proliferated.
The second shift was from command lines to GUI's and the Common Lisp standard lacks facilities for windows and other GUI elements. There is no equivalent of Swing in Common Lisp.
At the time it was standardized, Common Lisp just missed the explosion of the commercial internet and the distribution advantages it afforded Java just a few years later.
Finally, the way in which the Lisp community tended (and still tends) to shun Microsoft Windows, probably played an important role in limiting its general adoption. After the demise of Lisp machines, the remaining segment of the Lisp community tended toward FOSS partisanship - somewhat unsurprising since RMS emerged from the Lisp community and it was closed source Lisp code which first led him to action.
In short, Lisp was both unlucky and placed bad bets relative to the current state of affairs. The ultimate effect of a bias toward FOSS remains undetermined.
It was not just "timing" that killed Lisp (if one can even say that Lisp is dead -- I use it in my day-to-day work, and I am not alone). While the world's Lisp hackers were working on lofty AI dreams, the world's Unix hackers were spreading C to every college, university, and business out there. Had an open, portable, low-end OS like Unix been written in Lisp, I suspect the programming world would be a much different place today.
The article suggested that Lisp innovation was dead, not Lisp itself. Written before Clojure was a viable option (per the author's comment in this discussion), that's a pretty easy position to defend.
Whether moving Lisp to the portable JVM and adding concurrency and other features it offers counts as innovation, is a matter for reasonable discussion.
It seems that you were not really there during the 80s.
First 'Lisp' can't bet. It's a programming language. As a community it is extremely diverse.
Let's get the history of Common Lisp right. Common Lisp was created to unite the successors of Maclisp (and related languages) and to make it feasible to create and deploy applications written in it. During the development of Common Lisp from 1982 on it was immediately implemented on all kinds of machines: especially on Unix, Windows and Macs. LUCID CL was a commercial system for Unix, LispWorks started on Unix, moved to Windows and the Macs, Allegro CL was on Unix and moved to Windows and later to the Mac. Golden CL was on Windows. Exper CL on the Mac. Macintosh Common Lisp on the Mac. KCL/AKCL/... on C. CLISP in C. Plus tons more.
When I worked in an AI Lab in the early 90s, developers were using: Macintosh Common Lisp on Macs and Allegro CL on SPARC/Solaris (used in 'planning and configuration'). We had a LispWorks license for SPARC/Solaris for an image processing project. Golden CL on Compaq/Windows. Lucid CL was used on SPARC/Solaris for a commercial project. Lisp Machines were not used anymore.
It just happened that at the end of the 80s / beginning of the 90s the development of applications which were typically developed in Lisp moved to C++. Reason: better deployment options on machines of that time and 'momentum'. Some then even moved to Java later. Few moved back when Common Lisp came back during the early 20xx years.
Common Lisp lacks 'standard lacks facilities' - just like most languages. Java has Swing, but it's not really successful. Then came SWT. ...
It's not that the Common Lisp community did not try to develop a GUI library standard. During the standardization process there were efforts in that direction. For example 'Common Windows'. But the funding died out and when new platforms appeared (Cocoa, GTK, ...) Common Lisp was out of fashion. The result is that the best GUI toolkits for Common Lisp are commercial: CAPI from LispWorks and Allegro's GUI Toolkit. CAPI runs natively on top of Windows, Cocoa and GTK+.
Common Lisp implementors support the development of Windows applications usually through the commercial offerings: Allegro CL and LispWorks. With some minor alternatives. It's just that these tools are expensive - as expensive as comparable options for other languages.
"During the development of Common Lisp from 1982 on it was immediately implemented on all kinds of machines: especially on Unix, Windows and Macs."
We are both sweeping across history in the same manner, since obviously there were no Macs or Windows machines in 1982, and thus no immediate development for such machines.
Or rather, we are both creating a sketch. Mine, I admit is quick and contains convenient anthromophization.
Commercialization of Lisp was largely focused on Lisp machines. There was no TurboLisp for the upcoming generation of programmers. It never made its way into the mainstream before the internet revolution, and was, as you point out increasingly fragmented (and under financed) by the mid 1990's.
I said from 1982 on. In 1982 Lisp ran on a variety of architectures. Common Lisp systems were offered in 1984 maybe.
Commercialization of Common Lisp was huge on Unix from day one. Lucid was a commercial vendor. You might want to read the description of RIchard P. Gabriel of its business. Franz is a commercial vendor on Unix. Harlequin was one / now Lispworks. When Steve Jobs sold his first NeXTStation, a copy of Allegro CL was included. Every major Unix vendor in the 80s sold a Common Lisp implementation. SUN had the Symbolic Programming Environment (SPE). HP. IBM. Most of the major Lisp software of that day got ported to Unix (some also to Windows). Macsyma ran on Unix and Windows. KEE on Unix. Knowledge Craft. G2. etc etc.
I was excited about Lisp for a while, primarily based on all the claims of increased productivity.
I started going to Lisp and Clojure meetups while learning, and I was struck how much LESS productive the supposed gurus at the meetups were in Lisp than I am in Python.
Maybe that reflects on the language. Maybe these people were posers. But the claims I hear seemed inconsistent with what I saw.
How much does your problem domain leverage Python's available standard library? I'm pretty familiar with both, and the former is what I use to write stuff when I don't have a library available that does the work for me. The latter is what I use when I don't really have to do much work to leverage something someone else has already built.
Personally, my programming involves heavy use of the scientific computing libraries. But my comparison was for general programming that doesn't use these libraries, and which don't rely especially heavily on the standard library.
For personal use, sure. For anything else, language choice requires bottom-up campaigning or top-down mandates--Lisp never even enters the conversation, at least in any serious way. Language choice is also heavily dependent on mature third-party libraries, which are in abundance in many languages but not as much with Lisp dialects (unfortunately).
Ruby and JavaScript have adopted enough Lisp concepts and in a lot of organizations enjoy enough support that pushing for real Lisp is a touch quixotic.
I get the impression that Clojure is at least entering the conversation in some places. Some companies that need concurrency control capabilities that they find difficult to implement in Java are starting to use Clojure, or at least considering using Clojure to make certain Java business applications threadsafe. The fact that Clojure runs on the JVM and can be called from inside a Java application makes this easy to do, but in a bottom-up campaigning situation the hard part is convincing management to sign off on your use of the language as an "approved tool."
We use Clojure at twitter and I believe that google supports CL as part of the after-effects of buying ITA. At the time I wrote this, clojure didn't exist and ITA was not particularly well-known.
Yes, Clojure is the exception. From an institutional standpoint, its support derives entirely from the fact that it runs on the JVM (as you pointed out) and can talk to both the company's existing Java code and the vast pool of Java libraries out there.
The lesson here is that it was incredibly smart for adoption to target the JVM and CLR. The JVM in particular will almost certainly outlive Java itself as a primary development language.
Ruby and JavaScript have adopted enough Lisp concepts and in a lot of organizations enjoy enough support that pushing for real Lisp is a touch quixotic.
Alternately, JS and Ruby may have served to loosen people's resistance to useful and interesting language features that are better implemented in, say, CL. Writing either makes me sad, in an "uncanny valley" sort of way.
My point is languages are chosen at organizations based on practical reasons like libraries, employee skill set, and potential employee skill set--not language features.
Language features are a practical reason as well. They are force multipliers. But to understand that one has to understand the features, which understandably doesn't happen at the organizational level, thus that factor doesn't enter the equation.
There were a lot of good ideas explored in Lisp - but not always they were first developed in Lisp (for example OOP came from Simula and Smalltalk to Lisp and merged their with Lisp approaches). Many ideas developed in Lisp have not even been reused now. It is buried in computer science reports from 60-90 (which no one reads anymore) and code which has been lost (or is not accessible). Example: Lisp 1.5 from the 60s had a bitmap used for marking objects during GC - Ruby 2.0 just now implemented this thing. One could have read old Lisp implementations from the last 50 years - but this is not how people work. Few people study old stuff and harvest it.
The flexibility of Lisp to change the language in multiple dimensions is still valid. There are few languages which provide it in a seamless and built-in way. That's why some people still use Lisp. It's also a thing which limits Lisp: it is more difficult to understand and goes against the coding standards in 'enterprise software development'.
Any Lisp, but it turns out to reveal a bunch of Lisp warts. The most obvious one being that no Lisp out there ships with a built-in macro to do this. The subtle truth, though, is that most people who use Lisp start out wanting this and then either give up before they become proficient enough to implement it, or else get acclimated to prefix math and once they are proficient, they don't care anymore.
If you were to set out to do this, the first thing you'd have to accept is that the macro is going to look a little weird. For instance:
(infixmath 2 + 3 * 4 * (2 + 1))
You're going to want to put spaces between all the terms, every time, because you'd rather have separate atoms for numbers and functions. The next problem you're going to run into is that Lisp doesn't ship with any context-free grammar support akin to Parsec or Yacc. This is the corollary to the point above, that once Lispers become proficient enough to do this they don't want to anymore. Once you're fully enamored with prefix notation, you never find yourself wishing for the kinds of grammar that necessitate this kind of work. (This has a lot to do with why most Lispers hate loop and format too). In other words, you'll be writing your own recursive descent parser from scratch.
Once you've written your recursive descent parser to handle infix math, you'll face the next challenge: what to do with the parse tree. The simplest thing to do is just evaluate it. If you were really clever you might be building a Lisp prefix expression as you went and you can just let Lisp evaluate that directly with eval. A part of you will wonder if you should instead generate a compiled function, but then you have to capture the variables properly and it will be a lot more work, but it could become more efficient, since the Lisp compiler would have access to that information and could do better things with it...
Anyway, my point is that this is sort of a will-o-the-wisp or mermaid example, because it's intended to capture the imaginations of non-Lispers, but actual Lispers have never actually gotten serious enough about it to make a robust implementation and ship it with the system, because the mindset that values infix math is incompatible with the mindset that produces good Lisp programmers.
The macro will not necessarily look weird, nor will you necessarily need to put spaces between the terms; in Common Lisp you have reader macros, and you can make this as simple as using [] to delimit regions of infix notations.
As for why you do not see it done in practice, I would say it is mainly because it is not terribly useful in practice. Infix notation is just what most people are comfortable with, because they learned it in school; it has no real technical advantage, and the need for precedence rules is something of a disadvantage (though not a terribly major issue).
How many projects do you see promoting reader macros? The only package I ever used that came with one was CL-SQL, and it was really unpleasant to get everything working. Reader macros are another example of a feature that Lispers haul out when discussing the benefits of Lisp but never use in practice, because they're almost always more trouble than they're worth.
The reason you perceive this as not terribly useful in practice is my overall point: once you're good enough with Lisp to develop this, you think in terms of s-exps and don't see the point. The primary technical advantage of infix notation is that the book you're getting the formula from used it, so it's easier to see if you transcribed it correctly. A secondary advantage is that it's what you use in your head and on paper. If the advantages of Lisp syntax outweigh the advantages of a lifetime of schooling and compatibility with the world at large, you are, at last, a Lisper. :)
Actually I stopped seeing the point of infix notation long before I started programming in Lisp; infix notation stopped seeming like the cat's meow when I had to actually spend time thinking about how to parse it. I have yet to encounter a good argument for infix notation; all anyone seems to offer is, "Well it's what everyone knows and what we teach in schools!" I am not even sure that infix notation being taught in schools should be pointed to as a positive thing, given that "Please Excuse My Dear Aunt Sally" does nothing to improve anyone's understanding of math (when was the last time you saw PEMDAS written in a proof?). I'm sure there would be endless complaints from parents who were unable to help their children with their arithmetic homework if we suddenly switched to postfix notation in elementary school.
I don't think mindlessly transcribing formulas from textbooks is a strong argument for infix notation, and it is definitely not a technical advantage. I am not entirely sure that infix is what people naturally use internally, either, as it is not unusual to hear people say things like, "subtract 3 from x" or "divide the sum of the numbers by the count" -- those look more like something you would see in Lisp than infix notation to me.
This is one of those debates where there's no convincing anyone on either side. Convention is the only argument I have or need, and it's perfectly sufficient for me. It's insufficient for you and the rest of the Lisp world. This has been the state of affairs for the last fifty years and it will continue to be the state of affairs until the end of time.
I hope you mean prefix not postfix. Postfix (or Reverse Polish Notation http://en.wikipedia.org/wiki/Reverse_Polish_notation) is completely "unintuitive" for both mathematicians and laymen, and I wouldn't wanna see it in any school... maybe for speakers of RTL languages like arabic it looks more natural, but... no
But seriously, now. Reverse Polish notation is plenty intuitive as long as you don't get too crazy with what you put on the stack.
For example,
(7**3 + 9*7) / 2
might be rendered:
7 3 ** 9 7 * + 2 /
That is, "Take 7 and 3 and exponentiate, take 9 and 7 and multiply, then add the two numbers, take the answer and 2 and divide." Parentheses or extra spacing could easily be used to increase readability (though parentheses are never required in RPN; they're purely aesthetic), if that's too much to swallow:
(((7 3 **) (9 7 *) +) 2 /)
...But really it's just a matter of what you're used to.
P.S. You'll notice that with parentheses RPN looked remarkably Lisp-like. This isn't a coincidence. The core difference between prefix notation as used by Lisp and RPN is that operators in RPN may only take two arguments (it's more complicated in languages like Forth, but since we're talking about arithmetic here we can safely make this assertion). This is the tradeoff RPN makes: losing the ability to have n number of arguments per operator, for the ability to ditch those parentheses for good.
It should also be clear that one could easily have a Lisp-like syntax similar in appearance to the RPN with parentheses above (though with Lisp's ability to have more than two arguments), but just happening to put operators at the end. Conversely, one could have an RPN where operators are at the front, which is actually the concept behind Polish notation, where RPN stems from: http://en.wikipedia.org/wiki/Polish_notation
On a final note, since RPN is read from left-to-right exclusively, a speaker of an RTL language would actually be worse off. The perceived unintuitiveness is more because English is a subject-verb-object language, whereas RPN is subject-object-verb.
I think it's more about pushing you mind towards a more imperative and sequential way of thinking than compactness (see my comment above)... but this shouldn't be "anti-Lisp" since Lisps have always tried to be "multi-paradigm"...
For what it's worth, I'm not trying to be anti-Lisp with my remarks. I'm just trying to point out that this particular widely-cited example of a macro is false advertising.
Reader macros are there to extend s-expressions. Lisp developers who think in s-expressions use reader macros to implement new data structure syntax (sets, tables, JSON, ...), new code syntax (infix, SQL, Objective-C, ...), ...
I know that infix notations tends to push your intuition towards an imperative and sequential way of thinking somehow. Like, for an expression like:
( ...big thing 1... ) + ( ...big thing 2...)
...people tend to think like "compute 'big thing 1' and then, to this, add 'big thing 2'".
...but then again, "purer" languages like Haskell don't use prefix (yes, it's an alternative, but people just don't use it). And mathematical language, that is not imperative and sometimes not even sequential, uses infix. And List never made a fuss about being a "pure" functional language... it's more of a "multi paradingm" language. Still like Haskell's way of putting the operator on parentheses for prefix in infix by default, though H will definitely never be my favorite language...
I don't see how prefix notation would ameliorate this problem. You'll still wind up with:
(+ ( ... big thing 1 ...)
( ... big thing 2 ...))
I don't think this syntax does a better job of conveying a lack of ordering between the two things. I'm inclined to argue that it's worse simply because Lisp is sequential and the math you did in school wasn't, so you have experience looking at the bigger picture with infix versus prefix. But both of us are bullshitting, because we haven't got any science to support one position or attack the other, only reasonable-seeming hypotheses.
Haskell is my favorite language, but I find I get a lot more use out of converting words to operators than converting operators to "words." For example, I find this:
x `elem` y
a lot more intuitive than:
elem x y
because the infix version suggests the natural English reading, that the set is the second argument and the item is the first, where in the prefix case I have to think about it for a second.
...true, it's bullshitting ...or it's an entirely subjective thing, depending on how you're used to think :)
...but it's that different feeling that "add 'big thing 1' with 'big thing 2'" vs. "compute 'big thing 1' and then add 'big thing 2' to it" ...and if you think of a programming language as an "interface" between your mind and the computer, and think of a programming language from a UI or UX design perspective, these subjective "how it feels" things really do matter. But indeed, Haskell wins on this one 100% by allowing you to just use whatever notation you like without having to write or use a reader macro or something like this. Anyway, back to real work now...
Define a function that transforms an infix form to a prefix form.
Write a reader macro that understands some special character to mean that the following form is an infix form and uses the function above to transform it.
Done.
In pretty much any Lisp by using macros. It is a 10 minute exercise in writing a simple parser, or you can just use one of the parser generators out there (plenty for Common Lisp, probably plenty for other Lisps).
I'm usually sceptical about "Why Lisp?" "Practical Common Lisp" offers the confident (and subjective) "You'll get more done, faster, using it than you would using pretty much any other language."
If that's a hurdle for you, then you should not use Lisp. Lesson No. 1: Extend the language. Lesson No. 2: develop taste how to best extend the language.
Here are the five missing lines which made you not use Lisp:
(defun init-hash-table (data &key (test #'eql))
(let ((hash-table (make-hash-table :test test)))
(loop for (key value) in data
do (setf (gethash key hash-table) value))
hash-table))
'why should I have to reprogram my brain to fit the language'
That's wrong. Lisp is for those who want to reprogram the language to enhance the brain. If you feel uncomfortable with that mode, then Lisp is not for you.
That's wrong. Lisp is for those who want to reprogram the language to enhance the brain. If you feel uncomfortable with that mode, then Lisp is not for you.
I don't know if I would go so far as to say the poster is wrong for feeling that way, but you are correct, Lisp is fundamentally different, and one should really only embark on learning it, if they want to learn how to do something fundamentally different. If one has the mindset of "why should I change" then they are probably going to miss the profound ah-ha many of us got when leaning a Lisp. They are not wrong for feeling that way, but a Lisp is definitely a bad choice for them and will only frustrate them.
they are probably going to miss the profound ah-ha many of us got when leaning a Lisp
I think I've at least had some sort of "aha" from learning Lisp; it just doesn't necessarily translate into something that's worth the extra investment of using Lisp for the practical programming projects I've encountered thus far. (To be fair, part of that is the relative paucity of good libraries for Lisp implementations, compared to, for example, Python, with its "batteries included" philosophy.) But even when I'm using, say, Python, I am still able to make use of the "aha" in some ways.
Here are the five missing lines which made you not use Lisp
Yes, I know the init-hash-table function is easy to write. I didn't mean to imply that the function was complicated, or that writing it was a "hurdle", or that not having the function was what made me not use Lisp. I wasn't talking about one particular function; I was talking about the general fact that, for the programs that I've been writing, the advantages of Lisp did not, for me, outweigh the disadvantages. Your mileage may vary, of course.
Lisp is for those who want to reprogram the language to enhance the brain.
This is a valid point. I try to use Python this way as well, and so far I haven't run up against a specific case where its limitations compared to Lisp were enough to push me over the edge. If I were to change my mind at some point, it would be because of some way that I wanted to reprogram Python but couldn't.
It's fun to play with when I need a break from work or projects. In the real world work environment I need to write mundane code maintainable by coworkers and clients, so it's not there. In my hobby, C and assembly rule the tight memory spaces of embedded hardware, so Lisp exists in a shared dream-state with Smalltalk and Oberon...
I agree with this answer, but I suspect that this is also why Lisp isn't often a first choice. Meta-programming doesn't figure into language choice discussions as often as it probably should.
I distinctly recall wishing for macros in Java a few months ago while working on an Android project -- so much boiler plate for doing so little. I came to the realization that Java uses XML augmented libraries so much simply because Java boilerplate is such a pain to write. It's often easier to build a mini-language on top of XML and use Java to parse and compile it than it is to write the equivalent Java code. I think that alone says something about the efficacy of Java and the benefit of using a language with a more malleable AST.
JavaScript is enough of a Lisp, and it runs everywhere and is attached to a (more-or-less) standard UI environment that is getting increasingly powerful. And if you really like Lisp syntax, and believe in enforced immutability, there's always ClojureScript.
Another good one, although Scheme oriented (like SICP) is How To Design Programs: http://www.htdp.org/
The second edition is a work in progress, but it gets you going pretty quickly with graphical programs, so it's a bit less cerebral (and arguably more interesting for many beginning programmers): http://www.ccs.neu.edu/home/matthias/HtDP2e/
The only reason this site loves LISP is because PG said it was special. What a joke. Sorry to be terse but this whole place is either insane or plain stupid when it goes on about LISP.
1) Good ideas? Yes, it did, but many modern languages implement them so it is no longer a reason to use Lisp. Macros? There are numerous arguments on both sides about whether those are good or bad. Needless to say, not enough to justify using Lisp.
2) Your "extension to the language" is my "wtf is this developer doing here?" Consistent syntax is part of what makes a language intelligible to many different people. I don't want to be a human parser to understand code.
3) Thrust/weight... ?. OK, I think your point is that there is little syntax to understand. Unfortunately, that means that context needs to be parsed by the developer because there are few "road signs" that tell me what is going on without reading the whole chunk of code in detail.
Finally, I'm not sure what this is in response to (other than it being the eternal question that burns in the hearts of all men) but if it is an argument for using Lisp on a project, it would be better directed at what Lisp has that makes it good for production work in a modern environment. However, in that way, I don't think it is much better than many of the other languages today.