>The Picolisp evaluator is not compatible with Lisp 1.5. It doesn't even have LAMBDA.
However, it keeps a lot of ideas from 1.5 that were later dropped by CL and others. Names don't matter: ideas do.
>You could have looked it up or tried it, before claiming it. I did it for you.
It wasn't entirely relevant to the present situation, until I mentioned that I thought cons was more performant. Thanks for trying it. I don't have an excuse, but thanks.
>Yeah, but claiming that the CL code was less efficient. Great move.
I didn't. I said that splicing unquote had to traverse the resultant list, making it slower than cons, which was pretty much irrelevant in this case. I then explained the real reason I used cons.
>I looked that up from the Chicken Scheme sources, to actually see what it does. It traverses inputs and outputs during macro execution. You could have mentioned that.
Quite honestly, I didn't see how it was relevant. We weren't discussing macro system internals until just now. It's not great, but it gets the job done, and that wasn't the point.
I'm starting to get really really frustrated here. You seem to miss every point I make, to the point that I'm very nearly wondering if it's deliberate.
> However, it keeps a lot of ideas from 1.5 that were later dropped by CL and others. Names don't matter: ideas do.
That's what I say: vague ideas don't matter much when forming language families. Code does. Books. Libraries. Communities.
What were those ideas that were dropped? Fexprs would be one. That was dropped when compilers were used and Fexprs were found not to be compilable. That happened in the 70s before CL existed. Pitman published his paper on macros in 1980, which summarized the view of the Maclisp / LML developers. What else?
The Lisp 1.5 manual gives an extended example: the Wang algorithm.
What were the 'ideas' that were dropped, even though somehow old code still runs?
> We weren't discussing macro system internals until just now. It's not great, but it gets the job done, and that wasn't the point.
The point was, claiming a 'slower compilation process' due to splicing backquote usage, while in fact the whole compilation of the example you gave was the really slower one, because use used a slower macro system which traverses code for renaming and re-renaming.
> You seem to miss every point I make
EVERY POINT? Are you really sure I miss EVERY POINT you make?
Personally I would only claim that you miss SOME of my points, not every. In some cases I would claim that we have different opinions, for example what makes a language and its dialect.
But I would not claim that you miss all my points.
Well, maybe not EVERY point. It just often feels like you emphasize the parts of the write that I focus on least.
>What were the 'ideas' that were dropped, even though somehow old code still runs?
Well, fexprs and dynamic scope by default are the big ones, but also the idea of functions as lists, which are why it doesn't have lambda.
>The point was, claiming a 'slower compilation process' due to splicing backquote usage, while in fact the whole compilation of the example you gave was the really slower one, because use used a slower macro system which traverses code for renaming and re-renaming.
I appreciate the irony, but as I've now said several times, that wasn't my justification for using cons. I even said that they hypothetical speed increase would be negligible, and unlikely to be noticed, before you showed that the speed increase wasn't even there. This is one of the things it seems like you missed.
>That's what I say: vague ideas don't matter much when forming language families. Code does. Books. Libraries. Communities.
That's not entirely true. Sure, code matters a bit, but Java definitely comes from the C family, and the code doesn't transfer at all. As for communities, see for yourself: Scheme was born from the MACLisp community, and retains strong ties the modern equivalent: Common Lisp.
However, it keeps a lot of ideas from 1.5 that were later dropped by CL and others. Names don't matter: ideas do.
>You could have looked it up or tried it, before claiming it. I did it for you.
It wasn't entirely relevant to the present situation, until I mentioned that I thought cons was more performant. Thanks for trying it. I don't have an excuse, but thanks.
>Yeah, but claiming that the CL code was less efficient. Great move.
I didn't. I said that splicing unquote had to traverse the resultant list, making it slower than cons, which was pretty much irrelevant in this case. I then explained the real reason I used cons.
>I looked that up from the Chicken Scheme sources, to actually see what it does. It traverses inputs and outputs during macro execution. You could have mentioned that.
Quite honestly, I didn't see how it was relevant. We weren't discussing macro system internals until just now. It's not great, but it gets the job done, and that wasn't the point.
I'm starting to get really really frustrated here. You seem to miss every point I make, to the point that I'm very nearly wondering if it's deliberate.