Urn was is very much an experimental project for me: part of me wanted to see how much I could do with as little syntax. Consequently Urn has comparatively few "special forms", but meaning it generates less idiomatic Lua.
Fennel is much more an alternative syntax to Lua. Most Lua constructs have a corresponding Fennel form, meaning the generated code looks more "like Lua". Lua interop is also much nicer: Urn requires you to declare all variables, so you have to jump through hoops to use external libraries. With Fennel you can just go ahead like you would with Lua.
I've toyed around with ideas related to your project. Such as: What if... you had a Lisp that instead of being based around the classic linked list, was based around the Lua table instead?
I've briefly tried to come up with formulations for how the code itself should be represented as a series of nested tables.
And perhaps have a series of nested tables for all variables in the runtime as well (including local ones), though Lua sort of does this now.
All that would increase ease the introspection, creation of macros, and other fun stuff. I hadn't started deciding on a syntax or anything either.
Urn seems to be greater in scope and code. Fennel is intentionally small and a single file that can be dragged into any project.
Fennel does not try to provide Lisp idioms or abstractions unless they have a direct mapping in Lua. For example, there are no linked lists (use tables), and no continuations (use coroutines). Operators are implemented as special forms rather than function literals so that they can be easily implemented efficiently.
Fennel also includes abstractions that might not make sense in a normal lisp but help make interfacing with Lua code easier. For example, symbols that have a dot in them, like "table.insert", have the correct interpretation and can be used as normal symbols. Method calls are supported such via the `(:)` form; `(: "hello" :find "llo")` compiles to `("hello"):find("llo")`.
Fennel takes this approach because Lua is almost already a Lisp, so I felt it was best to add the minimum abstraction necessary. Also, I prefer simple and transparent tools.
Urn may provide more comfortable and "Lispy" abstractions, however, because it takes more control of the environment.
It also seems to have pretty great error messages compared to Fennel, and a good website. There are definitely some features in Urn that aren't
(yet) in Fennel, like built in pattern matching, although Fennel does do destructuring.
MIT and especially books from MIT are held on a pedestal. If you can't understand them or don't feel like pursuing them, just find a different book.
Everyone to his own taste, but I think that is just short selling the book. I had no problems in starting to convert the lisp evaluator described in the book to C++ after only a few years of programming experience and I'm not an ace programmer. It was not straightforward, but it was not intimidating either. I did it for fun, not as a compulsory exercise.
Specifically, Chapter 4 (Metalinguistic abstraction), starting on 4.1 
Sure, you need to figure out that you need to tokenize strings to symbols, and do recursive descent parse stage or such to push the token/symbol list into hierarchical lists. But that's precisely the sort of thing that I feel figuring out has made me a better programmer. The key was that I knew the solutions had to be simple, so I sought simple solutions.
Yes, I agree, the book could have verbatim explanation - "hey, after you know some Scheme go to chapter 4, where we give you a damn good literate programming based exposition of a scheme interpreter." - and since it doesn't, I do think your critique does have pedagogical merit. I remember leafing through the book and marking stuff up, trying to figure it out, until I realized it was standing there in front of my eyes the whole time. My paper copy is quite dog-eared by now.
> Everyone to his own taste, but I think that is just short selling the book. I had no problems in starting to convert the lisp evaluator described in the book to C++ after only a few years of programming experience and I'm not an ace programmer.
@sillysaurus3 just made that book more accessible to everybody by giving them a guilt-free exit, based on experienced results, and your first instinct is to pull a quote out of context and explain why their personal experience is “wrong”?
The concept of 'guilt' when abandoning a book is completely foreign to me. I presume it is a facet of poor self esteem? This is quite useless as a concept and people need to find their own intellectual feet. Not all need to dig the same things.
If some weird book some unknown dudes in the internets glorify feel useless to me I just dump the book without a second thought. I don't have the time to become an expert in every subject, so I should only investigate those concepts that arouse a personal feeling of beauty. Doing brainy things "just because you imagine it makes you look smart" is counterproductive.
Thanks for the thoughtful response. I think sillysaurus was trying to ease anxiety for people (like sillysaurus) who may not get something out of an otherwise highly regarded book. Sillysaurus backed up the statement by saying they had successfully implemented DSLs many times, despite not getting anything from this book. I think this sentiment expressed by sillysaurus is valuable, and in a domain that is often elitist and somewhat hostile, I thought sillysaurus’ position was worth defending for the sake of anybody who might be approaching this with any apprehension. I was advocating for people that might need some encouragement, which is what I saw in sillysaurus’ comment, and that I thought you were tearing down. I appreciate you intended no ill will though.
It has a great quote from Hal Abelson in the foreword, too:
> Perhaps the whole distinction between program and programming language is a misleading idea, and future programmers will see themselves not as writing programs in particular, but as creating new languages for each new application.
However, the recorded lectures are quite illuminating:
also, this is more a 'computing philosophy' sort of course
On lisp: https://www.lurklurk.org/onlisp/onlisp.pdf
Mainly the key is to start tinkering, rather than jump from book to book. Just start with a very informal v1 and start hacking stuff together. When you run into a design dead-end, you'll know it because it will become increasingly hard to make forward progress.
And that's the tricky situation. When you run into a design dead end, what do you do?
Books! :) And that's where SICP might end up useful.
Or think about the problem really hard, and write down the answer. I've occasionally pulled that off.
Oh, I thought of one other suggestion: linkers and loaders. It's another classic. And unlike all the other references, there's no Lisp.
Linkers, Ian Taylor: https://www.docdroid.net/2L1IZ5y/linkers.pdf
Do you mean the book by John Levine? I had read it, it was good.
I collect single-file language implementations. None in my collection (so far) use Lua, but I'm always on the lookout.
I found a copy of the original public-domain code a while back and have been hacking on it here and there, mostly pulling in fixes from tinyscheme.
 http://htdp.org (seems to be down right now, but you can also find the near-complete draft for the second edition at http://www.ccs.neu.edu/home/matthias/HtDP2e/ )
sillysaurus3 is right about the MIT books being held on pedestals. It's not just them but happens with lots of "elite" colleges. There's usually much better books for learning. Now, where I disagree is that SICP is a good book with plenty of value in it for people that know Scheme. I loved reading all the sections on deriving interpreters. It was so clean-looking compared to what I did in imperative languages. It's just best to learn programming and Scheme elsewhere first.
My current parser fetish is Earley parsers but they're a bit more complicated to get from (possibly multiple) parse trees to an AST you can do something with, still trying to grok how that all works. Probably doesn't help I've been playing with a C++ grammar in my experiments.
Guess that's why it targets Lua.
I'm sure Lua has other sweet spots, but that's the one I'm most familiar with.
It's also embedded in Wikipedia. Complex templates can be written with Lua modules, rather than relying on wikitext.
 Lua Parser Expression Grammar
Not that I have anything against Lisp, but an emacs-ish text editor with Lua as an extension language would be really cool. I know the Zile project tried something like that, but I have no clue how far they have gotten.
Also, there is a huge body of code written in elisp that has been around for ages and got properly debugged along the way. Replacing that is not a very realistic task.
I doubt it will happen though, since more trivial and much less intrusive changes to Emacs get bikeshedded into oblivion.