Nice piece. Maybe because I havent seen many articles these days with any sort of objective comparison of software approaches in general. Ok… I’ll admit that any attempt I’ve made to find such pieces never makes it past all the SEO spam these days.
A little surprised to not see any games using Janet- since it seems to be both made for gaming and has a surprising number of ‘batteries included’ like a web server and graphics. Then again, I’ve only stumbled upon it pretty recently myself. From my minor hacking with it, it’s def worth a peek in the Lisp/games arena.
Great to see s7 getting some love. I used it as the Scheme for Scheme for Max, an open source extension to the Max/MSP computer music environment that puts a Scheme interpreter in Max, and I love it. It occupies a space somewhere between Guile, Clojure, and CL, while being very minimal, and is dead easy to embed. It is also much more liberally licensed (BSD) than Guile.
If you like CL macros with first class environments, you will probably like s7.
It's also dead easy to use in WASM, which I am doing for a music pedagogy project. It's using the icing approach, but I would say the cake is 3/4 icing. Notably, it was not difficult to make generic functions for calling JS functions from Scheme and vice versa, which has made things very smooth. A ganache perhaps. :-)
Another vote for s7. We successfully embedded it and SQLite as the (non graphics) engine for otherwise native apps on iOS and Android. Super fast. Great FFI. Reliable. Small. Huge benefits from shared code across our mobile apps, insanely fast unit tests and a clean set of tooling.
Ultimately we switched to Fennel as a more pragmatic Lisp for mobile. AFAIK we were alone using s7 on mobile whereas Lua especially is a common mobile extension. Partly also this reflected the state of r7rs Scheme compatibility. We were running Guile on the desktop for development while shipping s7 and subtle incompatibilities would bite us regularly (a fun one was parameter evaluation order).
Thumbs up to both s7 and fennel for their great projects and community.
No secret. A little puzzle app company I’ve been running for a while: www.eggheadgames.com. No ads or gamification. Just classic paper magazine puzzles. Pay for more or subscribe for unlimited play.
As we’re small, rewriting the code over and over for cross-platform is costly. But as they’re games with words, we want a fully native experience for the highest quality app with great accessibility and full integration - as well as easier crash log debugging without intrusive 3rd party packages.
The classic solutions are either not native enough (Flutter), are graphics-based (Unity), too expensive to maintain the stack (React *), don’t integrate well with the native mobile tooling (Go, various ports of Python, Ruby, Lisps apart from s7) or also end up being a separate core but are harder to debug via crash logs (various JS, Kotlin or Swift cross platform).
Plus once we tried Lisp it was hard to go back! Succinct, fast, great editing. And yeah, it’s just fun to do something different. We also get a kick from being part of and contributing back to the Fennel community. It’s nice to mix some open source with commercial apps.
Man I hear you, on my music pedagogy app just being able to do the actual "business logic" in Scheme is soooo nice. And I can share my code with my Max based work, which is awesome.
i encourage the readers of this comment thread to check out the spritely institute, especially it's blog [0].
i don't wanna spoil what it's all about, but i ensure you it's a topic (and institute!) worth diving into. i poured 10h+ just reading the blog and all the related links and projects :)
Except conventional CPUs aren't really glorified PDP-11s, except in the sense that they're Von Neumann machines with untagged memory; and the conventional CPUs started lapping the Lisp machines around the mid-80s, when 32-bit microprocessors started becoming common and Lisp compiler technology evolved to better support them.
And even the "untagged memory" bit might not hold; with things like CHERI, a LispM style architecture is due for a major comeback.
Hi I think is in the hypothesis (I'm not sure) of that we have edition made by the pdp 11 keyboard developer and tooling that's old and design whiout introspection in mind showcase in this video, the video is really cool but little bit long
Well.... I really liked the pdp-11; for years I had a pdp-11/45 in my living room, but eventually sold it to downsize to a couple of H-11 LSI-11/2 machines with dual 8" floppies.
Why? Well, the pdp-11, besides being the unix protoplasm, is conceptually well-designed. In the same way we tend to write routines that 'fit on a screen', the pdp-11's small directly addressable space encourages not-very-big modules: it encourages modularity.
Is it inadequate for today? Sure. Especially painful for Big Data. But, conceptually -- there's a reason the pdp-11 was so successful, and continues to exist vestigially.
Ironically we are now getting C Machines with hardware memory tagging, as there is no other way to fix C, and too much code around that will never be rewriten.
Lisp is unsuitable for modern CPUs because of memory hierarchy. Lisp operates primarily with lists, which can have pointers all over memory. This was not a problem on earlier CPUs because all memory was the same with similar random access time. Modern CPUs cannot access pointers in memory with the same speed, they need to follow locality rules for performance. This means that an algorithm using something like C or Fortran arrays will always be faster than a Lisp list based version.
Common Lisp allows one to use arrays or other structures, and can even be used to inline assembly. So despite lists being a major part of the ecosystem and code representation itself, they are not mandatory for implementing an algorithm, or necessarily a performance drawback to using lisp. By the same token, it's easy for people to accidentally use lists and pointers to implement algorithms just as inefficiently in other languages - Python comes to mind. A standard approach in Python is to lean on external libraries for high performance computation, and this can be done just the same way in Lisp - but Lisp can also be used directly to write efficient low level algorithms by making a conscious effort not to use the list- and pointer-based functions to do so.
Modern CPUs can execute Lisp just fine, like they execute languages like Java and Javascript, which also use a lot of pointers. Just like those, Lisp offers vectors, strings, hash tables, structures/records and objects.
> This was not a problem on earlier CPUs because all memory was the same with similar access time.
That's not true. Many early systems had small RAM sizes (example: a DEC VAX 11/780 had a few Megabytes and often served dozens of users) and 10 (or more) times larger slow virtual memory.
Lisp systems then tried to deal with that with the first generational garbage collectors, often also compacting (related objects stayed near in memory). Ephemeral GCs watched for changed memory in RAM.
Depending on the implementation for short list they can be optimized (just as strings in C++).
You may be right with C and Fortran. But tbh in an era where the tiniest thing has an OS with MMU, I really prefer GC when the performance is not a problem.
If it is just a performance concern, then ASM will always beat C…
Like everything in engineering is about the right trade off.
I'm excited to see all of the recent progress with Guile Scheme. Since the last time I looked it's gone from an interpreted language to having a full on compiler. Now it can be compiled to wasm with Hoot. This is exciting.
I'm comfortable with Clojure, uLisp and Common Lisp...but I feel like Guile Scheme has cleared away much of CL's cruft, and I'd love to have a compiled lisp at my disposal, specially as Guix and Shepard take off.
Are there any good resources for learning how to use Guile Scheme effectively? Besides Little Lisper and SICP...
Janet seems to have fallen through the cracks. I went back to fennel for scripting stuff, mostly because I can just go and get a lot more Lua libraries and run it everywhere (even on my iPad with a-Shell) with zero hassle.
Do you think that Fennel adds enough value on top of Lua to justify the extra compilation step?
I mean it is valid to use it for the fun of it, just wondering. Lua is already more or less a Scheme clothed in a more conventional syntax. On top of that the fact that Lua already has excellent metaprogramming capabilities kind of negates the typical advantage of using a lisp-style syntax. So what is Fennel bringing to the table?
Every copy of GNU Emacs comes bundled with the text adventure Dunnet [0].
Dunnet was originally written by Ron Schnell in 1982, as a Maclisp program running under TOPS-20. [1] In 1992, he ported it to Emacs Lisp; however, the Emacs Lisp version is more than just a simple port of the original, it extends the game with new rooms/items/puzzles, but also removes MIT-centric content–e.g. the "endgame" computer at the end of the game was originally named MIT-SALLY, was located at MIT, and was accessed via Chaosnet–the GNU Emacs version removes all those (dated) MIT references–although the GNU Emacs version contains (obviously intentionally) equally dated (albeit more widely recognisable) content such as a VAX 11/780
Playing "This thing all things devours" is one of the most profound gaming experiences I have had, and I happened to use Malyon. Why wouldn't I use the best text editor to play an inform game?
A little surprised to not see any games using Janet- since it seems to be both made for gaming and has a surprising number of ‘batteries included’ like a web server and graphics. Then again, I’ve only stumbled upon it pretty recently myself. From my minor hacking with it, it’s def worth a peek in the Lisp/games arena.