I envision something like a Raspberry Pi, but with the hardware designed to run Lisp, and taking inspiration from the original Lisp Machines.
Actually some the first ARM processors were supposed to run a Lisp operating system. When Apple did the research&development for the Apple Newton, they originally attempted to run a Lisp-like system on mobile systems like PDAs and tablets. This Lisp dialect was derived mostly from Scheme and Common Lisp - the language was eventually called Dylan. Apple had ARM systems in their lab which ran a Dylan OS. When they at one point chose the ARM processor, Apple invested in ARM and they used a chip which was able to support a garbage collected language. Unfortunately these Lisp operating systems by Apple did not reach the market and Apple instead sold the Apple Newton with an OS kernel written in C++ and the software on top written and scripted in NewtonScript - a language with many aspects from Lisp - incl. garbage collection.
For a recent attempt see: https://github.com/froggey/Mezzano
Since everything was Lisp all the way down, and Lisp is a very dynamic language, all calls that comprise the OS are available as plain function calls visible to the interactive REPL, introspection, and modification. They were very developer-oriented and had lots of ways to have live displays of objects, to reuse or interact with them, and very deep debugger and documentation integration, since all code was at the same "level".
Since it's a high level language that does not expose low level memory (although certainly there are obscure implementation-specific Lisp operations for doing so used deep in the guts), corruption of memory at a low crashy level isn't generally a thing. Protections between applications/processes aren't as necessary and it can all remain plain running threads & functions interacting with each other & the OS.
There are namespacing facilities (though I'm not sure how far back in history they go) to discriminate your code's reach, as well as to import the public API from others' code, or in a different manner to bore into their private affairs generally for admin/debugging/tracing/modification. "Global" variables have thread-local, dynamically scoped bindings which are super useful for configuration, redirection, or setting other side-band broad context when calling into shared code.
Also it should be noted that these types of machines (at least from the Symbolics point of view) tended to be single-user workstations, though were still networked to allow remote (and simultaneous) access to its running world, with varying levels of per-connection context.
One of the biggest downsides of ye olde Lisp machines was garbage collection times. Some had facilities to define dedicated heap regions, but generally the GC had to walk the entire workstation heap across all "applications" in 1980s hardware, which wasn't great. But at least as a programmer, you could freely code and only worry about minimizing GC pressure when it became an issue, instead of starting from a required malloc/free perspective and constantly worry about leaks.
a stop-the-world global GC was generally tried to be avoided. mid 80s Symbolics Genera had regions for data types (areas), areas for manual memory management, a generational GC, incremental GC (the GC does some work while the programs are running) and an ephemeral GC efficiently tracking memory pages in RAM. For normal use it then looked like global stop-the-world GCs were only used in special situations like saving a GCed image or when running out of free memory. But one often preferred to add more virtual memory then, save the work and reboot. For that one could add paging space when Lisp crashed with an out-of-memory error and continue from there.
(Only half joking)
Actual Lisp operating systems also don't look like an Emacs running on hardware. The MIT Lisp operating system had Zmacs, but it was one application among others and while Zmacs had modes (incl a Dired), the typical Lisp OS application was not an Zmacs mode. None of these were based on Zmacs: the Listener (the REPL), the process overview, the font editor, the inspector, the terminal, the chat program, the documentation reader, ... On a Symbolics there were basically two major other programs reusing parts of Zmacs in their UI: the mail client and the documentation editor.
Thus when people think that a Lisp OS looks, feels or works like an Emacs on the metal or that GNU Emacs looks like the typical Lisp operating systems from the 80s (from MIT, Symbolics, LMI, TI or Xerox), then this is actually not the case.
See this for an example how an 3d animator used a Symbolics Lisp Machine and parts of the graphics suite for it: https://youtu.be/8RSQ6gATnQU
Python without an operating system
Porting Python to run without an OS
My understanding is that the tooling available on lisp machines was so good it made emacs look rigid and dumb. I'd imagine that reacreating such tooling would take more man hours that building the lisp machine and its kernel but I don't know.
The software wasn't dumb, there was just a lot less of it. They didn't have IPv6, SSL, a web browser ...
One halfway house would be a Lisp unikernel.
But an operating system built around the idea could be quite interesting. From what I know, the systems running on the Lisp machines of the 1980s were nightmares from a modern security perspective. But the idea of having a closely integrated development-runtime-environment centered around Lisp is appealing, I think.
Except that building an OS from scratch is a lot of work, at least if you want to support a reasonable range of hardware. But maybe you do not even need that. Now that I think of it, one might build a desktop environment for a modern Unix-ish system that aims to reproduce the Lisp machine development environment of the olden days. This would also allow one to harness the existing Lisp implementations. Does anyone know of such a project?
Unfortunately, it seems to have been based around PicoLisp, which isn't exactly a popular Lisp implementation.
I expect that for any project like this to get wide traction in the Lisp community, it would have to be based around Common Lisp, as that's by far the most popular Lisp out there, with a huge existing ecosystem it could leverage -- an advantage akin to basing the Raspberry Pi around Java.
I sometimes try to imagine a world where these things took off, but the closest thing I've touched is Emacs and I'm pretty sure that's not what they had in mind.
Symbolics also was not a joint venture. Tom Knight was actually a co-founder of Symbolics.
In order to see how important and iconoclastic (nevermind good) the Lisp Machine paradigm is, examine the trickle-down effect that it put in motion. There are popular IDEs today that are only _just beginning_ to integrate concepts from that paradigm. The Lisp Machines were so far ahead of their time, so visionary, that it will take decades if not centuries for the consensus to catch up to what they represented.
The marketplace sucks for things like healthcare, public transportation, infrastructure. But it works pretty good for many other things and if there is one success story in the last 6 decades then computing would be it.
The marketplace also gave us the billion dollar computer security industry as a direct result of practices that were 'selected' by said marketplace. We find ourselves operating in an emergent, cascading-effects, all-subsuming space - that we have absolutely no control over - and is now threatening to destroy us.
I would go as far as to say that you are completely out of touch with technology if you think for a moment that the marketplace has longterm memory or stratagem ability. This is validated by the constant re-invention / regurgitation of the same ideas. It seems that abiding by 'greed is good' short-term, is not really a good way to wield the Promethean fire.
Maybe you have not paid attention: but we are talking about Lisp Machines with special hardware and their specific operating system written in Lisp from the ground up. Thus it's not your idea of a high-level Lisp, but a Lisp which runs the graphics card interface, handles memory, schedules processes, as much as it implements the network interface incl. the whole TCP/IP stack. It runs on top of a special processor for which the compiler generates machine code.
And yes, there are various emulators for various Lisp Machines - there are some for Intel + SPARC (Medley, the emulator for Xerox' Interlisp-D) and there is Symbolics' Open Genera (DEC Alpha, and nowadays Intel 64bit and ARM 64bit). Those have been sold commercial. The there are a handful of non-commercial emulators. But this is old software and there are only tiny user groups left. You can use those, but its more like a time-travel to an alternative universe and its state 20 years ago.
That's what Open Genera does. Symbolics wrote an emulator for the Ivory architecture, which allowed people to use much of the OS on a DEC Alpha under Unix.
> I maintain that 90% of those who irrationally praise that old system this much did never even work a single hour with one of them
Possibly true, though I have. I know both its pros and cons (and it conses a lot).
> They don't, but they just can't keep from creaming all over any forum where the LISP machine is mentioned at all. Darn I need my morning coffee, I'm cranky.
I think you have a point here.
I think its useful to have alternative approaches preserved and some people have influenced by it and implemented slightly similar stuff on current software (see for example MCCLIM for Common Lisp, which is a portable reimplementation of some of the UI management and graphics substrate of the Symbolics Lisp Machine OS). It did not catch on, since much of that is very exotic and complex at the same time. But those have probably never had the opportunity to format a disk in Lisp, read a Unix tape via a Lispm tape drive or copy a file via anonymous FTP onto a Lispm. It's not the fault of the current generation - they haven't had the chance to actually work with such computers (since the whole thing was mostly dead after the early 90s) and all in all they were very rare (roughly 10000 Lisp Machines in various forms were built) and the whole experience was very expensive - thus only affordable to companies and well funded research. This makes it more mythical and mysterious. Then also the commercial systems have't been 'freed' yet - their source code is not available under an open source / free software license - only the old MIT code has been made available - but that is really ancient.
It's the same with the old Lispm keyboards - the layout is cool, but using it for typing? Who has ever typed with the old mechanical keyboards from the 80s and would do that today? There are only a few...
Then of course you accuse the Lisp advocates of suffering from what you so clearly represent. If you had a clue, even a minimal one, you would never write 'just port the damn software'. This statement - and your previous comments - portray fundamental ignorance of what you are trying to argue against. May I suggest further education - Lisp Machine emulators are widely available - so as to stop making a fool of yourself.
The K-machine development was 'just porting the damn software' to a large extent.
Thank you for proving my point, which you didn't get.
a: They were not successful, but should have been, and now it's too late.
b: If anybody cared it wouldn't be too late for success.
a: No see, somebody DID care!
b: And still no success... qed
Lots of the features of the software only work because everything is in the same address space, working out how to do similar things across multiple address spaces for a modern operating system is much harder.
I have talked with RG on Skype but not about historical stuff.
TI modernized a lot of stuff of LMI (hardware and software). But TI management radically pulled the plug when the government funding for the various projects went away. DARPA financed their 32bit Lisp microprocessor - but there was no one to fund the next round and TI not even published an emulator or published the source code under some license. It just went away... there is an archive with code from them - but who would touch it without a license?
A student travelled to the east to hear Master Sussman discourse on the Lisp-nature. When Master Sussman began holding forth on low-level programming in Scheme, the student interrupted him.
"Master," he said, "How can you do such bit-level programming in a high-level language such as Lisp? Would you not want to use C or pure assembly language?"
Master Sussman replied, "A fisherman was walking along the beach, when he spotted an eagle. 'Brother eagle,' said he, 'how distant and unreachable is the sky!' The eagle said nothing, and flew away."
With that, the student was enlightened.
I don't think so, and, this being HN, I thought we all knew by now that just because the market favors something it doesn't make it better in every aspect to other virtually unknown things. Most easily identifiable examples of this are windows vs linux, gui vs cli, and point-and-click interfaces (touchscreen or mouse) vs keyboard.
The market favors what the masses favor, and the masses favor simple, intuitive interfaces with minimal learning curve. Versatility, efficiency, power and everything else that necessitates even the smallest of learning curves are all damned by the market. When's the last time we've seen an average joe read an instruction manual?
In any case, looking at Lisp OSes vs Unix OSes, there is a design dichotomy where both options have great advantages. Anyone correct me if I'm wrong, but I understand Lisp OSes chose to have the whole OS work in a single high-level language, which allows a very natural coupling between programs, basically destroying the distiction between whole programs and program functions. On the other hand, Unix OSes chose to have a very unassuming framework for programs that would best support a great diversity of programming languages so that they could best interoperate despite the fact that they could work via very different semantics. This structure, as we all know, consists around the semantics of files which could be thought of as global variables, plain text arguments as very unassumming (untyped and with no predefined arity) function call arguments, standard input, output, and error as lazily-evaluated function arguments, and environment variables which could be thought as dynamically-scoped variables.
I don't know exactly why Unix won over Lisp, but I could guess that it was because people favored language diversity or even just being able to stick to the language they already knew over having to learn Lisp.
I must say, though, I'm very interested in how Lisp OSes took advantage of this monolanguage property of theirs. I think I remember hearing that when an error (an exception) ocurred in any applicacion, the system could open the exact line that raised the error in an editor, allowing one to edit the source and load the modifications while the program is still running. That's just impossible in Unix by design and sounds super exciting. It would be totally useless for an average computer user of today, though. Maybe if the average computer user were a programmer they could see the value, but we all know that's never going to happen. In the beginning all users were programmers and administrators, but as the masses also became users, the administrators are now a minority and the programmers even more so. Such technology like Lisp OSes favor programmers and that's their sin in the market.
That was then, though. Now, the real obstacle for Lisp OSes, more so than them favoring a minority, is that we've built a shitload of stuff on top of Window and Linux. How much work would it be to port Chrome or Firefox (both of which are like OSes on their own) to a Lisp OS? Not to mention the other loads of apps that people depend on for their work.
That reminds me, you know how much people bitch about not being able to work because they can't find buttons with specific drawings when offered LibreOffice instead of MS Office? There's lots more to the market than it choosing solely based on quality. The software market is really depressing at times.
> Nostalgia is a powerful agent of deception.
I wasn't alive when Lisp machines were a thing, so no nostalgia applies to me here.
It's another subject, but I think there's a bit of a Stallman-esque glorification of LMI when they never seemed to go anywhere nor were Lambda machines that appealing IMHO.
TI bought/licensed the technology from LMI and then developed a bunch of interesting machines, like: Lisp Machines with embedded Unix system running on its own 68020 and communicating via a 68010, the first commercial Lisp Microprocessor (a 32bit chip), compact Lisp workstations using that chip, a NuBUS board for the Mac II based on their Lisp chip, ...
Upshot: you can probably get away with it if you are careful to canonicalize pointers before using them.
Is lisp relevant if the whole world move deep learning AI with self programming is the norm. May be. But a hardware?
Symbolic AI of the sort for which Lisp Machines were originally created is still extremely important as something is needed to provide “executive function” in intelligent systems.
Software is always a big concern and rather than requiring everything be written in Lisp, I'd want to also be able to run regular binaries written in C. This requires a way to safely embed the "impure" world.
The obvious way to do this is extend the value domain with a hidden tag bit  which is carried around everywhere, but can't be changed or inspected by regular RISC-V instructions and in fact, almost all instructions trap if any operand is tagged.
Memory space would be partitioned into a tagged space and untagged. Tagged values can only live in the tagged space and registers (this is important for precise GC). For regular user space code, tagged values can only be created with a `cons` instruction, and accessed with `car`, `replaca`, etc instructions. Having dedicated instructions would allow a hardware read barrier for real-time (or incremental) GC. (Machine mode would be allowed unsafe access for part of the GC and various tasks).
TL;DR: adding fast Lisp support to an existing RISC-V core is likely much easier than building a new dedicated architecture from scratch.
(set! projects (cons 'RISCV-LISP projects))
 If you have data cache, it's not hard to implement a cache line as, say, 32 33-bit word line, backed by 33 32-bit memory words.
Ah, back in those times it was already an argument.