Forth written "flat" with shallow data stack usage is assembly by another name, a wrapper for load-and-store; Forth that does everything with intricate stack manipulation is a source of endless puzzles since it quickly goes write-only.
But, like any good language with meta expression capabilities, you can work your way up towards complex constructs in Forth, and write words that enforce whatever conceptual boundaries are needed and thus check your work and use the system as a compiler. That's what breaks Forth away from being a simple macro-assembler. But you have to write it in that direction, and design the system that makes sense for the task from the very beginning. That falls into an unacceptable trade-off really easily in our current world, where the majority of developers are relatively inexperienced consumers of innumerable dependencies.
Even Prolog, which is as far from traditional structural program as one can go, usually has descriptive names for unbound variables.
Compared to this, Forth is very name-terse. You get "function names" at best, and nothing else. This really makes programs much harder to understand, as it requires one to remember much more things while reading the code.
But as someone that really got into Forth this year, I can assure that it does get better once you get more familiar with the language. I used to write commenting the stack effects on each line and now I barely need them as I write words in one sitting without that much effort.
Other developers are doing tacit programming in J or Haskell, pipe forward operators in F# or OCaml and threading macros on Lisp dialects and they seem to do fine.
That was my experience learning Forth in the 80's. Initially I tried to add compositional features I was familiar with from APL and Lisp. I raged against the limitations of the cell and schemed to use a typed stack. Eventually I became one with PAD and was able to ALLOT peace to my code. At some point I ranted on how the idiom of counted strings could be generalized to all sorts of useful sin and had an epiphany. 2ROT on!
Well expressed, I often think of the stack as the syntax of the language, not the runtime implementation, and it is quite difficult to hire for this skill without spending a lot of money.
It's pretty hard to make that judgment. I'd guess at least 99% of programmers have never used Forth.
aka, this is hard to do, and a difficult mental model to learn. It's the same kind of thing with LISP'y languages and functional languages (not the same model, but the same level of difficulty).
I built an RC2014 after CollapseOS was posted last year, and thoroughly enjoyed it.
I ended up adding a front panel, complete with switches and lights, to allow toggling in and executing code without a ROM, and also wrote a HTTP/1.0 server for CP/M (which was an enormous headache for lots of different reasons). Never did get around to running CollapseOS on it though.
> With a copy of this project, a capable and creative person should be able to manage to build and install Collapse OS without external resources (i.e. internet) on a machine of her design, built from scavenged parts with low-tech tools.
I'd be interested in trying to build my own proof-of-concept, but my hardware experience pretty much starts and stops with kids' electronics kits from 20 years ago.
Assuming I can scavenge a Z80, what would be a logical next step? What books/other resources should I be reading to learn more?
It gets a bit dry when it gets to the bit that documents the individual instructions, but the overview before that point is well worth reading.
Once you've done that and you're trying to write code, this page is a good reference for the instructions: http://clrhome.org/table/
To build a scavenged machine, you'll need a Z80 CPU, a clock source (I think an RC oscillator would be the simplest working setup?), probably some sort of ROM, an SRAM, some sort of IO device (probably a Z80 SIO/2 for serial), and a 5V power supply. I think that should be all that is required. Then you need to load your code into the ROM at address 0.
Then you can connect the SIO to a USB serial cable (e.g. FTDI) and communicate with your Z80 using a modern PC.
I can't remember if the SIO/2 needs external address decoding logic, maybe you could cope without it if you are happy for it to use up every address?
But building an RC2014 would be easier and more likely to result in a working machine, and would give you almost all the knowledge required to build a scavenged one later if you still want to.
For help and support, the rc2014-z80 mailing list is quite active: https://groups.google.com/forum/#!forum/rc2014-z80
And I'd also happily receive RC2014 or Z80-related questions by email (available in profile), although I'm not much of an expert compared to many of the people who are active on the mailing list.
To be clear I don't necessarily share the view that this will likely happen, but for the sake of discussion I will assume it will.
A working macbook will indeed be much easier to encounter working at first, but not only is it much less durable (it isn't even particularly durable from current laptop standards), and parts are much more numerous, specialized, and hard to find. Macbooks in particular have a very tightly controlled supply chain, but this applies to other laptops too.
Z80 style processors and peripherals are still in use in various industrial and home appliance products, so it is still rather widespread anyways.
This means that while a laptop might be way more useful at the beginning of a collapse, it will probably stop being maintainable much earlier than a simpler computer will. In fact, if push comes to shove, it is plausible to actually build a Z80-compatible processor from discrete transistors.
> trying to build
What if there was a crowdfunding project to pay different people, to actually do this?
They could find unclear things in the docs, or missing docs. Usability Testing, sort of, of Collapse OS?
This is maybe pretty important after all :- ) More important than Fly-to-Mars rockets?
In most common languages, there is a complicated base spec that covers many cases and defines a broad range of affordances, plus libraries and libraries that expand on an already fleshed out collection of tools and etc.
Forths and Lisps give you the core of an environment, and let/expect you to build on the foundation to create your own implementation. Like someone else in this thread said, N programmers, N dialects. Or, more accurately, every Forth program is its own DSL for accomplishing its work.
> Part of the problem stems from our very dear friends in the artificial intelligence (AI) business. AI has a number of good approaches to formalizing human knowledge and problem solving behavior. However, AI does not provide a panacea in any area of its applicability. Some early promoters of AI to the commercial world raised expectation levels too high. These expectations had to do with the effectiveness and deliverability of expert-system-based applications.
People will put up with whatever bullshit as long as there is demand and helps them get a job.
The thing is, UNIX was a massive success, and it happened to be written in C. Since then, all successful languages had to have a familiar syntax with the host language.
It was UNIX that killed the Lisp Machine (by being given away for free). Programming languages never got to play a role.
Even the dyed-in-the-wool Lisp enthusiasts headed by Richard Stallman were compelled to reproduce Unix, even though their stated goal was to have a system running Lisp.
One of the first GNU programs he released was indeed his "system running Lisp"; it was called Emacs.
For instance, MS-DOS had a larger installed base than Unix at the time, but ... enough said about that, right?
Imitation is the sincerest form of flattery, as the saying goes.
A hacker like RMS isn't going to pour years of coding into making a C compiler, and Unix utilities, in his spare time, if he thinks those technologies do not have merit.
There also arose a new generation of hackers brought up on the new microcomputers who didn't care for, know or else even have access to legacy systems. As microcomputers showed signs of advancement, old hackers who had learned how to make things fit into small memories 15 years prior brandished their skills, which popularized tools like Pascal and C. Turbo Pascal for MS-DOS PC's fit a compiler and IDE into under forty kilobytes.
In the 1980's, people who wanted to use their Lisp techniques to deploy into the microcomputer market were faced with rewrites. A blatant example of this is CLIPS: an expert system written in C which retains the Lisp syntax of its predecessor. https://en.wikipedia.org/wiki/CLIPS . CLIPS was inspired by a Lisp-based system called OP5. But that itself had also been rewritten into Bliss for speed: https://en.wikipedia.org/wiki/OPS5 .
How about both ... message from the past for the future
I think you are close to part of an answer, but it isn't because Forth and Lisp expect one to do more work than other languages. If anything, they expect one to do less. The problem is programmers feel lost because there is no way to differentiate the bedrock of the language from higher abstractions. C has operators and statements and keywords that tell you there is nothing "underneath" what you are looking at. With Forth, everything is words. With Lisp, everything is lists.
I wonder about that. A few weeks back I read about a coroutines implementation in C, using plain C and lots of intricate preprocessor definitions:
There are similar examples in just about any language out there. People use whatever tools the language ecosystem provides to change the language to fit some problems better. Some languages are easier to change and extend, some are harder, but that doesn't stop people from trying to do this anyway.
I think there's a level of familiarity with the language above which changing it is a natural thing to do. It can take years before you learn a "normal" language well enough to be able to do this, but with Forth, Scheme, Prolog, and the like, you're basically required to do this from the get-go. My intuition is that these languages simply target advanced, already experienced programmers, while completely ignoring the beginners. So it's more of the optimization for a different user-base, IMO. That would also explain how these languages are still alive, despite their communities being very small for the last 50 years.
Sure, if we ignore C.
(Submitted title was "Collapse OS was entirely rewritten in Forth, a 50 year old language".)
The project itself was discussed last year at https://news.ycombinator.com/item?id=21182628.
C has excellent portability and performance. The article agrees with the general consensus that C is also generally a better language for the programmer. So why use FORTH? What does it matter that it can do cute things with self-hosting? What does 'compactness' matter?
If the goal is to build a portable means of writing programs for Z80 and AVR, why not develop a C-like language, or an IR, or put work into developing a serious optimising C compiler targeting the Z80? I get the impression that's a relatively unexplored area for (rather niche) compiler research.
I won't claim to be a proficient Forth author, but I've used it to accomplish a couple of rather odd one-off projects, and it is amazing how much you can do, as long as you're not expecting graphics or networking or huge storage needs.
Compactness matters because when you're trying to bootstrap into a tiny (or hacked together custom build) environment, the tiny bootstrap footprint means you can be up and rolling that much faster.
C is all good, I've been writing it for a long time, but I'd much rather get a Forth core going in raw assembly than even a stripped down to brass tacks C compiler.
Using Forth gets pretty close to being the most compact representation all by itself, there's literally no extra tooling needed. No compiler or other translations, it's all just there in the words.
Once you're at that point you may as well just use Forth, especially since it's got a proven ability to work in these kinds of resource contrained, self-hosting/self-bootstrapping environments.
Good point. Related to this, FORTH can be treated as a target platform for compilers, although I don't think there are many mature compilers that do this.
I wonder if CollapseOS will ever seriously target heavyweight platforms like x64 (and not just through emulation with libz80). I suppose that's out scope, but it would open the door to JIT.
> you may as well just use Forth
Presumably it could be a little more compact if a less human-readable variation were used, no? FORTH uses DROP and THEN, which could be shortened at the cost of readability.
I say "concept" because you can run forth code on an interpreter or any number of semi-compiled or compiled approaches that get more and more closer to assembly.
Forth makes bootstrapping and cross-compilation a straightforward exercise. C provides no help whatsoever until you've climbed to the top of a mountain of abstraction.
Because after the end of civilization, you may be inputting your first programs on punch cards or something similar, and your system's memory capacity might be measured in a few kilobytes, not gigabytes. Compactness is a huge deal in this context.
I often talk with a friend who's an historian, and that makes me realize how our relationship with time, as an industry, is extremely short sighted.
The internet archive is an immensely valuable project, as well as all the websites archiving old documentation, etc. But i don't think a lot of people realize the value of things they're destroying everytime they execute a delete statement in a DB or a filesystem.
This Collapse OS ambition to be able to "bootstrap" something useful over any kind of primitive hardware and sustain the passage of time ( or catastrophic event) may have an immense value in the future.
And it's not like we don't have alternatives. There is Datomic of course, but also juxt/crux and DataHike.
Meanwhile, you can't compile rustc on a 32-bit system because it runs out of address space...
I used this approach when I bootstrapped a Forth compiler on the TI 34010 graphics chip in a similar fashion as the author. It even had local variables so you didn't drive yourself mad thinking about the stack all the time.
My favorite commercial example of such a Forth was Mach 2 Forth on the early (pre-OSX) Macs. I don't know if any modern Forths do down-to-the-metal compiling or local variables, but I'd be interested to find out.
I believe most modern forths will have these, gnuforth does for example.
> down-to-the-metal compiling
I think there are some that do this, but back in the day, the opinion of lots of forth programmers was that threaded code was good for the 90% of the program where performance didn't matter, and added an inline assembler for areas where it did.
Think about seat belts. Do you ask the driver:
> "I'm really curious to hear why you think you'll crash the car?"
when s/he puts on the seat belt? S/he'd likely say "No I don't think that at all".
And not impossible that the Collapse OS author might have a similar reply. Still, the project can be well spent time, like, a seat belt in case of the unexpected.
Think about: 1) Likelihood-of-Global-Supply-Chain-Collapse x How-Bad. And 2) How-much-does-Collapse-OS-mitigate-the-bad-things. And compare that, with 3) time spent building C.OS.
In it, he writes: "... two important stages of collapse ... the second one is when, in a particular community, the last modern computer dies ... decades between the two ... Collapse OS won't be actually useful before you and I are both long dead"
making me wonder if one scenario he has in mind, is the different biggest countries in the world stopping trading with each other, so it won't be possible to get more rare earth metals (needed for today's computers, right). And then, maybe downhill from there, the thereafter following 50 or 100 years? — But not _necessarily_ a nuclear winter or something that dramatic & sudden.
And ... He also writes:
> What made me turn to the "yup, we're fucked" camp was "Comment tout peut s'effondrer" [a book]
I think you'd find the answers in that book then? Seems the book got translated: "How Everything Can Collapse" [in our lifetime] by Pablo Servigne.
I haven't read either version, but piecing together the thesis from reviews, it seems to be a somewhat more evolved form of Malthusian catastrophe and Peak Oil(/Energy), with a dash of climate change alarmism  and Piketty-style concern over inequality. And technology won't save us because... well, I haven't found anyone who can elucidate that concern. It seems to me that the commenter who wrote "this book seems to be only for those who were convinced beforehand" has it right.
 I don't like using "alarmism" here because it suggests that I don't think it's a problem (I do), but I can't think of a better succinct description of "if we don't fix this literally tomorrow, we're totally screwed."
I hope I’m not uncharitably interpreting your comment, but are you saying that technology will be some sort of panacea?
The most critical processes for human success - population growth - is an exponential process. Pretty much all of it happened in the age of fossil fuels. We have 200 years (out of 200,000 of human history according to Wikipedia) of experience with global populations >1 billion and we are currently cruising at about 8 billion souls on the planet. All of that 200 years is in the context of freely available and rapidly growing utilisation of fossil fuels to power the logistics networks enabling the growth.
History doesn't show us being adaptable, history shows if something happens to the solid/liquid carbon supply around 7/8 of us are expected to die. And we can statistically all-but guarantee something will sooner or later over a long period.
Our major reason to be hopeful is our history isn't a guide and that something other than oil really makes strides. Maybe nuclear, maybe renewables.
I disagree. Look at the pandemic response here vs. the Spanish flu, or the Bubonic plague.
We also have plenty of alternate energy sources to diversify our power infrastructure, and many nations are taking these steps. The past few decades have been a series of lessons on the importance of resilience over efficiency, and we're slowly learning this lesson.
The low systemic risk events we are living through now, prove that the Emperor is naked. An event that produces moderate to serious systemic shocks would be our DOOM.
The system is on a hairline trigger and the control room is empty.
'Innovation will fix things eventually' is cold comfort for the generations of people living in the interim.
edit: I mean, look at what is happening now. All sorts of disruptions because of some sneezery (regardless of real or imagined danger, it's the policy that matters). Now imagine further disruptions by volcanic ash particles and gases in the atmosphere. So F-ed!
The gutenberg press was used in the real world. There are lots of OS being used in the real world for real work.
So why should someone in 500 years think this is more relevant, than for example Linux?
And if there is a real collapse, then I also do not really believe everyone makes a run for collapseOS. There are other options: all the ones, tinkerers and hackers use already today.
There is no intriguing backstory for it, like for CollapseOS, but it's a ~6 kiloword, practical 4th environment for Microchip PIC microcontrollers, which are a lot simpler than Z80, btw...
The source code is trivial to understand too.
My father is still using it daily to replace/substitute Caterpillar machine electronics or build custom instruments for biological research projects.
We started with Mary(Forth) back then, when the first, very constrained PIC models came out, with 8 deep stack and ~200 bytes of RAM. Later we used the https://rfc1149.net/devel/picforth.html compiler for those, which doesn't provide an interactive environment.
I made a MIDI "flute" with that for example, which was fabricated from sawing out a row of keys from a keyboard and used a pen house as a blow pipe and a bent razor with a photo-gate as the blow-pressure detector...
There are more minimal Forth OSes, which might be more accessible than a Z80-based one.
I would think those are more convenient for learning, how can you have video, keyboard and disk IO, an interactive REPL and compiler in less than 10KB
I remember, I played a lot with https://wiki.c2.com/?EnthForth
But if you really want to see something mind-bending, then you should study Moore's ColorForth!
I found it completely unusable, BUT I've learnt immense amount of stuff from it:
There are more usable variants of it, btw.
Also worth looking into Low Fat computing:
I think it's still relevant today.
I’ll be trying to compile collapseos, write forth and load slackware on floppys in a few years then. (Will systemd survive civilisation’s collapse, especially when it caused it, that is the question....)
Maybe I'm getting old, maybe I'm seeking more control, maybe the world has come needlessly complex, but there's a certain appeal about returning to more manageable days, where it was possible to fit a software and hardware system in your brain, and your brain was swimming with ideas on how to use a limited system rather than drowning in layers of complexity.
But on the other hand, real preppers scare the effing s out out of me.
This is how I learned.
In my opinion, it's not that FORTH code is hard to read, but that FORTH gives the programmer so much freedom that every program becomes its own microcosm of DSLs.
: C(-; LICK SMILE NOSE WINK ;
"That being said, I don't consider it unreasonable to not believe that collapse is likely to happen by 2030, so please, don't feel attacked by my beliefs."
Triple negative? Quadruple negative? (If we count the second "don't".)
Compare something like:
"I consider it reasonable to believe that collapse is unlikely to happen by 2030, so please, don't feel attacked by my beliefs."
It is almost as if the grammatical structure reflects the life perspective of the author.
So let me translate for you.
He doesn't "consider it reasonable to believe that collapse is unlikely to happen by 2030", in fact he believes that given the importance of the matter, he is better to assume the scenario, in which his project will turn out to be life-saving. But if it doesn't seem likely to you, that collapse will happen before 2030, and you don't believe the evidence supporting that claim, he wouldn't call you silly (unreasonable) for that, so we (he and you) can work on the project together even if our forecasts are different, don't worry about that too much.
That seems like a perfectly valid thing for grammatical structure to do, although this one tripped me up admittedly.
That is the point. This style of communication is indirect and ambiguous. This is just negative followed by negative followed by negative, etc.
Just say what you mean. In the affirmative. Overuse of negatives is the functional equivalent of "spaghetti code" in written communication. Not easy to follow.
Anyway, some readers missed the point of the comment. It is not every day that one sees so many negatives in one sentence. Most however got the point, and the commenter who crafted a version of the sentence with even more negatives I thought was hilarious.
“reasonable” expands to something more like “The reasons you have provided support your conclusion.”
“reasonable” can work in this case, but it doesn't state as clearly that the speaker disagrees with your conclusions.
In a more general sense the “not un-” pattern is a marker for something that is qualitatively similar to the corresponding simple positive attribute (e.g. “reasonable” or “popular”) but not to the extent of the category of things fitting that simple positive attribute. That is, category “reasonable” is a strict subset of category “not unreasonable”.
For example, something like this:
"I don't consider it unreasonable to believe that supply chains will survive to 2030, so please, don't feel attacked by my beliefs."
"That being said, I don't consider it unreasonable to not believe that collapse is unlikely to not happen by 2030"
- It sits very close to assembly
- It allows high-level programming like in C
- The concept of pushing/popping things onto/off the stack is a relatively straightforward programming model when done consistently
One of the old competitors to the likes of UEFI and uBoot is OpenFirmware (also known as OpenBoot), for which the primary UI is a Forth shell; OpenFirmware was the BIOS equivalent for Sun's SPARC workstations/servers (and still is for Oracle's/Fujistu's SPARC servers, last I checked) and most POWER hardware (including "New World" PowerPC Macs), among others. About the most delightful pre-boot environment I've used; it's a shame it didn't catch on in the x86 or ARM space.
The big problem with Forth is software reliability: ad-hoc Forth code is hard to reason about in any generality. Languages like Factor show that dialects of Forth can be much better in this regard.
After getting into Forth I got into PostScript, which does have most of the Forth fun/freedom taken out of it. It's not usable for GUI-type programming is it?
This is (the now pedagogically famous) JonesForth. It's ~2000 lines of HEAVILY commented assembly, and at then you have everything you need to start writing a fully functional Forth+standard library (which is done in jonesforth.f in a ~1800 lines of heavily commented forth). That's not even as tiny as forth can be, and it gives you a lowish level language that is as/more modifiable and extensible as any Lisp.
There is basically no faster way to get from bare metal to a comfortable humane and interactive (so you can write more stuff) computing environment than writing a forth.
As I understand it, they tend to write assembly, and compress the resulting machine code, for decompression at runtime.
If you end up liking it and want 64bit, you can buy iForth for 100 EUR. It's what I use.
These are all amazing optimizing compilers.
If you care about OSS: Don't waste your time with GForth. If you want something simple that you can hack yourself, take a look at pforth.
It would be interesting to build a minimal useful CPU out of fluidics. https://en.wikipedia.org/wiki/Fluidics
I bet it would be easier to make up "sacred dances" that incorporated processing in the motions of the dancers.
And let's not forget that a human with an abacus can do a great deal of the math required for everyday life. :-)
A truly under appreciated language imho.
I find it misleading at best to casually intimate that Lisp is some kind of ur-language which exemplifies simplicity and thus lies at the root of any design space. Fans of Lisp are overly eager to stake claim upon ideas which do not belong to their language.
I don't mean to bite your head off about it; this is just a trope I find tremendously frustrating.
These kinds of operations are behind the curtain in C but they're accessible to everybody in Forth and Lisp.
(I, on the other hand, have to be reminded that 50 years have passed.)
Edit: as someone below mentioned there can not be a 'Standard Forth'. You just write your own one as you see fit.
Forth is a collection of ideas and philosophies toward programming as much as it is a language. Two stacks and a dictionary. Threaded code (direct, indirect, token, subroutine, etc...). Collapsed abstractions. No sealed black boxes. Tight factoring.
C has changed. Forth is everchanging.
The stacks are likewise a description of semantics rather than implementation. Some Forths keep the top few stack elements in registers to reduce overhead. If the return stack isn't user-accessible (r> >r etc.) then, again, you're straying closer to a higher-level functional language than a Forth.
There are many kinds of threaded code. For example, subroutine-threaded code uses native subroutine call instructions in word bodies and does away with the inner interpreter. Threaded code is a natural consequence of disentangling conventional "activation records" into two stacks which grow and shrink independently. You could have a Forth without threaded code, I suppose.
Some Forths attempt to "seal off" or otherwise obscure some parts of their own internal workings from "user programs". This is more common among dialects which try to adhere to ANS specs, like GForth. This isn't a total anathema to Forthiness, but it tends to introduce additional complexity. If a word is useful for implementing a forth kernel, why couldn't it be useful in implementing other functionality, too?
If you're building something higher-level which vaguely resembles Forth, it's probably better to describe it as a concatenative language. A dependent type system doesn't sound very Forthy, imo.
VFX Forth from MPE UK is a native code generator. It evaluates source code and emits inline code or calls depending what you tell the compiler to do. It can expand everything to inline if you tell it but your code would get much bigger.
This is the state of the art for Forth compilers today.
Homemade systems and older systems use threaded code.
I have no affiliation with MPE.
The focus of CollapseOS is, well, an OS that can be used after society has collapsed, something running on scavenged chips on hand-soldered boards (in the worst case scenario).
The original logic was the Z80 is still pretty prevalent, so it was thought to be a good choice to base the OS on.
Turns out that a Forth interpreter/compiler is incredibly easy to write (just a few hundred bytes of a assembler, a few thousand at the upper end), so by using Forth they hugely expand the range of scavengeable chips.
Nice framing, that it’s not an intellectual argument you can make to justify the benefits of drawbacks (i.e. an infamous red flag of stockholm syndrome), so you have to point to the experience of the thing itself (“you had to be there”).
When I first heard of Forth and started trying out some of my own code, I was surprised by the initial effort it took me to adjust to writing small words vs C style functions.
I then started building on the pieces I first wrote, and it took very little code to cover my needs.
So yes, there's a very different experience, and it does take adjustment for anyone only familiar with function-style code. And it is not just what I've described, there's a whole different thought pattern involved.