
Collapse OS - spraak
https://collapseos.org/
======
bcaa7f3a8bbc
I don't think just developing a Z80 operating system is enough. The whole
ecosystem needs to be preserved.

In agriculture, we have the doomsday seed vault [0] just for this purpose. If
we anticipate collapse of the current economic system or society, I think we
should build a doomsday computer vault, that keeps everything we need to
rebuild the computing industry in a underground bunker. It keeps everything we
need in a controlled environment, such as 8080, Z80, m68k, motherboards, logic
chips, I/O controllers, ROM/RAM, generic electronic parts, soldering irons,
oscilloscopes, logic analyzers, schematics, documentation, textbooks,
software. We also keep some complete, standalone computer systems such as
desktops and laptops, and all the parts that need to service them. We also
need to preserve the old semiconductor production lines around the world,
although probably not in the same bunker. Even if we fail to build better
systems, 8080s are already useful enough!

Meanwhile in peace time, we need to form a team of experts that makes a
roadmap to rebootstrap the computing technology for the future using parts
from the bunker, with a step-by-step plan, that can be easily followed and
executed.

[0]
[https://en.wikipedia.org/wiki/Svalbard_Global_Seed_Vault](https://en.wikipedia.org/wiki/Svalbard_Global_Seed_Vault)

~~~
rapind
I'd like to see a log being kept in orbit. Like every 10 years it's rotated
(FIFO, oldest comes down, new one is launched) with a large delta so that it
might survive our current civilization in case of global / climate craziness.

~~~
Scapeghost
What about burying something on the moon, with monoliths and radio sources
marking its location?

~~~
Johnny555
Maybe we already did that in a previous civilization, but lost the ability to
recognize the beacons.

~~~
0xdeadb00f
_The Sentinel_ anyone?

------
AtlasBarfed
I think some sort or retro low-performance OS should start with NeXTSTEP.

Ryzen processors have 80MB of L3 cache! IIRC that may have been twice what was
needed to run the colorized versions of NeXT.

A lot of "man the old days were all we needed" recollections of Windows and
DOS forget how crappy those OSes were...

But NeXT? That was basically a modern OS, with a bit less anti-aliasing.
XWindow systems STILL look horrible compared to it.

It fits in L3 cache. 5-10 NEXT hard disks fit in RAM!

It had a web 1.0 browser, TCPIP networking, all the productivity apps. It had
DOOM.

Sure Mach the microkernel was buggy and leaked memory, but kernels... there's
lots of kernels now.

I think it would be a great project to start with the NextOS and carefully
rebuild it up with good security and multitasking.

It would be fun to run NeXTSTEP on bare metal with RAMdisks (spare a processor
to do the writes to SSD), and compare with other systems, see if it feels
ludicrously fast or not.

~~~
cable2600
NeXTSTEP basically became MacOSX when Apple merged MacOS with it.

AROS is the next best OS to use as it has a low memory footprint and can run
on X86 systems: [http://aros.sourceforge.net/](http://aros.sourceforge.net/)

If you want Windows try ReactOS: [https://reactos.org/](https://reactos.org/)

OS/2 try OSFree: [http://osfree.org/](http://osfree.org/)

BeOS try HaikuOS: [https://www.haiku-os.org/](https://www.haiku-os.org/)

All are low memory OSes.

~~~
godDLL
Couldn't get ReactOS to boot on any hardware that I own. HaikuOS boots with
GFX disabled, but is not anywhere near stable.

Will try the other two, I guess. Not holding my hopes up.

~~~
Crestwave
_> HaikuOS boots with GFX disabled, but is not anywhere near stable._

Can you elaborate? It's been fairly stable for me.

P.S. It's just Haiku, not HaikuOS

~~~
godDLL
I get frequent freezes, followed by a kernel panic. Sound, network, memory
management; every time it's different.

~~~
cable2600
Try running it in a virtual machine.

------
SwellJoe
I've been sort of half-assed working on a novel about a post-tech future (100+
years after apocalypse), where a handful of people retained some technology by
virtue of having stored information about it in a manner that survived and a
tradition of scavenging and trading and a sort of religion based on
maintaining it in a working state. So, this is a fun read and thought
experiment, even if an 8-bit computer is probably not my highest priority when
thinking of "when the shit hits the fan, as seems more likely today than it
did five years ago, what do I want to have in my survival kit?"

One of the questions I keep coming back to for such a scenario, and still
haven't come up with a great answer for, is how does someone living in a world
without the ability to manufacture a computer still have computers that work
100+ years after the last one was made? Even manufacturing transistors without
modern methods is non-trivial. Will a Z80 last 100+ years? I mean, maybe, if
it's kept dry and not exposed to anything corrosive. I've got a Commodore 64
that's ~40 years old and still works...so, 100 years seems reachable, but
there have to be extenuating circumstances to get to that "post-tech" world
(though I guess in a post apocalyptic world, the value of computers would be
seen as minimal for a few years while survival is the only concern, so just
forgetting could be enough).

~~~
rwmj
My novel idea (feel free to use it!) is about someone taking a 1980s computer
back to the Second World War. It's used by the Allies to decrypt Ultra
intelligence and is treated like a kind of holy relic - only a tiny set of
"high priests" are allowed near it, fewer still can touch it, and because of
its importance they go to extraordinary lengths to ensure it can never be
damaged by anything from bombs to power glitches. Think a Commodore 64 in a
ridiculous white room.

But the book would be more about the consequences of this - do they eventually
take the thing apart and jump-start a silicon chip revolution in the 1950s, or
(more likely I think) does the government destroy the machine as the UK
government did to the Bletchley machines after WWII, and because there's no
ground-up computer theory does it set back computing for decades?

~~~
fernly
Yeah, no. I've handled parts of Whirlwind[1], a vacuum tube machine from just
post-WWII, and the gap from that to a C-64 or any other circa-1980 machine is
just too great. They were using discrete wiring, resistors and wires soldered
to the bases of the vacuum tubes. The Whirlwind was the first machine to use
core memory, and the 4K core memory unit is a box about the size of a phone
booth. I don't know if PCBs existed before 1950 but if they did, they were
certainly single-sided.

So now ask somebody really smart in that technology, like say Jay Forrester[2]
who had just finished inventing core memory, to analyze this magic beige
plastic box. He could probably recognize that the PCB provided connectivity
between parts, but what are the parts, these little flat plastic tiles? I
don't think it would be possible to work out from first principles what the
functional contents of a DRAM chip is, let alone the CPU. Even if they x-rayed
it, supposing they had x-ray tech with enough resolution to resolve a chip,
how could they figure out that those little blobs are transistors? Transistors
hadn't been invented!

I think they'd have to concede this is "sufficiently advanced" tech, in Arthur
Clarke's phrase, to be indistinguishable from magic.

[1]
[https://en.wikipedia.org/wiki/Whirlwind_I](https://en.wikipedia.org/wiki/Whirlwind_I)

[2]
[https://en.wikipedia.org/wiki/Jay_Wright_Forrester](https://en.wikipedia.org/wiki/Jay_Wright_Forrester)

~~~
rwmj
They would see the "© 1982" on a chip and although it would be incredibly
futuristic (35+ years in the future!), would at least know it was likely to be
created by humans. Whether they could work out how on earth you place such
incredibly tiny components onto a sliver of silicon is interesting. If the
person taking the computer back in time mentioned the word "photolithography"
I suspect they would have been able to make a pretty good guess.

~~~
murderfs
I don't think there would be many copyright dates on the chips. They might
think that Texas ruled the world from the TI logo being on everything, though.

~~~
rwmj
Here's a high res picture of the C64 PCB, where you can see the markings on
the chips: [https://myoldcomputer.nl/wp-
content/uploads/2015/11/board-32...](https://myoldcomputer.nl/wp-
content/uploads/2015/11/board-326298.jpg)

You can see both copyright dates, and plenty of other English text. While in
1940 this would have represented incredible futuristic technology, it's pretty
obviously made by humans and not a piece of alien magic. It also has
components like resistors and capacitors with markings which would have been
immediately obvious to 1940s electronics experts.

------
agentultra
A neat idea!

I think we could also get started now. Not necessarily de-escalating tech, but
realizing that the fundamental supply of newer, more powerful chips might not
last even with a shift to more plentiful supplies of rare-earth metals due to
our need to get off of fossil fuels, fast. I think it might be useful in the
more immediate term to be able to lock-in the minimum set of features that
make the web and Internet useful then distribute that as widely as possible on
low-power, commodity platforms with resilient networks that _could_ survive
super-storms knocking out huge swaths of devices in one fell swoop.

Low-power p2p protocols, mesh networking, recycling devices, component-
platforms that allow scavenging and make-shift repairs, etc.

Until we can solve the green energy problem it might be nice to know that even
if your community gets hit with a storm or flood, it's still possible to
restore and maintain network services rapidly in the aftermath. Simply being
able to send a message to someone would be a big deal.

~~~
Spooky23
I think there is a business model for different reasons.

As web client tech stabilizes and telecom regulatory rollback continues, there
may be an opportunity for localized solutions to be landed for all sorts of
different purposes.

~~~
npo9
I don’t see large opportunities in localized “web tech” (I’ll read as http
based) business.

The whole reason the web works for business is that you can give something
away to millions of people for ad revenue or sell something to very large
groups for small amounts of money.

Localization reduces your customer base. The price a business has to charge
would be higher. This is on top of the adoption problem.

~~~
Spooky23
That’s one aspect of the business, but you also have companies like Netsuite
or Intuit selling general ledgers and similar solutions, and those lines of
business don’t really benefit from the scale.

------
digitalsushi
I was pontificating about my 401k fund and what I should do if I find it no
longer exists when I need to start using it.

The older gentleman who was polite enough to listen to me said, "It's ok guy,
if that 401k doesn't exist, then neither will you".

And so I think I will not stockpile any computers for later. I do like the
engineering spirit of this however.

~~~
mattnewport
"It's ok guy, if that 401k doesn't exist, then neither will you".

There are a number of ways that can not be true without a large scale societal
collapse. Fraud involving pension funds has happened many times in the past
(Bernie Madoff, Robert Maxwell being two high profile examples). The last
financial crisis brought a bit more attention to the topic of counterparty
risk - the idea that your "safe" investment is only as safe as the
institutions that are backing it in many cases. It's not necessarily a high
priority concern but I think it's worth at least considering splitting your
retirement savings across more than one account with different institutions.

There are also lots of conceivable larger scale crises with historical
precedent (many in the 20th Century) that would render your retirement savings
largely worthless without leaving you dead. In many of those you would have
more pressing concerns than your 401K but it still seems like not a bad idea
to have some physical things of value that you keep somewhere secure but
accessible (cash, perhaps gold and/or silver).

~~~
nradov
401(k) plans have individual named customer accounts and strict auditing
standards. A Bernie Madoff type scam isn't really possible. Even if Vanguard
or Fidelity goes bankrupt the customer accounts will still exist. There's no
counterparty risk (except for cash actively being transferred in or out).

~~~
mattnewport
Even if you have full confidence in auditors and regulators (Arthur Andersen?)
that only guarantees that you own the contents of the account. What about
what's in those accounts? How familiar are most people with the details of the
various mutual funds and/or ETFs that might be held in their accounts? How
complex are the webs of obligations inside those funds? If one or more large
financial institutions were to have a Lehmann Brothers situation how long
might the assets in some of those funds be tied up in litigation, even if they
were actually still there?

These concerns are all likely fairly low probability but there's certainly a
whole range of possible scenarios between "my retirement funds are completely
secure" and "I'm dead in a global thermonuclear apocalypse".

~~~
bluGill
There are enough people who depend on those for their income that congress
would do something. I don't know what, but the litigation would be the top
concern of congress and that pressure would ensure that things got wrapped up
quick somehow.

------
frabert
So, let me see if I understand this correctly: this is supposed to run on
z80-based computers after an armageddon comes and all the other "modern"
computers are out of business, so people start building them by scavenging
parts. Ok.

So, first of all, how are you supposed to download this thing onto your
homebrew computer, given that internet will most likely be down?

"But if the collapse magnitude is right, then this project will change the
course of our history, which makes it worth trying."

Mmmh, I think the author is a bit on the hyperbolic side here. I'm quite sure
that anyone that can design & assemble a z80 computer can quite comfortably
code some basic utilities by himself just fine. All the others won't care a
bit about your OS. Sorry if I sounded harsh, but I actually was.

~~~
blfr
You're supposed to download it now and keep around. Just like preppers do with
MREs and ammunition.

~~~
AtlasBarfed
I said this further up the thread, but I'd rather have a raspberry pi, some
screen cribbed from a smartphone, and some flash sticks.

Why plan for less than the raspberry pi level?

~~~
maze-le
Because the pi might be needed for more demanding workloads (networks,
routing, sdr radio system). With this solution you can build low-tech control
systems and programmable switches with a bit of logic where a raspi would be
overkill.

But the doomsday scenarios aside, this is super useful as an educational
device. It can teach people what computers actually are and how they operate
on the lowest possible level.

~~~
MetalGuru
This is what excites me about this project. Another commenter posted the
NandToTetris course, which is along the same lines. Computers/software are so
complicated now, so much understanding gets lost in the upper levels of
abstraction. Everything just seems like black box magic.

------
jedberg
For anyone interested in the idea of keeping old computers running, I highly
highly recommend the Living Computer Museum in Seattle [0]. It was started by
Paul Allen and has some of the coolest stuff I've ever seen. Their goal is to
restore old computers to working condition and have them look/feel/smell/work
the same as they did when new. I got to see a ton of old computers that I had
as a kid. I even got to write a program in BASIC on the original IBM PC like I
did when I was a kid! [1]

[0] [https://www.livingcomputers.org](https://www.livingcomputers.org)

[1]
[https://twitter.com/TeriRadichel/status/1164369796307116033](https://twitter.com/TeriRadichel/status/1164369796307116033)

------
hutzlibu
Why can it be only useful after a collapse?

If it can be useful, than it can also be useful today to all the (poor) tinker
people around the world today. There are lots of alternative eco villages etc.
trying to be self sufficient, who do all kinds of recycling and improvised
technology. If this adopts with those people, then it might be useful.

But if they cannot use this today, then I don't see how a broken down surviver
group could use it.

------
RodgerTheGreat
Forth is the ideal language for bootstrapping a cobbled-together computer from
whatever scraps you can find. Forth gives you a shell, an assembler, a
disassembler, and a rich, extensible programming language in a few kilobytes.
You can peek or poke hardware interactively, or use the REPL as a calculator.
Forth-style assemblers also make cross-compilation very practical.

If I was tasked with bootstrapping a post-apocalyptic computer from junk, a
hard copy of a well-commented Forth implementation would be a welcome
assistance.

~~~
sdegutis
Sounds great! Can we get all that with a less confusing syntax? That’s my main
difficulty with forth.

~~~
RodgerTheGreat
The syntax is unfamiliar to most, but it's very consistent and learnable. The
syntax is one of many aspects of the design that keeps the whole thing simple.
The use of a stack to pass arguments between words makes the kind of
"pipelines" which are common in CLI shells very natural. I can think of few,
if any, other languages I would want to use interactively as a replacement
shell without complex line-editing assistance.

For those curious as to what a modern machine using Forth on bare metal as an
operating system might feel like, check out Open Firmware:
[https://www.openfirmware.info/Open_Firmware](https://www.openfirmware.info/Open_Firmware)

(If you have an OLPC sitting around in a closet somewhere from the Give-One-
Get-One program years ago, you already have a serviceable and physically
robust Forth machine ready to roll! Same deal for some older Powerbooks and
Sun workstations.)

~~~
drudru11
I wish openfirmware was the norm. PC BIOS and UEFI are horrible in comparison.

------
albandread
I like the Collapse OS concept. I once typed Fig Forth from a booklet I
ordered from America into a Sharp Z80 computer. Society was in the main
computer free at the time. It was like the post collapse era. To restart the
software industry from scratch; I recommend etching a modern version of that
Fig Forth booklet onto stone tablets; perhaps provide a scheme interpreter
written in forth as well. They will never need anything else.

~~~
drivers99
I was also thinking that if I had to start from scratch, Forth would be a
great way to do it. I'm not familiar with Fig Forth, but I just ran across
these assembly sources for various early CPU architectures:
[http://www.forth.org/fig-forth/contents.html](http://www.forth.org/fig-
forth/contents.html) but they look really long. I wonder if that booklet you
had was simpler than these. (Any idea if it's available online?) I was
thinking something like jones forth
[https://news.ycombinator.com/item?id=10187248](https://news.ycombinator.com/item?id=10187248)
which has a minimal assembly part (which is mostly comments explaining how it
all works) and then quickly moves to implementing Forth in Forth itself.

------
rst
I like the concept, but I'm not sure that the Z-80 is the best implementation
substrate -- it's got a lot of oddball properties and special case instruction
encodings (due in part to the way things were squeezed in around the base 8080
instruction set).

A PDP-8 can be implemented in fewer transistors (original DEC wiring diagrams
are on bitsavers, and github has source for several clones in Verilog), and
DEC already shipped a moderately full software suite for it.

~~~
exDM69
I think the point is in that there are so many z80 chips out there for
scavenging and that there are plenty of consumer devices that can be used
(like TI calculators). Even if they stop making them today, the supply will
last long.

~~~
noir_lord
They are also the cochroach equivalent of a processor, tough as hell,
functional on a ropey power supply and can be used to make a simple 8 bit
machine with very few ancillary chips - the reason they ended up in the
sinclair zx's.

~~~
cestith
Similar reasons are likely why they were in the TRS-80.

~~~
rst
The TRS-80 was... not built for ruggedness and durability. At least not if you
had a system with any peripherals; the cables were notoriously flaky, to the
point that the problems are on record in the Wikipedia page, as an explanation
for the "Trash-80" sobriquet. (I recall at the time seeing aftermarket
recommendations for expensive cables _and_ doing things like wrapping
components in tinfoil to try to get some extra RF shielding.)

~~~
cestith
The system wasn't built for ruggedness. The chip had to be, because the system
provided that ropey power supply previously mentioned and very few ancillary
chips. It was a relatively inexpensive chip that one could build an even less
expensive home computer around.

~~~
noir_lord
Precisely this, The ZX80 was a masterpiece of "just how close to the wind can
we sail and still have a mostly functional for most people legit computer".

I mean it was built out of off the shelf cheaply (relatively) available TTL
components and no real RF shielding (something variants in the US had to fix
to comply with the FCC rules of the time).

It's astounding it was a commercial success but it cost 80 quid (as a kit, 100
pre-built) at a time when others where 3-4 to 10 times as expensive (average
wage back then was around 110 per week).

In a very real sense it democratised computers to something almost anyone
working could afford if they wanted it.

I know if it hadn't of been for the ZX-81/ZX-Spectrum I wouldn't have had a
career in software engineering nor a life long love for computers, I was born
in '80 to working class parents in the north of England even in 1987 having a
computer was considered exotic among my cohort, I didn't see another one
outside my family til 1990 (a C64 I lusted after).

------
headcanon
Why does the author think the global supply chain will collapse in the next
ten years? What scenario do they envision?

Climate change? Will cost trillions of dollars and billions of lives, but will
likely be played out over course of several decades. We will be stressing out
about it but its not going to be electronics-ending apocalyptic

Nuclear war? Please. The countries that have the capability are also level-
headed enough to use them to play brinksmanship, despite what the news is
telling us. These countries want deterrence, not to blow stuff up.

Disease? We're too widely distributed and the most successful viruses are ones
that infect but do not kill. Ebola is scary but its too destructive for its
own good which makes it easy to contain. The most successful virus is the
common cold, and possibly HIV which is certainly a serious problem, but
nobody's out there building shelters because of that.

Water/food supply? Fresh water is a function of energy, and if anything is a
plus about climate change its that we're gonna have a lot of fresh water
raining down on us from the Earth trying to compensate for higher temps.

Second order effects from climate change will likely affect arable land and is
worrisome but it may also open up new areas for growth and will likely play
out over time, so I'm considering this more of a political problem.

The only things I can think of are either:

1) A sudden disappearance of rare earth metals needed to make electronic,
which would be massively inconvenient but we'd figure out a way around that,
either by it suddenly becoming more valuable to recycle old electronics or not
needing them in the first place. Besides if this happens we'd just get extra
motivated to start mining asteroids.

2) Celestial events like asteroid strike or Coronal Mass ejection hitting
Earth in the wrong way. The first problem is mitigated with asteroid tracking
and we're getting better at that, and the second one would make for an
interesting 6 months but pretty sure we'd get back on track pretty quick.

I am all for technology that does not depend on a complex global supply chain
- we will need to manufacture simple but sophisticated tech in space and mars
in the future but this prepper BS is just fantasy driven by a hyper-
apocalyptic news cycle shlepping around clickbait.

What am I not worried about that I should be? What massively apocalyptic event
is going to happen in 10 years to turn us back to the middle ages? Seriously.

~~~
undershirt
> this prepper BS is just fantasy driven by a hyper-apocalyptic news cycle
> shlepping around clickbait

Au contraire, it’s the belief that our system can continue like it’s doing
that is the real hyperbole. Collapse is just baseline reality of
civilizations.

\- HISTORY: Collapse is a property of every civilization we’ve studied. These
people were as smart if not smarter than us, working with societies smaller
and simpler than ours.

\- ECONOMY: The way money is created and managed today is an ongoing
experiment that almost ended in 2008, and we are still on uncharted ground. We
can only continue paying for debt by increasing consumption in the following
year, yet our debt keeps increasing, by the ever-devaluation of our currency,
requiring more production and consumption. No one is planning on an end to
this model of growth.

\- TECH: Most of our infrastructure is built under the incentive of increased
efficiency and profit, not long-term robustness since profit has to be
sacrificed to plan for contingencies like price fluctuations in supply. Short
term tech outcompetes the long term, easy. Strong but fragile. And then
there’s the incentivized inefficiencies from economies of scale: one calorie
of food now requires ten calories of energy from our system to produce.

\- COMPLEXITY: “More is different.” As everything becomes interconnected,
things become entrenched into dynamics that become increasingly difficult to
control and even reason about. Rational decision-making must always be
filtered by the interests of the current system, thus there is a loss in
agency in what we can do (read: incentives), and we are stuck trying to find
creative solutions that must accept the framework of what may be a harmful
system, often just making that system more effectively harmful.

\- ENVIRONMENT: Some call it the sixth mass extinction. Whatever it is, the
biosphere is changing dramatically. Soil is in a weird zombie state kept alive
by oil. The basic line is that the value of life is diminished through the
lens of our economy, as dead resources. So our model will continue bringing
the real world into consistency with that deadness.

\- MYTHS: When we live in a civilization that sanctifies all forms of
advancement and improvement and growth, there is no fertile soil for the
acceptance of limitation. We only have the vocabulary to label it pessimist.
Thus, optimism becomes co-opted for the aspirations of a mythical techno-
utopia beyond all conceivable boundary.

~~~
nwah1
>Collapse is a property of every civilization we’ve studied

How would you define "civilization?" Because sure, every civilization has an
expiration date, but for current computing technology to be lost requires a
worldwide civilizational collapse. Current global civilization is a
decentralized collection of many civilizations which have all shared and
replicated the knowledge of computing.

>our debt keeps increasing

Public and private debt are separate things. Public debt has generally seen a
continuous march upwards. Private debt has been peaky, with no upward trend.
Debts are fine when the debt is incurred for a purpose that has a sufficient
return on investment. Public debts of sovereign currency issuers can always be
repaid, and the yields on those bonds are whatever the currency issuer
decides. And further debts shouldn't be judged as nonviable just because of
the quantity of existing debt. Rather, the question at each point should be
whether the investment is a good one.

> Soil is in a weird zombie state kept alive by oil

Soil is renewable, and can be made even with simple techniques. The terra
preta soil of the Amazon rainforest was largely human-made, and thus the
Amazon itself is largely a human construct. Creating it didn't require any
oil.

>there is no fertile soil for the acceptance of limitation

Malthusian thinking has often been the default, and one of the most popular
modes of thinking since the Enlightenment. The mid 20th century was full of
best-selling Malthusian books by the Club of Rome, Paul Ehrlich, M. King
Hubbert, and EF Schumacher. The entire fields of biology and ecology have been
predicated on Malthusianism. Darwin was explicitly inspired by Malthus.

It has been to the great surprise of the intelligensia of each successive
generation that there hasn't been mass starvation. We've been able to do more
and more, with less and less. Any serious type of collapse hypothesis needs to
factor in the history of losing bets on that side of the argument, and
internalize why their predictions were wrong. It wasn't just luck every time.

~~~
undershirt
Definitely, the Green Revolution et al is a solid basis for optimism,
especially with Ehrlich losing his wager on resource scarcity. And I do like
the malthusian lineage you described.

This empirical optimism is also paradoxically irreverent toward the immutable
attrition of complexity. Our creativity has limits, whatever they are just
pick something. At the risk of sounding flippant, 200 years of “creative
patching” is historically too small a window to say we can continue subverting
this “law” with eternal vigilance (I’ve heard this described as “we are
running out of tricks”). Maybe I’m oversimplifying when I say we would have to
approach the limit of absolute foresight to achieve this, but I think there’s
some truth to it. For example, I like these explanations of our rational
limits, with regard to managing a complex society:

\- CHOMSKY[1]: We have in our heads a certain set of possible intellectual
structures. In the lucky event that some aspect of reality happens to have the
character of one of these structures in our mind, then we have a science. And
that doesn’t mean everything is ultimately going to fall within the domain of
science. Quite the contrary… personally I believe that the nature of a decent
society might fall outside scope of possible human science.

\- ZIZEK[2]: Hegel says, the owl of Minerva only flies out in the dusk. [owl
being the icon of wisdom] So philosophy can only grasp a social order when
it’s already in its decay.

Particularly unsettling is our reaction to the _blurriness_ of our creative
boundaries—that we insist on walking blindly toward cliffs to find where they
are. Optimism in uncertainty is great, but some projections cannot be certain
until too late.

A final quote that might address your first points:

\- OPHULS[3]: Because our own civilization is global, its collapse will also
be global, as well as uniquely devastating owing to the immensity of its
population, complexity, and consumption. To avoid the common fate of all past
civilizations will require a radical change in our ethos—to wit, the
deliberate renunciation of greatness...

Anyway, this debate is covered in the book The Wizard and The Prophet[4]. I
think we can tell which schools we belong to.

[1]:
[https://youtu.be/3wfNl2L0Gf8?t=1748](https://youtu.be/3wfNl2L0Gf8?t=1748)

[2]:
[https://youtu.be/lsWndfzuOc4?t=6703](https://youtu.be/lsWndfzuOc4?t=6703)

[3]:
[https://www.amazon.com/dp/1479243140](https://www.amazon.com/dp/1479243140)

[4]: [https://www.penguinrandomhouse.com/books/220698/the-
wizard-a...](https://www.penguinrandomhouse.com/books/220698/the-wizard-and-
the-prophet-by-charles-c-mann/9780307961693)

------
mschaef
In the event of a sufficiently large collapse, people will be so far down on
Maslow's hierarchy of needs that an OS will be about the last thing on their
minds.

~~~
yellowapple
The problem is that modern society - and the current size of the world's
population - is dependent on a lot of programmable devices. For example:
agriculture, where tractors and other farm machines nowadays have ECUs/ECMs
(let alone even more programmable bits and pieces). Same for the vehicles used
to actually transport food from farms to the rest of the world. There are
plenty of other examples, too, like medical devices and water extraction and
heating/cooling and other things that are nowadays the difference between life
and death for a lot of people.

Sure, we were able to make do a century or so ago, but not with 8 billion
people and counting. People _will die_ without some way to keep the various
microcontroller-driven systems up and running. It's a long shot that we'd be
able to adequately replace a microcontroller in a tractor ECM or a pacemaker
or an air conditioning system or a water pump, but a slim chance is better
than no chance at all, and the latter is exactly what we'll have unless we're
thinking about and testing out solutions _now_ , while we still have the
resources to easily do so.

~~~
mschaef
> the current size of the world's population - is dependent on a lot of
> programmable devices.

Not to mention the energy supply chain. If the supply chain required to make
electronics collapses, that probably also means the energy supply chain has
collapsed, or has at least been severely disrupted. That seems far more likely
to be damaging and far more quickly that a lack of ability to keep a
microcontroller running. If I don't have gas for my car, it doesn't really
matter if I can fix it when it breaks down. (And I run out of gas in a few
hundred miles, but repairs are required on the order of tens of thousands of
miles.)

This is really what I was trying to get it with my first comment. The problems
presented by a lack of ability to make new technology are the sorts of
problems that take months or years to become critical, but in a true collapse
setting, the issues that matter most would unfold in days or weeks.

(I feel like I should point out that I don't think any of this is particularly
likely.)

~~~
igammarays
Electricity generation does _not_ require a global supply chain. Modern
computer manufacturing does.

~~~
mschaef
True as far as it goes, but a couple comments:

* I was referring to the energy supply chain, not just electricity. Energy as a whole is very much a global supply chain. (And even more than that, it's very globally interconnected in terms of pricing, etc.)

* As a thought experiment, consider completely shutting down the computer manufacturing supply for two weeks. Then consider the same for the energy supply chain. Which of those has more immediate and profound impact?

Keep in mind that I'm not saying that either of these domains is unimportant.
Just that society would and has felt the importance of one a lot more acutely
and a lot more suddenly.

~~~
yellowapple
I think the point of GP's comment, though, is that it's arguably
straightforward to bootstrap some degree of electricity generation without
there necessarily being a working energy supply chain (e.g. building one's own
dynamo with a magnet and some wire and hooking that dynamo to a windmill or
watermill or steam engine or other turbine, or salvaging bits and pieces of
broken solar cells to build a new one from almost-scratch; then it's just a
matter of building capacitors or batteries or flywheels or elevated weights or
whatever to store that electricity). Yes, it'll be absolutely painful (and
will offer nowhere near the energy production/distribution capability to which
we're accustomed as a society), but it's survivable.

It's also possible to bootstrap some degree of computing power without an
electronics supply chain, but it's also much easier to cannibalize from
existing devices (whereas for the current energy supply chain there are fewer
things to be cannibalized, besides perhaps electric motors to turn into
impromptu dynamos).

Realistically, both will probably go hand-in-hand: we'll use primitive,
cobbled-together generators to power primitive, cobbled-together computers;
which we'll use to control more sophisticated generators to power more
sophisticated computers (and the more sophisticated processes for
repairing/building those computers); and so on until we're eventually back to
where we started.

------
carapace
Give me a good slide rule and a manual of practical mathematics, eh?

As for scavenged parts, you're going to need a warehouse of manuals and
datasheets, eh?

Depending on the details of your post-apocalyptic scenario planning, simple
automation driven by relays or clockwork logic will be more likely than e.g.
scavenged microcontrollers.

I applaud the spirit of the project though: I don't want to live on Gilligan's
Island making everything out of coconuts and vines.

~~~
AlEinstein
> As for scavenged parts, you're going to need a warehouse of manuals and
> datasheets, eh?

You're right! As a thought experiment, let's say I download CollapseOS and
then switch off my internet.

I have in my house a normal complement of electronic devices. I have a
soldering iron, some wire etc. I assume if I start taking things apart I'll
find some Z80s. Those Z80s will be living on boards with clock chips and
memory etc. Where do I even start?

------
codeulike
Reminds me of Global Village Construction Set

[https://www.opensourceecology.org/gvcs/](https://www.opensourceecology.org/gvcs/)

 _The Global Village Construction Set (GVCS) is a modular, DIY, low-cost,
high-performance platform that allows for the easy fabrication of the 50
different Industrial Machines that it takes to build a small, sustainable
civilization_

------
unhammer
Long Tien Nguyen and Alan Kay's Cuneiform Tablets seem relevant:
[https://archive.org/details/tr2015004_cuneiform](https://archive.org/details/tr2015004_cuneiform)

------
aquabeagle
_But if someone has a hint about useful prior art, please let me know._

[http://fuzix.org/](http://fuzix.org/) \- lots of 8-bit targets, z80 included

[http://cowlark.com/cpmish/index.html](http://cowlark.com/cpmish/index.html)
\- has a vi-like editor, assembler, and is cp/m compatible so it can run lots
of old cp/m software like various compilers

------
cmrdporcupine
If I were to pick an 8-bit processor for a post-apocalyptic future, it'd be
the single chip version of the Fairchild F8, not a Z80.

It was designed to be extremely simple and reduced in scope to the minimum of
what a processor needed. It went into space. Radiation hardened versions were
made.

The original version had its functionality broken up into multiple chips. That
could allow for easier repairs.

I don't know how many transistors were in it, but I doubt it's more than the
Z80 or 6502.

The RCA 1802 is another one I'd consider. In fact, it will likely outlive the
human race entirely, as it's in the Voyager spacecrafts.

~~~
naasking
> It was designed to be extremely simple and reduced in scope to the minimum
> of what a processor needed. It went into space. Radiation hardened versions
> were made.

But you won't find them in calculators just lying around that you can
scavenge. Remember, the narrative driving this is post-economic/supply chain
apocalypse.

------
rkeene2
Relevant to this goal is Stage0 [0], which is attempt to bootstrap a compiler
toolchain. It is still a work in progress but the most promising attempt I
have seen.

[0] [https://github.com/oriansj/stage0](https://github.com/oriansj/stage0)

------
guidoism
I love projects like these. One that has me fascinated is the idea of building
a computer than can last centuries. Can it be done?

\- Will the ICs last that long, can they?

\- How will it get electricity if the sockets and voltage standards change?

\- How do you make it durable to dropping, water, dust, etc?

\- What sort of writable storage can last that long without degrading?

\- How do you edit fonts as language changes over time?

\- What sort of libraries and documentation do you include?

\- Should you include some sort of Rosetta Stone for new users?

~~~
coryrc
I have some answers

1\. Yes; 10 _C reduction in temperature means doubling of life. I 've known
pentiums to last 10 years at 60_C+; just running processors at 30*C instead is
80 years minimum. Main thing is to use leaded solder so you don't get
electromigration problems.

2\. Solar panels and batteries. Battery voltage is chemical and fixed by
physics; nickel-iron batteries can be rebuilt and last forever. Solar panels
can be oversized to provide enough energy even when they degrade over time
and/or the computer can just be used at a lower duty cycle.

3\. Make it big and hard to move in a sturdy box.

4\. Flash can last that long if it is periodically rewritten, kept cool, has
redundancy, and isn't updated often.

~~~
jacobush
5\. Default to a ROM which is made in such a way it is basically immortal.

------
roca
I think a useful adjunct to this sort of project would be a project that
describes a really useful general-purpose CPU that can actually run a lot of
advanced software but that's still as simple as possible --- and work out a
realistic path for bootstrapping its manufacturing. A stripped down 32-bit
RISCV for example only needs tens of thousands of gates but could run most
modern software.

In conjunction with that, it would be good to have an archive of useful
software and data in a durable format where access to that data can also be
bootstrapped. I'm not sure what that format would be...

------
adrianmonk
Neat idea, but I'm not seeing the window of usefulness for this.

If society collapses and recovers relatively quickly, we likely can coast for
10-20 years on the computers that have already been built. This would be what
I'd expect to happen with a point-in-time catastrophe that disrupts everything
but then ends and we can all set to work to rebuilding everything. (Like a
massive economic collapse, huge meteor strike, nuclear winter, etc.) Even if
95% of computers become inoperable, there's a lot you can do with the
remaining 5%. Probably more than what you can do with new stuff you build.

Another scenario is that we recover really slowly. This would be due to some
kind of enduring factor that holds back humanity, like a really long-term
famine or global political instability that we somehow cannot reset. In that
case, what's the hurry to develop software that's ready to go? Maximizing
compute capability doesn't seem like it would be the thing that tips the
scales and allows society to get rolling again. For that you need to solve
whatever the root problem is.

TLDR, if we fall, maybe there is nothing holding us down, and we can bounce
back up relatively quickly, in which case we don't need this. Or there is
something holding us down, then it seems unlikely that computing is what we
need to solve that.

Maybe there are other scenarios that I haven't thought of, though. Or ways
that computing would help in the above scenarios.

------
alwaysanagenda
The digital version of prep-er style hoarding for the not-far-off-inevitable
apocalypse.

Love the idea to make it run on simple 8-bit CPUs that will be scavenged
Fallout-style, but seems to presume that no 'newer' technology would survive
and be functional.

Wonderful to see, none the less.

~~~
ocdtrekkie
His idea presumes not that you can't scavenge newer technology, but that we
can't replicate, repair, and manufacture more of it. Computers are built with
computers. Advanced computers were built with less advanced ones. Sure, we can
try to use some of what we found that survived, but will we be able to build
manufacturing to build a modern processor with what we find that still works
and the expertise we still have?

If we can't manufacture new smartphones, we need to have a baseline of
computer to develop new computers that can eventually develop computers that
can develop smartphones. Essentially he's proposing that if we lose our
societal ability to compute advanced things, that we be able to fall back to
the Z80 rather than the abacus.

Not sure I'm totally sold on it here, but it's an interesting topic to say the
least.

~~~
cestith
The standards for a clean room to manufacture a 1980s home computer CPU or
modern lower-tech microcontroller are far, far less stringent that the clean
rooms for a 7nm or 10nm part. The photolithography equipment you need is far
less precise and far more common. There are a lot more fab cranking out IoT
device chips than Xeons.

------
tiborsaas
Anyone wondering why the Z80 chip, just read the FAQ:

[https://collapseos.org/why.html](https://collapseos.org/why.html)

"The z80 has 9000 transistors. 9000! Compared to the millions we have in any
modern CPU, that's nothing!"

~~~
AnimalMuppet
No, it's 9000. That's not nothing. You may be able to scavenge old ones for a
long time, but building new ones will be non-trivial.

~~~
yellowapple
Non-trivial is still better than outright impossible.

~~~
AnimalMuppet
Well, the first transistor was in 1947. The Z80 was first manufactured in
1976. That's 29 years of improvements in manufacturing technology before the
Z80 was manufacturable. So if we have to start over, the Z80 is sure better
than the Pentium (or at least, we can make it first), but it's still outright
impossible for quite a while.

~~~
yellowapple
It's possible to build CPUs with many times the Z80's transistor count by
hand¹, albeit with great effort and taking up a heck of a lot of space (and
probably nowhere near the speeds of a single-chip Z80).

¹ [http://megaprocessor.com/index.html](http://megaprocessor.com/index.html)

------
hybrids
Why z80 and not x86 or AArch, which are both more readily accessible today?
This whole idea reeks of someone trying to reconcile their love of old
computers with their poorly considered death-cult Malthusianism.

~~~
yellowapple
Z80 chips are way more common than even x86, by merit of them having
proliferated in embedded microcontrollers.

~~~
hybrids
By accessibility I didn't really mean "commonality in terms of numbers," I
mean that they are generally more easier to configure and work with (on top of
being readily available). x86 PCs and ARM-powered mobile devices are very
plentiful, are fairly modular without requiring more sophisticated tools, and
avoid a high barrier of entry (i.e. deeper EE experience).

The author thinks that when their imagined Mad Max society comes to be,
they're going to be picking up a soldering iron against old Segas and TI-84s.
If for some reason that you need to use computers in a developmental capacity
(since the author's OS has an assembler and an `ed` clone) in a "post-collapse
society," I don't think it would be that hard to find some discarded HP
desktop or laptop to work on.

~~~
yellowapple
> I don't think it would be that hard to find some discarded HP desktop or
> laptop to work on.

In the short term, you're probably right. Most modern desktops and laptops
will hopefully last a decade or two (maybe three).

In the medium term, even these will start to break down. One of the key points
of failure will be thermal paste; these modern CPUs run quite a bit hotter
than a Z80 or 8086 or what have you, and the thermal paste has a finite
lifetime (especially the cheaper stuff used in most mass-produced desktops and
laptops). Unless you've got a whole bunch of the stuff stocked up, or you're
able to setup an immersion cooling rig (with a coolant that's non-conductive
and non-corrosive), these PCs will eventually overheat and die. Flash memory
and hard drives both have similarly-finite lifetimes, too, so there goes the
vast majority of mass-produced storage media (thankfully flash memory
longevity is driven by use, so it should be possible to stockpile flash
media).

Older chips like the Z80 or 8080/8086 or 6502 tend to avoid the thermal paste
problem entirely (by not requiring any sort of heatsink at all), and have
simpler memory interfaces (which makes it easier to wire them up to
replacement memory, including potentially hand-wound core memory or hand-wired
SRAM in a worst-case scenario).

In the long term, even these scavenged Z80s will probably eventually wear out.
Hopefully by this time at least some degree of chip fabrication will have been
bootstrapped back into existence, in which case replacement Z80s and
8080s/8086s will most likely be possible much sooner than replacement 386s and
ARMs.

\----

EDIT:

> x86 PCs and ARM-powered mobile devices are very plentiful, are fairly
> modular without requiring more sophisticated tools, and avoid a high barrier
> of entry (i.e. deeper EE experience)

Possibly, from a certain point of view. Apples-to-apples, though, this is very
unlikely to be true. Z80-based computers tend to be electrically simpler (by a
pretty wide margin) than x86-based or ARM-based computers. There's a lot more
supporting circuitry between the CPU and memory/peripherals/etc., which means
more components that can fail (and be difficult to replace, especially given
the tighter electrical and latency tolerances of the average x86 or ARM
motherboard).

------
X6S1x6Okd1st
One way to gain traction (pre-collapse) might be to hold competitions about
getting it to run on challenging "salvaged" systems and demonstrating
impressive ways to copy it from one system to another.

------
bobloblaw45
A bit of a tangent but in the novel "The Windup Girl" they live in a post oil
world which essentially ended up more of a total societal restructuring that
pretty much resembled what we'd consider a collapse. Nations fell and in some
places companies took over. Cities collapsed as the population shrank and
technology shifted to focus more on bio engineering to make up for the loss of
all the mechanical/electrical technology that ran our world since powering it
all got a lot more expensive after the oil was gone.

In one part a high security government installation was described with
"ancient" PC's. They couldn't make new ones so they kept whatever they could
running and the narrators mind was blown thinking about how much energy they
wasted.

I think one of the top priorities for a project like this should be making it
easy to implement considering practically everything you would use now days to
get help getting it working won't exist. No websites or forums or anything
like that.

------
RantyDave
I've wondered about this a few times and always started by asking myself what
would be left after armageddon.

Android phones. Tens of millions of them. It _must_ be the most ubiquitous
computing platform by now...

------
cubedrone
I had been thinking about similar projects myself. I figure that experience
with the z80, 68000, and the 6502 would give someone a platform for hacking
for at least the next century. There are some dozens of 68000-like chips in a
single car. I/O is as simple as LEDs and toggle switches for the bare
necessities, such as bootstrapping other I/O options. Worked for the Altair
8800. From there one could implement morse-like momentary switch input. In
these (possibly far-fetched) scenarios, going back to things like ticker tape
and printers would make a decent amount of sense. Perhaps spools of wire could
be used as "tape" for programs and data, as wire recorders existed before
plastic tapes were available. I love seeing how home fabrication is
developing, with people making simple silicon in their garage, but there is
value to a basic tool chain that doesn't require as much sophistication and
supply chains. I truly hope we don't live to see such a world, as the
suffering would be immense. That said, I have no idea how complex supply
chains can be expected to persist without fossil fuels.

------
mostlysimilar
This is a fun idea.

I often think about hoarding a collection of software and media for an end of
the world scenario. Then another year goes by and the world is still here.

~~~
rnd0
No law against updating the software collection and the media ...that's what I
do.

------
GeorgeTirebiter
Why not a Rad Hard 8086 [https://www.renesas.com/in/en/products/space-harsh-
environme...](https://www.renesas.com/in/en/products/space-harsh-
environment/rad-hard-digital/rh-microprocessors-
peripherals/device/HS-80C86RH.html) or maybe a Rad Hard RCA CPD1802
[https://www.renesas.com/in/en/products/space-harsh-
environme...](https://www.renesas.com/in/en/products/space-harsh-
environment/harsh-environment/microprocessors-
peripherals/device/CDP1802A.html) ? Those might survive a hydrogen bomb if not
too close.

I've thought about this a little, and I think rebooting vacuum tube technology
from scratch is possible more easily. Not trivial, but possible. Once you get
reliable triodes, you're on your way.

------
nitoy69
They have _much_ too rosy an idea of the future. Nuclear war is actually the
only thing that will _save_ them! By that time countries will be their own
separate cyber-Fascist states, actually cooperating with each other to keep
their citizens, by that time implanted with chips in their brains and
connected to the central Net, in line. Oh, by the way, they won't be forced to
have the chips implanted, they'll willingly line up for it! So they can get
their Dominos Pizza, and Amazon deliveries, and Google Map directions all with
a thought! Or so they'll be told. Besides, the asymmetry in computing power
between the rulers and the enslaved will be laughable. The government will
have even (more) powerful quantum based machines. Cobbled together 48K Z80
machines will be insignificant. If they're even tolerated. Which they won't
be...

------
insamsu
Everyone is comparing it with Seed Valut[0]. its better to have a complete
eco-system for computers. But its a nice forward step to save the technology
in small computer, which gives the basic idea of how to proceed further. in
case if you start storing the everything you need lots of space underground
which seems not feasible.

In my opinion, there should be system in which all the blueprints for the
technology is saved & that machine should be self sufficient to run on its own
power, memory and should be capable enough to educate or atleast gives the
basic idea of structure, as after the post Collapse, if anyone who is lucky
enough get this technology, can improve and build a new system.

I like the idea of Collapse OS, in similar manner create the machine which can
run any software/os or supports most basic and used operations.

Same goes with the books as well.

~Nauman

------
jolmg
Why Z80? Is it like the 2nd most common processor type or something? Where
would one find a Z80 when scavenging?

~~~
SwellJoe
It's extremely common, even today, though probably not the processor I would
pick (I think 68000 family, perhaps?), but it's probably a reasonable choice.
You're going to be able to find it embedded in literally millions of devices
and it's simple enough for one person to build a computer around it.

I suspect the argument against modern Intel chips is just their complexity.
They need an incredibly complicated and somewhat fragile support
infrastructure...you can't build a modern PC motherboard in your garage and
you don't expect modern PCs to last decades. They're very common, though, and
I suspect there will be plenty of PCs to scavenge, at least through our
lifetimes. But, the next generation will probably have trouble keeping them
going...I've got a 40 year old C64 still running with nearly all original
parts, but I am nearly 100% certain my modern laptop will not last even a
decade without repairs using parts that can't be manufactured without modern
infrastructure.

~~~
jolmg
> I suspect the argument against modern Intel chips is just their complexity.

Well that, and the fact that we already have plenty of OSes to run on
x86(-64).

Looking at arch/ in linux's source:

    
    
      alpha  avr32     frv      Kconfig  microblaze  openrisc  score  um
      arc    blackfin  h8300    m32r     mips        parisc    sh     unicore32
      arm    c6x       hexagon  m68k     mn10300     powerpc   sparc  x86
      arm64  cris      ia64     metag    nios2       s390      tile   xtensa
    

I'm surprised that it doesn't have support for Z80 if it's so common.

I'm also surprised that I can't see mention of Z80 in GCC's documentation.

~~~
skykooler
The Z80 can't run Linux due to the lack of an MMU. (Though it's possible to
get around that through emulation: see
[http://dmitry.gr/?r=05.Projects&proj=07.%20Linux%20on%208bit](http://dmitry.gr/?r=05.Projects&proj=07.%20Linux%20on%208bit)
for example.)

~~~
LargoLasskhyfv
There is Fuzix.

[1] [http://www.fuzix.org/](http://www.fuzix.org/)

And Symbos.

[2] [http://www.symbos.de/](http://www.symbos.de/)

I wonder if he knows about them?

------
laudens
Do you know that can be possible (more than what you can think) the situation
you told, Mr Virgil Dupras, in 2022? You were thinking about informatic and
not about events on the planet. For that time (2022) all the wolrd should have
a negative situation where a solar storm arrives on the planet and destroys
many electrical/electronics... It is called Carrington event and happens every
150 years... Please ask to the NASA as they know well. US Government started
researches about to stop, already in 2004...

------
java-man
Is there a rad-hardened version of z80?

~~~
skykooler
I have found many people saying that a rad-hardened Z80 with ferrite-core
memory was used on the Space Shuttle, though I can't find an authoritative
source to back that up.

~~~
yellowapple
The Shuttle did indeed use core memory for awhile, but in what was basically a
repackaged and rad-hardened System/360 (so not Z80-based AFAICT):
[https://en.wikipedia.org/wiki/IBM_System/4_Pi](https://en.wikipedia.org/wiki/IBM_System/4_Pi)

------
neurobashing
The manifesto/winter-is-coming bits and overall design reads to me like
TempleOS without the mental illness. Eg, a shell that "compiles", no protected
memory, etc.

~~~
dusted
> without the mental illness.

"But if the collapse magnitude is right, then this project will change the
course of our history, which makes it worth trying."

Let's say the jury is still out on that one? :D

------
LargoLasskhyfv
Seems strange to go for something like Z80 from the 8-Bit home computer age,
when the goal is simple manufacturing. Should go with the PDP-11 line instead
which had a long history spanning different technologies, going across large
and simple to manufacture to more integrated and miniaturized, faster, with
more memory, with a standardized set of peripherals and instruction set
architecture.

------
dusted
I'll chose TempleOS for my go-to "paranoid schizophrenic philosophy" daily
driver over CollapseOS anyday, and I'm atheist.

------
akhilcacharya
I really like this idea!

Makes me wonder what possibilities become, er, possible if we up the computing
power a few orders of magnitude to a Pi Zero W or Pi 4.

From what I understand it’s fairly easy to use a Pi as an LTE Router for
longer ranges and WiFi for shorter ranges. I wonder if the right microsd cards
and were stockpiled one would be able to reconnect several communities in a
mesh.

------
harperlee
To me a much more interesting avenue to be able to “bootstrap the pc era”
would be to achieve DIY knowledge enough to recreate TTL on a garage.
Afterwards you can at least create a computer from the 80s, as Ben Eater is
doing in his youtube channel.

------
notreall1238123
[https://en.wikipedia.org/wiki/Apocalyptic_and_post-
apocalypt...](https://en.wikipedia.org/wiki/Apocalyptic_and_post-
apocalyptic_fiction#Failure_of_modern_technology)

------
Ericson2314
Eh, our current ecosystem is such a mess I wouldn't miss it. I worked on cross
stuff for Nixpkgs in part due to a more optimistic take: Let's bootstrap all
the good stuff _en masse_ onto freer hardware.

------
kotongo
Maybe a system based on ARM or MIPS @ 1GHz very common in a modem-router. Z80
@ 8Mhz cannot dispay even a 640x480 image. it would be wonderful recycling
modems as stand-alone computers

------
oneepic
This project is really out of left field, especially as someone who doesn't
tend to wear tinfoil hats. I wouldn't mind if it was marketed more as some
kind of lightweight OS though.

------
brian_herman__
Reminds me of the anime series:
[https://en.wikipedia.org/wiki/Dr._Stone](https://en.wikipedia.org/wiki/Dr._Stone)

~~~
rafaelvasco
Pretty good series btw.

------
brian_herman__
Reminds me of:
[https://en.wikipedia.org/wiki/Dr._Stone](https://en.wikipedia.org/wiki/Dr._Stone)

------
akozak
What if computers caused the collapse though? Doesn't this risk reseeding the
problem? Maybe it'd be better in the long run if we have to reinvent.

------
tov_objorkin
In the Herbert Dune, civilization collapsed because of smart machines. So
after renaissance they are completely banned for the future use.

------
voldacar
Could this run on TI calculators? They use Z80 but I don't see them
mentionied. Looks like a cool project!

------
wailupe2k
Lets think about what technologies are most likely to be around after an
apocalyptic event... Probably something that everyone has, that there are
millions of.. something with a screen and a keyboard... very light and easy to
transport.... maybe even something with a battery, and signals processing
chips.... na screw it, lets just use a desktop. :P

~~~
BeefySwain
To be fair, the scope of a project that would require the ability to jailbreak
/ root arbitrary mobile devices running arbitrary OS versions would be
massive, much less creating an OS that would run on all of them.

Smart phones are a lot of things, but general purpose computers are not one of
them.

~~~
wailupe2k
[https://www.kingoapp.com/android-
root/devices.htm](https://www.kingoapp.com/android-root/devices.htm) , I think
there will be a few orders of magnitude more devices laying around that are
easily jail broken than z80s.

------
bovermyer
A fascinating thought experiment.

As the author points out, probably useless, but still fascinating.

------
gtirloni
I'm interested in the reasons the author sees for the ~2030 collapse.

------
elchief
Should come with a bicycle to generate power

------
flipgimble
Related to the post, I highly recommend reading "The Knowledge: How to Rebuild
Our World from Scratch" [1] if you are interested in foundational technology
that underpins our civilization. This type of information is often forgotten
or taken for granted in our highly dependent late-capitalism society.

[1]
[https://en.wikipedia.org/wiki/The_Knowledge:_How_to_Rebuild_...](https://en.wikipedia.org/wiki/The_Knowledge:_How_to_Rebuild_Our_World_from_Scratch)

------
joombaga
Does this work for the T80 FPGA core?

------
unixhero
This collapse thing. It's real?

------
hyperion2010
Like other in this thread I have been thinking about this problem on and off
for a while. I think that many of the comments stating that the Z80 is
probably not the best choice are right (I know nothing about the Z80) and
would like to extend some of their thinking.

The primary design requirement for a stand alone computer system in a post-*
world is simplicity, maintainability, and debugability. It must be possible
for a single user to do _everything_ in situ. There are very few existing
systems that meet all three of these criteria across the whole hardward-
firmware-software stack, and modern technology companies are actively moving
away from this.

At all levels this requires extensive and open documentation and
implementations, and ideally a real standard.

The hardware level would probably need a complete rethink, and if you want
good peripheral support (e.g. to be able to try to access whatever data device
you come across) then you need a solution that doesn't require a subsystem
kernel maintainer for everything, or you just give up on that. A potential 4th
requirement here could be a large supply of parts since in most scenarios it
is extremely unlikely that anyone will be able to get a fab working again for
hundreds or thousands of years. Maybe radiation hardened large feature size
ICs or something like that. The alternative would be a zillion RPis (with some
alternate data storage interface) so that hopefully some of them survive and
continue to work after 100s of years, but this seems like a much riskier bet
than trying to actually engineer something to survive for a very long time.
Above the IC level the ability for someone to replace parts without special
tooling beyond maybe a soldering iron also seems like it is probably also
important.

At the software level there are two existing systems that might serve, one of
the Smalltalks, or one of the lisps (my bias says common lisp, despite the
warts). Assembly and C are just not a big enough lever for a single
individual, and other things like Java seem to have been intentionally
engineered to deprive individual users of power. The objective here is not to
be fast, the objective is to retain access to computation at all so that the
knowledge of how to work with such systems is not lost. Also at the software
level the requirements pretty much preclude things like browsers that are so
monstrously complex that there no hope than an individual could ever hope to
maintain a legacy artifact (or probably even compile one of the monsters) for
interpreting modern web documents.

I do not think that we can expect the current incentive structure around
software and hardware to accidentally create something that can meet these
requirements. If anything it is going in the other direction as large
corporations can employ technology that can _only_ be maintained by large
engineering teams. We are putting little computers in everything, but they are
useless to anyone in a world without a network.

~~~
LargoLasskhyfv
What about Setun, a soviet ternary computer like in

[1] [https://en.wikipedia.org/wiki/Setun](https://en.wikipedia.org/wiki/Setun)

[2]
[https://web.archive.org/web/20080207064711/http://sovietcomp...](https://web.archive.org/web/20080207064711/http://sovietcomputing.com/node/47)

[3] [http://www.computer-museum.ru/english/setun.htm](http://www.computer-
museum.ru/english/setun.htm)

It is a stack machine, it has somthing like FORTH.

In which you can implement anything else, if you absolutely have to. Like some
have done with another stack oriented system here:

[4]
[https://en.wikipedia.org/wiki/POP-11](https://en.wikipedia.org/wiki/POP-11)

[5]
[https://en.wikipedia.org/wiki/Poplog](https://en.wikipedia.org/wiki/Poplog)

[6]
[http://www.cs.bham.ac.uk/research/projects/poplog/freepoplog...](http://www.cs.bham.ac.uk/research/projects/poplog/freepoplog.html)

And then have some cybernetic monks preach the advantages of something like
TRON

[7]
[https://en.wikipedia.org/wiki/TRON_project](https://en.wikipedia.org/wiki/TRON_project)

applied to all of the above.

~~~
hyperion2010
Will have to look over these more carefully, but your post reminded me that
there is a potential 3rd candidate which is FORTH, but it seems like it might
be just a bit too bare bones in some cases. Maybe with an accepted standard
library or something it could work.

------
monsonite
IMHO - the Z80 is probably not the optimum starting point. Its close cousin,
the 8080 started out as a TTL cpu built into early Datapoint terminals. Intel
took the Datapoint logic design and ISA and integrated it into LSI silicon.
Anything a Z80 can do, an 8080 can do - albeit less efficiently.

With an 8080 equivalent running a serial character display terminal based on
an oscilloscope CRT (1940s RADAR tech) you have an input/output device.

This leaves the main job of processing to another cpu, which could be 16-bit
for arithmetic speed and efficiency. The late 70s, early 80s 8-bit machines
were only underpowered because they were doing all of the video output using
the same cpu. Separate computation from video generation and you get a much
faster system.

8-bit cpus rarely needed an OS. They were really only capable of running
single applications at a time. All an operating system does is separate
hostile C code applications from each other. C is probably not the best
starting point to reboot society using 8-bit systems.

Forth, or some derivative might be better. Charles Moore's original 1968
listings for Forth on an IBM 1130 are available from here:
[https://github.com/ForthHub/discussion/issues/63](https://github.com/ForthHub/discussion/issues/63)

Remember also that every mid-1970s microprocessor generally relied on a
minicomputer (built from TTL) for its software and logic design. If you go
back 10 years (1965) to the PDP-8 minicomputer, these were built from diode-
transistor logic or DTL - made from discrete diodes, transistors, resistors
and capacitors. This sort of technology could possibly be re-booted more
easily for post-apocalypse society.

The original 12 bit PDP-8 contained 10,148 diodes, 1409 transistors, 5615
resistors, and 1674 capacitors. See-
[https://www.pdp8.net/straight8/functional_restore.shtml](https://www.pdp8.net/straight8/functional_restore.shtml)

Scale these figures by 1.33 and you have the approximate requirements for a
16-bit architecture.

Whilst over 50 years old, the PDP-8 could run BASIC at speeds not too
dissimilar to the early 8-bit micros that appeared in 1976 - about 10 years
later.

It used a modular construction - and if you did find yourself with an excess
of diodes and transistors, the best approach might be to build a series of
logic modules - loosely based on the 7400 series, but using DTL for
simplicity. If you were to standardise on a footprint similar to a 40 pin DIP,
you could probably recreate about 8 NAND gates in such a device.

Some years ago I looked at the NAND to Tetris cpu, and worked out a bitslice
design based entirely on 2-input NANDs. Each bitslice needed 80 NANDs, so a
16-bit machine would need 1280 gates. Memory would be difficult, but something
could be implemented using shift registers. You could of course revert back to
storing charge on a CRT screen - which formed the basis of the 1K words of
memory on the Manchester Baby machine of 1949 (Williams Tube).

Finally - never underestimate audio frequency generation, and storing signals
as audio tones - something that cpus are good at. Possibly use a rotating
magnetic drum for storage.

In the summer of 1984 - a friend and I, who both owned Sinclair ZX81s set up a
1-way data link between one machine and the other across our college dorms -
using a FM transmitter bug and an FM radio receiver - over a distance of 300
feet.

------
AltmousGadfly
Are Z80 systems and alike very common? Seems like a niche hobby.

I'm thinking old phones, tablets, and portable computers will be more common.
I keep several bootable USB drives which have lots of ebooks, audio books,
videos, software, and games along with several old laptops/netbooks which were
free. I also keep some of those files on microSD cards to make them accessible
with tablets.

IMO collapse will be very boring so lots of books, audio files, video games,
and music would be nice to have if it can be run off small off grid solar
setups.

------
jaakl
Ok, but what user programs will we really need then for it? Other than re-
building technology. Isn't "too much technology" basically the single biggest
root issue of current state of humankind, and planet in general?

