In agriculture, we have the doomsday seed vault  just for this purpose. If we anticipate collapse of the current economic system or society, I think we should build a doomsday computer vault, that keeps everything we need to rebuild the computing industry in a underground bunker. It keeps everything we need in a controlled environment, such as 8080, Z80, m68k, motherboards, logic chips, I/O controllers, ROM/RAM, generic electronic parts, soldering irons, oscilloscopes, logic analyzers, schematics, documentation, textbooks, software. We also keep some complete, standalone computer systems such as desktops and laptops, and all the parts that need to service them. We also need to preserve the old semiconductor production lines around the world, although probably not in the same bunker. Even if we fail to build better systems, 8080s are already useful enough!
Meanwhile in peace time, we need to form a team of experts that makes a roadmap to rebootstrap the computing technology for the future using parts from the bunker, with a step-by-step plan, that can be easily followed and executed.
Indeed. It takes a civilisation to build an iPhone.
I don't think people appreciate that even within the highest-tech manufacturing industries there is a lot of tacit knowledge. People shake their fists about "technology transfer" to China, and before that Japan; but that's taken decades for them to reach parity. And that's with running, copyable examples and all the parts of an existing supply chain widely available. Similarly the process for making a nuclear bomb can be written in a short paper, but few countries have successfully replicated it.
"Post-collapse recovery" and "technology transfer" are the same problem, except that post-collapse recovery is cribbing from a dead example rather than a live one and in much worse circumstances.
Collapse recovery is a fun little competence fantasy to play out in your own head. Like "the rapture" for atheists. But within our lifetimes, we have to put in the work to avoid the collapse.
The point here is that we don't need to build an iPhone, we only need a radio. Building a 8080 is much simpler, the USSR did it, the East Germany did it, China did it, all around the same time without too much difficulty. It's certainly would be much more difficult if the current civilization collapsed, but I think the author doesn't anticipate a total collapse, just a breakdown of the current economic system, thus it should be doable.
> Similarly the process for making a nuclear bomb can be written in a short paper, but few countries have successfully replicated it.
My understanding is that the physics of achieving the nuclear explosion itself is relatively straightforward. The real difficulties are to produce the weapon-grade materials needed, and to transform the explosion to an useful weapon, all under external sanctions, and even sabotage.
Taiwan had a nuclear weapon project in 1970s, significant progress was made in the beginning, if the U.S. didn't discover it and dismantle everything, it would be interesting to see how it turned out to be.
As the author themselves admits, this requires a very narrow band of Goldilocks catastrophe. Not catastrophic enough that you still have electricity, but catastrophic enough that all the plants on this list are destroyed or rendered unusable or embargoed from you? https://en.wikipedia.org/wiki/List_of_semiconductor_fabricat...
How is that going to work exactly?
No, it's a fantasy disaster, like Day of the Triffids and all the other John Wyndham style "cozy catastrophe" novels.
Real catastrophe is slow and grinding. Think "decline and fall of the Roman Empire" - a multi-lifetime process. Or something like Venezuela, where crumbling power infrastructure took out their aluminium smelter. This causes it to freeze solid and is unrecoverable without rebuilding the crucibles.
All the fabs on your list are fairly modern, and it is easy to see how they would be impossible to maintain in a Tainter-style collapse.
I would be looking into what you can do with ~1 um processes, which are currently accessible to amateurs. E.g. designs like https://news.ycombinator.com/item?id=11720289
Not sure about the Soviet Union, but East Germany had to "import" complete production lines from Western Germany in order to bootstrap their chip production. Of course the West prohibited exporting such sensible technology to the East, and there's some wild stories around involving the secret service, setting up a proxy chip manufacturing company in Western Germany, shutting this down, and "losing" the production lines, which showed up a few months later in East Germany.
But there will be plenty of Z80 and other 8-bit chips to scavenge after a civilizational collapse, and those chips are simple and slow enough to be used in computers built from breadboards and wires.
Some of the stories of how Western machinery was procured is mentioned from time to time in articles like this one:
It seems to have been less "broad sweep style" than I mentioned above though, more like that new products required specialized machines which were only produced by a handful Western manufacturers, and which were under the CoCom embargo. And the "grey-importing" of those machines went through various shadow companies in different countries to cover everything up.
"The vault managers are now taking precautions, including major work to waterproof the 100m-long tunnel into the mountain and digging trenches into the mountainside to channel meltwater and rain away. They have also removed electrical equipment from the tunnel that produced some heat and installed pumps in the vault itself in case of a future flood."
Sorry to ruin this for you, but it is already too late. We are already in the process of damage control.
"it's too late" is a dangerous statement because it suggests actions no longer matter. And if no action is taken, several climate models project >4 °C average warming. The majority of the planet will be become uninhabitable at the current human density levels, with billions of people having to relocate and/or die.
Like build dikes around low lying areas rather than trying to coordinate a worldwide slowdown of economic activity which faces the worst outcome of prisoner's dilemma.
This website got posted to HN in the past, definitely something that should be part of the attempt.
While the focus is on the data, it also necessarily involves access to original hardware at times, which is additionally also often stored for curatorial purposes in its own right.
What you need is a super-deep immortal vault that isn't anywhere near any known fault lines, and who's ingress point is sufficiently above ground, and can be accessed safely if under water, and supplies its own electricity for thousands of years.
Everything outside of the facility will have to be electrically neural up to extreme voltages (due to the possibility of plasma storms arising that make the worst thunderstorms on Earth look like a nice day outside).
I kid, but the more secure a vault is the more irretrievable it is, which only makes it useful to aliens, not us.
The unanimous response was "not much". Mostly anomalously concentrated resources. So... if they are right, even though we have no reason to believe there was a developed civilization before us, it's not that crazy to merely entertain the possibility.
Not to imply those were from unnatural processes, more to indicate that basic spacefaring technology implies ability to detect such: https://en.wikipedia.org/wiki/Mass_concentration_(astronomy)
Controlled descent is not necessary if they are built to last.
Especially that the people who will restart from a Z80 will face technological edges when their skills and needs grow.
We faced the same in the 80's: going from 8 bit to 16, then 32, having memory addressing problems, being pushed to new and hardly compatible architectures, having programs to rewrite...
Then why not leaving behind us a technology, still simple, but that will save them most of these issues?
I feel an 8-bit data bus and 32-bit address bus would be a good way to make long-live programs, edge-free when extending memories, and still not so complex processors and main boards. The address bus does not need to be fully wired in the first processor versions, so it can scale over time with more complex processors when skills and needs grow.
Besides, it would be smart to leave in a vault a kind of FPGA technology with sources to flash the FPGA components. So no need to create a production line for many different integrated components: only one output, and the components are specialized by flashing them.
Indeed even microprocessors can be flashed on an FPGA.
Well, just ideas...
Low tech CPU to support 32 bit code:
https://en.wikipedia.org/wiki/Motorola_68008 (8 lines data bus, 20 lines of address bus, 32 bit programs, 70 000 transistors)
Microprocessor on an FPGA:
I disagree. If humanity collapses, and a new fledgling civilisation grows out of the ashes, then maybe we should let them follow their own path, and make their own discoveries and mistakes.
It's massive hubris to assume that future generations can only survive if they have access to our knowledge.
Many civilisations have fallen only to be replaced by newer betters ones, even when they've had no previous records to refer to.
We're just a stepping stone along the path of human evolution, we are no greater or lesser than the stones before us, or the stones that come after us.
A collapse would be the perfect and likely sole opportunity to ditch the mistakes of the past and forge something completely new with the benefit of a "foresight" forged from hindsight.
It reminds me an 80's cartoon where Spanish, aliens and 3 children run after a golden city and destroy it out of greed when they find it.
Ryzen processors have 80MB of L3 cache! IIRC that may have been twice what was needed to run the colorized versions of NeXT.
A lot of "man the old days were all we needed" recollections of Windows and DOS forget how crappy those OSes were...
But NeXT? That was basically a modern OS, with a bit less anti-aliasing. XWindow systems STILL look horrible compared to it.
It fits in L3 cache. 5-10 NEXT hard disks fit in RAM!
It had a web 1.0 browser, TCPIP networking, all the productivity apps. It had DOOM.
Sure Mach the microkernel was buggy and leaked memory, but kernels... there's lots of kernels now.
I think it would be a great project to start with the NextOS and carefully rebuild it up with good security and multitasking.
It would be fun to run NeXTSTEP on bare metal with RAMdisks (spare a processor to do the writes to SSD), and compare with other systems, see if it feels ludicrously fast or not.
AROS is the next best OS to use as it has a low memory footprint and can run on X86 systems:
If you want Windows try ReactOS:
OS/2 try OSFree:
BeOS try HaikuOS:
All are low memory OSes.
Will try the other two, I guess. Not holding my hopes up.
Can you elaborate? It's been fairly stable for me.
P.S. It's just Haiku, not HaikuOS
Never heard of AROS...
ReactOS is bloated by definition of replicating windows, isn't it? It probably uses less than main windows, but it is just my opinion that the 1990-1997 period preceded some of the real bloat added due to the massive Moore's law gains after that with the GHz races.
OS/2 ... seems obscure to me but I never used it. But NeXT and Beos had revolutionary capabilities, I think OS/2 was basically just preemptive multitasking for Windows. Is that right?
BeOS, which I've never used, is probably also a good starting spot. HaikuOS probably has more oomph behind it community wise.
It is very wrong assumption!
Though I suspect your right, PC DOS is very powerful, and allows you to do a bunch of stuff.
You really want to go back 40 years to the dumb terminal era?
How long will 7nm chips work before electromigration destroys them?
The new Ryzen "only" has 64 MB L3 cache but it's split into 16 MB chunks per 4 cores. You can add in the L2 cache since it's exclusive (at the cost of more access fragmentation) to get 18 MB/72 MB depending on how you want to count it.
The new Epyc is the same design, just more cores so you get 256 MB. Still "only" 16 MB accessible per group of 4 cores though.
You are correct that one can't snap their fingers and create the community that even FreeBSD has. The last big corporate sponsor opportunity for this was the infant smartphones like the Palm Pre era (they owned Beos IP at that point I think) and early Android.
Sigh, that reminds me that Beos should have been the foundation of OSX, if not for the exhorbitant buyout cost they were insisting on.
One of the questions I keep coming back to for such a scenario, and still haven't come up with a great answer for, is how does someone living in a world without the ability to manufacture a computer still have computers that work 100+ years after the last one was made? Even manufacturing transistors without modern methods is non-trivial. Will a Z80 last 100+ years? I mean, maybe, if it's kept dry and not exposed to anything corrosive. I've got a Commodore 64 that's ~40 years old and still works...so, 100 years seems reachable, but there have to be extenuating circumstances to get to that "post-tech" world (though I guess in a post apocalyptic world, the value of computers would be seen as minimal for a few years while survival is the only concern, so just forgetting could be enough).
But the book would be more about the consequences of this - do they eventually take the thing apart and jump-start a silicon chip revolution in the 1950s, or (more likely I think) does the government destroy the machine as the UK government did to the Bletchley machines after WWII, and because there's no ground-up computer theory does it set back computing for decades?
So now ask somebody really smart in that technology, like say Jay Forrester who had just finished inventing core memory, to analyze this magic beige plastic box. He could probably recognize that the PCB provided connectivity between parts, but what are the parts, these little flat plastic tiles? I don't think it would be possible to work out from first principles what the functional contents of a DRAM chip is, let alone the CPU. Even if they x-rayed it, supposing they had x-ray tech with enough resolution to resolve a chip, how could they figure out that those little blobs are transistors? Transistors hadn't been invented!
I think they'd have to concede this is "sufficiently advanced" tech, in Arthur Clarke's phrase, to be indistinguishable from magic.
The RAMs and ROMs would be fairly trivial to figure out, as well.
You might not learn the manufacturing process -- that really did take a couple decades of material science and physics advances. But the principles of the machine would be clear. And then you could take the knowledge of that and scale it to the electronics components available in the era. You'd definitely have a head start simply knowing that these things were _possible_, and getting a boost on knowing how a computer could be structured.
Given that knowledge they can try breaking those pieces of plastic apart to see that it's a housing over some sort of ceramic core. Using spectroscopy and chemistry you can figure out what that core is made out of. Now you know what mix of chemicals allows for really high density data storage and computation.
Using x-rays and microscopes they can figure out that the little ceramic die has some sort of structure etched on it. Maybe remove tiny pieces to see what different parts/layers are chemically composed of.
Now they know that there's something interesting about certain elements deposited on top of silicon using some sort of etching approach. Early transistor research was already well along (and had been patented already in the 20s) so it's likely they would have made the connection. Given all that you can start brute forcing industries and ideas around those materials.
You can see both copyright dates, and plenty of other English text. While in 1940 this would have represented incredible futuristic technology, it's pretty obviously made by humans and not a piece of alien magic. It also has components like resistors and capacitors with markings which would have been immediately obvious to 1940s electronics experts.
From Greer's point of view, the factors that make today's hardware brittle are not technical, but economic. Corporations have to make electronics at a profit, and at a price point that is accesible to the average working class citizen. This business model would not be sustainable in the either a fast-collapse or slow-collapse scenario.
Instead, in the novel, governments take over the tech industry sometime in the second half of the 21st century, and treat it as a strategic resource in its struggle to not be left out in the global musical chairs game of climate change + resource depletion. They run it at a loss, and put the best minds they can spare to the task of making a computing infrastructure that is built to last.
By the 25th century, which is the time when the novel's events take place, Humanity has lost the ability to manufacture electronics, but computers built 350 years ago are kept in working order by a cadre of highly trained specialists (most of which have the skills of a geeksquad employee, but still). Common people have maybe heard some wildly innacurate legend about robots or computers. Wealty individuals are probably better informed but still cannot own one of those at any price. They only computers depicted of spoken about are US government property operated at US millitary facilities (or maybe there was one at the Library of Congress, do not really recall, though).
There's one post-collapse hacker in the novel, a secondary character that is part of the protagonist's crew. The author is not an engineering type and dances around the actual skills of this guy, but I'd say he seems able to use a debugger/hex editor and read binaries. His greatest achievement, though, is to fix and set up an ancient printer and recover documents from a disk that was "erased" but not wiped clean.
... you find a controller
... you find their emulation raspberry pi.
... all of a sudden, the world isn't as desolate.
They basically walk you through assembling and programming a full CPU from nothing but NAND-gates in a hardware description language, and in the second part even adding a compiler and higher-level language to the stack.
This, of course, presumes libraries are also mostly gone, since you don't need WikiPedia if you have a library.
The premise isn't that the folks with WikiPedia can rebuild modern society. It's that they literally can't (even if they had better knowledge resources), but would still have a survival advantage from having a little bit of the old knowledge. The fact is that if we lose our modern society, we'll never be able to build it up again. We've dug up all of the easily accessible resources, already. Scavenging from the good old days is the best any post-apocalyptic society can hope for, as bleak as that sounds.
¹ I vaguely recall Wikipedia already provides some way to download all pages in bulk, but I can't seem to find it (if it even exists anymore, or if it ever actually existed instead of me just hallucinating it)
I have libraries and WikiPedia is still pretty useful. Searchability and portability / mobility would be pretty valuable attributes in this type of scenario.
In general, the scenario is, that the whole world is broken down, but full of tech.
So many machines to get back to working. Machines beat muscle on a scale
Most robotics don't work in areas of heavy radiation, because radiation damages electronics.
Even chat is possible for example between two buildings where radio might not penetrate the walls. But sure, at that point if you can lay down cables, then it's simpler to just build a telephone.
There's a nice project page of one here, including an in-depth video about it here. There's a collection of other relay computers here.
I think we could also get started now. Not necessarily de-escalating tech, but realizing that the fundamental supply of newer, more powerful chips might not last even with a shift to more plentiful supplies of rare-earth metals due to our need to get off of fossil fuels, fast. I think it might be useful in the more immediate term to be able to lock-in the minimum set of features that make the web and Internet useful then distribute that as widely as possible on low-power, commodity platforms with resilient networks that could survive super-storms knocking out huge swaths of devices in one fell swoop.
Low-power p2p protocols, mesh networking, recycling devices, component-platforms that allow scavenging and make-shift repairs, etc.
Until we can solve the green energy problem it might be nice to know that even if your community gets hit with a storm or flood, it's still possible to restore and maintain network services rapidly in the aftermath. Simply being able to send a message to someone would be a big deal.
The simple way would be radio morse
As web client tech stabilizes and telecom regulatory rollback continues, there may be an opportunity for localized solutions to be landed for all sorts of different purposes.
The whole reason the web works for business is that you can give something away to millions of people for ad revenue or sell something to very large groups for small amounts of money.
Localization reduces your customer base. The price a business has to charge would be higher. This is on top of the adoption problem.
The older gentleman who was polite enough to listen to me said, "It's ok guy, if that 401k doesn't exist, then neither will you".
And so I think I will not stockpile any computers for later. I do like the engineering spirit of this however.
There are a number of ways that can not be true without a large scale societal collapse. Fraud involving pension funds has happened many times in the past (Bernie Madoff, Robert Maxwell being two high profile examples). The last financial crisis brought a bit more attention to the topic of counterparty risk - the idea that your "safe" investment is only as safe as the institutions that are backing it in many cases. It's not necessarily a high priority concern but I think it's worth at least considering splitting your retirement savings across more than one account with different institutions.
There are also lots of conceivable larger scale crises with historical precedent (many in the 20th Century) that would render your retirement savings largely worthless without leaving you dead. In many of those you would have more pressing concerns than your 401K but it still seems like not a bad idea to have some physical things of value that you keep somewhere secure but accessible (cash, perhaps gold and/or silver).
These concerns are all likely fairly low probability but there's certainly a whole range of possible scenarios between "my retirement funds are completely secure" and "I'm dead in a global thermonuclear apocalypse".
If someday the US looks like Syria, your 401k will be worthless but you’ll probably survive.
The first link on CollapseOS’s announcement called “Winter is Coming” has it all:
Think about a super volcano erupting, large meteor sticks, Black Plague style microorganism outbreak, a small or medium sized nuclear war, climate change lead global food shortages, late stage trade wars, and many more.
So, first of all, how are you supposed to download this thing onto your homebrew computer, given that internet will most likely be down?
"But if the collapse magnitude is right, then this project will change the course of our history, which makes it worth trying."
Mmmh, I think the author is a bit on the hyperbolic side here. I'm quite sure that anyone that can design & assemble a z80 computer can quite comfortably code some basic utilities by himself just fine. All the others won't care a bit about your OS. Sorry if I sounded harsh, but I actually was.
Why plan for less than the raspberry pi level?
But the doomsday scenarios aside, this is super useful as an educational device. It can teach people what computers actually are and how they operate on the lowest possible level.
I've also got 2 desktops from 2003ish (Athlon64 and Pentium III), they probably work, too, although they're stored in a garage (along with some other stuff like VCRs and CRT displays).
Not to mention routers old and new, all running Linux.
Yeah, why plan for less? All I need to do is scavenge around my property :D
Though a garden, livestock and a greenhouse will be a much higher priority. No one needs any sort of computing when they can't eat.
Mechanical parts fail first. The keyboards on those laptops will be gone after a few years of use. The USB ports won't last much longer. How long will the thermal paste and internal fans last?
The point of the z80 is that they're cheap and relatively easy to build from scavenged parts.
And I'm also thinking more of a "repository of knowledge" use, not just simple controllers.
TI calculators, microcontrollers, and more:
> I think it's much easier to scavenge parts for PCs. Plus, laptops are way more fixable than you seem to think, with simple soldering. Fans can be kept going for years with grease.
Collapse OS is talking about timelines of a century or more. Non-mechanical computers are the only ones that will last that long without the supporting infrastructure.
Your reality does make better sense, but doesn't make a good story.
¹ "logic gates" not necessarily being transistor-based, either; one could take a cue from the guy building a 32-bit RISC-V machine with vacuum tubes: https://www.ludd.ltu.se/~ragge/vtc/
I'd guess that those 8500 transistors would be better used to build thousands of much simpler logic controllers to help automate infrastructure that's lost its computer control systems.
There are simpler "transistor computers" that might be more feasible to build from discrete components:
There are multiple ways to skin a cat, though (perhaps literally; this would be the post-apocalypse, after all!), and you're right that there are numerous ways to put transistors to use besides building full-blown CPUs. One of the key advantages of a general-purpose CPU is that it's general purpose and can be made to do all sorts of different things, but there are certainly plenty of cases where that ain't necessary and you'd need a fraction of that capability at most.
Still, they'll probably go hand-in-hand. "Chips" are just discrete components, whether crafted from a single chunk of silicon or itself built out of discrete components and treated as a single discrete unit. Building a whole general-purpose CPU from individual transistors is much easier when those transistors are already arranged on a little board you can plug into your bigger board. Chances are that no matter if someone's building a whole CPU or something more special-purpose and limited, that someone will be doing so in terms of already-assembled-and-composable gates rather than transistors directly, if only for the sake of one's own sanity.
says about 30.000 gates, which is 10.000 less than Motorola 68000 and arguably faster.
at least made us Amiga and AtariST geeks envious.
Furthermore there is 'Microsequencer' like described there:
and following up from there to for example
which is a 6502 softcore with microsequencing applied
There are many options to choose from according to the available technology, tools & knowledge. One does not have to make an exact copy of something which made sense for arbitrary reasons, which don't necessarily apply when doing it from scratch under different circumstances.
The Z80's at 9,000 transistors (not sure how many gates, but almost certainly a fraction of that transistor count), so even the Acorn would be heavy in comparison. Still doable, though; just takes more time.
In terms of speed, it has less to do with transistor count and more about how close together you can get the transistors. Big, hand-wired CPUs tend to be slower than small single-chip ones just from the sheer latency differences between components.
> One does not have to make an exact copy of something which made sense for arbitrary reasons, which don't necessarily apply when doing it from scratch under different circumstances.
True, and I ain't saying one does. If we're at the point where we have to hand-wire replacements, though, it helps to have at least some degree of compatibility with the thing we're replacing. There are at least some schematics out there for building 8-bit CPUs from TTL chips¹², and I'd imagine those would all be viable candidates if we have to re-bootstrap our computational power and run out of other CPUs to tide us over in the meantime.
Ideally we should be working on CollapseOS equivalents/ports for as many CPUs as possible, so that we know that no matter what we're stuck with, there's always a way to repurpose it. Just as importantly, though, we should be hoarding copies of pinout/wiring diagrams, hardware manuals, etc. to make sure we have the knowhow on the hardware side, too.
¹ http://cpuville.com/Kits/8-bit-processor-kit.html - happens to be bus-compatible with the Z80, though not ISA-compatible as far as I can tell.
² http://mycpu.thtec.org/www-mycpu-eu/index1.htm - more "modern" features like Ethernet and VGA out, so a more likely candidate for general purpose computing if we really do run out of Z80s to scavenge
I’d hazard a guess that 8-bit machines played a part in the author’s young life - first computer, first job, happiest childhood summer, last computer they felt in control of before they got annoyingly complex - something like that. And therefore a collapse ending right when the author would have useful skills but things wouldn’t be too hard, is the most fun one to imagine.
Computing was around 40 years before the 1980s and electricity for a hundred years, but who wants to try and rebuild room sized punched card machines for ballistic trajectory calculations, get greasy fingers on mechanical parts, or deal with HT electrical power supplies safely, yawn, no fun there. rPi the same - by then everyone can do it and author isn’t special, so whatever. It’s not different enough from right now.
(SD cards seem like a good commodity to stockpile here, as he supports them, but they're likely incredibly hard to manufacture post-collapse.)
If it can be useful, than it can also be useful today to all the (poor) tinker people around the world today. There are lots of alternative eco villages etc. trying to be self sufficient, who do all kinds of recycling and improvised technology. If this adopts with those people, then it might be useful.
But if they cannot use this today, then I don't see how a broken down surviver group could use it.
If I was tasked with bootstrapping a post-apocalyptic computer from junk, a hard copy of a well-commented Forth implementation would be a welcome assistance.
For those curious as to what a modern machine using Forth on bare metal as an operating system might feel like, check out Open Firmware: https://www.openfirmware.info/Open_Firmware
(If you have an OLPC sitting around in a closet somewhere from the Give-One-Get-One program years ago, you already have a serviceable and physically robust Forth machine ready to roll! Same deal for some older Powerbooks and Sun workstations.)
A PDP-8 can be implemented in fewer transistors (original DEC wiring diagrams are on bitsavers, and github has source for several clones in Verilog), and DEC already shipped a moderately full software suite for it.
I mean it was built out of off the shelf cheaply (relatively) available TTL components and no real RF shielding (something variants in the US had to fix to comply with the FCC rules of the time).
It's astounding it was a commercial success but it cost 80 quid (as a kit, 100 pre-built) at a time when others where 3-4 to 10 times as expensive (average wage back then was around 110 per week).
In a very real sense it democratised computers to something almost anyone working could afford if they wanted it.
I know if it hadn't of been for the ZX-81/ZX-Spectrum I wouldn't have had a career in software engineering nor a life long love for computers, I was born in '80 to working class parents in the north of England even in 1987 having a computer was considered exotic among my cohort, I didn't see another one outside my family til 1990 (a C64 I lusted after).
Climate change? Will cost trillions of dollars and billions of lives, but will likely be played out over course of several decades. We will be stressing out about it but its not going to be electronics-ending apocalyptic
Nuclear war? Please. The countries that have the capability are also level-headed enough to use them to play brinksmanship, despite what the news is telling us. These countries want deterrence, not to blow stuff up.
Disease? We're too widely distributed and the most successful viruses are ones that infect but do not kill. Ebola is scary but its too destructive for its own good which makes it easy to contain. The most successful virus is the common cold, and possibly HIV which is certainly a serious problem, but nobody's out there building shelters because of that.
Water/food supply? Fresh water is a function of energy, and if anything is a plus about climate change its that we're gonna have a lot of fresh water raining down on us from the Earth trying to compensate for higher temps.
Second order effects from climate change will likely affect arable land and is worrisome but it may also open up new areas for growth and will likely play out over time, so I'm considering this more of a political problem.
The only things I can think of are either:
1) A sudden disappearance of rare earth metals needed to make electronic, which would be massively inconvenient but we'd figure out a way around that, either by it suddenly becoming more valuable to recycle old electronics or not needing them in the first place. Besides if this happens we'd just get extra motivated to start mining asteroids.
2) Celestial events like asteroid strike or Coronal Mass ejection hitting Earth in the wrong way. The first problem is mitigated with asteroid tracking and we're getting better at that, and the second one would make for an interesting 6 months but pretty sure we'd get back on track pretty quick.
I am all for technology that does not depend on a complex global supply chain - we will need to manufacture simple but sophisticated tech in space and mars in the future but this prepper BS is just fantasy driven by a hyper-apocalyptic news cycle shlepping around clickbait.
What am I not worried about that I should be? What massively apocalyptic event is going to happen in 10 years to turn us back to the middle ages? Seriously.
Au contraire, it’s the belief that our system can continue like it’s doing that is the real hyperbole. Collapse is just baseline reality of civilizations.
- HISTORY: Collapse is a property of every civilization we’ve studied. These people were as smart if not smarter than us, working with societies smaller and simpler than ours.
- ECONOMY: The way money is created and managed today is an ongoing experiment that almost ended in 2008, and we are still on uncharted ground. We can only continue paying for debt by increasing consumption in the following year, yet our debt keeps increasing, by the ever-devaluation of our currency, requiring more production and consumption. No one is planning on an end to this model of growth.
- TECH: Most of our infrastructure is built under the incentive of increased efficiency and profit, not long-term robustness since profit has to be sacrificed to plan for contingencies like price fluctuations in supply. Short term tech outcompetes the long term, easy. Strong but fragile. And then there’s the incentivized inefficiencies from economies of scale: one calorie of food now requires ten calories of energy from our system to produce.
- COMPLEXITY: “More is different.” As everything becomes interconnected, things become entrenched into dynamics that become increasingly difficult to control and even reason about. Rational decision-making must always be filtered by the interests of the current system, thus there is a loss in agency in what we can do (read: incentives), and we are stuck trying to find creative solutions that must accept the framework of what may be a harmful system, often just making that system more effectively harmful.
- ENVIRONMENT: Some call it the sixth mass extinction. Whatever it is, the biosphere is changing dramatically. Soil is in a weird zombie state kept alive by oil. The basic line is that the value of life is diminished through the lens of our economy, as dead resources. So our model will continue bringing the real world into consistency with that deadness.
- MYTHS: When we live in a civilization that sanctifies all forms of advancement and improvement and growth, there is no fertile soil for the acceptance of limitation. We only have the vocabulary to label it pessimist. Thus, optimism becomes co-opted for the aspirations of a mythical techno-utopia beyond all conceivable boundary.
How would you define "civilization?" Because sure, every civilization has an expiration date, but for current computing technology to be lost requires a worldwide civilizational collapse. Current global civilization is a decentralized collection of many civilizations which have all shared and replicated the knowledge of computing.
>our debt keeps increasing
Public and private debt are separate things. Public debt has generally seen a continuous march upwards. Private debt has been peaky, with no upward trend. Debts are fine when the debt is incurred for a purpose that has a sufficient return on investment. Public debts of sovereign currency issuers can always be repaid, and the yields on those bonds are whatever the currency issuer decides. And further debts shouldn't be judged as nonviable just because of the quantity of existing debt. Rather, the question at each point should be whether the investment is a good one.
> Soil is in a weird zombie state kept alive by oil
Soil is renewable, and can be made even with simple techniques. The terra preta soil of the Amazon rainforest was largely human-made, and thus the Amazon itself is largely a human construct. Creating it didn't require any oil.
>there is no fertile soil for the acceptance of limitation
Malthusian thinking has often been the default, and one of the most popular modes of thinking since the Enlightenment. The mid 20th century was full of best-selling Malthusian books by the Club of Rome, Paul Ehrlich, M. King Hubbert, and EF Schumacher. The entire fields of biology and ecology have been predicated on Malthusianism. Darwin was explicitly inspired by Malthus.
It has been to the great surprise of the intelligensia of each successive generation that there hasn't been mass starvation. We've been able to do more and more, with less and less. Any serious type of collapse hypothesis needs to factor in the history of losing bets on that side of the argument, and internalize why their predictions were wrong. It wasn't just luck every time.
This empirical optimism is also paradoxically irreverent toward the immutable attrition of complexity. Our creativity has limits, whatever they are just pick something. At the risk of sounding flippant, 200 years of “creative patching” is historically too small a window to say we can continue subverting this “law” with eternal vigilance (I’ve heard this described as “we are running out of tricks”). Maybe I’m oversimplifying when I say we would have to approach the limit of absolute foresight to achieve this, but I think there’s some truth to it. For example, I like these explanations of our rational limits, with regard to managing a complex society:
- CHOMSKY: We have in our heads a certain set of possible intellectual structures. In the lucky event that some aspect of reality happens to have the character of one of these structures in our mind, then we have a science. And that doesn’t mean everything is ultimately going to fall within the domain of science. Quite the contrary… personally I believe that the nature of a decent society might fall outside scope of possible human science.
- ZIZEK: Hegel says, the owl of Minerva only flies out in the dusk. [owl being the icon of wisdom] So philosophy can only grasp a social order when it’s already in its decay.
Particularly unsettling is our reaction to the blurriness of our creative boundaries—that we insist on walking blindly toward cliffs to find where they are. Optimism in uncertainty is great, but some projections cannot be certain until too late.
A final quote that might address your first points:
- OPHULS: Because our own civilization is global, its collapse will also be global, as well as uniquely devastating owing to the immensity of its population, complexity, and consumption. To avoid the common fate of all past civilizations will require a radical change in our ethos—to wit, the deliberate renunciation of greatness...
Anyway, this debate is covered in the book The Wizard and The Prophet. I think we can tell which schools we belong to.
You could buy a new EPYC server with solid state drives, grind it up and homogenize the whole thing in acid, and the resulting solution would have a smaller percentage of rare earth elements in it than the same mass of ordinary crustal rocks treated the same way.
Computers don't need rare earth elements. Nor do solar panels, nor do most wind turbines.
See for example the "Consumption" section in the USGS 2015 Minerals Yearbook Rare Earths report:
In descending order of global consumption volume, rare earth elements are consumed by the manufacture of catalysts, magnets, glass polishing media, and metal alloys. Everything else is just miscellaneous.
Near miss in July 2012:
Also, while I think a big solar flare would break a lot of stuff--- I think we're better prepared for it than many give us credit for. Tens of millions of people might be initially without power; some fraction of them may need to wait for a long time (months or even years) to get it back; and various bits of transport and production may get disrupted. Enough to require rationing and a major pain to quality of life, but not enough for any kind of catastrophic chain reaction, IMO.
Either way, I don't think such is a good premise to start this/an OS. There are much better argument to be made to prefer a lightweight OS. Intellectual curiosity, for one.
When the product lifecycle changes from 1 year to 10+ years, you'll find that people will just keep their stuff around longer and the demand on the supply chain goes way down.
Plus, there will be a shitload of data centers with capacity that will no longer be necessary (because of reduced devices making requests, segregated internet, less connectivity) in apocalyptic scenarios. Those can probably be re-purposed.
We haven't had to get clever about computer conservation because there's been so much supply.
Also, "middle ages" are going to take a good century at least. Think instead of the collapse of Soviet Union (with some places playing the part of the Balkans / Caucasus), but worse...
Btw, rare earths are not so rare - it's just that the US got rid of this industry.
No large scale technology was lost. If anything, human civilization became more technologically advanced.
: a charming novel about this: https://en.wikipedia.org/wiki/The_Black_Obelisk
Sure, we were able to make do a century or so ago, but not with 8 billion people and counting. People will die without some way to keep the various microcontroller-driven systems up and running. It's a long shot that we'd be able to adequately replace a microcontroller in a tractor ECM or a pacemaker or an air conditioning system or a water pump, but a slim chance is better than no chance at all, and the latter is exactly what we'll have unless we're thinking about and testing out solutions now, while we still have the resources to easily do so.
Not to mention the energy supply chain. If the supply chain required to make electronics collapses, that probably also means the energy supply chain has collapsed, or has at least been severely disrupted. That seems far more likely to be damaging and far more quickly that a lack of ability to keep a microcontroller running. If I don't have gas for my car, it doesn't really matter if I can fix it when it breaks down. (And I run out of gas in a few hundred miles, but repairs are required on the order of tens of thousands of miles.)
This is really what I was trying to get it with my first comment. The problems presented by a lack of ability to make new technology are the sorts of problems that take months or years to become critical, but in a true collapse setting, the issues that matter most would unfold in days or weeks.
(I feel like I should point out that I don't think any of this is particularly likely.)
* I was referring to the energy supply chain, not just electricity. Energy as a whole is very much a global supply chain. (And even more than that, it's very globally interconnected in terms of pricing, etc.)
* As a thought experiment, consider completely shutting down the computer manufacturing supply for two weeks. Then consider the same for the energy supply chain. Which of those has more immediate and profound impact?
Keep in mind that I'm not saying that either of these domains is unimportant. Just that society would and has felt the importance of one a lot more acutely and a lot more suddenly.
It's also possible to bootstrap some degree of computing power without an electronics supply chain, but it's also much easier to cannibalize from existing devices (whereas for the current energy supply chain there are fewer things to be cannibalized, besides perhaps electric motors to turn into impromptu dynamos).
Realistically, both will probably go hand-in-hand: we'll use primitive, cobbled-together generators to power primitive, cobbled-together computers; which we'll use to control more sophisticated generators to power more sophisticated computers (and the more sophisticated processes for repairing/building those computers); and so on until we're eventually back to where we started.
Not really, but the idea is sound. There's a hierarchy of control in electricity generation.
* At the bottom level you have microcontroller driven control loops sitting within the plants themselves. These operate on a sub-second timescale and do things like balance air/fuel/etc. flow through the plant to keep it safely running and stable.
* The lowest level loops take their setpoints and controls from a higher level set of controls that work at the level of the generating unit. Those work along the lines of 'generator 1 produce 200MW and ramp to 300MW over the next 3 hours.'
* Above that are control loops run by the grid operator that dispatch plants to match the amount of generation. (And do so in a safe and economic way).
* Above that are (can be) a series of nested power markets ranging in duration from real time, daily, monthly, etc.
* Above that are (can be) long term capacity markets that help ensure there's enough capacity within a grid to serve future load needs.
(So there are a lot of things that might qualify as 'macrocontrollers'. :-) )
As for scavenged parts, you're going to need a warehouse of manuals and datasheets, eh?
Depending on the details of your post-apocalyptic scenario planning, simple automation driven by relays or clockwork logic will be more likely than e.g. scavenged microcontrollers.
I applaud the spirit of the project though: I don't want to live on Gilligan's Island making everything out of coconuts and vines.
You're right! As a thought experiment, let's say I download CollapseOS and then switch off my internet.
I have in my house a normal complement of electronic devices. I have a soldering iron, some wire etc. I assume if I start taking things apart I'll find some Z80s. Those Z80s will be living on boards with clock chips and memory etc. Where do I even start?
The Global Village Construction Set (GVCS) is a modular, DIY, low-cost, high-performance platform that allows for the easy fabrication of the 50 different Industrial Machines that it takes to build a small, sustainable civilization
http://fuzix.org/ - lots of 8-bit targets, z80 included
http://cowlark.com/cpmish/index.html - has a vi-like editor, assembler, and is cp/m compatible so it can run lots of old cp/m software like various compilers
It was designed to be extremely simple and reduced in scope to the minimum of what a processor needed. It went into space. Radiation hardened versions were made.
The original version had its functionality broken up into multiple chips. That could allow for easier repairs.
I don't know how many transistors were in it, but I doubt it's more than the Z80 or 6502.
The RCA 1802 is another one I'd consider. In fact, it will likely outlive the human race entirely, as it's in the Voyager spacecrafts.
But you won't find them in calculators just lying around that you can scavenge. Remember, the narrative driving this is post-economic/supply chain apocalypse.
- Will the ICs last that long, can they?
- How will it get electricity if the sockets and voltage standards change?
- How do you make it durable to dropping, water, dust, etc?
- What sort of writable storage can last that long without degrading?
- How do you edit fonts as language changes over time?
- What sort of libraries and documentation do you include?
- Should you include some sort of Rosetta Stone for new users?
1. Yes; 10C reduction in temperature means doubling of life. I've known pentiums to last 10 years at 60C+; just running processors at 30*C instead is 80 years minimum. Main thing is to use leaded solder so you don't get electromigration problems.
2. Solar panels and batteries. Battery voltage is chemical and fixed by physics; nickel-iron batteries can be rebuilt and last forever. Solar panels can be oversized to provide enough energy even when they degrade over time and/or the computer can just be used at a lower duty cycle.
3. Make it big and hard to move in a sturdy box.
4. Flash can last that long if it is periodically rewritten, kept cool, has redundancy, and isn't updated often.
Would you like us to share notes on the book itself? I've practiced its teachings for the last 3 years or so, and I'd like to chat with someone about it.
Sorry if this comment is a bit irrelevant, but HN doesn't really have a DM system. /shrug
In conjunction with that, it would be good to have an archive of useful software and data in a durable format where access to that data can also be bootstrapped. I'm not sure what that format would be...
If society collapses and recovers relatively quickly, we likely can coast for 10-20 years on the computers that have already been built. This would be what I'd expect to happen with a point-in-time catastrophe that disrupts everything but then ends and we can all set to work to rebuilding everything. (Like a massive economic collapse, huge meteor strike, nuclear winter, etc.) Even if 95% of computers become inoperable, there's a lot you can do with the remaining 5%. Probably more than what you can do with new stuff you build.
Another scenario is that we recover really slowly. This would be due to some kind of enduring factor that holds back humanity, like a really long-term famine or global political instability that we somehow cannot reset. In that case, what's the hurry to develop software that's ready to go? Maximizing compute capability doesn't seem like it would be the thing that tips the scales and allows society to get rolling again. For that you need to solve whatever the root problem is.
TLDR, if we fall, maybe there is nothing holding us down, and we can bounce back up relatively quickly, in which case we don't need this. Or there is something holding us down, then it seems unlikely that computing is what we need to solve that.
Maybe there are other scenarios that I haven't thought of, though. Or ways that computing would help in the above scenarios.
"The z80 has 9000 transistors. 9000! Compared to the millions we have in any modern CPU, that's nothing!"
The author thinks that when their imagined Mad Max society comes to be, they're going to be picking up a soldering iron against old Segas and TI-84s. If for some reason that you need to use computers in a developmental capacity (since the author's OS has an assembler and an `ed` clone) in a "post-collapse society," I don't think it would be that hard to find some discarded HP desktop or laptop to work on.
In the short term, you're probably right. Most modern desktops and laptops will hopefully last a decade or two (maybe three).
In the medium term, even these will start to break down. One of the key points of failure will be thermal paste; these modern CPUs run quite a bit hotter than a Z80 or 8086 or what have you, and the thermal paste has a finite lifetime (especially the cheaper stuff used in most mass-produced desktops and laptops). Unless you've got a whole bunch of the stuff stocked up, or you're able to setup an immersion cooling rig (with a coolant that's non-conductive and non-corrosive), these PCs will eventually overheat and die. Flash memory and hard drives both have similarly-finite lifetimes, too, so there goes the vast majority of mass-produced storage media (thankfully flash memory longevity is driven by use, so it should be possible to stockpile flash media).
Older chips like the Z80 or 8080/8086 or 6502 tend to avoid the thermal paste problem entirely (by not requiring any sort of heatsink at all), and have simpler memory interfaces (which makes it easier to wire them up to replacement memory, including potentially hand-wound core memory or hand-wired SRAM in a worst-case scenario).
In the long term, even these scavenged Z80s will probably eventually wear out. Hopefully by this time at least some degree of chip fabrication will have been bootstrapped back into existence, in which case replacement Z80s and 8080s/8086s will most likely be possible much sooner than replacement 386s and ARMs.
> x86 PCs and ARM-powered mobile devices are very plentiful, are fairly modular without requiring more sophisticated tools, and avoid a high barrier of entry (i.e. deeper EE experience)
Possibly, from a certain point of view. Apples-to-apples, though, this is very unlikely to be true. Z80-based computers tend to be electrically simpler (by a pretty wide margin) than x86-based or ARM-based computers. There's a lot more supporting circuitry between the CPU and memory/peripherals/etc., which means more components that can fail (and be difficult to replace, especially given the tighter electrical and latency tolerances of the average x86 or ARM motherboard).
Android phones. Tens of millions of them. It must be the most ubiquitous computing platform by now...
I often think about hoarding a collection of software and media for an end of the world scenario. Then another year goes by and the world is still here.
In one part a high security government installation was described with "ancient" PC's. They couldn't make new ones so they kept whatever they could running and the narrators mind was blown thinking about how much energy they wasted.
I think one of the top priorities for a project like this should be making it easy to implement considering practically everything you would use now days to get help getting it working won't exist. No websites or forums or anything like that.
I've thought about this a little, and I think rebooting vacuum tube technology from scratch is possible more easily. Not trivial, but possible. Once you get reliable triodes, you're on your way.
In my opinion, there should be system in which all the blueprints for the technology is saved & that machine should be self sufficient to run on its own power, memory and should be capable enough to educate or atleast gives the basic idea of structure, as after the post Collapse, if anyone who is lucky enough get this technology, can improve and build a new system.
I like the idea of Collapse OS, in similar manner create the machine which can run any software/os or supports most basic and used operations.
Same goes with the books as well.
I suspect the argument against modern Intel chips is just their complexity. They need an incredibly complicated and somewhat fragile support infrastructure...you can't build a modern PC motherboard in your garage and you don't expect modern PCs to last decades. They're very common, though, and I suspect there will be plenty of PCs to scavenge, at least through our lifetimes. But, the next generation will probably have trouble keeping them going...I've got a 40 year old C64 still running with nearly all original parts, but I am nearly 100% certain my modern laptop will not last even a decade without repairs using parts that can't be manufactured without modern infrastructure.
Well that, and the fact that we already have plenty of OSes to run on x86(-64).
Looking at arch/ in linux's source:
alpha avr32 frv Kconfig microblaze openrisc score um
arc blackfin h8300 m32r mips parisc sh unicore32
arm c6x hexagon m68k mn10300 powerpc sparc x86
arm64 cris ia64 metag nios2 s390 tile xtensa
I'm also surprised that I can't see mention of Z80 in GCC's documentation.
I'm somewhat surprised there's no Z80 support for GCC, I recall running a gcc for Motorola's 68HC11 which is a similar processor. That said, most general purpose C compilers are a bad fit for 8-bit processors; you really want to write assembly for these small systems to ensure your code is compact and fast; it's much too easy to write code in C that will be significantly slower than if well written in assembly because of limited memory indexing modes or lack of registers. It's probably possible to carefully constrain your C usage to encourage reasonable assembly output, but then you're not going to be able to use existing C code. You won't have that much code that fits in your roms anyway, so you may as well dig in and write assembly.
The was only introduced with the 68030, which is roughly analagous and contemporaneous with the Intel 80386.
The 68040 added an onboard FPU, like the 80486. (And like the 486SX, the 68040EC had the onboard FPU removed again.)
That's because it's an 8 bit computer, my dude. Back in the old days, when "the internet" was still a military project, and you'd phone up your local BBS at 2400 baud on a POTS with suction cups; that's all the little people had access to as recently as the early 80s. And as other people said, there are apparently many of them around, and they run on shit power supplies.
It's a cool idea, but obviously it requires both cheap and dirty hardware implementations and a paper manual. Pretty sure "hacking" will be low on the hierarchy of needs in the event of apocalypse. Also pretty sure something like CP/M would be more useful. I know where there are CAMAC crates with Z80/CPM punched card/paper tape readers that would probably do great in a post apocalyptic environment.
I wonder if he knows about them?
Apparently they discontinued those in 2004. Huh.
Maybe the m68k would be a better target? Actually if the OP is serious about this project, I would write base kernels for Z80, m68k, armv7, and RV32i. The last one isn't widely available, but has the advantage of being both modern architecture and allowing open-source specs for how to construct one from scratch.