Hacker News new | past | comments | ask | show | jobs | submit | page 2 login
Collapse OS (collapseos.org)
753 points by spraak 41 days ago | hide | past | web | favorite | 303 comments



The digital version of prep-er style hoarding for the not-far-off-inevitable apocalypse.

Love the idea to make it run on simple 8-bit CPUs that will be scavenged Fallout-style, but seems to presume that no 'newer' technology would survive and be functional.

Wonderful to see, none the less.


His idea presumes not that you can't scavenge newer technology, but that we can't replicate, repair, and manufacture more of it. Computers are built with computers. Advanced computers were built with less advanced ones. Sure, we can try to use some of what we found that survived, but will we be able to build manufacturing to build a modern processor with what we find that still works and the expertise we still have?

If we can't manufacture new smartphones, we need to have a baseline of computer to develop new computers that can eventually develop computers that can develop smartphones. Essentially he's proposing that if we lose our societal ability to compute advanced things, that we be able to fall back to the Z80 rather than the abacus.

Not sure I'm totally sold on it here, but it's an interesting topic to say the least.


The standards for a clean room to manufacture a 1980s home computer CPU or modern lower-tech microcontroller are far, far less stringent that the clean rooms for a 7nm or 10nm part. The photolithography equipment you need is far less precise and far more common. There are a lot more fab cranking out IoT device chips than Xeons.


Maybe a system based on ARM or MIPS @ 1GHz very common in a modem-router. Z80 @ 8Mhz cannot dispay even a 640x480 image. it would be wonderful recycling modems as stand-alone computers


Is there a rad-hardened version of z80?


There allegedly was one, but I can't find any info on it. Given the relative simplicity of a Z80 compared to newer CPU designs, it should be relatively "easy" to harden it.

There was definitely a rad-hardened version of the 8085 (similar to the 8080, and therefore to the Z80), which was used on the Sojourner rover (among various other NASA and ESA spacecraft). Seems like RISC processors were more common for this, though (looks like most relatively-recent NASA spacecraft - including pretty much all of NASA's Mars landers after Sojourner - use(d) rad-hardened POWER CPUs, e.g. the RAD6000 and RAD750).


I have found many people saying that a rad-hardened Z80 with ferrite-core memory was used on the Space Shuttle, though I can't find an authoritative source to back that up.


The Shuttle did indeed use core memory for awhile, but in what was basically a repackaged and rad-hardened System/360 (so not Z80-based AFAICT): https://en.wikipedia.org/wiki/IBM_System/4_Pi



Fairchild F8 was common as a rad-hardened 8-bit CPU in space and military tech. That's simpler than a Z80, and potentially more flexible.


Seems strange to go for something like Z80 from the 8-Bit home computer age, when the goal is simple manufacturing. Should go with the PDP-11 line instead which had a long history spanning different technologies, going across large and simple to manufacture to more integrated and miniaturized, faster, with more memory, with a standardized set of peripherals and instruction set architecture.


I really like this idea!

Makes me wonder what possibilities become, er, possible if we up the computing power a few orders of magnitude to a Pi Zero W or Pi 4.

From what I understand it’s fairly easy to use a Pi as an LTE Router for longer ranges and WiFi for shorter ranges. I wonder if the right microsd cards and were stockpiled one would be able to reconnect several communities in a mesh.


To me a much more interesting avenue to be able to “bootstrap the pc era” would be to achieve DIY knowledge enough to recreate TTL on a garage. Afterwards you can at least create a computer from the 80s, as Ben Eater is doing in his youtube channel.



Eh, our current ecosystem is such a mess I wouldn't miss it. I worked on cross stuff for Nixpkgs in part due to a more optimistic take: Let's bootstrap all the good stuff en masse onto freer hardware.


This project is really out of left field, especially as someone who doesn't tend to wear tinfoil hats. I wouldn't mind if it was marketed more as some kind of lightweight OS though.


Reminds me of the anime series: https://en.wikipedia.org/wiki/Dr._Stone


Pretty good series btw.


The manifesto/winter-is-coming bits and overall design reads to me like TempleOS without the mental illness. Eg, a shell that "compiles", no protected memory, etc.


> without the mental illness.

"But if the collapse magnitude is right, then this project will change the course of our history, which makes it worth trying."

Let's say the jury is still out on that one? :D


I'll chose TempleOS for my go-to "paranoid schizophrenic philosophy" daily driver over CollapseOS anyday, and I'm atheist.



What if computers caused the collapse though? Doesn't this risk reseeding the problem? Maybe it'd be better in the long run if we have to reinvent.


In the Herbert Dune, civilization collapsed because of smart machines. So after renaissance they are completely banned for the future use.


Could this run on TI calculators? They use Z80 but I don't see them mentionied. Looks like a cool project!


Lets think about what technologies are most likely to be around after an apocalyptic event... Probably something that everyone has, that there are millions of.. something with a screen and a keyboard... very light and easy to transport.... maybe even something with a battery, and signals processing chips.... na screw it, lets just use a desktop. :P


To be fair, the scope of a project that would require the ability to jailbreak / root arbitrary mobile devices running arbitrary OS versions would be massive, much less creating an OS that would run on all of them.

Smart phones are a lot of things, but general purpose computers are not one of them.


https://www.kingoapp.com/android-root/devices.htm , I think there will be a few orders of magnitude more devices laying around that are easily jail broken than z80s.


Z80 doesn't imply a desktop. See e.g. the Cambridge Z88. A Z80-based computer is also significantly easier to repair with rudimentary tools, and to create from scavenged parts, than an Android device. It can also be powered from pretty much anything, making for a great tool in an environment which is at risk of losing reliable widespread power distribution capabilities and which probably cannot create new li-ion batteries.


What on earth are you going to do with a pocket botnet? Without some kind of USB-OTG adaptor to break it out into useful GPIO for interfacing with other machinery, its going to be limited to playing flappy bird and failing to send telemetry over the now-destroyed mobile network.


A modern android cell phone has a camera, screen, light, keyboard, speakers, battery, and a cpu that is many times faster than the z80 counterpart and all the software to go with it. There are also various short range communication and mesh networking abilities (NFC /Bluetooth/WIFI). Its not too difficult to create a wifi router out of a phone. You can purchase many USB to GPIO boards cheaply online (as a pepper your stock piling stuff anyway, your not going to just find a z80 laying around either, there are even adapters for ham radios you can pick up cheaply). I could easily see using a camera network from cell phone for proximity alerting, mesh networking of a few city blocks to create a communications network and basic note taking and information storage fitting into a few phones storage, in addition to having a fully functional Linux machine. I'd wager that if you take all the time it takes to create this project and compare its success to the time it takes to buy 30 GPIO adapters, download the rootkit library for android and some cables, that you'd have more success going with the phones. Also, just think about moving around... a backpack of cell phones isn't going to have the failure rate that you'll get with exposed electronics.


I agree for hardware; we need to etch the instructions into a stone tablet; for powering up and rooting an android phone. Some of the phones are water resistant; shame the batteries will likely kill most of them.


Something with a life expectancy of 3-5 years


A fascinating thought experiment.

As the author points out, probably useless, but still fascinating.


I'm interested in the reasons the author sees for the ~2030 collapse.


Should come with a bicycle to generate power


Related to the post, I highly recommend reading "The Knowledge: How to Rebuild Our World from Scratch" [1] if you are interested in foundational technology that underpins our civilization. This type of information is often forgotten or taken for granted in our highly dependent late-capitalism society.

[1] https://en.wikipedia.org/wiki/The_Knowledge:_How_to_Rebuild_...


This collapse thing. It's real?


Does this work for the T80 FPGA core?


Like other in this thread I have been thinking about this problem on and off for a while. I think that many of the comments stating that the Z80 is probably not the best choice are right (I know nothing about the Z80) and would like to extend some of their thinking.

The primary design requirement for a stand alone computer system in a post-* world is simplicity, maintainability, and debugability. It must be possible for a single user to do _everything_ in situ. There are very few existing systems that meet all three of these criteria across the whole hardward-firmware-software stack, and modern technology companies are actively moving away from this.

At all levels this requires extensive and open documentation and implementations, and ideally a real standard.

The hardware level would probably need a complete rethink, and if you want good peripheral support (e.g. to be able to try to access whatever data device you come across) then you need a solution that doesn't require a subsystem kernel maintainer for everything, or you just give up on that. A potential 4th requirement here could be a large supply of parts since in most scenarios it is extremely unlikely that anyone will be able to get a fab working again for hundreds or thousands of years. Maybe radiation hardened large feature size ICs or something like that. The alternative would be a zillion RPis (with some alternate data storage interface) so that hopefully some of them survive and continue to work after 100s of years, but this seems like a much riskier bet than trying to actually engineer something to survive for a very long time. Above the IC level the ability for someone to replace parts without special tooling beyond maybe a soldering iron also seems like it is probably also important.

At the software level there are two existing systems that might serve, one of the Smalltalks, or one of the lisps (my bias says common lisp, despite the warts). Assembly and C are just not a big enough lever for a single individual, and other things like Java seem to have been intentionally engineered to deprive individual users of power. The objective here is not to be fast, the objective is to retain access to computation at all so that the knowledge of how to work with such systems is not lost. Also at the software level the requirements pretty much preclude things like browsers that are so monstrously complex that there no hope than an individual could ever hope to maintain a legacy artifact (or probably even compile one of the monsters) for interpreting modern web documents.

I do not think that we can expect the current incentive structure around software and hardware to accidentally create something that can meet these requirements. If anything it is going in the other direction as large corporations can employ technology that can _only_ be maintained by large engineering teams. We are putting little computers in everything, but they are useless to anyone in a world without a network.


What about Setun, a soviet ternary computer like in

[1] https://en.wikipedia.org/wiki/Setun

[2] https://web.archive.org/web/20080207064711/http://sovietcomp...

[3] http://www.computer-museum.ru/english/setun.htm

It is a stack machine, it has somthing like FORTH.

In which you can implement anything else, if you absolutely have to. Like some have done with another stack oriented system here:

[4] https://en.wikipedia.org/wiki/POP-11

[5] https://en.wikipedia.org/wiki/Poplog

[6] http://www.cs.bham.ac.uk/research/projects/poplog/freepoplog...

And then have some cybernetic monks preach the advantages of something like TRON

[7] https://en.wikipedia.org/wiki/TRON_project

applied to all of the above.


Will have to look over these more carefully, but your post reminded me that there is a potential 3rd candidate which is FORTH, but it seems like it might be just a bit too bare bones in some cases. Maybe with an accepted standard library or something it could work.


IMHO - the Z80 is probably not the optimum starting point. Its close cousin, the 8080 started out as a TTL cpu built into early Datapoint terminals. Intel took the Datapoint logic design and ISA and integrated it into LSI silicon. Anything a Z80 can do, an 8080 can do - albeit less efficiently.

With an 8080 equivalent running a serial character display terminal based on an oscilloscope CRT (1940s RADAR tech) you have an input/output device.

This leaves the main job of processing to another cpu, which could be 16-bit for arithmetic speed and efficiency. The late 70s, early 80s 8-bit machines were only underpowered because they were doing all of the video output using the same cpu. Separate computation from video generation and you get a much faster system.

8-bit cpus rarely needed an OS. They were really only capable of running single applications at a time. All an operating system does is separate hostile C code applications from each other. C is probably not the best starting point to reboot society using 8-bit systems.

Forth, or some derivative might be better. Charles Moore's original 1968 listings for Forth on an IBM 1130 are available from here: https://github.com/ForthHub/discussion/issues/63

Remember also that every mid-1970s microprocessor generally relied on a minicomputer (built from TTL) for its software and logic design. If you go back 10 years (1965) to the PDP-8 minicomputer, these were built from diode-transistor logic or DTL - made from discrete diodes, transistors, resistors and capacitors. This sort of technology could possibly be re-booted more easily for post-apocalypse society.

The original 12 bit PDP-8 contained 10,148 diodes, 1409 transistors, 5615 resistors, and 1674 capacitors. See- https://www.pdp8.net/straight8/functional_restore.shtml

Scale these figures by 1.33 and you have the approximate requirements for a 16-bit architecture.

Whilst over 50 years old, the PDP-8 could run BASIC at speeds not too dissimilar to the early 8-bit micros that appeared in 1976 - about 10 years later.

It used a modular construction - and if you did find yourself with an excess of diodes and transistors, the best approach might be to build a series of logic modules - loosely based on the 7400 series, but using DTL for simplicity. If you were to standardise on a footprint similar to a 40 pin DIP, you could probably recreate about 8 NAND gates in such a device.

Some years ago I looked at the NAND to Tetris cpu, and worked out a bitslice design based entirely on 2-input NANDs. Each bitslice needed 80 NANDs, so a 16-bit machine would need 1280 gates. Memory would be difficult, but something could be implemented using shift registers. You could of course revert back to storing charge on a CRT screen - which formed the basis of the 1K words of memory on the Manchester Baby machine of 1949 (Williams Tube).

Finally - never underestimate audio frequency generation, and storing signals as audio tones - something that cpus are good at. Possibly use a rotating magnetic drum for storage.

In the summer of 1984 - a friend and I, who both owned Sinclair ZX81s set up a 1-way data link between one machine and the other across our college dorms - using a FM transmitter bug and an FM radio receiver - over a distance of 300 feet.


Are Z80 systems and alike very common? Seems like a niche hobby.

I'm thinking old phones, tablets, and portable computers will be more common. I keep several bootable USB drives which have lots of ebooks, audio books, videos, software, and games along with several old laptops/netbooks which were free. I also keep some of those files on microSD cards to make them accessible with tablets.

IMO collapse will be very boring so lots of books, audio files, video games, and music would be nice to have if it can be run off small off grid solar setups.


[flagged]


We detached this flagged, flamewarish subthread from https://news.ycombinator.com/item?id=21185952.

Can you please follow the site guidelines when commenting here? They include: "Eschew flamebait. Don't introduce flamewar topics unless you have something genuinely new to say. Avoid unrelated controversies and generic tangents."

https://news.ycombinator.com/newsguidelines.html


Sorry. I genuinely thought I was stating a scientific fact.

https://grist.org/article/humans-cause-climate-change-do-we-...


This is absurdly wrong. How about instead of trying to get some of the most educated people on the internet (who historically already have low birth rates) to stop reproducing even further, you instead redirect your efforts towards passing more climate-friendly regulations for corporations. That is the biggest source of climate change, not a niche group of people having some extra kids.


Sorry. I genuinely thought I was stating a scientific fact.

https://grist.org/article/humans-cause-climate-change-do-we-...


If that's the price, so much the worse for the climate.


Ok, but what user programs will we really need then for it? Other than re-building technology. Isn't "too much technology" basically the single biggest root issue of current state of humankind, and planet in general?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: