Hacker News new | past | comments | ask | show | jobs | submit login

I don't think that's necessarily the case, but it's a damn good point; I'd posit that a number of advancements of modern systems could be retroactively applied to a much less sophisticated "base" system and still maintain reliability, because they're born out of increasingly sophisticated programming practices that are widely established at this point (unit testing, fuzzing, even formal methods).

I'm no longer sold on "added functionality" though. I want my computers to be dumber, do less, and be fully owned; I want a 32-bit Z80 (or something like it) that runs at 400MHz on an FPGA, and a graphics subsystem that's primitive and easy to understand and work on. It won't play Crysis - fine. But DOOM, sure! (There is a killer project about a multi-Z80-core CPU implemented on an FPGA, I'll find a link to the project later)

It seems like we made this deal that we get more and more functionality, and it's set us up so that computing could be taken away from us (walled gardens, the modern internet, telemetry, software that requires subscriptions, updates you don't control, etc etc).

But you've brought up an interesting point that I'll need to consider deeply about reliability and its effect on system complexity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: