Hacker News new | past | comments | ask | show | jobs | submit login

The problem is that shoving a full modern OS into an appliance means you'll need to deal with the many surfaces of attack on that OS somehow. Updating it from time to time might be better than getting hacked and recovery being difficult or impossible?



From the perspective of a distro, I am sure that fixing bugs in the current version takes way less work than building the next version of the distro.

So I suggest it is worth building the next version 5% slower and use the free 5% of resources to fix security issues in the last one.

And by this, extending security fixes for the last version to 50 years.


For any distro, “fixing bugs” that affect users will mean upstreaming patches to source in over 99% of cases. If you’re upstreaming patches, you’ll want to run the new version of upstream with those fixes included. Now you’re running new software and you’ve come full circle to a distro upgrade.

The alternative is to maintain your own patches, but that is not remotely sustainable for even the largest commercially supported distributions.

I sidestep all these concerns by running a rolling release distro on my workstation and deploying serverless code at work. I don’t miss the days of trying to get software downloaded off the web to work on a fixed debian or Ubuntu install.

Edit: nor having my production language runtime constrained by distro!


Backporting and maintaining patchsets is a huge portion of the work of distro maintainers, possibly more so than developing each new version. Why do you think Arch (which very much has a "least effort for maintainers" philosophy) avoids it as much as possible while happily keeping up with the bleeding edge?


> So I suggest it is worth building the next version 5% slower and use the free 5% of resources to fix security issues in the last one. And by this, extending security fixes for the last version to 50 years.

If we want to bring math into the conversation, then I don't think that would cut it, since supporting old releases is basically quadratic in terms of human effort. It's why distros ship LTS releases at a different cadence, which makes the problem less bad. But the costliness explodes if they do it for any longer than a few releases. It'd be fine if all the sticks in the mud could agree to use one specific old version, like RHEL5 and only RHEL5. But that's basically what projects like the BSDs are already doing, and they're awesome. I wish people who desire Linux to not change so rapidly were supporting the BSD folks instead, because they actually live up to preserving UNIX in a very authentic form.


If we say "security updates only" for the kernel, does that mean we won't support new processors and graphics cards? When Intel and AMD say they have a patch that enables new processors and peripherals, do we decline it? Why?

People will expect the distribution to just work with newer hardware. Or maybe you're saying upgrade to debian 13 if/when you get new hardware but stay with 12 for fifty years on current hardware?

Intel and AMD should fire their marketing if a lot of people feel ok running the same hardware for fifty years, no?


>If we say "security updates only" for the kernel, does that mean we won't support new processors and graphics cards?

Why would you need support for new hardware if the use case is hardware that's going to sit for 50 years doing one thing and doing it well?


Just judging by your karma you are not new to software development. So how do you imagine it? Your installation is a point in time, there are sometimes refactorings which need to be done, version of library which fixes the security hole depends on 3 other libraries in newer version than your current one and of it goes. It such a common pattern and it's really hard to avoid because you can't maintain 20 versions of product for people who installed at different points in time.

I wholeheartedly agree with the sentiment, but it sounds like a wishful thinking. If you maintain a simple library, do you backport a bugfix to every single previous released version? Because that's what would be required. From every single one of them.


1. People are building a next version of the distro anyway. Even if you want stable, you don't want 80ies stable.

2. Is making a new release really more work? Most parts of a distro come from upstream, so their development too. I don't think most tools have stable branches, you're just expected to upgrade. Backporting security fixes is then actually more work than simply updating and solving breakage every now and then.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: