Eventually you run into issues with your stack being incompatible with the rest of the world.
Your 20 year old debian won't be able to provide a TLS stack that is compatible with any modern browsers. And you could argue that this is still a "security backport", but backporting a new TLS version isn't like your garden-variety buffer overflow backport patch.
The TLS treadmill + certificate authority lists are antithetical to stability.
HTTP + merkle-tree based integrity checking + having the merkle root in DNS could be way more stable.
Even the the SHA1 deprecation would barely affect this, as long as you're not using the hash tree to vouch for content provided by 3rd parties collision attacks don't matter.
Ok but once you have integrity, how do you open a private channel with the website ? You do need TLS, or something akin to that, somewhere. And like all crypto stacks, holes are found and change still needs to happen.
What makes you think we'll still be using X25519 in 50 years? I'm pretty sure we'll have human level AGI by then and there's a decent chance that large orgs will have quantum computers too
The ancient server box doesn't need to do that. Publishing the hashes to DNS can happen somewhere else. And the clients would have to trust their local resolver if they don't want to do the validation themselves.
Heh the only thing I use HTTP for that's compatible with your model is browsing Wikipedia, but I've already got a local copy of Wikipedia on my phone so it's hard for me to imagine why anyone would want to use your model.
> trust their local resolver
No thanks. There was a time when software would trust devices on the LAN. I'm not going back to the dark ages
Your 20 year old debian won't be able to provide a TLS stack that is compatible with any modern browsers. And you could argue that this is still a "security backport", but backporting a new TLS version isn't like your garden-variety buffer overflow backport patch.