Hacker News new | past | comments | ask | show | jobs | submit login

They're lost anyway. The web is on a path to deprecate and remove HTTP and as the usage of plain HTTP dwindles even further to a level Google is comfortable with they'll announce the end of plain HTTP on Chrome (likely in a tiered approach). We'll likely see warnings of insecure HTTP, followed by a red page at some point (similar to a mis-configued TLS cert), followed by refusing to connect to HTTP altogether.

This will absolutely happen by the end of this decade, and HTTP will be a distant memory.

If you care for HTTP, you need to ensure the contents of any HTTP sites are preserved in some capacity because one day, they will remain inaccessible, even using old software will likely not work at some point.




Doesn't make banning http submission reasonable by any stretch of imagination.


You assume that all web content is designed for a browser, and uses html. This is simply not true. There are very cool tools and tricks like getting the weather in a terminal just ‘curl wttr.in’ and bam weather report right in your terminal. There are other tools like ‘curl ifconfig.co’. It would make the tools bit more cumbersome if you had to ‘curl https:// wttr.in’. Unless the maintainers of curl had it default to https.

Edit: how to add an erroneous space because HN was doing something weird with the https link


Or the owner can set up an http redirect to https. It's a win-win. curl happily works with that when you ask it to follow redirects. 'curl -L ...'


why would old software not work?


Try connecting to the internet using Windows XP, lots of bits and pieces are broken. Especially the default browser ;)


To be fair, though, the default browser in Windows XP was pretty broken even when it was new.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: