The purpose of websites is to provide information and functionality to humans. It's up to publishers and developers to decide how to deliver their work in a way that is accessible to their chosen audience.
It's already a high bar for people to write content and build apps.
The higher you raise the bar, the less content and functionality will be available to all of us.
Also, many people who write content are not developers, and will simply publish their writing where it's easiest for them to do so.
Are they now required to audit all the publishing platforms available to them, to see whether or not their content will be allowed on HackerNews? How are they meant to learn how to do that?
And what about web-apps that are not just text content, but rich apps and games that could never possibly work via text alone.
Are they now not allowed to be viewed by anyone via HN?
Overall, this idea is impractical but also reeks of a sense of entitlement.
We should show gratitude and encouragement to anyone who makes the effort to produce content or functionality for us to read and use, not punish people for failing to cater exactly to our narrow preferences.
It doesn't work on this article. The initial served HTML page contains no content -- which is more than a little ironic.
The linked post suggests banning all content on a domain/site after content which does not render in basic html is submitted and continuing until that site appeals the determination. I don't think that would be a good idea as 1) it may hide content which non-HN users produce and would not otherwise know to appeal and 2) penalize broader domains due to bad content produced by say a single user posting to a multiuser site.
If folks do want to move forward with some kind of penalization of content perhaps it would be better to agree to vote down this content as a community practice or have a per submitted article flag that either ads a tag of shame [requires JS] to the title or removes only that single article rather than broader domain/site bans that are permanent until appealed.
My suggested implementation is to institute site bans based on reports / awareness, and to leave those bans in effect until the problem can be verified to be fixed. That is: the system needn’t be perfect, but it should exist, bans should be instituted when requested, and sites themselves must take positive action to see them lifted.
The point is to make noncompliance painful.
If the main body content is feasibly viewable in a GUI and a console browser, that's sufficient.
In the case prompting this request, I was able to successfully view the content with the additional steps of locating the Markdown source, and re-rendering that as HTML, via a pandoc pipeline:
pandoc --standalone -f markdown -t html -o - \
/usr/bin/w3m -T text/html
I wouldn't consider the process of that as sufficient, but the result certainly was.
Collateral damage is unfortunate. Responsibility lies with the violator.
Post text: https://pastebin.com/raw/9K5h6c5V
I nominate joindiaspora.com as the first banned site.
You say that like it would be a bad thing!
Web 3.0 doesn't function without JS, web 2.0 was enhanced using JS.
Performance is better. Privacy is better. Battery life is much better. Usability is much much better.
That is, strange things happen when you scroll, content moves around so that when you click on a link you accidentally click on an ad "ka-ching!"
I started complaining to advertisers and anandtech stopped doing this.
It's a key reason why I disable JS by default, and enable it as an exception.
... and will re-disable it if any such behaviours manifest.
b) couldn't you have found a better host for this?
On what type of connection? Does my packet loss matter? How many idle cpus do I need?
No, HTTP2 does not solve that.