Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] Please ban sites which fail to function using vanilla HTML (joindiaspora.com)
28 points by dredmorbius 9 days ago | hide | past | web | favorite | 31 comments





How is this different from somebody demanding that Microsoft refuse to allow any app to run on Windows if it only works via a GUI and not also on the DOS CLI?

The purpose of websites is to provide information and functionality to humans. It's up to publishers and developers to decide how to deliver their work in a way that is accessible to their chosen audience.

It's already a high bar for people to write content and build apps.

The higher you raise the bar, the less content and functionality will be available to all of us.

Also, many people who write content are not developers, and will simply publish their writing where it's easiest for them to do so.

Are they now required to audit all the publishing platforms available to them, to see whether or not their content will be allowed on HackerNews? How are they meant to learn how to do that?

And what about web-apps that are not just text content, but rich apps and games that could never possibly work via text alone.

Are they now not allowed to be viewed by anyone via HN?

Overall, this idea is impractical but also reeks of a sense of entitlement.

We should show gratitude and encouragement to anyone who makes the effort to produce content or functionality for us to read and use, not punish people for failing to cater exactly to our narrow preferences.


I have found myself using Firefox's Reader View for nearly every article I read these days. And if FF can't render a site or article in this way I usually skip it.

It doesn't work on this article. The initial served HTML page contains no content -- which is more than a little ironic.


Does this exist in Brave or Chrome?

It does in Chrome, but it's a little hidden away;

https://www.howtogeek.com/423643/how-to-use-google-chromes-h...


As a person who is often in a network that enforces very aggressive active content blocking I am very sympathetic to the idea of penalizing content that does not work well without javascript. This said, I do not know how that policy would be reliably implemented.

The linked post suggests banning all content on a domain/site after content which does not render in basic html is submitted and continuing until that site appeals the determination. I don't think that would be a good idea as 1) it may hide content which non-HN users produce and would not otherwise know to appeal and 2) penalize broader domains due to bad content produced by say a single user posting to a multiuser site.

I also wonder what would count as not being usable in plain html. Sure there are obvious examples like the article the post links to which display zero content but what about all of the in-between spaces. As an example, what if a site has textual content viewable as plain html but also very informative figures/demonstrations that require javascript to make some of the points in the piece. Would this content be removed if it was partially unviewable and meaningful information is lost?

If folks do want to move forward with some kind of penalization of content perhaps it would be better to agree to vote down this content as a community practice or have a per submitted article flag that either ads a tag of shame [requires JS] to the title or removes only that single article rather than broader domain/site bans that are permanent until appealed.

Maybe it's worth trying to force the change, but if enough people are able to view the content and like it enough to vote it up perhaps we should also just leave it alone and just deal with the consequences of choosing not to run javascript and let everybody just view potentially valuable content that they can see.


From the article:

My suggested implementation is to institute site bans based on reports / awareness, and to leave those bans in effect until the problem can be verified to be fixed. That is: the system needn’t be perfect, but it should exist, bans should be instituted when requested, and sites themselves must take positive action to see them lifted.

(Author)

The point is to make noncompliance painful.

If the main body content is feasibly viewable in a GUI and a console browser, that's sufficient.

In the case prompting this request, I was able to successfully view the content with the additional steps of locating the Markdown source, and re-rendering that as HTML, via a pandoc pipeline:

    pandoc --standalone -f markdown -t html -o - \
    https://raw.githubusercontent.com/anderspitman/anderspitman.net/master/entries/16/entry.md |
    /usr/bin/w3m -T text/html
In 0.3s as it happens, for those concerend with efficiency measures.

I wouldn't consider the process of that as sufficient, but the result certainly was.

Collateral damage is unfortunate. Responsibility lies with the violator.


Probably best to make it clear you are the author at some point.

I'm aware Diaspora itself doesn't render without JS. It's what avails itself, my apologies.

Post text: https://pastebin.com/raw/9K5h6c5V

I nominate joindiaspora.com as the first banned site.


I see what you are going for and in an ideal world, I would love for there to be some way to prevent those types of abuses. But the problem with this proposition is that it would effectively ban every modern SPA and take us back to web 1.0.

> it would effectively ban every modern SPA and take us back to web 1.0

You say that like it would be a bad thing!


yeah, like there was anything wrong with having 6 levels of nested tables and dozens of placeholder gifs all over the page...

Even if that ever was the norm, which it wasn't, it still is better than what we have today.

Web 2.0*

Web 3.0 doesn't function without JS, web 2.0 was enhanced using JS.


I'd love to ban single-page apps, especially ones that break URLs by forcing all actual information into the #fragment.


If this were to happen, product teams would simply create more server-side fallbacks for their design patterns. All Javascript has done is make the web require fewer full page redraws. We'd be back into the days of server-side HTML rendering. I'm not sure that's a step forward. Stamping out a technology just because it enables a behavior you don't like won't stop the behavior. Meanwhile, you'll punish a lot of good actors.

Video on the web, specifically, has a storied history, which is the bigger reason for what's going on with Smart TVs and how they relate to naked video files. Javascript has nothing to do with it.


What... server side html rendering is vastly superior.

Performance is better. Privacy is better. Battery life is much better. Usability is much much better.


Performance depends on how many refreshes are asked of your connection and browser, but that mostly pertains to forms and login screens.

The purpose of HN is to share interesting information with others, not to lobby for changes in the way web sites are implemented. I personally dislike the gratuitous use of JavaScript, but I don't think I should deprive everyone on HN of interesting articles because of my personal preferences.

I think there are many other things worse. e.g. Medium, sites with pop-over menus, cookie popups, etc.

Headers that slide in and hide the content whenever you scroll to read the content in the first place.

I was mystified by some of the strange things I've seen on content sites in the last few months but then I realized they were doing click fraud.

That is, strange things happen when you scroll, content moves around so that when you click on a link you accidentally click on an ad "ka-ching!"

I started complaining to advertisers and anandtech stopped doing this.


Killing JS normally defeats most of those annoyances.

It's a key reason why I disable JS by default, and enable it as an exception.

... and will re-disable it if any such behaviours manifest.


> And yes, I'm aware of the irony that Joindiaspora.com does not function without JS. Please ban Joindiaspora.com as your first site ban instance.

LOL


Ironically this page says:

> This website requires JavaScript to function properly. If you disabled JavaScript, please enable it and refresh this page.


Ironically, the post explicitly references that point, and calls for a ban of Joindiaspora.com as the first HN site ban.

a) that isn't irony

b) couldn't you have found a better host for this?


Isn't submitting your own stuff usually subbed shunned upon on here? Or was that Reddit?

SPA apps only add about 30 milliseconds to the time required to load content, which is far faster than human perception. If folks really think they can tell the difference, we should get James Randi to set up a challenge.

>30 milliseconds

On what type of connection? Does my packet loss matter? How many idle cpus do I need?


The ramp-up for the TCP connection to load the Javascript that renders the page takes longer than that!

No, HTTP2 does not solve that.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: