
Can't We Have Anti JavaScript-Only-Sites Search Engine? - vezycash
Am I the only one who wants a search engine that will prioritize sites that with with Javascript Off?<p>And also can be filtered for pages size say under 1mb<p>Won&#x27;t this be a better solution than AMP?
======
hjek
I use ublock origin with scripts and large media blocked by default (which
makes sites load faster, like AMP is supposed to as well).

For some sites, depending on their purpose, it would be unreasonable to expect
them to not use JavaScript. For example
[https://www.openstreetmap.org](https://www.openstreetmap.org) (where the
scripts are even free/libre).

In such a case, it is easy to (permanently) unblock scripts and large media in
ublock origin, but it would not be fair for a search engine to deprioritise a
map site for that reason, would it?

Also, most news sites pull in massive pile of JS, but they still work when
it's blocked. Yet, even some blogs can fail to show any text when JS is
blocked. So I feel the issue is not so much about JS alone, more about
graceful degradation.

Because surely it's fine to have add a bunch of non-existential JS? Salon has
a Monero miner on their site[0], but works fine without it being loaded.

[0]: [https://arstechnica.com/information-
technology/2018/02/salon...](https://arstechnica.com/information-
technology/2018/02/salon-to-ad-blockers-can-we-use-your-browser-to-mine-
cryptocurrency/)

~~~
hjek
s/non-existential/non-essential/

------
zzo38computer
I partially agree. What I want is for stuff that is only available with
JavaScript to not be indexed; if it contains some text that is always
available and some that is added due to JavaScripts, only the text that is
part of the downloaded HTML file should be indexed. (Pages that are empty
except for the JavaScript code would not be indexed at all, although if there
is a page that describes and then links to it, and the description does not
require JavaScripts, that page would still be indexed, so you could still find
it.)

For some webpages it is useful to have some JavaScripts, but most ones with
just text or whatever, you should not need it. Even ones that use it just to
access its internal data, should still have descriptions that do not need
JavaScripts (whether you want to access the data in a different way or just
read the description only, or whatever other purpose you might want to do).

------
jamieweb
As hjek has mentioned below, sites that clearly need JavaScript (e.g.
interactive street map) shouldn't be de-prioritised.

However I definitely agree that it would be nice to be able to filter out
sites that you would expect to just contain plain text and images (e.g. blogs,
news, product pages), yet show a white screen or infinite loading animation
when viewed with JS off.

------
nik736
Yes, Javascript can be annoying. But what matters is the quality of the
content that is being searched, not if a website has annoying Javascript or
not.

Would you rather read a really bad article that has no JS or a very well
written one with JS enabled?

~~~
hjek
> Would you rather read a really bad article that has no JS or a very well
> written one with JS enabled?

That's not always the choice though. There are lots of good articles, so you
could ask: Would you read one good article that requires you run a (non-free)
program in your web browser, or would you rather read another good article
that doesn't require that? (or simply read the article elsewhere, if possible)

Presumably OP doesn't have an issue with sites that have "JS enabled", but
sites that _don 't work at all without JS_. For example Wiki Wiki Web[0] fails
to display anything but a spinning loader animation when JS is disabled.

[0]: [http://wiki.c2.com/](http://wiki.c2.com/)

~~~
zzo38computer
In the case of the c2 wiki, you can still download the raw text and write your
own parser, at least.

Although I agree with the issue being sites that don't work at all without JS
enabled. (That is why I suggested ignoring JavaScript when indexing.) (The c2
wiki should be made to work without, or at least, to add a <noscript> which
links to the raw text, perhaps.)

~~~
hjek
> at least, to add a <noscript> which links to the raw text, perhaps.

Yes, even just that would make all the difference.

------
fghtr
I am not an expert, but maybe free search engine YaCy[0] could be modified for
that purpose.

[0] [https://yacy.net](https://yacy.net)

------
muzani
Oh man my buttons use JS just to log in. But what is a way around this? Making
logins with <form></form>?

~~~
hjek
Exactly. Just like the site you're logged into now.

------
thedevindevops
The term you're looking for is 'progressive enhancement'

