Hacker News new | past | comments | ask | show | jobs | submit login
Lightweight versions of websites without all the bloat (github.com)
184 points by pmarin 25 days ago | hide | past | web | favorite | 96 comments

We need a Web again. A hypertext document browsing system. Update the basic elements like they should have been updated 20+ years ago (sortable tables seems like a no-brainer to include in base HTML, there's got to be a way to update frames so they suck less and fit 90+% of the legit uses of AJAXy page updates, improved form elements like you get in a real GUI toolkit, file upload with preview, and so on) but NO javascript, NO talking to any server behind the user's back. Once the page loads my network connection should be silent until I click a link. I should get a preview of what a form's about to send. Anything but a GET to any server but the one I'm currently on, flatly disallowed. Somewhere you can browse hypertext documents with high performance, in safety, with no spying.

This one can be abandoned to spyware and apps. Whatever. That's fine. But I don't want safe hyptertext browsing comingled with the same program that can and constantly does load spyware and other abusive junk.

There's nothing stopping you from browsing that way if it's what you want. Install uMatrix, uBlock, and a couple other extensions, and you can lock down pages as much as you want. You can even take it a step further and use an older browser that doesn't support the modern features you're complaining about.

Unfortunately, what you can't do is force everybody else to participate. HTML3 still works great in modern browsers, but the people making the sites want to use modern features.

> There's nothing stopping you from browsing that way if it's what you want.

There is. Pages that don't work without Javascript.

That was my point. You can control your browser, but you can't control what other people offer up.

And that's what's stopping them from browsing a sane web.

jlarocco 25 days ago [flagged]

Sorry, but nobody's twisting their arm to use those websites.

Making yourself out to be a victim doesn't make it so.

Saying that you have perfect choice when half of the things you need to use use dark patterns rife with abuse does not make it true either.

>And that's what's stopping them from browsing a sane web.

What, that people have too much freedom to express themselves?

When did hacker culture become so... fascist?

> You can even take it a step further and use an older browser that doesn't support the modern features you're complaining about.

> HTML3 still works great in modern browsers, but the people making the sites want to use modern features.

I'm not disagreeing with your point, but I disagree with words like "modern" and "older".

The most "modern" browser that I'm aware of (not counting the myriad Webkit/Gecko skins) is EWW, which was first released in 2013. It's now a built-in feature of Emacs, a reasonably widely used piece of software. I think this fits the parent's desire for a readable hypertext renderer.

In contrast, Google Chrome dates from 2008. If we include that in our definition of "modern" then I think we should include Netsurf (first stable release in 2007), which is a full graphical browser but might still be acceptable to the parent. If we call Firefox "modern" then I'd say it includes everything back to Netscape Navigator.

If we're instead talking about maintained and up-to-date browsers then w3m, Links and lynx have all had releases in 2019. Again, those are readable hypertext systems like the parent describes.

We might consider "modern" to mean implementing certain Web standards; yet that seems precisely against the nature of this discussion. Plus, many of those described above have support for e.g. HTML5, certain ECMAScript standards, etc.

In contrast, "older" browsers would presumably include bloat like Internet Explorer's support for ActiveX and VBScript; and the IRC client + WYSIWYG editor + mail reader + news reader from Netscape/Mozilla/SeaMonkey. Precisely against what the parent asked for.

Ill go a step further, and say I wish the view was decoupled from the data in a way that my browser controls the view template. The fonts, the column size, the page colors. So whether i browse to cnn, nytimes, espn, etc, the pages look identical.

My browser can have maybe 20 page templates built in, and sites can specify which templates work best for their content (article, video, forum, search results, recipe, howto)

Due to techniques and services like jsdelivr, many sites will prevent you from even seeing any of the content without loading all of the JS they want you to be running.

No problem. Hit the back button and select one of the alternatives from 10k+ search results..

Edit: and don't forget to add the broken website to your "remove from search results" addon, so you never see it again in search results..

Can I change websites for paying my electricity bill? How about accessing government services?

Sure they offer an in person alternative, but due to the prevalence of online usage they've been able to dramatically cut the number of physical agencies.

Try booking a flight and hotel without js. It's just not possible, you can't simply "use another airline" you have to use an entirely different booking medium.

Inject this HTTP header and you have the web 1.0 back:

     Content-Security-Policy: sandbox allow-forms allow-top-navigation;
Restrict default-src to self and your second requirement is covered.

You misunderstand, I think. The problem is that the ecosystem of the modern web is entirely and irreverisbly full of spying and user-hostile junk, due to decisions that were made. TL;DR you can't have Javascript with anything close to its current capabilities and also have a safe and user-controlled Web.

I don't want some restricted version of web sites that work OK in current web browsers, provided you're very careful not to leave those sites. I don't want constant fiddling with umatrix or custom privacy settings that break 90% of the Web. I want a safe hypertext browsing system back, in its own program that will kick open a "real" web browser (with warnings and flashing lights) if you try to link from it to the spyware web, like if you try to open a PDF. As long as I'm in it, I shouldn't have to worry about any links I open trying to spy on me. I shouldn't have to track down which tab in this system is making my nigh-supercomputer laptop's UI stutter (hi, Asana, Trello, Slack, Jira). None of them should break because of my privacy settings—privacy, old-school web style, should be the only way it operates.

I want a safe-only Web that is for browsing hypertext documents. The current web can keep the spying, security vulnerabilities, resource-eating, and apps. I know I'd still have to use it, and that's fine, I just don't want safe, secure, resource friendly browsing to happen alongside this shitshow, because then it's not safe and secure.

> open a "real" web browser

I'd call that launching a Javascript app, which rarely touches the World-Wide Web beyond that single point.

i think we still have it, with wikipedia and personal blogs for example. it's just been largely subsumed in our minds by web apps and the commercial web. but there are plenty of interconnected hypertext documents that are primarily documents to be read and shared using http(s) without an intermediary organization (like facebook or google).

> Somewhere you can browse hypertext documents with high performance, in safety, with no spying.

How do you propose the no-spying part? With laws passed by country? Nothing in your setup prevents spying, and spying is extremely lucrative in a capitalistic society, so people will still find a way to spy.

You are putting all your trust in the server. But those servers are where the spying would move to if it needed to. You cannot create a technological system that prevents spying, since you have not removed the incentive to spy.

Make it illegal to do targeted marketing and then you will have a dramatic reduction in spying. You'll also have a short-term dramatic drop in economic efficiency, something the HN crowd is likely to decry as equal to the end of the world. But I think it's much more likely that we build a sustainable society with good, long-term-growth economics if we put more limits on capitalism.

Edit: Facebook doesn't care if you're reading those documents on some simple text-web. They'll find out anyway, either via your absence from the 'regular web' or any other clue they can find. The spying won't stop just because you're using some primitive technology.

Under what I propose you could only do stuff like recording a user's entire session, including mouse position and typing-in-progress in text fields, which is commonly and easily done on the modern web for "UX research", with the complicity of the browser itself. The point is that the server may log your requests, sure, but they can't gather tons of extra info without your seeing it (post previews) and they can't effectively stick a camera over your shoulder while you look a their pages. No more abusive, close session tracking. Best you can get is "the user, personally, made this request at this time". Which is way the hell better than where things are at now.

I agree we also need to outlaw mass collection of info about people, or at least put in place such massive penalties for leaks that no sane company without a really compelling reason would dare store such things. Hopefully that'd end the damn credit card company and loyalty card and credit system spying, too, or at least force them to shape the eff up. I'd just also like a hypertext network that cannot constantly spy on me or eat an entire core because "programmer productivity". That's all.

I don't know how perfectly you could do this. If there are going to be arbitrary mouseover effects then JS has to be able to see your mouse. If you want Jupyter notebooks to indent text & do tab completion then JS has to see text you haven't submitted.

That said, I'd like my browser to be much more on my side -- by default disclose very little about my setup, by default partition cookies according to the headline site I visited, and delete them soon. I'd like JS completely off on most sites I visit, and more convenient ways to turn on as much of it as needed on some sites.

> If there are going to be arbitrary mouseover effects then JS has to be able to see your mouse. If you want Jupyter notebooks to indent text & do tab completion then JS has to see text you haven't submitted.

Just don't allow those things. Those aren't hypertext browsing, they're app-things. Wikipedia or Timmy's Ninja Turtle Fan Club or even banking and ordinary ecommerce sites don't really need those things, and having those things on the same platform as all these "apps" get them all mixed up in a spying and vulnerability nightmare for no good reason. I think hypertext browsing is distinct enough from what the current web is, and that it's still a useful enough idea, that it'd be nice to split that off from... whatever this has become.

Indeed. I guess you're picturing two apps, where I'm picturing one app with much easier per-site settings, with much the same goal.

My current plan is to make lots of Fluid-app browsers, for certain sites, and then leave JS off most of the time. But there are enough sites which don't work at all, yet I want to visit, to make this a bit of a pain.

One major benefit of a separate program and network (network of hypertext, that is, not IP network or whatever) is that if you can overcome an initial usage hurdle (big if, I know) it'd provide an audience incentive to get current websites that could work just fine on it (which is lots of them) to provide a second experience on it. The way they do now with web+mobile app, but it'd obviously be cheaper to provide than either of those, by a long shot. Gaining the ability to access, banking, ecommerce, and government resources in particular through a hypertext network that's not also a spyware and application delivery platform would be huge for security.

I wonder whether pushing accessibility is another way forwards. Maybe the simple-browser should present itself as a screen reader for the blind, and automatically send threats of ADA lawsuits if a site isn't usable.

I know for instance that UK govt. websites are designed for such use first, which just means sticking close to the original spirit of hypertext. Bank websites I have little hope of... and anyway I trust my bank quite a lot! But commerce would be great.

Great to see a post about lightweight websites having over 100 upvotes.

To me, the "heaviness" of the web is making it much less enjoyable then it could be.

As a user, I regularly read the HN discussion of an article rather then the article itself. Simply because HN is so lean and other websites are usually bloated. So bloated, that I prefer to infer the content from the HN discussion just so I can avoid visiting the website.

I run multiple very lightweight websites myself:


I categorically refuse to add heavy assets or frontend libraries.

Many of my developer friends insist that "Normal users don't care!". I am not so sure. I wonder how big the target audience for "Lightweight websites" really is. Would love to see a poll that asks people "How much do you like the typical website? What are the most common reasons that make you dislike a website?".

> Many of my developer friends insist that "Normal users don't care!". I am not so sure.

I am with you. I've spoken to a lot of non-technical users in my life. Their very typical and often response on the matter is "it is how it is, I cannot change it, of course it's annoying but I have no alternative" -- this is about ads, heavyweight sites and possible tracking/spying.

So IMO most non-technical people accept it as a fact of life but would rejoice if the Web became lightweight again.

> So IMO most non-technical people accept it as a fact of life but would rejoice if the Web became lightweight again.

Yup. There's also a subset of non-tech people who don't have a basic mental model for how things work; from those, I hear "my computer is slow, probably has viruses, could you help?". I come, and the computer is often fine - it's popular websites that just went through another bloat-up cycle, and the computer struggles with them. Installing uBlock can make such computer live for another year, but ultimately people end up discarding perfectly good hardware just because the web keeps accruing bloat.

>So IMO most non-technical people accept it as a fact of life but would rejoice if the Web became lightweight again.

What most non-technical people want is for the web they're using now to work faster. The problem is, the people who want to make the web lightweight again seem to want to do so by eliminating the modern web entirely. No one seems to be asking how we could make the web that exists, and that billions of people actually use, better.

I am asking. What would you propose?

> The problem is, the people who want to make the web lightweight again seem to want to do so by eliminating the modern web entirely.

Not entirely, though it may seem that because many proponents of lightweight web - like myself - are just tired of the modern one, and have comparative experience of how to make a site many times more functional and use the fraction of resources at the same time.

> No one seems to be asking how we could make the web that exists, and that billions of people actually use, better

Circling back to 'pdimitar point upthread: people use what they're given. They have no love for it, they just don't have any other choice. This way people adjust to the growing bloat of the web, and the same way they could adjust back to the leaner web.

And yes, many are asking how to make the web that exists better. Not many want to hear the answers, because they begin with "stop doing the crap you're doing right now". The answers are, stop doing SPAs for pages that fit the document model (which is 90%+ of them). Stop putting useless gimmicks on the frontend - like that Cleave.js thing from yesterday, for example[0]. Stop pulling in all the trackers and ad network scripts. Design your page function-first, form-second.

It's a hard sell, because it goes against several strong social phenomena, like CV-oriented and fashion-driven development, very lucrative surveillance capitalism business models, vendor lock-in and SaaS-ification of everything. That's why at least part of the solution will have to be technological - just asking people to behave isn't going to cut it.


[0] - https://news.ycombinator.com/item?id=19233637

I found this stat - not sure how old it is:

Forrester Research did some research in why people return to web sites. There were 4 main reasons: good content (thank goodness this was number one!): 75%, usability: 66%, download speed: 58%, frequency of updating: 54%. All other reasons were noise compared with these 4.


You're not alone: for HN and website lightness.

Hey! I just added Gnod tools to the list. I saw your pull-request for Gnoosic, but the other tools also fall in the same bucket.

I recently started a subreddit with a similar goal: https://www.reddit.com/r/SpartanWeb/.compact

Explanation: http://beza1e1.tuxen.de/spartan_web.html

I had started a site with the same goal, but let the domain name lapse: https://lighten-the-web.netlify.com/

It's open source, so feel free to make some PRs if you need a place to add resources or something to go with the subreddit.

This is cool! I would appreciate pull-requests to add these to the repository

Otherwise I will take a look at them later on myself

Submitted a few I found fitting: https://github.com/mdibaiee/awesome-lite-websites/compare/ma...

There is no section for individual blogs.

I wonder if anyone here would agree to pay a small subscription fee, say $2/month, to get a text only version of websites. No to minimal images, no to almost minimal CSS flash/bloat/design. Pretty close to reading things on a newspaper.

Given that browsers can already do that (CSS off, reading mode, etc.) I don't think that would be a popular idea. Usually it's "pay to get more content that wouldn't be accessible freely".

I might pay $5-10 one time for a browser plugin that intercepts requests and redirects me from the heavyweight versions to the lightweight ones on this list...

I wouldn’t pay for that but I would pay a hell of a lot more than 2 a month to have all major websites I visit properly redesigned to render in terminal based browsers (such as links or w3m) as first class citizens without weird formatting bugs or UX.

What's wrong with the various reader modes that Safari and FF already have? They're great at de-formatting a page to it's base article. No cruft. It's very newspaper style. (Chrome has reader plugins that can doing this too)

Not sure what your $2 would be doing another than adding an actor between me and the page I wish to read. For privacy purposes, I'll be keeping well clear of such "solutions".

Dont you normally have to wait for the heavy site to load before switching to reader?

This is fantastic. I have a thing for older laptops and many modern websites are too bloated to run on them. I might use some of these on every machine... why waste cycles when I just want to read?

I have a 2018 MacBook Pro, and reddit “lags” in safari unless I use the old version.

The "new" reddit makes my 2015 macbook sound like its about to launch into space and the site really chugs in Firefox.

Old reddit is smooth as silk. I honestly don't understand why they changed it.

On my netbooks I have been using the html only version of Gmail because it became faster than the modern web app.

It's faster on my 2018 well-above-base-config Macbook, too. Somewhere along the way the original promise of this whole AJAX thing—faster updates, no page loads—got lost. The page loads are usually faster for real sites in the wild, where we have two versions to compare.

I use it everywhere now. It loads instantly, it's amazing. The only thing I miss is the auto categorization, I have to filter out all the newsletters with my brain now.

You can also setup filters which are easy to do in gmail and work better than autocategorization.

I'm tempted to make the switch myself, is there a way to permanently force this? Otherwise I'm just going to use Gmail from Thunderbird, which can be a Behemoth but it's not as damn slow as that UI at the end of the day.

This is off-topic but: reddit uses React to for building and rendering interface. Reddit is no.1 forum on the Internet,huge company with plenty of resources and top-notch engineers. So conclusion is simple and straightforward - it is not possible to write complex and at same time high performant app using React. Is there any mistake in my reasoning?

Not intending to defend react, but you are neglecting other possibilities such as that they have not tried, tried too little or that it's not important to their business and others targeting different userbases could succeed.

Without a definition for "high performance" the statement doesn't really say much. If we had a definition for it, we'd also have to ask if "high performance" was even a goal for the reddit team. As you stated, reddit is the most popular forum on the internet and it has continued to grow even with the new design so perhaps the performance concerns were considered premature. I know some would disagree based on their own issues with performance, but it's at least safe to say that most people using reddit don't seem to have a problem with it (if there is one thing redditers love its to complain loudly about why reddit is going to shit)

Yes, your sample size is one.

Netlify is also built with React and its homepage loads amazingly fast (<1MB in size).

Are you sure that Netlify homepage uses react? I have react developer tools installed and it indicates no react presence on netlify's (very plain and simple - I mentioned complex apps in my initial message, but that is another thing) homepage.

Another example could be Facebook which qualifies into "complex apps" category and at the same time is very well known for sluggish performance.

Not sure if that's a typo and you meant Netflix or not, but did you know that Netflix actually gave a presentation about removing React from their homepage, and got an incredible performance boost from it? I suspect something similar in the case of Netlify, if you're referring to them.

Try https://reddit.premii.com/ It should work/load quickly on a 7-10 years old computer with the latest browser.

I have an older MacBook Air and reddit doesn't lag at all in Safari. Maybe it's something else..?

I don't like that this conflates bloat with JS.

I have a minimal chat application that heavily relies websockets and JS, that said, it strives to be data minimal. The heaviest part is the initial load of emoji images which get cached. Other than that there are no third party scripts or data transfer.

I get that JS can suck and be abused, but I don't really understand the complete allergy to it. A websocket implementation is going to have a lot less overhead than page loads for everything.

The article is talking about websites, not webapps --- i.e. document-oriented sites, for which JS is really not necessary. For a realtime interactive application like chat, however, the circumstances are different.

I'm not sure that's the case, there are a number of the sites on this list, like Facebook and Reddit, that I definitely wouldn't categorize as document-oriented. It seems more of an issue with both how resource intensive the JS-heavy sites are, as well as the purpose of the JS- is it used to track me and invade my privacy, or does the JS have a legitimate use (as in a chat application)? Personally I LOVE JS-powered websites/webapps, as it means that as a lone developer I have the ability to create a website as powerful as a desktop application, without developing multiple versions for each OS/app store. So I, too, don't wish to see a massive backlash against JS, only against bad JS.

Facebook we could dispute once we include the chat (though the chat is available as separate Messenger site/app), but without it it's almost entirely a document-oriented experience. Profiles, timeline, posts, comments - those are all just glorified documents with forms. The same is true about the entirety of Reddit - it's document-oriented by nature.

Consider HN as an example of a document-oriented social media site actually implemented as documents.

Even the chat application mentioned previously could be viewed as (and implemented as) "documents with forms". This oversimplification doesn't help us much. It's the powerful, real-time interactivity that javascript provides that is being debated here.

The question is whether the real-time interactivity is desirable. For a chat widget, it almost certainly is. It's less clear for something like Reddit, where the main mechanism for getting updates to the page you're currently viewing is to explicitly refresh.

Go look at TechCrunch and you’ll see why there’s so much hate for JavaScript. That site is awful.

Why is this the top voted story barely 40 minutes after submission? Are people really excited for 20 links, most of which aren't even "lite" websites?

Every now and then there's an odd top story on HN which makes me wonder who actually reads this site. I guess I'm not hip enough!

It's because HN readers have strong feelings against bloat, in software generally and especially on the web. Just mentioning that topic excites strong responses.

Perhaps you could interpret it as untapped demand for lightweight websites?

People probably upvoted prior to reading. I know I did, because the headline immediately told me I'll love the submission.

People also vote on the headline without actually reading.

I'd guess there's high correlation between people who want this and people who use emacs, which is a non-trivial portion of HN readers.

Don't know how one relates to the other, but I for one know a couple of web sites which UX would vastly improve, were they accessed through Emacs instead of the web browser.

Well I am interested in lite websites as well but this list contains about 20 links with roughly half that aren't "lite versions of websites". I am just wondering why this is the topmost story.

I would like to see more developement towards features like the "reading mode" in browsers. There is no good reason why the style of text should vary between websites. The same is true for many other "features". Why do tables with data look (and feel) different across websites? Why does every website implement user registration/login in it's own unique way? Every website has it's own navigation style/concept - instead of allowing me to use my choice for all websites (as a browser setting). The downsite though is less creativity/innovation.

Im with you.

Id like to see a browser come out that does this. Reading mode by default, along with some p2p network of userstyles and templates. Opera sort of did this once, where the community maintained .js patches to make websites work better in the browser. It would be nice to have a set of standard browser templates, and then be able to have an on/off button for "allow most popular community patches to load automatically," those patches basically being css tweaks for sites that dont quite fit a template.

The examples on the page are still fundamentally bloated. Take a look at the source of any example and you will find the regular div soup albeit not burdened by 106 cookies and 46 tracking scripts.

The web has HTML5 elements and the CSS grid layout engine that can be made fully responsive with maintainable CSS variables to look excellent in evergreen browsers. However nobody is able to code up a page this way as there is so much technical debt in the way web development teams work and the hacks that can't be dispensed with overnight.

Recently I tried to make a CMS form look pretty. The forms needed have to have the simplest markup, e.g. label followed by input, label followed by input with those inputs having HTML5 niceties such as placeholders and the 'required fields' being styled with simple CSS that is based on 'required=true' being on the input boxes. However, the templates had two divs and spans around the labels and two more divs around the inputs with a container div holding it all together, everything decorated with untold class tags. To make this look good and responsive in CSS grid I needed to get rid of all of it apart from the inputs and labels. If I didn't have confidence in what I was doing I would have just added yet more CSS and yet more complexity.

Anyone can make stuff complex, making stuff simple and concise is an entirely different skill. None of the examples on this list have fully grasped what a modern web page should be like. It is as if nobody has taken the time to learn how to write content with the correct elements and then style it up for the evergreen browsers people actually use with modern CSS.

Any suggested resources or examples to learn from?

I do not actually see a concise course out there that puts it all together. I am a big fan of Rachel Andrews, Jen Simmons and Lea Verou, however, at the moment it really requires some DIY effort to get everything learned. You also need a project such as your own portfolio site to have a go with.

Given there is no guide out there I think I had better write one.

I was unaware of reddit.compact, looks even better than old.reddit.

Does anybody know of a good browser extension to apply some default CSS to sites you visit? It took me just 4 lines to turn the CSS-free NPR site into something that I enjoy reading.

Before and after: https://imgur.com/a/EJympO9

font-family, font-size, max-width, what's the fourth one?


iCab lets you select from several stylesheets. Pretty cool

CNN is night and day...

The regular CNN site isn't even usable in Lynx

The web brought us easy access to static content. Then, interactive web apps appeared and mixed code with content. Easy access to content need up restricted by new rules designed by app owners.

So, could a better separation of content from software improve the situation?

The twitter mobile version is also the "new" standard desktop version if you enable it.

Should also check out wiby.me, it is a search engine that only contains lightweight websites.

Is there a plugin that automatically sends me to the lighter version of a site if it exists?

Added +2000 bonus points if : uBlock (0), as in text.npr.org or HN.

Why does ddg do a POST request?

HN pedantry of the day: 'lite' vs 'light'. I screamed internally a bit at "a list of lite websites", so naturally I looked up the definition and etymology.

It turns out that lite is basically a modifier for product names. Early in the 20th century it was used for products like "Auto-Lite" and "Adjusto-Lite", apparently as a diminutive form of "light". But then later it was used to connote fewer calories (as in "Miller Lite", "Kikkoman Lite") whereas other products still use the full form ("Yoplait Light"). Lite's other definition is "diminished or lacking in substance or seriousness ... innocuous or unthreatening", implying an almost childish, immature, or unthreatening form.

In this website's case, the list is specifically referring to the "weight" or "heft" of the websites, relative to the content which isn't solely what the user is looking for. It also is not referring to a product name. That falls squarely under the definition for light, and not lite.

But the name of the repo, awesome-lite-websites, can be seen as a product name. So the word lite fits for the name, but not in the description. (indeed, in the README the word "lightweight" is correctly used)

[1] https://grammarist.com/words/lite/ [2] https://blog.apastyle.org/apastyle/2012/10/lite-or-light-whi... [3] https://www.etymonline.com/word/lite

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact