Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If there is one thing Google services seem to have in common, it is that they're burdened with sluggish javascript. Whether it is Google Maps, Google Groups, Google Image browser (in Image search), all of them like to spike the CPU at 100% for a while doing their thing, leaving the user staring at a frozen tab or window. Having been around a bit I remember where Google Groups came from, it used to be called Deja News. Back in the day I made a 2-pane browser for reading news groups, functionally comparable to what Google Groups does nowadays with one big difference: it loaded instantaneously and was fast. Mind, this was in the time where we counted ourselves lucky with our 4 Mbit fixed line and our 400 MHz Pentium II developer machines, using Netscape 4.x on a Linux 2.x kernel with Afterstep or FvWM (or olvwm for Sun-OS aficionados).

While these 'modern' services might be more flexible with their largely client-generated UI, I feel that this comes at too high a price. This problem is not just limited to Google services, other sites and services are similarly hampered. Thing is, I'd have expected better performance from Google.




> with one big difference: it loaded instantaneously and was fast

When things were HTML, like most things "back in the days", thing were fast. Frames, maybe iframes, and ready-to-consume HTML.

Does Squirrelmail rings a bell? It was one of the earliest popular webmails in PHP, and it run every machines I accessed internet with, including 96MB RAM Pentium I MMX laptop.

The current trend of outsourcing rendering to the client is not really a gain for anyone: the server side still does a lot of arranging and magic on that data which is served, while it could just generate HTML. Serve the already done HTML, replace the parts in the DOM like we did in the beginning of the AJAX days, and things would be much faster.

Out of curiosity, I installed a lesser known browser, Dillo. It's dumb: it can parse a minimal amount of CSS and probably HTML ~4; the rest, it drops. And voilá: the internet - well, the ones that are still serving a viable miminum HTML without JS - became instant. Using the word "fast" would be an understatement.

We need to go back to the roots, otherwise we're using a lot of CPUs as heaters.


I feel like this is a situation where developers are looking back to an internet that doesn't exist anymore. In 2002, there were 665 million internet users, now there are 3.4 billion and growing [http://www.internetlivestats.com/internet-users/].

If we were still stuck with terrible standards implementations and lack of processing power, I would side with the "Lets do everything on the server again!" argument, but we're not. For most use cases only privileged information has the requirement of server processing, and if thats not a requirement, why not take advantage of the device requesting the service?

Set aside the 5 second delay user experience annoyance, and realize that as a business there is no way to scale your services profitably without offloading as much work as possible to the web client.


> In 2002, there were 665 million internet users, now there are 3.4 billion and growing

A significant part of that 3.4 billion are on mobile, which is even deeper prone to the overused JS problem.

> terrible standards implementations

Ah, you mean IE6. In it's prime time, it was a brilliant browser, standards or not; it was a necessary evil to forcefully move things ahead. It has XHR, webfonts, a gazillion things none of the browsers had yet. It indeed stuck around for too long, but this was not the case in it's initial state.

> lack of processing power

https://hackernoon.com/10-things-i-learned-making-the-fastes... -> Read the paragraph "#2 Do mobile first. Like, really do it." We do lack processing power on mobile, so for that 2-3billion people, who are using mobile _only_ or mobile first, we need economical solutions.

> why not take advantage of the device requesting the service?

Because you can't assume the device is powerful enough to do so. Why do you think https://mbasic.facebook.com still exists?

> Set aside the 5 second delay user experience annoyance

You go against one of the initial hard rules of the web with that. I wonder if is the importance of speed had changed, but judging by the grouchy voices all around, it did not. People only keep using the services because they deliberately made it hard to leave or because they learned tactics from Microsoft and are constantly eliminating competitors.

> and realize that as a business there is no way to scale your services profitably without offloading as much work as possible to the web client.

Now, this is complete nonsense; there is no money you save there. How is generating all those React apps, shuffling all that data is better then generating HTML? :)

Of course, there are exceptions, there are always exceptions, but most of the things would not need mammoth sized invisible JSONs to be parsed in the browser while the pre-rendered HTML could be served and manipulated only when needed.


I think your misunderstanding surrounds the amount of processing required to render HTML on the server, if you believe that rendering JSON is just as expensive, I'm open to the discussion but I don't agree.

Usually these applications are compiled once, and delivered by CDN. That's a huge savings.

Also I'm not saying the speed of Google Maps in your browser is acceptable, but with tools like Webpack Chunking, you can optimize even the initial load to nothing. This is a software architecture problem, not a javascript problem.


Maybe you're right, but I'm interested on your opinion about the rest of my answer, such as mobile browser capacities.


There is no way you'd want to leak even more information about the machine such as RAM and CPU power, I feel like it's already dangerously easy to de-anonymize users across websites simply using what's provided in headers and described in the "navigator" global.

So with that in mind, I believe that Google Maps is purposefully not optimized for most mobile clients. It's designed as a fallback if you don't install the app (which they are deeply incentivized to peddle).

I also think other web applications that get to the point of ~5 seconds of loadtime need to rethink their compiling strategy (if at all possible). I mentioned webpack before, but spreading out the work of large payloads is a well documented issue with many different solutions. If it's the CPUs fault, you should be using Web Workers.

My original point stands though, processing on the client is free, scalable and an effective solution even considering the harm it could inflict on UX, and those harms can be mitigated.


   Read the paragraph "#2 Do mobile first. Like, really do it."
   We do lack processing power on mobile, so for that
   2-3billion people, who are using mobile _only_ or
   mobile first, we need economical solutions.
Or use an older, 'underpowered' (by modern standards) machine for development, like I do. A Thinkpad T42p will give you a 1600x1200 screen, one of the best mobile keyboards available, 5-6 hours of autonomy with an extended battery and... the awesome power of a 1.8 GHz Pentium M... well, not so awesome maybe but if it works on that machine it flies on anything more up to date.


> And voilá: the internet - well, the ones that are still serving a viable miminum HTML without JS - became instant.

I've been using umatrix for this effect. I have some sites that unfortunately I need to use JS on. Sure, some sites fail to the point of not even working, but overall, I find sites faster, more user friendly, and more responsive than pre-umatrix.


The problem is that less and less websites support JS-less browsing, even though in many (if not most) cases it's completely unnecessary, i.e. what I need from the website (mostly: reading text) could be accomplished just by HTML and JS.


Dillo, yes. I did quite a bit of development for the previous, GTK-based version. I added tabs, frames, a simple command language and several other features which I distributed as a patch set due to the fact that the core developers were somewhat antagonistic against external developers [1]. I kept up with this for more than a year but finally gave up when the Dillo project went dormant due to a failed fund drive. Eventually the project got going again around a new code base built on FLTK.

[1] https://en.wikipedia.org/wiki/Talk%3ADillo#Frames.2C_Tabs.2C...


I used to host SquirrelMail on my little $10/month personal server. My users (OK, friends and relatives) were quite happy with it!

There's a lot broken with the recent model of browsing, essentially, applications rather than content.


And I presume Google Maps works great in Dillo?


No, it absolutely doesn't. However there were maps before Google Maps, not even bad ones. The greatness of Google Maps: a, satellite images; b, near-live traffic data; c, geotagged images. Oh, wait, the third is gone, and the second is not thanks to Google but to a startup they bought ;)


Google Keep has one of the worst start-up time delay's I've ever experienced on a web app - which is kind of terrible for note taking.


I really have very modest keeps - 25 maybe? The startup and idling Javascript requirements of keep are insane.


Youtube also does this.. the interface is frozen for me on page navigation for around 2-5 seconds before I can start interacting.


This is why I always use `mpv https://www.youtube.com/watch?v=...` instead of clicking on Youtube links. It bypasses all the Javascript and plays the raw video stream. (youtube-dl needs to be installed.)


See also the awesome youtube-dl script: https://rg3.github.io/youtube-dl/ - especially useful for videos you plan to watch more than once. It also handles far more than just youtube, despite the name.


And for even more video services with an emphasis on live and one-time streams, see http://docs.livestreamer.io/


Livestreamer is discontinued. I suggest people try https://github.com/streamlink/streamlink


Does youtube-dl get the ad-free videos for YouTube Red subscribers?


I have never seen any ad on youtube; are they really integrated into the video itself ? from the fact that I never saw any either in the browser (with ublock origin) or through youtube-dl, I kinda just assumed they were distinct videos played by the JS player beforehand - and thus easy to bypass - and not an integral part of the main video file.

Anyway, to answer your question: youtube-dl can perform the page fetching and the download in the context of an authenticated user (for example for age-limited videos), so I assume that any advantages associated with a youtube account would most likely work with it.


I think in at least some cases they are integrated, because I use ublock origin as well and I did see at least some ads before I got YouTube Red. I could be wrong but that's my impression.


Hmm, you could well be right and I simply didn't watch the right videos.


It just doesn't download the ad at all. You can specifically tell it to try to download the ad, but just don't do that and you get ad-free, no matter if you have YouTube Red or not.


That's your ad-blocker. YouTube has started pausing playback for the 12-30 seconds while the ad plays (even if blocked), so you see a black screen.


So THATS why all videos appear to freeze up for me? Been driving me crazy trying to figure out why.


I had the same problem...Luckily it's easy to fix. Disable your ad blocker and refresh your video with ads until you get the one that is a survey. Once you finish the survey you can enable your ad blocker and things will be normal again.


Since my reason for running a blocker is to prevent an ad-delivered malware infection -- I think I'll stick with the black screens, as annoying as they are.


Playback is okay. Its the homepage and subscription pages that are freezing up.


DejaNews was great. It may be the first and canonical example of a service being ruined after it was acquired.


Good luck making Google Maps without JavaScript.


I use earlyoom to prevent the godawful slowdown that comes with running low on memory, and I've found that it aggressively kills Gmail. There are times I've actually had to turn earlyoom off in order to read my mail, even.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: