Hacker News new | past | comments | ask | show | jobs | submit login
Our best practices are killing mobile web performance (molily.de)
220 points by nkurz on May 21, 2016 | hide | past | web | favorite | 127 comments



Why isn't there a way for devices to indicate to a server if they are on a slow or data-capped connection?

It seems like "responsive" web design is based entirely around the size or pixel-density of a device's screen. But why can't the experience respond to the speed of a user's connection, or the amount of data the user can afford to download?

My phone already knows I have a hard data cap when I'm off wifi. Why can't it include a field in its request to every web server indicating I would like the smallest amount of data possible, and then properly configured servers can respond by omitting large images, videos, font files, etc. We have srcset for images, but AFAIK choosing which image to use is still based on screen size rather than bandwidth. And we need to get beyond srcset - as a content creator, I want to be able to mark some images and other heavy resources as optional, so I don't reference to them at all to bandwidth-sensitive devices.


As much as HN hates adding stuff to browsers, this is something that needs to be controlled by the useragent, not the application.

Have the user tell their browser that the connection is metered, then have the browser request smaller assets, don't preload anything, maybe give the ability to mark some things as "optional", etc...

And by doing it that way it's easily "backwards compatible". Phones that don't understand the "this image is optional" attribute will just continue to render it, but those that do and know not to show them won't.


At $JOB-2, the CEO was clear in the application design that there were two classes of network: "free", and "expensive". If it's "free", do whatever; "expensive", minimise everything tenable.


I think this is quite probably the minimum possible distinction.

Either you have /unmetered/ and /fast/ and /low latency/... or you fallback to a design where every byte matters.


> Why isn't there a way for devices to indicate to a server if they are on a slow or data-capped connection?

IP databases can be used to check if the device is on a mobile connection from a telco. There is also the experimental navigator.connection property: https://developer.mozilla.org/en-US/docs/Web/API/Navigator/c...

> I would like the smallest amount of data possible

This is and should always be the default. It's the heavy content experiences that should be explicitly initiated by the user.


> This is and should always be the default. It's the heavy content experiences that should be explicitly initiated by the user.

This is the correct solution. Wanting the phones to send speed and cap info is just ... looking for a bandage.


With my unlimited fibre connection on my MacBook Pro, I'd prefer to have a web experience that looks amazing - even if it's 5MB per page. Saying 'the smallest amount of data possible' is a truism, but it's not particularly helpful.


I'd still like unadulterated HTML with some images sprinkled in. Content > presentation


Yes but the resolution of those "some images" depends on what kind of connection/device you're on. It's not adequate to say 'just send the lowest bandwidth experience', because then every desktop browser on a 5mbps connection would get blurry low-res images and Arial/Times New Roman text intended for iPhone 4 on a 2G connection.


I would always like to have locally-installed fonts in preference to web fonts, regardless of connection speed (Arial and Times New Roman are hardly the only options), and right-sized, well-compressed images (in the appropriate format for the image type) are essentially a solved problem.


> right-sized, well-compressed images (in the appropriate format for the image type) are essentially a solved problem.

An image with retina resolution is going to be pretty bulky if you want it to look crisp, even if you're doing a good job of optimizing.

If you're on a bad connection you probably want non-retina and aggressive compression, even though it can introduce blurs and artifacts. That can give you a file over ten times smaller.

One size does not fit all.


Good luck with that.

You'll fit in better in 2016 if you have leprosy but are killing it with a photo-heavy bootstrap theme + Angular SPA (with, of course, a healthy number of bower modules brought in).


`navigation.connection` only tells if the user is on cellular/wifi. While this is certainly helpful, cellular connections themselves can vary enormously.


There is connection.type (bluetooth, cellular, ethernet, wifi, wimax, other, mixed, unknown, none) and connection.downlinkMax (megabits/sec of the first hop) both available.

You can read the W3C draft for more details: https://w3c.github.io/netinfo/

It's not perfect but far better than nothing. It's relatively easy to figure out a "slow" profile based on both device and connection and optimize some of the heavier resources.


But even then there is only so much the application can (and should) do.

I don't want my application making decisions like trading download speed/size for battery life (better compression, etc...), or deciding to sacrifice quality because it thinks I don't want to wait.

We need APIs that let the useragents choose what they want, and let the users configure their useragent (or choose one that aligns with their ideas at the moment).

Like how srcset works, build APIs that let developers offer up several different assets, and let the useragent choose which is best for that moment (combining preferences for battery-life, connection speed, connection-capping and how close to the cap you are, preferred load time, and if it should "upgrade" to a better resource at a later time if possible)


Sure, better capabilities reporting + user control is the ideal. This will take a long time though considering how slow this all moves so incremental progress is still good until we get there.


Very interesting, but how does requesting an optimized version of seemingly a gazillion different pages by a billion different authors magically make any of that happen? From what I've seen, web devs throw the kitchen sink even on mobile.


I suppose that could be useful, but max downlink is the least useful metric they could have gone with. The times when I need traffic minimization the most also tend to be times when max downlink and average downlink are very far apart.


True, but the poor browser support for connection type and even poorer support for downlink means it cannot be relied on all the time.


Lazy-loading (the practice being somewhat criticized here) is actually halfway to that goal. The BBC (cited in this article) also uses a technique they call 'cutting the mustard', which checks which features are available on the browser to determine which JS to run. So bandwidth can be one of the things tested when doing JS-based optimization like this.

I agree that CSS-based responsive design just comes up inadequate when we want real optimization. We have to use JS based methods like deferred/lazy loading, detecting features, bandwidth, etc. (With images you can also have a <noscript> tag that loads the image immediately in no-JS scenarios, but it's difficult to figure out alternatives for lazy-loaded fonts and other things.)


Have you used the BBC site on a mobile?

It sucks ass and has done for a while. Blank spaces where pictures should be, suddenly fading in a few seconds too late, pixelated pictures that later resolve, elements change height and jump around the screen as it loads. Even though the text is there, you shouldn't bother starting to read it as it's just going to jump around in a few seconds.

It's precisely the sort of shitty site that the article is complaining about.

The guardian is another terrible site that's "mobile optimised" and have the cheek to give other developers advice on how to code for mobile, even though the site is an absolute nightmare to use. They have expanding sections on the homepage. You click a story, read it, and go back and the page takes 10 seconds jumping around to sort itself out. And worse still is the shitty adverts suddenly popping in after 20 seconds and causing you to lose position I the story you're reading.

HN and its basic style are much more useable than either the BBC or the guardian mobile sites.

It's like the developers don't understand that the reason I'm there is for the text and the pictures, not for the UI. So treat them as king and load them first and never, ever move them once they're loaded.


But blank and pixellated images that get filled out later are what I'd recommend. The alternative is looking at a blank screen while your browser makes upfront HTTP requests for images, which can delay render-blocking resources like a stylesheet from loading quickly (thus seeing no text on the page until everything loads.)

I understand what you mean about if the layout changes while reading then there's little value in rendering text upfront. But the solution is to reduce layout reflow, not to go back to the bad old days where we don't optimize/defer resources.

So for example, if an image is 200px in height and is lazy-loaded, we should let the placeholder be 200px in height. If the placeholder has 0 height and then the image loads and reflows the page, that messes up the UX. Same with other elements that change page layout. (Also caching things once they are loaded, so the example you mention of hitting the back button and waiting for everything to reflow again doesn't happen.)


> So for example, if an image is 200px in height and is lazy-loaded, we should let the placeholder be 200px in height.

No, you should simply specify the size in the IMG element. People who care about this stuff have been doing this correctly for 15 years or more. Just as people have been advocating for what now has been labeled with the buzzword "responsive design" since then. People who cared told you 15 years ago that fixed-width layouts are stupid because they won't display properly on phones (among many other reasons which were more relevant back then). This has all been a solved problem for a long, long time, if only people cared. But they still don't, still piling on more complexity and more bloat in an attempt to "fix" things.


Yes, by "placeholder" I mean the IMG element that loads with an initial low-res/default "src" attribute. Then the actual, higher-bandwidth image will load by updating the "src" attribute.


No, you don't use an initial low-res/default source that you later switch; you specify the actual damned image in the source and use the width and height attributes.


That makes the end device have to download a large image, which isn't particularly optimal for bandwidth, data cap, CPU usage, and battery life. It's not as simple as one-size-fits-all, because it really doesn't.


Please follow the thread of the conversion. I'm talking about how to mitigate the page reflow issues that the posted article is talking about when lazy loading images and other resources.


Except it actually makes the problem worse, whereas the age-old solution actually solves it.

If you specify a placeholder, that placeholder still has to be loaded, and as long as it hasn't been loaded, the dimensions are unknown, leading to reflow when the placeholder has been loaded. Plus, it adds even more bandwidth usage (for the placeholder and the javascript code), more CPU/battery usage (you have to execute javascript and decode an additional image (the placeholder)). Plus, it doesn't work without JS. Lazy loading of images in JS is just a dumb idea.

If anything, that might be a sensible feature of mobile browsers, which would only have to implement it once for it to work on all sites (instead of bloating each and every website with it), which could more easily have access to information about available connectivity, and which could also do it so much more easily, as the browser's rendering engine has a lot more clue which images currently are in the viewport.


Except it actually makes the problem worse

Nope. On real world websites, deferring images, fonts, and making other optimizations can reduce time-to-first paint significantly. That drives real UX and business results (imagine clicking a link and waiting for 7 seconds to see the heading on the page vs. waiting 1.5 seconds.) There's a reason all these websites practice lazy-loading and Google takes time-to-first-render seriously enough to impact search ranking.

If anything, that might be a sensible feature of mobile browsers

Sure, I would be happy if browsers introduced something like a "lazyload" attribute on images/JS/fonts etc. There is something like that for <script> tags (defer, async) but it still downloads the script ASAP. I need something that gets the text and CSS, shows it on the page, and then opens the HTTP connections for other stuff.


Part of this problem is solved via the Client Hints spec (http://httpwg.org/http-extensions/client-hints.html#save-dat...) When the user turns this on in his preferences, Chrome sends this data to the server as a header and the server can respond accordingly. Your server can decide what images to omit or other optimisations if it sees the header. But this setting is applied at the browser-level and is not per connection.

The NetInfo API can help in this case but the current spec is still not the best way forward. For example, even if the user is on Wifi, he may be on someone's tethered network and might still want to get a light-weight page. Conversely, some cellular connections might have a very high limit and the user might want to get the best possible experience even though he is on a 3G network.

A user-preference per network connection type would be cool to have!


> "Why isn't there a way for devices to indicate to a server if they are on a slow or data-capped connection?"

Why do you need it? Are your fancy fonts, 1 meg CSS file, or 1 meg of Javascript that essential to the function of your website that you absolutely must have them? And if not, what are they doing there?

I've spent many years writing many websites, including more intense applications that juggle a decent amount of data. There are a select few things where you really do need a lot of code in the backend, but far too often I encounter work done by designers press ganged into development (bad, but understandable) or work done by lazy developers who need a framework to center an image (bad, not understandable, you should know better) that results in web pages that peak into Pinterest levels of bloat.

If you're juggling analytics data or managing the CMS of a large site, yes you'll need a large site in turn to do that. Literally everything else should be as small as possible and load as quickly as possible on whatever device is querying, end of story.


> Why do you need it? Are your fancy fonts, 1 meg CSS file, or 1 meg of Javascript that essential to the function of your website that you absolutely must have them? And if not, what are they doing there?

I wonder myself this every time this discussion flares up.

Stack Overflow awnsers that tell fledgling devs 'you can use framework x function y to solve problem z' are particulary irksome.


Why isn't there a way for devices to indicate to a server if they are on a slow or data-capped connection?

There is. I'll update this post if I remember what it is, but I believe there's a spec and Chrome Canary is supporting it. It uses an HTTP header.


Rather than rely on specs or browser support, why not simply measure how quickly data is being downloaded?

That wouldn't help on the first page load, but it would make the information available on subsequent loads.


Navigation Timing API https://developer.mozilla.org/en-US/docs/Web/API/Navigation_...

I have a feeling there is something else but I have trouble remembering it, too, and I bet I have it bookmarked to study when I have the need to use it.


That's a good one, but it wasn't that.

It's something that literally gets clients to add an HTTP header to specify that bandwidth should be "saved".

Actually, that reminded me of what it was called.. haha, it's https://developers.google.com/web/updates/2016/02/save-data?... .. The header is: Save-Data: on.


It's this: https://developers.google.com/web/updates/2016/02/save-data?... .. the header is: Save-Data: on.


when I opened a topic a few days ago saying that mobile version of Gmail no longer works with JS toggled off, I got a reply pretending that JS doesnt pose a problem. As if all those Analytics and Ads were not powered by JS


The worst is when you start scrolling and an ad "lazily loads" under your finger, using your touch to switch you out of the browser and onto an app store page. It makes you reluctant to even start interacting with a page before you're sure it's done loading.


This infuriates me to no end. I just cannot believe that it's even allowed to kick a user out of their browser and into an app store without their express interaction.


The browser considers your touch (which you intended to scroll the page) an "express interaction". A policy from a simpler time!


I encounter this on page loads frequently, before I've even touched the screen. I also encounter it from ads inside certain news apps (which actually angers me more than the browser, if only a little).

Perhaps the express interaction is using my phone?


The relative rarity (IME) of appstore-jacking causes me to believe it's closely related to ad-network malware delivery. Perhaps a test for more nefarious future ads, but I'm inclined to blame unfixed ad network vulnerabilities.


Why is it any different than redirecting to a different webpage?


They're both obnoxiously shitty. You wanted to load one resource, and then are redirected to a different, unwanted resource by an ad? That's asshole behavior. It's particularly worse when you're redirected out of the app you're using and into another one.

Nobody wants to be redirected by an ad to something they didn't ask for.


No back button.


There is since iOS 9. It also has popups before navigating out of the browser, but yeah, what a web these days.


For YouTube videos, but I noticed suspiciously not for Twitter.


Lifehacker.com is horrible about lazy loading and scroll jank.


All of Gawker's sites are, Kotaku is practically unreadable on my Galaxy S6 on wifi, it's preposterous.


Take the hint then ;P


Also pretty much every townnews-type C-grade whitelabel news site.


I thought touch/scroll events were distinguished thus:

Finger down -> finger up = touch

Finger down -> move -> finger up = scroll

How does a scroll get interpreted as a touch? It sounds like a bug to me.


The ads pop up right underneath your finger in the lag between deciding to scroll and your finger touching the screen to drag. With the page jerking around underneath you that seems to end up being classified as a click rather than a drag, and away you go to whatever piece of crap you really don't care about.


I believe it's because JS is still loading and the scroll isn't registered until it springs to life and sees only a touch.


Pages can override the default interactions using touch event handlers.


That is bad! I get terrible scroll tracking lag, too.


In europe, the requirement for the annoying cookie compliance popups puts the final nail in the coffin for mobile browsing.


Yes I look forward to the day that idiotic law is overturned. It sort of almost did die (https://silktide.com/the-stupid-cookie-law-is-dead-at-last/), but the idiots in charge bought it back.


The cookie law is not even relevant to 99% of sites.

It doesn’t apply to session cookies, login cookies, technical cookies, etc.

It only applies to tracking cookies, which most sites just don’t need.


But most sites have ads


Then that’s their issue.

As you might have noticed, this site we’re on doesn’t need to have a cookie banner.


Probably because it's financed by a billion dollar VC.


I'd be curious to know what kind of hardware it takes to run HN. I'd suspect it's something like a one-hand-number-of-fingers AWS instances to actually serve it, although storage for all the comments is probably not insignificant at this point.


Not like it is impossible to show ads without cookies.


It is probably impossible to find an ad network that doesn't use cookies.



Another alternative is the Prebake filter set [1] for uBlock Origin.

[1] https://github.com/liamja/Prebake


Thanks! One less add-on, one more list for me. :)


that won't work for my mobile safari :(


But you can use WIPR to get rid of ads, at least if you are on a 64-bit iOS device.

http://giorgiocalderolla.com/index.html


yes, thats super annoying even on a PC. But for the rest of ads, I use Firefox mobile with toggle off JS, toggle off images and AdBlock/uBlock


Allow me to introduce you to the I Don't Care About Cookies extension, which is available for Android Firefox: https://addons.mozilla.org/en-US/firefox/addon/i-dont-care-a...


Thanks I will surely look at it, possibly to try it. Ive heard about cookie addons but dont know of any killer cookie addon. Meanwhole you could describe that extension in a few words, it can save a some clicks to 100 readers who are on a mobile right now.


The extension blocks cookie warning messages on a large number of websites and has the option to report that the user saw a cookie warning on the current website so that it can be blocked in the future.


thanks it must be very nice, i didnt realize its related to the previous reply. I'll.install it once i get on the computer.


Do not worry- they will remove it, once mobile devices are no longer a thing and add new obstructions to augmented reality.

Want to bet on the next big tech fear driven law ruining AR?

How about two buttons you must press all the time, to show you are not driving during AR?


Over the years I've developed a bit of a habitual response of repulsiveness whenever I hear the term "best practices" being used as a dogmatic justification for doing anything, often completely missing the actual situation at hand. It basically says "I don't want to think."

Related article: http://www.satisfice.com/blog/archives/27

That said, a lot of what I see in web development seems to be churn and features-for-the-sake-of-features bloat, and I think much of it has to do with the fact that the basics of a useful page --- images, text, hyperlinks, and some styling --- were essentially solved problems many years ago, so all web developers are left with is constantly trying to "reinvent" and "innovate" things that were, albeit not perfect, entirely usable. The "do everything in JS" trend is one of the clearest examples of this, as well as the "appification" of sites that are perfectly fine as a collection of statically linked pages. Blogs are the most common example of this that I come across --- instead of displaying the content using HTML, they require the browser to download and execute a huge blob of JS that often exceeds the size of the post content itself, just to show that exact same content, sometimes even making another (AJAX! It's awesome, so let's try to use it everywhere!) request to fetch that content in a different format, parse it, and then render it back into HTML. It sounds very Rube Goldberg, yet a lot of developers think this massive waste of resources is somehow normal.

Thanks to the Internet Archive, you can try visiting the BBC site from (exactly!) 10 years ago on your mobile device:

http://web.archive.org/web/20050521031013/http://www.bbc.co....

I don't have a mobile device with me at the moment, but it's clearly less "fluffy" than the page today, and loads essentially instantly even from IA's not-so-fast servers.


I think "Best Practices" are a good thing, so long as you understand why they are best practices and in what situations. They are bad thing when they are simply dogmatic cargo-cult programming without that understanding.

Moreover, I think they're a good thing as a short hand when thinking about problems. You don't have to think about why this is a best practice for every problem you encounter, because if you did, you'd never get anything done. i.e., Why don't we write all of our code in a single main method? Oh yeah, because it becomes a nightmare to find bugs, update to add new features, reuse algorithms inside it...etc...etc. Instead of going through the rationalization every time, the Best Practices rule of thumb generally works out.

Where it doesn't work out is if you've never done the exercise of why they're best practices in the first place. For every best practice I follow, I have understood the justifications and as a result, have developed an intuitive sense for when my situation is the exception to the rule. Or, if my intuition fails me and it gets questioned, I can re-run the exercise and perhaps change my mind.


Gratuitous use of background videos, large JS, large CSS, many fonts, and so on degraded the desktop web performance as well. It's not a mobile-only problem. Compare scrolling on HN with scrolling on a random startup page and consider the provided content.


All kinds of site are guilty of bloating their sites, without any real reason. Take PayPal, their site is slow enough as is, so why do I need to see a short video of a barista just to login. That's kinda stupid and has nothing to do with PayPal.

Aiming for a lighter site wouldn't just be good for mobile, it helps everyone. There's currently to much focus on presentation and to little on content. Even if the content is good, sometimes the presentation just get in the way.


While most in this article is true and has to be considered, in the real world of corporate production it will not work. Is is not user you have to care about. You have to keep happy your ignorant PM, that is all. Unless you are happy to put significant effort into inobtrusive education of your manager or PM, your efforts to enhance user experience will be most likely doomed.


That's been my experience as well, and it's so demoralizing. I've worked on many projects where we, the developers, took effort to produce a lean, functional, snappy website, on for the PM's to then order us to implement multiple tracking tools, ad systems, and 'widgets' that would turn the site into a bloated abomination and reduced it to a crawl.

The 'funny' part is that the sites would sometimes break without anyone noticing, because every developer and PM would have an ad blocker installed that filtered out the 'extras' that caused the breakage...


Very true.


My favorite is https://www.schwab.com.

If you attempt to type your username & password while the page is loading, it'll keep trying to clear & refocus the username field. It forces you to wait for 5 to 10 seconds before you can use it. If you are typing without looking at the page, you will often end up entering a part of your password in clear in the username field. If you're impatient, you might even hit enter accidentally and send your password as your username to the server. Yahoo mail used to be like that, haven't used it in a long time.


The biggest "best practice" that needs questioning is the excessive use of client side JavaScript. 90% of the websites out there would be able to provide all their functionality with server-side rendering, except that's considered unfashionable or something. I think it's partly web designers inventing fads to keep themselves relevant -- if you're a web developer, you have a financial incentive to try to persuade your customers that their perfectly working site needs redesigned every 3-5 years to feel "modern".

But as a user, that's not really relevant. I want to browse the web of 2004 on the internet connection of 2016.


So then how do you handle all your dynamic behavior, or are you actually suggesting we go back to the click-wait-click-wait paradigm of stateless web applications?

Having client side JavaScript should not really add much to the overhead. Yes, the initial load time will be larger, but every subsequent request will be quicker because the server doesn't need to generate the markup and send it along the wire on every click.


Using JavaScript rarely seems to fix click-wait-click-wait. It just replaces the browsers spinner with a javascript spinner.

Raw HTML provides prefetch hints. That, combined with fast servers and replication to ensure those servers are near your users, are all a fast website needs.

Of course if your website is actually an app then maybe it shouldn't be a website to begin with.


How many completely static sites are hopelessly broken without Javascript? This is a problem.


Can we update that to the web of '07? I had several sites I love open around then.


I wonder if Google altering their SEO rating to test for "page jank" would impact this.

Remember the adage "you get what you measure" well, for a while Google has been awarding time to document ready (or some such) as their page speed test.

What if they started measuring post page load stuff?

Seo teams could start to measure page stability and budget for it.


I doubt it (perhaps smaller players might get hit though). Google apparently measures for page speed but I don't see any evidence that the worst offenders (e.g. major news sites) are losing page rank for having "heavy" pages. It seems that if you have a powerful brand, you can operate above what Google suggests webmasters do. At least that's how I see it from my perspective.


I'm probably the wrong person to ask because I hate most websites which aren't just JavaScript free pages with text and a few relevant, tasteful images, but don't people who "design" websites actually check what they look like on mobile devices, and how they act under the average 3g data speed?


Yeah, they do check how the sites look on mobile devices. They might also test them under the average 3g data speed, though there are a lot of developers who just use the office wifi and assume it'll work fine everywhere else.


And they should be chastised.

I generally only make "enterprisey" web apps these days, but I will make sure they are usable all the way down to a 2G connection.

And you don't just do this on the $800 top of the line phone, you try it on other OS's, last year's model, a model from 3 years ago, that 2 year old phone from my mother in law that is still crapped up with games and other stuff so it runs extra slow, and then you adjust accordingly.

And the thing is this really doesn't take that much longer, and in the grand scheme of things budgeting for a few older devices to test on is cheap as shit.


Good points, but you should consider the experience of using a better mobile browser. Opera Mini isn't exactly up to date with modern web standards..


I use the latest Chrome on Android and I have pretty much the same experience.


I am very happy to see that somebody finally wrote all of this down!

Perhaps it's time for a wall-of-shame which automatically gathers and displays performance data for famous mobile websites to pressure people into fixing stuff?


Exactly,

"the page is jumping around"

This is what most annoying experience during mobile web browsing.


Just reading the word "jumping around" pisses me off a bit...

But seriously, after 20 years of web development this is what we end up with? Millions and millions of man hours invested in this and the experience is worse than ever.

Somebody should write a "mobile web is a ghetto" rant and then we should start burning all of this down and start over with a fresh list of best practices.


It's important to realize the why for all this. Every site wants to be lean and fast, but they arent because it's ultimately a trade-off between the time, effort and talent available.

Even the BBC which has enough funding and a good tech staff is likely working with some bloated content management system that automatically generates overweight pages. Add in all the ads and other widget functionality, media and various customizations and themes and this is what we end up with.

There won't be any major progress until there's a big cause to prioritize this (as in a major loss of traffic or revenue) or there are tools that will automatically reformat and optimize pages (certain CDNs are getting into this heavily now).


No, its because a handful of bloggers decided doing everything client side was the proper and true way to develop a web site. Other developers, being easily influenced, have followed that dogmatically instead of using their heads.


Poor developers are covered under "talent" mentioned above.


One ad image could easily dwarf the most convoluted HTML/CSS/JS in size. One autoplaying video could dwarf the hard-disk size on my first PC...


That doesn't change anything. Images are easy to handle while JS/CSS are usually blocking assets that require parsing.

A small but complex JS file can have a bigger effect on latency and page interactivity/performance than many large images.


I was thinking along the lines of slow/data-limited networks and bandwidth usage.

I can download the complete works of Shakespeare from Project Gutenberg as plain-text with less bandwidth than one of the full-resolution JPEGs my digital camera takes.


And that image can be displayed full-screen on a 5K monitor (which is overkill) with no visible artifacts using a JPEG file smaller than 1MB if it's properly compressed and only has the copyright and contact as EXIF data. Asset preparation is part of the process, and it's not even difficult.


Let's just please stop using custom fonts. I can tolerate the other stuff, but having to wait forever without being able to read anything because i m waiting for your custom font to load is just stupid, and it happens often.


Universal rendering, works without javascript

And not filling the page with crap


> “Unobtrusive JavaScript” is an decade-old technique I have already critized in an earlier article. JavaScript is considered “unobtrusive” when it adds additional behavior to an existing, self-sufficient HTML structure. Once the JavaScript kicks in, it changes the HTML DOM and adds behavior. Inevitably, this leads to significant content changes and reflows.

I completely agree with the author on the fact that this is a problem, but I don't understand blaming it on "unobtrusive javascript." Adding behavior doesn't require a reflow.


The number of times I have miscliked because page is still loading and jumped the buttons. Or losing my already filled content.

And I'm on a fast 4g network and good phone, LG G3


Mobile browsing is awful enough that I avoid it as much as possible. There's a few sites I trust or need that I use (wikipedia, a couple of news sites, train time tables), but otherwise I restrict my web browsing to the laptop.

It wouldn't need to be so, but as long as browsing on my phone makes me angry and frustrated, there's no point in engaging the Internet on my phone. I'll just use my phone was what it was intended for: reading ebooks.


What makes you so angry and frustrated about it?


Not OP, but while the article lists some things that make web browsing on mobile more irritating, the single worst thing about mobile web browsers is unmodifiably short cache times, combined with the fact that switching to another app sometimes, but not always means you've accidentally closed the app.

Unlike desktop browsers, mobile browsers will not allow you to leave a page open and come back to it without reloading the page, which makes them effectively useless for so many situations. Even just locking the screen while on a page may result in the page being reloaded as soon as you return and unlock the screen.

One of the major original benefits of having an always-on internet connection was the ability to look thinks up on the fly; to spend 10 seconds checking something on Google before continuing with what you're doing. With my N900, I used to be able to move through a webapp action by action, locking my phone to get out of the metro car, unlocking it on the way up the escalator, locking it for the walk to work, unlocking it on the elevator, and completing what I was doing as I sat down at my desk.

On modern mobile web browsers, this is impossible as far as I can tell. I can't even read the same webpage reliably in a dozen 10-15 second increments over an hour, because many of the times I unlock my phone to look at it, it reloads. If I'm really lucky and don't forget to leave it alone, it may eventually position the scroll back where I was. Of course, that takes up the 10 seconds I had, so now I'll roll the dice again in five minutes when I have 10 more seconds.

This is the single biggest problem I have with mobile browsing: I can't actually use the phone as a web browser the way I use my desktop web browser. The tabs are useless because they don't load and keep content. Multiple-action webapps are useless because the browser won't remember where I was even though I haven't deliberately exited the application. Articles or blog posts longer than 5-6 screens are pointless because I will not be able to read them in the times I have, because the browser assumes that I'm doing nothing else for the next 5 minutes nearly every time I open it, "Wait, wait, let's start over from the top, shall we?"

Infuriating.

TL;DR: the biggest problem with mobile browsing is the browsers, not the content.


That pretty much sums up my experience try to use an iPad on wifi for browsing. The browser seems to think I'm an idiot, it's to slow and tabs doesn't work. If the browser won't load the content of a tab in the back ground, then I fail to see the point in having tabs. The mobile devices I've used simply don't have the processing power to run a modern browser in my opinion.

After a few minutes with any mobile device I'm normally tense and angry and just give up and find a computer instead.

I also hate navigating on a touch screens, but that's sort of irrelevant in this context.


Since I switched to the iPad Mini 2 things have improved significantly for me, but even now I too often find myself opening tabs that take ages to load and, worst of all, need so much memory that all the other tabs are unloaded. In this case I'd say it's mainly the content that is the issue, not the browser, but still.


My beef is with responsive design.

It's a great idea but designers always lop off features that don't scale down well instead of fixing them. Compare a user page in Reddit on mobile vs. desktop. The mobile version is pretty, but the information is lacking.


Another common issue with responsive design is that often the desktop assets are loaded even on the mobile version. Which aside from causing real issues (speed, blown bandwidth caps), just feels wrong and wasteful.


The article this discussion has a lot of the points that make it unpleasant to me.

Everything is slow. I don't dare touch the page until it's fully loaded (and I don't always know that has from the browser's progress bar) and stops jumping around. Pages are designed for bigger screens (they don't fit on my phone's screen in a useful way) or better eyes (fonts are really tiny). Also, the phone is just a bad device for doing anything complicated, such as following a link (hard to hit tiny links with thick fingers) or looking up information from multiple sources ("tab" handling is abysmal, at least with Chrome on Android). Etc.


Give a system-wide ad-blocker (AdAway) and Opera Mobile a try. You can zoom in and it will reflow the text to make it nicely readable. I wish I could switch to Firefox but the lack of reflow is an incredibly weird anti-feature.


I was very sensitive about this problem, and incredibly angered when text reflow was gradually dropped thanks to Google, and when of course Mozilla happily followed suit. But I must admit that on the web of today it is rarely an issue.

If you want text reflow on touch working on Firefox for Android, you've got this:

Text Reflow by david2097152 https://addons.mozilla.org/en-US/android/addon/text-reflow/

and its always-active fork:

Android Text Reflow by richmoz https://addons.mozilla.org/en-US/android/addon/text-reflow/

This could also come in handy:

Fit Text To Width by Jarrad https://addons.mozilla.org/en-US/android/addon/fit-text-to-w...


Oh, I too thought those would be workarounds but they are not.

The first two say: "This means it will only reflow ONE THING AT A TIME. If the page has 50 paragraphs, you will have to tap once on all 50 to reflow the whole thing."

The third one simply does nothing for me.


Just tap on the next paragraph when scrolling. Scroll, tap, repeat. Your thumb will do that automatically while you're still reading the paragraph above.

EDIT: I just tried the third one by zooming in on a few desktop sites and I'm impressed. Make sure text size is set to "tiny" in the accessibility settings. It definitely works for me™.


> ("tab" handling is abysmal, at least with Chrome on Android)

this can be fixed by going to the settings via the chrome menu and turning 'merge tabs and apps' off, if that's the behavior you're referring to


The two things that aggravate me most about web sites are (a) important stuff loads last only to find it's not even worth reading and (b) reflows where page content jumps around as I'm reading it or trying to click a link. Wait, let's add inability to scroll or content freezing for some amount of time due to crap I don't need. Author of this piece is on point tackling all three problems. If most sites did, it would prevent most of the headaches I have going to them.

It's not just a gripe either. I often avoid sites that do this stuff in favor of others that might have content or service I need without the headaches. The ones that have little competition or all do stupid stuff still get my visits. Just fewer of them due to the headaches. I doubt I'm the only one that reacts these ways.


I wish Google would penalize websites based on bloat like they do websites that are not mobile-enabled. Of course they won't/can't since the bloat contributes to their bottom line.


Nice read. Google's been working on AMP (https://www.ampproject.org/) to deal with this. They're applying a pretty strict policy to speed up rendering on mobile pages by trying to be as specific as they can upfront to determine the layout up front.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: