It seems like "responsive" web design is based entirely around the size or pixel-density of a device's screen. But why can't the experience respond to the speed of a user's connection, or the amount of data the user can afford to download?
My phone already knows I have a hard data cap when I'm off wifi. Why can't it include a field in its request to every web server indicating I would like the smallest amount of data possible, and then properly configured servers can respond by omitting large images, videos, font files, etc. We have srcset for images, but AFAIK choosing which image to use is still based on screen size rather than bandwidth. And we need to get beyond srcset - as a content creator, I want to be able to mark some images and other heavy resources as optional, so I don't reference to them at all to bandwidth-sensitive devices.
Have the user tell their browser that the connection is metered, then have the browser request smaller assets, don't preload anything, maybe give the ability to mark some things as "optional", etc...
And by doing it that way it's easily "backwards compatible". Phones that don't understand the "this image is optional" attribute will just continue to render it, but those that do and know not to show them won't.
Either you have /unmetered/ and /fast/ and /low latency/... or you fallback to a design where every byte matters.
IP databases can be used to check if the device is on a mobile connection from a telco. There is also the experimental navigator.connection property: https://developer.mozilla.org/en-US/docs/Web/API/Navigator/c...
> I would like the smallest amount of data possible
This is and should always be the default. It's the heavy content experiences that should be explicitly initiated by the user.
This is the correct solution. Wanting the phones to send speed and cap info is just ... looking for a bandage.
An image with retina resolution is going to be pretty bulky if you want it to look crisp, even if you're doing a good job of optimizing.
If you're on a bad connection you probably want non-retina and aggressive compression, even though it can introduce blurs and artifacts. That can give you a file over ten times smaller.
One size does not fit all.
You'll fit in better in 2016 if you have leprosy but are killing it with a photo-heavy bootstrap theme + Angular SPA (with, of course, a healthy number of bower modules brought in).
You can read the W3C draft for more details: https://w3c.github.io/netinfo/
It's not perfect but far better than nothing. It's relatively easy to figure out a "slow" profile based on both device and connection and optimize some of the heavier resources.
I don't want my application making decisions like trading download speed/size for battery life (better compression, etc...), or deciding to sacrifice quality because it thinks I don't want to wait.
We need APIs that let the useragents choose what they want, and let the users configure their useragent (or choose one that aligns with their ideas at the moment).
Like how srcset works, build APIs that let developers offer up several different assets, and let the useragent choose which is best for that moment (combining preferences for battery-life, connection speed, connection-capping and how close to the cap you are, preferred load time, and if it should "upgrade" to a better resource at a later time if possible)
I agree that CSS-based responsive design just comes up inadequate when we want real optimization. We have to use JS based methods like deferred/lazy loading, detecting features, bandwidth, etc. (With images you can also have a <noscript> tag that loads the image immediately in no-JS scenarios, but it's difficult to figure out alternatives for lazy-loaded fonts and other things.)
It sucks ass and has done for a while. Blank spaces where pictures should be, suddenly fading in a few seconds too late, pixelated pictures that later resolve, elements change height and jump around the screen as it loads. Even though the text is there, you shouldn't bother starting to read it as it's just going to jump around in a few seconds.
It's precisely the sort of shitty site that the article is complaining about.
The guardian is another terrible site that's "mobile optimised" and have the cheek to give other developers advice on how to code for mobile, even though the site is an absolute nightmare to use. They have expanding sections on the homepage. You click a story, read it, and go back and the page takes 10 seconds jumping around to sort itself out. And worse still is the shitty adverts suddenly popping in after 20 seconds and causing you to lose position I the story you're reading.
HN and its basic style are much more useable than either the BBC or the guardian mobile sites.
It's like the developers don't understand that the reason I'm there is for the text and the pictures, not for the UI. So treat them as king and load them first and never, ever move them once they're loaded.
I understand what you mean about if the layout changes while reading then there's little value in rendering text upfront. But the solution is to reduce layout reflow, not to go back to the bad old days where we don't optimize/defer resources.
So for example, if an image is 200px in height and is lazy-loaded, we should let the placeholder be 200px in height. If the placeholder has 0 height and then the image loads and reflows the page, that messes up the UX. Same with other elements that change page layout. (Also caching things once they are loaded, so the example you mention of hitting the back button and waiting for everything to reflow again doesn't happen.)
No, you should simply specify the size in the IMG element. People who care about this stuff have been doing this correctly for 15 years or more. Just as people have been advocating for what now has been labeled with the buzzword "responsive design" since then. People who cared told you 15 years ago that fixed-width layouts are stupid because they won't display properly on phones (among many other reasons which were more relevant back then). This has all been a solved problem for a long, long time, if only people cared. But they still don't, still piling on more complexity and more bloat in an attempt to "fix" things.
If anything, that might be a sensible feature of mobile browsers, which would only have to implement it once for it to work on all sites (instead of bloating each and every website with it), which could more easily have access to information about available connectivity, and which could also do it so much more easily, as the browser's rendering engine has a lot more clue which images currently are in the viewport.
Nope. On real world websites, deferring images, fonts, and making other optimizations can reduce time-to-first paint significantly. That drives real UX and business results (imagine clicking a link and waiting for 7 seconds to see the heading on the page vs. waiting 1.5 seconds.) There's a reason all these websites practice lazy-loading and Google takes time-to-first-render seriously enough to impact search ranking.
If anything, that might be a sensible feature of mobile browsers
Sure, I would be happy if browsers introduced something like a "lazyload" attribute on images/JS/fonts etc. There is something like that for <script> tags (defer, async) but it still downloads the script ASAP. I need something that gets the text and CSS, shows it on the page, and then opens the HTTP connections for other stuff.
The NetInfo API can help in this case but the current spec is still not the best way forward. For example, even if the user is on Wifi, he may be on someone's tethered network and might still want to get a light-weight page. Conversely, some cellular connections might have a very high limit and the user might want to get the best possible experience even though he is on a 3G network.
A user-preference per network connection type would be cool to have!
I've spent many years writing many websites, including more intense applications that juggle a decent amount of data. There are a select few things where you really do need a lot of code in the backend, but far too often I encounter work done by designers press ganged into development (bad, but understandable) or work done by lazy developers who need a framework to center an image (bad, not understandable, you should know better) that results in web pages that peak into Pinterest levels of bloat.
If you're juggling analytics data or managing the CMS of a large site, yes you'll need a large site in turn to do that. Literally everything else should be as small as possible and load as quickly as possible on whatever device is querying, end of story.
I wonder myself this every time this discussion flares up.
Stack Overflow awnsers that tell fledgling devs 'you can use framework x function y to solve problem z' are particulary irksome.
There is. I'll update this post if I remember what it is, but I believe there's a spec and Chrome Canary is supporting it. It uses an HTTP header.
That wouldn't help on the first page load, but it would make the information available on subsequent loads.
I have a feeling there is something else but I have trouble remembering it, too, and I bet I have it bookmarked to study when I have the need to use it.
It's something that literally gets clients to add an HTTP header to specify that bandwidth should be "saved".
Actually, that reminded me of what it was called.. haha, it's https://developers.google.com/web/updates/2016/02/save-data?... .. The header is: Save-Data: on.
Perhaps the express interaction is using my phone?
Nobody wants to be redirected by an ad to something they didn't ask for.
Finger down -> finger up = touch
Finger down -> move -> finger up = scroll
How does a scroll get interpreted as a touch? It sounds like a bug to me.
It doesn’t apply to session cookies, login cookies, technical cookies, etc.
It only applies to tracking cookies, which most sites just don’t need.
As you might have noticed, this site we’re on doesn’t need to have a cookie banner.
Want to bet on the next big tech fear driven law ruining AR?
How about two buttons you must press all the time, to show you are not driving during AR?
Related article: http://www.satisfice.com/blog/archives/27
That said, a lot of what I see in web development seems to be churn and features-for-the-sake-of-features bloat, and I think much of it has to do with the fact that the basics of a useful page --- images, text, hyperlinks, and some styling --- were essentially solved problems many years ago, so all web developers are left with is constantly trying to "reinvent" and "innovate" things that were, albeit not perfect, entirely usable. The "do everything in JS" trend is one of the clearest examples of this, as well as the "appification" of sites that are perfectly fine as a collection of statically linked pages. Blogs are the most common example of this that I come across --- instead of displaying the content using HTML, they require the browser to download and execute a huge blob of JS that often exceeds the size of the post content itself, just to show that exact same content, sometimes even making another (AJAX! It's awesome, so let's try to use it everywhere!) request to fetch that content in a different format, parse it, and then render it back into HTML. It sounds very Rube Goldberg, yet a lot of developers think this massive waste of resources is somehow normal.
Thanks to the Internet Archive, you can try visiting the BBC site from (exactly!) 10 years ago on your mobile device:
I don't have a mobile device with me at the moment, but it's clearly less "fluffy" than the page today, and loads essentially instantly even from IA's not-so-fast servers.
Moreover, I think they're a good thing as a short hand when thinking about problems. You don't have to think about why this is a best practice for every problem you encounter, because if you did, you'd never get anything done. i.e., Why don't we write all of our code in a single main method? Oh yeah, because it becomes a nightmare to find bugs, update to add new features, reuse algorithms inside it...etc...etc. Instead of going through the rationalization every time, the Best Practices rule of thumb generally works out.
Where it doesn't work out is if you've never done the exercise of why they're best practices in the first place. For every best practice I follow, I have understood the justifications and as a result, have developed an intuitive sense for when my situation is the exception to the rule. Or, if my intuition fails me and it gets questioned, I can re-run the exercise and perhaps change my mind.
Aiming for a lighter site wouldn't just be good for mobile, it helps everyone. There's currently to much focus on presentation and to little on content. Even if the content is good, sometimes the presentation just get in the way.
The 'funny' part is that the sites would sometimes break without anyone noticing, because every developer and PM would have an ad blocker installed that filtered out the 'extras' that caused the breakage...
If you attempt to type your username & password while the page is loading, it'll keep trying to clear & refocus the username field. It forces you to wait for 5 to 10 seconds before you can use it. If you are typing without looking at the page, you will often end up entering a part of your password in clear in the username field. If you're impatient, you might even hit enter accidentally and send your password as your username to the server. Yahoo mail used to be like that, haven't used it in a long time.
But as a user, that's not really relevant. I want to browse the web of 2004 on the internet connection of 2016.
Raw HTML provides prefetch hints. That, combined with fast servers and replication to ensure those servers are near your users, are all a fast website needs.
Of course if your website is actually an app then maybe it shouldn't be a website to begin with.
Remember the adage "you get what you measure" well, for a while Google has been awarding time to document ready (or some such) as their page speed test.
What if they started measuring post page load stuff?
Seo teams could start to measure page stability and budget for it.
I generally only make "enterprisey" web apps these days, but I will make sure they are usable all the way down to a 2G connection.
And you don't just do this on the $800 top of the line phone, you try it on other OS's, last year's model, a model from 3 years ago, that 2 year old phone from my mother in law that is still crapped up with games and other stuff so it runs extra slow, and then you adjust accordingly.
And the thing is this really doesn't take that much longer, and in the grand scheme of things budgeting for a few older devices to test on is cheap as shit.
Perhaps it's time for a wall-of-shame which automatically gathers and displays performance data for famous mobile websites to pressure people into fixing stuff?
"the page is jumping around"
This is what most annoying experience during mobile web browsing.
But seriously, after 20 years of web development this is what we end up with? Millions and millions of man hours invested in this and the experience is worse than ever.
Somebody should write a "mobile web is a ghetto" rant and then we should start burning all of this down and start over with a fresh list of best practices.
Even the BBC which has enough funding and a good tech staff is likely working with some bloated content management system that automatically generates overweight pages. Add in all the ads and other widget functionality, media and various customizations and themes and this is what we end up with.
There won't be any major progress until there's a big cause to prioritize this (as in a major loss of traffic or revenue) or there are tools that will automatically reformat and optimize pages (certain CDNs are getting into this heavily now).
A small but complex JS file can have a bigger effect on latency and page interactivity/performance than many large images.
I can download the complete works of Shakespeare from Project Gutenberg as plain-text with less bandwidth than one of the full-resolution JPEGs my digital camera takes.
And not filling the page with crap
And I'm on a fast 4g network and good phone, LG G3
It wouldn't need to be so, but as long as browsing on my phone makes me angry and frustrated, there's no point in engaging the Internet on my phone. I'll just use my phone was what it was intended for: reading ebooks.
Unlike desktop browsers, mobile browsers will not allow you to leave a page open and come back to it without reloading the page, which makes them effectively useless for so many situations. Even just locking the screen while on a page may result in the page being reloaded as soon as you return and unlock the screen.
One of the major original benefits of having an always-on internet connection was the ability to look thinks up on the fly; to spend 10 seconds checking something on Google before continuing with what you're doing. With my N900, I used to be able to move through a webapp action by action, locking my phone to get out of the metro car, unlocking it on the way up the escalator, locking it for the walk to work, unlocking it on the elevator, and completing what I was doing as I sat down at my desk.
On modern mobile web browsers, this is impossible as far as I can tell. I can't even read the same webpage reliably in a dozen 10-15 second increments over an hour, because many of the times I unlock my phone to look at it, it reloads. If I'm really lucky and don't forget to leave it alone, it may eventually position the scroll back where I was. Of course, that takes up the 10 seconds I had, so now I'll roll the dice again in five minutes when I have 10 more seconds.
This is the single biggest problem I have with mobile browsing: I can't actually use the phone as a web browser the way I use my desktop web browser. The tabs are useless because they don't load and keep content. Multiple-action webapps are useless because the browser won't remember where I was even though I haven't deliberately exited the application. Articles or blog posts longer than 5-6 screens are pointless because I will not be able to read them in the times I have, because the browser assumes that I'm doing nothing else for the next 5 minutes nearly every time I open it, "Wait, wait, let's start over from the top, shall we?"
TL;DR: the biggest problem with mobile browsing is the browsers, not the content.
After a few minutes with any mobile device I'm normally tense and angry and just give up and find a computer instead.
I also hate navigating on a touch screens, but that's sort of irrelevant in this context.
It's a great idea but designers always lop off features that don't scale down well instead of fixing them. Compare a user page in Reddit on mobile vs. desktop. The mobile version is pretty, but the information is lacking.
Everything is slow. I don't dare touch the page until it's fully loaded (and I don't always know that has from the browser's progress bar) and stops jumping around. Pages are designed for bigger screens (they don't fit on my phone's screen in a useful way) or better eyes (fonts are really tiny). Also, the phone is just a bad device for doing anything complicated, such as following a link (hard to hit tiny links with thick fingers) or looking up information from multiple sources ("tab" handling is abysmal, at least with Chrome on Android). Etc.
If you want text reflow on touch working on Firefox for Android, you've got this:
Text Reflow by david2097152
and its always-active fork:
Android Text Reflow by richmoz
This could also come in handy:
Fit Text To Width by Jarrad https://addons.mozilla.org/en-US/android/addon/fit-text-to-w...
The first two say: "This means it will only reflow ONE THING AT A TIME. If the page has 50 paragraphs, you will have to tap once on all 50 to reflow the whole thing."
The third one simply does nothing for me.
EDIT: I just tried the third one by zooming in on a few desktop sites and I'm impressed. Make sure text size is set to "tiny" in the accessibility settings. It definitely works for me™.
this can be fixed by going to the settings via the chrome menu and turning 'merge tabs and apps' off, if that's the behavior you're referring to
It's not just a gripe either. I often avoid sites that do this stuff in favor of others that might have content or service I need without the headaches. The ones that have little competition or all do stupid stuff still get my visits. Just fewer of them due to the headaches. I doubt I'm the only one that reacts these ways.