I feel like the 1MB limit is excessively generous, especially for text-only pages. But maybe that's what makes it so damning when pages fail to adhere to it. I know at least one website I maintain fails it spectacularly (though in my defense it's entirely because of that website being chock-full of photos, and full-res ones at that; pages without those are well under that 1MB mark), while other sites I've built consist entirely of pages within a fraction of that limit.
It'd be interesting to impose a stricter limitation to the 1MB Club: one where all pages on a given site are within that limit. This would disqualify Craigslist, for example (the listing search pages blow that limit out of the water, and the listings themselves sometimes do, too).
I also wonder how many sites 1mb.club would have to show on one page before it, too, ends up disqualifying itself. Might be worthwhile to start thinking about site categories sooner rather than later if everyone and their mothers starts spamming that GitHub issues page with sites (like I'm doing right now).
My toy project https://k8.fingswotidun.com/static/ide/?gist=ad96329670965dc...
Using math.js is by far the heaviest part but at least your Asm already knows things like solarMass and planckConstant :-). CodeMirror comes in second heaviest but for user-experience-per-byte you're always going to be streets ahead of a gif.
Yet this is most times used to load React to make a button clickable.
It's not a bad thing, it's just a thing.
If you have a few megs of images that never show up because they take too long to load, there is no point.
SVG only works in a few places.
No kidding. I just checked and the average text-only page on my blog well under 100kb. Even the image-heavy front page is under 1MB...
Even the page where I used images came in at juuuust about the limit -- to the point where you'd have to make a ruling of whether the rest of the page gzipped counted because over the wire was technically <1MB but it was just above after decompress. The site does specifically say "downloaded", haha
(This is a flexible target, depending on the complexity of a page. E.g. for a "bloated" page, a fully styled video display for competition winners showing 200+ entries and 280+ individual videos in categorized views is about 250K, including a few images, two and a half font families and the Vimeo Player SDK, but excluding the load of any external video streams. However, with compression we still manage the 140K mark.)
Then reality hits: Client insists on a full-width photographic hero image as it's still 2014. Usual controversies about a full-size intro video (autoplay, of course), we must have this highly intrusive chat asset installed, etc, etc… – And we easily blow the 1MB limit.
Lets say you write a daily blog. A single A4 of text contains on average 3000 characters, your posts average slightly above that by being 4000 characters.
How long until the text content alone is above the 1MB limit.
Being more verbose is generally just poor writing. Now, using a separate website per topic seems like a silly limitation, but the more topics being discussed the less relevant the discussion.
And a 500kB image can contain megapixels which will slow down non-PGU browsers.
Personally I always go for a max 2 sec. limit on all devices.
For context, 1MB is the same order of magnitude as the original Doom which was about 2.4MB in size. 
Even better - fixed site-wide assets (i.e css, js) to features; hence loading an entire framework only to use a small % of it's features is penalised.
Google's headline research on the subject says "Improving your load time by 0.1s can boost conversion rates by 8%."
Some add'l data, sourced by Neilsen Group below:
- Google found that increasing the load time of its SERPs by half a second resulted in a 20% higher bounce rate.
- Google found 53% of mobile visits ended if a page took longer than 3 seconds to load.
- Akamai aggregated data from 17 retailers (7 billion pageviews) and found that conversion rates were highest for pages that loaded in less than 2 seconds; longer load times correlated with 50% drops in conversion rates and increased bounce rates, especially for mobile visitors.
- BBC found that for every extra second of page load time, 10% of users will leave.
Want to sell / fix this?
Here's the best three simple resource illustrating objective third party results from increasing site speed:
Here is a more compelling deeper look from a leader in the UX space:
Here's a really well written article about how to PROVE to the powers that be that site speed is worth testing:
In e-commerce, the biggest factors (that come to mind) to compete on are—SEO and brand aside—price, offerings, and convenience. Convenience has two dimensions: UX and performance.
If a user has a very clear idea of what they want from your site, they'll probably be patient. If a user is channel surfing (and the vast majority are when it comes to shopping, comparing options, etc.), then every millisecond matters. Every millisecond spent loading is a millisecond not spent selling/convincing.
If I'm spending an hour's pay on something I really want, of course I'll wait 10 seconds for the page to load, or if I can't get it to load at all, I'll make a note to try again later from different browser or internet connection. I'll manually enter my address details if they didn't auto-fill properly. I'll respond to emails on what I want to declare for customs, and various other efforts.
I, and people I know, feel like we buy very little on a whim. Is that unique? Are there whales who buy everything you can put in front of their face, or a different demographic who searches for something to buy and then changes their mind in precisely 850 milliseconds?
I would accept that like candy in the checkout lane, the profit is small but worth more than the extra effort it takes to put the offering there, but the revenue is small compared to the actual stuff people need to buy, but the analytics that suggest that hundreds of people click a link to your page but most close it faster than you can read the headline just seem unbelievable.
Maybe, maybe not, but site load time has a very well documented correlation with bounce rate. And if a user bounces, they're obviously not going to end up spending money.
I believe it's more about keeping the user in a sort of sales/marketing 'funnel'. If I'm just browsing and come across a service I might be vaguely interested in, perhaps I'll click the link to view their page. Now I'm at the top of the funnel. With each link I have to click to follow, there is an opportunity for me to get bored, get distracted by something else, or so on. Maybe I'm not buying anything then, but maybe somewhere along the funnel is a free trial of the service, and if I'm in a pleasant mood, maybe I'll do the trial. But if I bounced because of a 5 sec load time, chances are good the mildly impulsive mood I was in is not going to persist enough to encourage a subsequent visit.
So it might not be an immediate thing, but better engagement does lead to more sales. I like to consider myself a mindful shopper, but there are definitely a handful of $5/month services I am subbed to because I was in a vaguely impulsive mood and stumbled into something I thought might be useful -- and their signup process was silky smooth.
And if you are instead shopping for an item and browsing several different sites, if one of them sticks out because of its slow load times, that's a tab you are going to close.
When I go to ebay I end up on the landing page. Then I have to enter my product name into the search. Then I will go into search settings and filter out auctions (if I want to order immediately). Then I have to open 3-4 different product listings. That's around 6 page loads. If each page takes 10 seconds then I would have to spend 1 minute doing nothing but waiting for the site to load.
> or if I can't get it to load at all, I'll make a note to try again later from different browser or internet connection.
Isn't it far more likely you are going to buy whatever you want from somewhere else?
>I, and people I know, feel like we buy very little on a whim. Is that unique? Are there whales who buy everything you can put in front of their face, or a different demographic who searches for something to buy and then changes their mind in precisely 850 milliseconds?
It's not about buying. It's about which store you visit when buying something. If your site performs poorly then users might not come to your online store. They may go to a competitor.
The thing about those averages is that they are quite unintuitive. Remember they are averaged over thousands or millions of users. There will always be some users with latency spikes because their device is much slower than usual or simply because servers are running at capacity. They are in the minority and have small effects on average metrics but the latency they see could be very high, maybe even up to 5 times the average. If you can decrease average latency by 1 second then the site might be 5 seconds faster for a minority of users.
>I would accept that like candy in the checkout lane, the profit is small but worth more than the extra effort it takes to put the offering there, but the revenue is small compared to the actual stuff people need to buy
You have to consider that each product listing on an online store is basically an ad. You are not necessarily buying all the products you see but each product listing informs you about what's available in the market place. If pages load faster you end up seeing more "ads" which increases the likely hood that you eventually do buy something.
They seemed to come from when mobile phones were much slower when there was a valid use for AMP pages. A/B test data is very easy to cherry pick from and you can see a 10% increase in conversation based on random chance. I can see my marketing team misinterpret data like that and use it for promotion. I am skeptical.
I improved the initial loading speed of a website so it was 30% faster on an old mobile phone and it made no difference in conversation based on the analytics.
Most people used faster phones with faster internet that it really didn't matter. After the first page load, most of the bloated assets were cached so even those with slow connections were relatively fast navigating further.
There could be some types of websites where speed matters more like if you promoted click bait articles with high bounce rates. But if you have high quality content, it isn't going to be that important unless you have really terrible bloat.
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 48 48"><path fill="#662113" d="M27.3 5.7c4.9-3.1 11-3.9 16.8-4.1A162 162 0 0031 10l-3.7-4.5zm4.1 5.6c4.8-3.6 10.1-6.5 15.3-9.6a18 18 0 01-4.3 12.7l-11-3z"/><path fill="#c1694f" d="M44 1.6l3.3-.3-.6.4c-5.2 3.1-10.5 6-15.3 9.6 3.7 1.2 7.3 2.2 11 3l-6 6.2c-3.6-.8-7.4-2.8-11-2.4a70.5 70.5 0 00-17 18.3c2.7.2 5.5.4 8.2.4h.5c-3.3.2-6.6.4-9.9.9 3.6.6 7.2.5 10.8 1-3.7 2.2-8.2 1.7-12.4 1.7-1.6 2.4-2.7 5.3-5 7.1.7-2.8 2.6-5.2 3.8-7.8.1-2-.4-4.1-.5-6.2L6.7 36l.3-.3c5.2-7 10.7-13.7 17-19.7l-1.6-7.7L27 5.4l.3.2 3.6 4.5c4.3-3 8.7-5.9 13.2-8.5z"/><path fill="#d99e82" d="M8 21.2C11.6 16 17 12 22.3 8.3L24 16A178 178 0 007 35.7c-1-4.8-1.3-10 1-14.5zm.5 15.2c4.6-6.8 10-13.4 16.8-18.3 3.7-.4 7.5 1.6 11.1 2.4L33 24.1c-2.5-.2-4.9-.4-7.4-.4l5.4 2.4-3.2 3.5c-3.7-.3-7.4-1-11.2-1l9.7 3a10.7 10.7 0 01-9.5 5.2c-2.7 0-5.4-.2-8.2-.4z"/></svg>
I went with:
<link rel="icon" href="data:;base64,AAABAAEAEBAQAAEABAAoAQAAFgAAACgAAAAQAAAAIAAAAAEABAAAAAAAgAAAAAAAAAAAAAAAEAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA">
Performance is important, but if your site is clocking in at 17kb then you're probably doing okay.
But calling larger sites a "cancerous growth on the web" just feels immature to me. Everything is just cost/benefit. There's no need to bring a black-and-white fanatical exaggerated mindset to it.
You can celebrate something you like without having to make the alternative a disease, sheesh.
There is zero benefit to me in loading 20mb of garbage just so I can read 10kb of text.
JS is not the issue and reddit could have easily made a snappy react version of the site. Monetization, ad tech and engagement growth are the issues.
I could also build a CNN clone in pure React (no SSG or SSR) without the ads and the millions of autoplaying videos that would demolish its current performance.
As for Reddit, perhaps SSR would be a better fit indeed. But most of its problems come from the fact that the SPA implementation is dogshit, and not from the architecture itself... Some things plainly don’t work, for example a feature as basic as the comment tree is broken as hell. This mess would be equally possible with SSR though - I would personally screw up more easily a comment tree implementation with AJAX data fetching in SSR templating and vanilla JS than in React.
2) Sometimes things might be useful to people other than you. The world doesn't exist to cater to your needs, and referring to things that aren't exactly what you want as cancerous is childish.
Aside from the marketing monkeys that put it all there in the first place?
Also, what’s with calling people monkeys? I’m sure many coders here produce stuff that is as questionable in value as what ad executives do.
People need to put food on the table and it’s abhorrent to think of yourself better than an another person because you build data pipelines in Scala that the universe doesn’t give a shit about.
>You can celebrate something you like without having to make the alternative a disease, sheesh.
You can, but would you be equally successful at it?
Name Status Type Size
------------ ------ -------- --------
danluu.com 200 document 2.5 kB
analytics.js 200 script 18.9 kB
collect 200 xhr 67 B
favicon.ico 404 text/html 266 B
Ah, the late 90s and early 00s when we still had user.css (in a meaningful sense).
Does an analogous window exist for QUIC and HTTP/3?
Yes, my goal (very hard to do sometimes) is to fit all of HTML + Push CSS within 14kb
Please use uBlock Origin.
Try uBlock Origin. Don't use stuff like 'Adblock Plus/Pro'.
What I think they mean by this is that you shouldn't link to resources on their website to make it seem like they endorse your (product, website, whatever).
From the legal disclaimer at the bottom:
> linking to this website without written permission is prohibited
"If you have any comments about our WEB page, you can write us at the address shown above. However, due to the limited number of personnel in our corporate office, we are unable to provide a direct response."
I see they have a link to "Berkshire Activewear". Now that's a much much more heavyweight page.
Can it be assumed that this same website has been in place since 1978? Obviously not exactly like it is now, but probably not far off.
I wonder what the logic for the 1978 date is. It's hard for me to believe they had any reasonably connected predecessor of this in 1978.
But, the website looks rather similar to how it did back in 2001 (with the recognizable two-column list of bullet points):
How about you average their subsidiary web pages? Start with DQ.com (Dairy Queen)
In situations like that, it's right and proper to ask who built the site. Then shake your head with absolute contempt.
The correct response is to work out who was responsible for maintaining the site for the past five years.
Includes analytics, chat client, big product screenshot, theming and payment integration, so you can still do a lot well within 1MB.
Assuming kB and dialup, which was 83% of the US at the time (easiest stat to quickly find), that was a difference of about 23s vs 121s load time.
Edit: Seems like the UK would have been about 50/50 around that time according to this article:
Page size does not matter. Render time/time 'til functional does.
Stripe.com is 1.2MB fully loaded but is pretty much all non blocking so it feels as fast as a very small site. By the time you've scrolled to the bottom of the homepage it's loaded >2MB. The convert pretty well, and my experience is that Stripe knows what they are doing better than most. So what should we be optimising for? Total page size? or a lightning fast experience with non-blocking content which can be as big as we require?
IMO developers need to optimise for building fast sites, not light ones for the sake of being light.
In these situations all I want to say is "Who cares".
Optimization matters when it actually solves a problem, before that it's just wasted effort.
I don't hear the general public yelling from the rooftops "The web is too slow! Developers are building too fast! I wish we could go backwards!"
Meanwhile valley bros keep helpfully reminding me to upgrade my computer to a reasonable 2020sh.
It seems a lot of the newer generation just doesn't realise what was already possible before, and is only infatuated with inefficient reinventions.
This is said by someone who is not a "valley bro" and often lives in rural Canada.
YouTube redesign, Twitter redesign, and one other huge site I can't remember at the moment all had plenty of complaints about how much slower --- and less featured --- they were. The average user doesn't know a static page has been replaced with a bloated SPA, but can sure feel the difference.
The web has declined for sure, and it's precisely because of ignorant developers with attitudes like yours.
To any twitter "engineers" on HN, sort it or abort it.
It's deeply disturbing that if they go full-SPA like YouTube did, it seems not a single person working there realises the insanity of somehow needing so much software (and hardware) just to read 140-character long microblogposts, something that would've been possible with hardware and software from 30 years ago. If you look at their minimum browser requirements, then look at the minimum OS requirements from the browser, and go back to hardware requirements from that, you'll probably end up with something that seems ridiculously overpowered for the task, yet with that "minimum supported" system the site will still be slow as molasses and worse than the static page. Seriously, what the fuck!?
Not only due to developers. Designers have a hand in this too. Example: on many sites (Twitter, Imgur, etc.), it is not possible to simply zoom in an image without jumping to a lot of hoops. You hover the mouse over the image, the pointer becomes a magnifying glass, you click and...
The image blows up to fill to the screen until it his a overly large and useless border that some daft designer thought looked cute. There is no way to zoom in any further, that is blocked. So zooming to see a part if the image in full detail is out of the question. What's worse, if you have a small screen, likely to be the reason that you wanted to zoom in the first place, the zoomed image is actually smaller than the original. Great!
This list has a second hidden benefit: Typically, I find that those who have very lightweight pages have more interesting things to say.
That being said, that's not a hard and fast rule, as I both have a lightweight page, and not much that's interesting to say.
I can point to two blogs on either side that have a lot to offer.
Large bundle: https://www.joshwcomeau.com/
Small Bundle: https://www.eugenewei.com/
Definitely. This is just something I've noticed.
> and a straw man.
Wait what? I was just bringing up something I've noticed. How is this a straw man? As I said before, this is by no means a firm rule, just something that I noticed is frequently the case.
PS. Thanks for linking those blog's, I've added both to my reading list.
And just as someone who grew up horrified if individual pages got over 100kb (or whatever the company rule was), the idea of not caring at all about "page" weight is how my old company wound up passing 20+ megs of JSON around just because it was a little easier.
The same thing is not true for the general public.
Except they usually yell "My phone is too slow" when the problem is actually the web that got 3X bloated since they bought their phone.
With the major disclaimer that if your website serves ANY commercial utility for you personally or your business website speed is hugely important a clear cause / effect impact on engagement, conversion rate, bounce rate, etc.
Makes them feel like the site actually does some calculations, even if everything is cached in the background.
See travel agencies, airline comparators, train booking services, insurance benchmarks or credit simulations.
I mean never mind that my internal page links load in <50ms thanks to prefetching and all the other smart stuff that Gatsby does.
I'll keep listening to my users.
On the other hand, Gmail regularly takes 10s+ to load and yet most users are glued to it.
If you're a SaaS business, your marketing page is probably not your app. The people using your app care about usability, and page load speed is rarely the driving factor. Especially with SPAs.
Better yet, ask your actual customers.
Except on forum's. We complain there :P
What I've found is that the users I have that complain are by far the minority of my users. They ask for every little whim, many of them conflicting. If I'd been listening to them my website would be a mess right now. Alexa top 5k. 7 million uniques / day. 550kb. Less than 2 years old.
Every time I implement a new feature, there is no significant change in my site's trajectory. Improving latency by adding an auto-scaler, however, has lead to a massive uptick in usage in the past 90 days.
People obviously care because people obviously want their browsers to load content quickly. No one likes waiting for webpages to load.
How do you decide when a website/webapp is fast enough and you've "solved the problem"? On what range of devices and internet speeds do you test on?
As my family's personal IT assistant, I can't tell you how often I get the complaint that "my phone is too slow" or "my laptop is slow and heats up" when all they are doing is browsing popular websites. An we live in a developed nation with decent internet -- I can't imagine how awful it must be when you don't have those luxuries.
I think your comment is lacking severely in perspective, and the mindset you've demonstrated to be particularly bad for the web. Not to mention snide and dismissive.
That’s actually a core reason for slow pages. People in offices with large monitors and huge bandwidth develop those sites
I think it might be like bicycle parts. It is ridiculous that parts are sold with a pricetag PLUS a weight in grams.
I mean, you could save the equivalent of $100 by wearing a short-sleeved shirt.
But those obsessive people end up making most bicycles lighter weight, and thus bike rides for normal people are more enjoyable.
Flippant but I doubt there's any clear correlation between reducing the size of a web page and reducing overall CO2 emissions.
Ironically you can google about and find it probably.
"Doom as a tool for system administration" , "What can you do with a slide rule?" , and of course some vintage 90s personal pages... some of which have been actively maintained to this day (!) 
edit: that last page has some incredibly-detailed information about the various cars he's owned 
When I started writing Web sites (late '90s), we'd get spanked for 50K pages.
I'm not sure it's possible, anymore, with a lot of the CSS and JS libraries; let alone the content.
But back then, we didn't have content delivery networks. CDNs make a huge difference.
This was back when Lycos was news to most people and Alta Vista was just launching. It was a poor man’s Yahoo. I don’t think it was a coincidence that the end of its usefulness came so close to the birth of better alternatives. The ungainliness of the status quo is usually what goads someone into trying something new.
It was an observation, and recounting some personal experience.
What is bloat, as far as I'm concerned, is hitting a page with a 4K background video.
If you want a real "I had to walk 2 miles, barefoot" story, I can tell you about writing [C]WAP/WML sites (the old "m.<yourdomain>" sites). I actually considered using XSLT to extract WML from the regular site HTML.
priorities all over the place and major regression in the most crucial things
I love the idea, just needs a better measurement.
I do agree that 1MB is arbitrary, but it's the right kind of arbitrary IMHO.
1. Weight sites by some popularity count. Otherwise does anyone care if some random small site generates a 0.1k response just to get to the top
2. Do you plan continuously verify the download size is still accurate.
12 requests total, 322kB transferred (914kB uncompressed, so either way), finish in 991ms. Request was made between Texas and AWS us-east-1. Server running on a T3a instance. This is .NET Core 3.1 on Windows Server 2019.
Of the 322kB, blazor.server.js is 39.2k and our JS interop shim is 3k. This represents all that is required to communicate between server and client. The rest of the static source is ace code editor, bootstrap, jQuery and open iconic. All remaining interactions happen as deltas over the websocket connection.
Every day that I work with Blazor, I am further convinced that it is the only way to build our web applications moving forward. I don't have a good answer for Blazor server side in mass-scale public-facing scenarios (a la Netflix), but anyone can come up with an edge case to poke holes. Our 'internal' dashboards are actually accessible from the public web and we have not experienced any DDoS or other performance incidents. If something does come up, I will just pop the VM over to the internal network & require VPN access. That said, I really prefer accessing Blazor server side apps as directly as possible due to the latency concern.
So, instead of sending tons of js on the client side that does the job, let's connect via websocket to the server and let the server return HTML with new content dynamically, yea?
Does that mean LiveView?
I guess we need a term for that since every framework is using a different name for it.
Single page applications are horrible, and oh-so-common.
With good code splitting, which is supported basically out of the box by Webpack and all the big front-end frameworks, I would argue a well-written SPA transmits less data in the long run than a fully server-rendered app. The initial payload is bigger by a few hundred KB sure, but after that:
- Like server rendered pages, you still only load pages on demand, thanks to code splitting
Yes you’re still making a request for fresh data on every page load, but while the code is cached you need only download the raw data itself. You’d have to be really clumsy to send a JSON payload bigger than a full HTML document containing a rendered representation of the same data.
Of course, this only really applies to apps. Static or infrequently changed content, you’re better off either server rendering or just serving static files.
Seriously though, this isn't just a "grammar" thing. If you don't know the difference between millibits and megabytes, you probably shouldn't post on the subject of page size (or anything else relating to bandwidth consumption / network performance).
I still hate it when people flag comments like yours because maybe it sounds "offensive" or something, but you can improve your tone too.
 - https://en.wikipedia.org/wiki/International_System_of_Units
In this case though, it was the context (1MB Club) that gave the clue. So, no.
My blog is pretty lightweight, but I use Japanese in the about page, and that font on that page blows everything else out on download size. I could subset it just for that line, but consider sites containing entirely CJK content...
I'm a huge fan of minimalism using gopher:// frequently.
But blanket statements like this seems seems a little too much.
I'm personally aware of some educational tools that non-techy educator are able to use with just a user name and password.
A few years ago in order to deliver tools like that the process was a physical CD, some key, some installation wizard... and if you had Mac? tough luck.
Let's be conscious about bandwidth usage and bloat, but we are no longer in the 90's and, even if imperfect, progress has happened in tech
Performance is just one piece of the UX puzzle. Users don't really care about bundle size... They care if your site provides value in a reasonable amount of time.
> Min is used by over 65,000 people in 195+ countries, from North Korea to South Sudan to Mongolia to Somalia. Think your software is critical? Try a webapp keeping you alive in a warzone.
Now, whether or not this actually happens is a good question, but it does seem like a plausible possibility.
Can you share some such situations?
Min's rationale further brings to mind things like disseminating info on hospitals, shelters, etc. in war zones, often to people who at most have an ancient (by first-world standards) phone with expensive and slow network connectivity.
Performance metrics are better when put into perspective.