Hacker News new | comments | show | ask | jobs | submit login

Am I the only person who finds it ironic that a web page about keeping web pages small comes with multiple megabytes of Javascript?

It doesn't need it, folks. It's a static blog page.




Couldn't agree more. If you look at what's happening in the network tab of the developer tools, you'll see it's doing a lot more than providing just a static blog page.

Instead, every x seconds it executes another POST request with pretty much all the details they can gather (scroll from top, scrollable height, referrer etc.). As soon as you start moving your cursor, the new requests start adding up very quickly, with lots of new params such as "experimentName: readers.experimentShareWidget" or "key: post.streamScrolled".

It really is collecting every single interaction with this page. As it's provided by Medium I'm sure it's part of their data collection program.


I wonder if those experimentName parameters are to do with serving different example images to different readers. Below, u/Anhkmorporkian mentions an image of sneakers, but I saw images of a woman and of the Golden Gate bridge. Did other readers see different?


Much more likely the experiment is from medium itself, A/B testing different sharing UI.


It's a Medium article. The author didn't write the platform.


The faustian bargain that writers of web performance articles make: use a minimal static platform that's fast and get minimal coverage and analytics or use a slower fully fledged platform that offers a built-in audience, analytics and features.


It loads 500kb of javascript files, not including the two codepen docs. Also, there is actually functionality on the page, it's not just a static page.


I'm on mobile right now and therefore it is difficult to look at source but wouldn't the JavaScript be cached in most scenarios while the images would be unique per page? On a side note I have no JavaScripts on my own blogs.


Besides uncached views (I'm guessing a decent amount of the traffic from HN for this submission would be) you also have to consider parse/eval time. Look at the flame charts in Chrome Dev tools for some of your favorite sites and probably see a few that take almost a second to parse/eval cached JavaScript. It will be even slower for mobile users.

Just because it can be cached doesn't mean it's free or even necessary.


I wouldn't count on anything being in the cache, especially on mobile. Even on desktop, I've just checked my cache and any Medium resource shows exactly 1 hit. With ~350 MB in my cache (default value), it's not going to stay long before being evicted.


gotta data mine the pleebs


And all the code pens are also slow. Between clicking the "run" button and the actual script running there's somewhat of a 3 second gap.

Which kind of puts the opening paragraph into question:

> I’m passionate about image performance optimisation and making images load fast on the web.

Like, ok, why aren't you using a simpler blog engine to post? As a plus, if you were in control of everything on the page, you wouldn't even need to use codepen; just include the javascript directly in your page.


Perhaps they don't want to worry about deploying a simpler blog engine ?


That's a common attitude, I was just pointing out how it puts the "passionate" part under doubt.

I've noticed the word "passionate" gets thrown around a lot cheaply.

It's seriously starting to lose all meaning.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: