Hacker News new | past | comments | ask | show | jobs | submit login
What should have been an IMG element became this (twitter.com/csswizardry)
287 points by seapunk on Oct 20, 2019 | hide | past | favorite | 143 comments



The hoops people are going through to justify this are crazy to me. The comparison is not no-JS vs a business-capable website. They could serve a static image tag that would load quickly, and then load the comments/content around it. Progressive enhancement has been a well-known development pattern on the web for a very, very long time.

The comparison here is, "serving your content, and then doing whatever the heck you want", and "dynamically fetching all of your core content on the fly after synchronously downloading and executing all of the auxiliary content that >85% of your users do not care about."

I think Youtube is overengineered as well, but at least Youtube has the good sense to prioritize loading the video first, and the recommendations/comments second.

It is surreal to jump from a thread with a Google dev telling me that user-agent scraping is necessary because progressive feature detection would require shipping unnecessary polyfills to modern browsers, to a conversation with an Imgur dev telling me that because 5% of their users need a special feature, everyone should wait twice as long to get core content served to them.

I realize these aren't the same people, and the web development community is diverse, but... I don't know how to reconcile those perspectives. We will jump through so many crazy, horrifying engineering hoops to get bundle sizes down: using statistics/user-behavior to calculate dynamic bundles on the fly, user-agent sniffing, compiling Javascript frameworks, prefetching URLs, HTTP2. But serving an img tag with our core HTML is a bridge too far?


It's cargo cult development. I've seen it everywhere I've worked, and have been a huge offender myself. It's the engineer interjecting themselves vs. doing what's best for the end-user: KISS (keep it simple stupid).

As a webdev I actually target older methods in order to get my work done. Is it as "sexy" as a ton of stuff you mentioned? Nope. But I can get a lot done with 20-50 lines of jQuery (or maybe a bit more verbose w/vanilla JS) by directly working with the DOM vs. roping in frameworks in every situation I find myself in. I try to build those "business capable" websites you're mentioning in the first paragraph.

The biggest thing that bothers me is most of us in the industry are "abstraction obsessed" where we would rather have discussions on the complex tools we use vs. how we solve the problem at hand. God help you if you're that one older dev in the back of the room going, "yeah but can't we just throw some vanilla JS/basic jQuery/simple DOM manipulation at that and be done with this?"


Right on. I think some newer devs come out of school not actually knowing how to implement their own, from-scratch application using vanilla JS or jQuery, so this stuff seems frightening to them; others are obsessed with the (possibly true) idea that the thing they build with React will be more extensible in the future, while discounting the poor fit in the present.

Something tells me that you could probably serve a page with almost no JS on it and accomplish the same thing Imgur does, including the comments and monetization stuff, with "server side rendering", which is what we used to just call "CGI serving HTML" until that wasn't trendy enough.


> I think some newer devs come out of school not actually knowing how to implement their own, from-scratch application using vanilla JS or jQuery

I'll go out on a limb and say that most webdevs (regardless of schooling) can't handle basic "from-scratch" work using a language without a framework... it's terrifying. It's a lot of trend-followers who often can't verbally dictate their decision to use one tool over another (framework, lib, or otherwise).

I have a JS interview question which is, "describe to me the the common nodes one would often leverage in a DOM?" Really all I'm looking for is a basic understanding of the document, element, attribute, and text nodes as those are the ones often interacted with. I'd say about 80% don't have an answer/totally BS one. They still totally get hired though (and I'm seen as the big bad wolf who "thumbs-downs" everyone). This is in Southern California in a major city where there are multiple FAANG offices =|

Webdev has always been a huge variability of skill. You have WordPress "install 50 plugins" "developers" all the way up to top-level programming talent who are just as technically talented as top C/C++ folks (they just target a browser/web server)... and it's incredibly difficult to weed the engineers out from the "devs" in my experience.

> Something tells me that you could probably serve a page with almost no JS on it and accomplish the same thing Imgur does

Yep - and I'll even argue it'd work better (more performant), and be a much more predictable product to develop on and ship!

> which is what we used to just call "CGI serving HTML" until that wasn't trendy enough.

100% agree and actually had a good chuckle at this =)


A company I know has been replacing some pre-Cambrian "Individual PHP-built HTML pages with the occasional JavaScript enhancement" pages with a React-based app. There is a case for updating the site-- it predates responsive design and several corporate rebrandings-- but I'm not sure new-era web tech is actually doing them any good.

It moved a lot of complexities from the server side to the client side-- management of state and orchestration of data, turning a single request for a finished page into a whole streak of API hits to populate out a template.

They went for an API-based approach with the thought they could expose it directly for external power users, but the API ends up not even serving their INTERNAL needs well (the information you want for this report is scattered across 12 different tables, but combining them together violates the REST spirit)

I'm not sure it will be faster or more stable than the old code even once they finish optimization. The only user-facing benefit appears to be that they replaced the 200-millisecond flash of white as it loads each page with a 5-second spinning beachball in the template as it collects the data a piece at a time.

I strongly suspect a lot of the motivation was the devs wanting to put some trendier technologies on their resumes.


Turning every site into a major SPA is the single biggest problem with webdev today. They should only be used for truly complex webapps that need real-time updates and interactivity.

Otherwise, like you said, it's just rebuilding everything that a capable server-side framework gives you to then run it in the browser with worse UX. I've consistently proven to my team that a server-rendered page with some optional JS on top (Vue is great for this) is 1000x better for users than a giant JS payload just to make API calls and render some HTML.


Too be honest that’s just poor development practice.

We run a React SPA, and our backend for initial page load & API is in PHP.

Frontend devs work with backend devs to ensure the critical path of the page can we gathered in 1 request. This usually means having some expand query params on the get request.

We also preload critical API responses in the initial page load to prevent this initial loading indicator.


^^ Also agree on this. I have seen the whole "it has to be RESTful" thing that the parent comment was discussing as a way to justify huge amounts of HTTP overhead too many times however.


Thanks for taking the time to share this - because it's exactly what I feel is happening to webdev as a whole. We're adding unnecessary complexity trying to put everything on the user's browser vs. rendering a sane and functional blob of HTML that just gets the job done + sprinkling in the JS when needed for more interactive needs!

> I'm not sure it will be faster or more stable than the old code even once they finish optimization. The only user-facing benefit appears to be that they replaced the 200-millisecond flash of white as it loads each page with a 5-second spinning beachball in the template as it collects the data a piece at a time.

If it's a ton of individual web requests there's very little chance that it will optimize anything because it sounds like a lot of HTTP overhead. I've seen this same anti-pattern 1000 times in my career and frankly would rather take the processing onto a very fast webserver that's optimized to crank through stuff like that with some SQL/joins/etc. (ie: doing a few joins on the server side vs. and sending a "view" of the data is WAY more appropriate than putting the responsibility of lots of web requests on your client and increasing your HTTP overhead)...

> I strongly suspect a lot of the motivation was the devs wanting to put some trendier technologies on their resumes.

Yep - and often if that's the case they end up creating a mess of an application that has all sorts of crazy constraints that never needed to be introduced.

---

That being said, new-era anything can be good. Heck - I needed to do the "shotgun a whole bunch of HTTP requests" approach for a GIS mapping solution that would load chunks of data in at a time based on the map's viewport. It just needs to be justified, and most of the time I feel like people just wanna do what they're comfortable with vs. what's a best-fit for the problem.


I agree with everything you say. What does Imgur have to say about it? When it's a business the question in mind is not what's best for the user, but what's best for the business and the user. After all that's the reason you do whatever it is you work for right? They are providing a service.

I personally have no idea why they did this, but I don't presume to know either. However I can add my two cents from the peanut gallery. When it comes to doing something right or wrong, or complex or simple. The bottom line is what does it cost? If you can't finish it in a reasonable amount of time with a reasonable to little or no tech debt, then it's not sustainable for a business. That's the part that literally every level of the peanut gallery ignores when looking at issues like this. And having said all that I still can't seem to imagine why they would want to put that much unnecessary load on their servers, perhaps they get something out of it in return? Perhaps they wanted more control over access to their content, and that required this waterfall loading?


Modern Google apps (Gmail, etc.) are incredibly bloated and buggy compared to the ones from 10 years ago, without any (to me) useful new features, so while I laud one engineer’s principles, it’s not clear they are succeeding at following them as an organization.


The new Gmail was a slap in the face vs. what they had. The older version was so much quicker and performant.


It goes deeper than that. They actually do have plenty of tags in the generated source directly linking to an image, like twitter:image, og:image, image_src, they just don't want you to load the image before you load all their javascript crap.


> I think Youtube is overengineered as well, but at least Youtube has the good sense to prioritize loading the video first, and the recommendations/comments second.

One extremely annoying thing about this is that if your internet is slow, it can take a minute to load the title of the video you're watching. So if e.g. you get linked something without context and you may not actually want to watch it but do want to know what it is, too bad.


Former Imgur engineer here who worked on the desktop site and helped on mobile when I could. A lot of the code that is loaded supports features that are used by a long tail of users [1]. However, they do serve the javascript with appropriate cache-control headers and serve them from Fastly's CDN so analyzing a cold load is a bit misleading to say the least. Moreover, as other commentators have mentioned, they optimized more for the subsequent images than the initial pageloads (they'd prefetch the next N images).

Keep in mind Imgur is not a large company despite their high traffic, even at their peak of employees the engineering team was pretty small (probably about 12-15 people after series A), and the mobile web team in particular was a handful of people, with a handful of people on iOS and Android, and a handful of people on desktop/backend/API (where I worked).

That said, I think Alan does care about these things. I know at some point they did support NoScript and did care about the experience with JavaScript off (and had code to support uploading images and viewing images with no JavaScript at all). But it's hard to have it as your top priority when Reddit and Instagram are trying to eat your lunch.

I'm sympathetic with the page bloat problem and noscript and I do think more effort should be spent on optimizing this stuff, especially because bandwidth is much of their opex.

[1] Posting, voting, commenting, uploading, accounts, tagging, albums, search. There is even a hidden-ish feature to "auto-browse" in a slideshow-like manner which you can find if you crawl around the source code.


> A lot of the code that is loaded supports features that are used by a long tail of users

Bounce rate 53% according to alexa. So, majority of imgur users don't appreciate it, do hit cold load, etc. A user probably has to be dozens of interactions deep for initial loading cost to not be so high, but more likely there is no way to ever offset overhead of all that bloat for any user.

Personally, I use an extension to fix imgur brokenness and extract images from imgur pages without loading anything else.


Or the signal indicates most visitors are a result of an accidental link click.


care to share the extension? i found this, which seems similar: https://greasyfork.org/en/scripts/390194-imgur-redirect


Which extension? Sounds very useful.


> Keep in mind Imgur is not a large company despite their high traffic, even at their peak of employees the engineering team was pretty small (probably about 12-15 people after series A)

12-15 engineers is small? I'd call that a full-size team, for any single project.

"Officially" Google makes an entire web browser with less than double that: https://www.quora.com/How-large-is-the-Google-Chrome-team/an...

Conway's Law in action here. If it were one person, it'd be one IMG tag. When you put 12-15 engineers to work making a social website for serving one IMG, you get this.


Chrome.... definitely has more people working on it than that. It's absolutely ludicrous to try to say Google only pays 23 people to work on chrome. Perhaps that quora answer is being pedantic and saying only 23 people work on the closed source, non chromium bits?

Regardless, 15 engineers to make a webapp and mobile apps with all of the features mentioned for a site that gets "lots" of views (not sure how many but I'd guess we are counting in hundreds of millions of clicks a day at this point) seems pretty efficient to me?


Chrome is at the top of a giant tower of abstraction. Sure, millions of people built the tower. Imgur is even higher on the tower of abstraction, though. It should be much simpler. That's the whole point of the tower.

15 engineers doesn't seem especially efficient to me for what it does (or ought to). Just because you get a lot of views doesn't mean the software itself has to be terribly sophisticated. It usually means you host a ton of user-generated content. Websites which are relatively straightforward hosting of user content, like Wikipedia [1] and Reddit, tend to have orders-of-magnitude fewer employees than other types of equally popular websites.

[1]: https://meta.wikimedia.org/wiki/Wikimedia_Foundation/Annual_...

If you assume WMF were about 20% engineers back in 2010-2011, as they are today according to their Staff page, that would mean they had about 16 engineers. Is Imgur today as complex as all Wikipedia properties in 2011? That seems rather inefficient to me.


Chrome has hundreds of people working on it, at least. Aside from that Quora link likely being wrong anyway, it's also 7 years out of date.


Even Safari/WebKit has like a hundred people working on it, and the Chrome team is much larger. Probably an order of magnitude more.


> so analyzing a cold load is a bit misleading to say the least. Moreover, as other commentators have mentioned, they optimized more for the subsequent images than the initial pageloads (they'd prefetch the next N images).

I've been opening imgur links on my phone and watched it do nothing for like ten seconds, and I just assumed it was intentionally slow/broken so I'd install the app or something. I'm flabbergasted that it's actually the outcome of a deliberate optimization.


I'm sympathetic to what you say, but optimizing for following pages might be a tad optimistic if I leave after first failed load attempt.


It also doesn't help privacy-conscious users who clear their cache regularly. I use Firefox Focus on mobile and have Firefox in permanent private browsing mode on desktop, so I always get a cold load.


12 to 15 developers can be a significant, if they are experience and with good leadership. Lacking experience and good leadership you get what we're seeing here. I'm older, been coding since the 70's and smaller teams than this wrote many well known operating systems, many major brand-name applications, and the majority of the classic video games were created with teams 1/2 to 1/4 that size. Looks to me like Imgur has inept engineering management.


Serving less code is more important than serving lots of code from a CDN. All that javascript needs to be parsed too after downloading, and that's probably taking the bulk of the time on mobile devices.


Earning money is important so serving ads is important. The masses don’t care about them including 20 frameworks so serving less code is not important.


I've built 4 adtech companies. Page load speed is directly related to ad revenue. Faster pages provide more ad impressions, not less.


Imgur removed most of those features from the mobile site, apparently.[1]

[1] https://twitter.com/martinbean/status/1185605846933352450


We are considering 12-15 people a "small team" now? I've worked on large enterprise applications and I've never worked on a team that big!


Reddit is trying to eat their lunch?

Imgur sprung up as a fast image host for Reddit, then Imgur turned into a social media site - a competitor for Reddit.

In that process Imgur became shitty at their core function... hosting images!


I find it ludicrous that amongst the hundreds of comments between here and on Twitter, people seem to completely ignore that this is a FREE WEBSITE. But one that is likely incredibly expensive to operate.

How to reconcile this and attempt to make up the shortfall? You're looking at it.

Imgur started as hardly more than an IMG element. It burned and burned and burned money, and their users were happy.

Then it added ads, but it didn't matter because people direct linked from reddit anyway, which was the vast majority of the traffic.

Then imgur realized they need to break their dependency on the reddit social mass, and built their own to apparently great success.

Now people go to imgur for images directly, and stay.

So. All that "bloat" came around for a reason and it was to try to make the company sustainable as a business.

Dammed if you do, and if you don't.


There are more options than "just serve the images" and "bloated monstrosity that takes so long to load on mobile that I actively avoid imgur links".

For example, they could have, but didn't, make it a page with adverts and social features that loads in a second or so.


Everyone keeps saying that, but I don't buy it, and I think it's insulting to think we all know better than the imgur employees who have been dealing with this problem for years, and have likely considered all the possible options.

1) They needed a solution that would still be somewhat meaningful with adblock. That means maintaining user engagement to stay with the site and eventually encounter branded content that is not officially an ad but is still a revenue stream

2) Whatever sequence they chose for how they load the page, it is certain that they considered alternatives and a/b tested them, and what they chose was the best balance of user bounce rates and revenue.

It's not the most user-friendly version, but that can't be the only metric for a free site.


Are you accusing Imgur of doing this deliberately? That seems more insulting than assuming that it's the result of ignorance.


I'm surprised image hosting never went aggressively with proabilistic advertising.

80% of the time, it loads your image, 20% you get an ad for Pepsi (or an ad overlaid over the bottom third of the image)

You're offering only cheap 'branding' CPM advertising, because you can't control how the image will be presented to provide any interactivity or clickability. Technologically, it requires no changes for the embed code, and could work with a "just load the image" view-- it's all server side magic. But at imgur's scale, that's not smll potatoes money.

Importantly, the inconvenience that sort of ad presents to users is modest and manageable. If you can't see the photo, reload it once. It's not going to be 30 seconds of loading dependencies and non-content until you give up BEFORE seeing the photo, and it won't be the photobucket-style "let's just disable hundreds of millions of external embeds AFTER 10 years of people building valuable resources around them.


And in doing so, they made their website awful for the one thing that most people were using it for.


That doesn't matter though if you're losing money. The early imgur was simple and popular, but, as the parent said, lost money.

Who would host a system that loses money? Similarly, I could have a website called yougetadollar.com where people put in their address and I mailed them a dollar. It might be a really simple site that fulfills it's core purpose efficiently - but if I don't figure out a way to make money of it, I'm going to have to stop hosting it sometime.

If someone came along with a way to make money off yougetadollar.com and that method involved adding a lot of JavaScript, and registrations, and upvotes, and advertising - you'd be correct that it makes the website worse for what people want to use it for, but if the alternative is losing money and shutting down - I don't see the point of the observation.


Doesn't matter, people still use it. It's not completely awful for what people use it for as long as it stays up.


> So. All that "bloat" came around for a reason and it was to try to make the company sustainable as a business.

Until someone comes along with a simple basic service that just works, exactly like how imgur replaced imageshack.


Imgur now is still multiple orders of magnitude better than what Imageshack and Photobucket were when it disrupted them.


This is a big problem, but calling it over-engineering seems too generous. I'd characterise it as under-engineering: Developers unwilling or unable to give proper consideration to non-functional requirements such as performance and design an optimised solution, and instead just layering on more and more dependencies.

There's a happy medium between serving a single <img> tag and this monstrosity. It would be totally feasible to use PWA techniques and a lightweight JS framework to build an image host that was performant and still provided all the other features Imgur does.

But let's face facts: It's not going to happen, no matter how many angry Twitter threads get posted to Hacker News. The web platform is completely unsuited to mobile, and there are simply too many perverse incentives in commercial web development to expect that most developers will expend the effort to build an optimised PWA when they can slap something together with React, knowing it'll work fine on the CEO's iPhone on office wifi.

The only real fix would be a complete reinvention of the web to fix the myriad design flaws. That is what Google's AMP was supposed to be, but people seem to hate it and see it is proprietary and evil. Maybe it is, but from a technical perspective, it's likely the only kind of solution that has a chance of working.


I think the whole comparison of imgur with a blank page with an image is a bit silly. It's not just the image, its a whole application with comments,image galeries, video playing ability, and of course a ton of iframes with ads which load a lot of content (more images etc.)

Try to do the same feature set without a Javascript framework and using only plain Javascript or even plain HTML and CSS, and then come and tell us about the results.

Also, a lot of the bloat is probably not under the developers control and is added by non technical marketing departments, which want a tracking pixel for this, a tracking pixel for that etc.

But yes its true its insane and its getting worse, and 4G and HTTP2 will not be a solution for most of us anytime soon.


I don't think it's unfair, when 99% of users don't want any of those features. By that logic, any and every possible feature is fair game, and beyond criticism, simply by existing.


But I don't care about any of that. I don't even care if the rest of the app is what keeps the lights on. Everyone* who visits imgur does so following an image link, they want to view that single image, and will not click anywhere else on the page. Optimizing for another use case seems crazy.

*There is set of users who browse imgur or otherwise use imgur as a kind of "web application", I'm disregarding that part of their userbase for the sake of this discussion. It would surprise me a lot to learn they are more than 1% of users .


> But I don't care about any of that

OK, fair enough but then it's a different discussion and a different conclusion.

We can't really blame Javascript frameworks for a particular business that has decided to monetize its user base in a different way, by adding more features that are not widely used by their userbase.

Imgur can't exist in a vacuum, it's a business there is a whole team behind it.

Now we might question that whole business model as well and the whole internet supported by ads thing, but that is a whole different barrel of fish that is in my view not attributable to Javascript frameworks.


> I think the whole comparison of imgur with a blank page with an image is a bit silly. It's not just the image, its a whole application with comments,image galeries, video playing ability, and of course a ton of iframes with ads which load a lot of content (more images etc.)

This is precisely the reason it's a good comparison...


It is not difficult to implement all those features using plain Javascript, in fact I would recommend it as a great learning experience. What is difficult is doing it in way that is maintainable over a long period of time.

The nature of the language is that without iron code discipline Javascript code inevitably turns into a mud-ball.

Apart from the obvious “many-eyes” benefits, frameworks also provide structure that buys extra time before this happens.


> The nature of the language is that without iron code discipline Javascript code inevitably turns into a mud-ball.

Could you outline what you think makes JS more likely to turn into mud-balls compared to other languages?

In my experience, no matter what language you use, or with what pardigm, unless you model and design your software architecture with discipline, you gonna end up with a mud-ball.

Has less to do with the language and more about the process you use to model your domain.


It probably does apply to all languages, but I prefer to speak from experience, which in my case is mostly JS. Architecture and the related abstraction is definitely the hardest thing about programming in my experience, and what really separates good programmers from the bad.

Loose types, pass by reference, near-limitless mutability, scoping rules (without “use strict” to save the day) and asynchronous nature of the things it is used for are some of the things that come to mind with JS.

All of these are actually very powerful features that make it into the language I love, but care is definitely needed. ;)


Yeah, I really wouldn’t link it in any way to the language. It’s just mostly more obvious when it’s in JavaScript, because that’s associated with the user agent having to fetch the code and run it from scratch every time, and as a user you can more easily inspect where the time is going, compared with an app or backend processes, which are both much more opaque to the user.

It’s more about architecture and discipline, and far, far more about simply caring about performance. (And larger teams make it much harder to care about performance.)

For personal background in my firm agreement with capableweb, I work primarily in JavaScript professionally, tend to use Rust privately, and have used a variety of other languages and worked with both good and bad code bases (mostly good); and I care very much about performance.


Don’t get me wrong, I actually love Javascript. I have written code in quite a few languages but JS is the one I am most productive and happy in by far (within the area of network services and web apps that I work in). It has also been very profitable. :)

But the language has such a low threshold that it allows for incredible productivity while writing shitty code. But by the time you realise just what you have done (or someone else realised when they have to work on your codebase) its too late and it’s time for a rewrite.

What makes JS special is how accessible it is and just how much mess you can make before reality bites. That said, I will take a Javascript mud-ball over an enterprise Java one any time! ;)


> pass by reference

I'm not quite sure why so many people think JS is pass by reference.

Everything in JS is passed by value. In fact, unlike some other languages (Swift, C++, etc), the only thing you _can_ do is pass by value. That some of those values are actually references doesn't really have anything to do with evaluation strategy.


Simple types are by value, objects are only by reference - feel free to verify in your friendly local debug console.

More than that, assignment of an object is also only by reference which is super useful and great potential for spaghetti code at the same time.

How it is implemented under the hood I could not care less because it is not exposed to me as a programmer (unlike in C).


Again, everything in JS is passed by value - it's not even debatable. Variables pointing to JS objects and arrays are passed by value the same as variables pointing to strings or numbers. The value of the variable is always copied. However, JS objects and arrays are reference types, so the value that is copied is simply a reference to the same data.

I can't really relate, but I do understand that the distinction between evaluation strategy and reference/value types might be a bit difficult for programmers who've never actually had to deal with either concept to grasp, though.


no - I believe the parent comment is right it's passed by the value of a reference as seen here from the console

var changeref = function(ref) { ref = {}}

var changeable = {prop: "something"}

changeref(changeable);

changeable; Object { prop: "something" }

var changeprop = function(ref) {ref.prop = "anything"}

changeprop(changeable);

changeable; Object { prop: "anything" }

the use of changeref does not turn ref into an empty object, because you passed the value of the reference, but the use of changeprop did change the property of the object because you are changing the prop on the reference.

at least this is my understanding of how it works. Interestingly enough I totally blew a simple programming test on the phone recently because I forgot about this (well my whole mind went blank and I couldn't do anything so even if I had remembered this it wouldn't have been enough - it was lucky I remembered my name)

on edit: console used, Firefox developer.


This is what I am talking about:

    > function mutate (o) { o.foo = 'bar'; }
    > const o = { foo: 'foo' };
    > o
    { foo: 'foo' }
    > mutate(o);
    > o
    { foo: 'bar' }
I am probably not using terminology correctly, but it is not important as far as Javascript is concerned, since the language does not allow for anything else. What is important is that you can mutate objects you pass into functions and this is useful and dangerous at the same time.


yes, essentially that is the same as my changeprop function so I do understand that - however what the parent comment was arguing about was the actual terminology and wondering why no one ever uses it correctly.

As I noted in another comment I don't think people use the terminology correctly because explaining to people the difference makes explaining an already complicated thing more complicated, and also it makes communication less elegant if more precise. I certainly never say Objects are passed by value of a reference, I just say they are passed by reference and if someone wants to take the time to correct me I would probably say "yes, sorry I was imprecise"


OK, after a bit more reading I now understand the difference, thank you for your patience.

But it remains true that in Javascript you can not pass an object by value, because the language does not permit creating copies of objects. You can only ever pass a reference to an object (which is indeed only ever done by value) so the usefulness of arguing about the distinction is questionable at best.


Yes, precisely. In a language that supported call by reference, your `changeref` function would actually cause `changeable` to point to an empty object/map (assuming, of course, it had been passed by reference).


Well probably because so many tutorials and books say it is pass by reference in order to clarify a hard to understand point for novices - thereby eliding an even harder to understand point.


> Loose types, pass by reference, near-limitless mutability

This sounds like almost every programming language which is not a functional one actually (it even fits C!).


It is true that C is capable of those thing by virtue of being capable of everything, including writing a functional Javascript engine. But you have a choice. You can have strict typing in C, you can pass structs by value and most C programs work in a synchronous manner. (I actually mentored a good C programmer to teach him JS for working on our team and asynchronous code was the thing he had most difficulty wrapping his brain around.)

With Javascript you cannot have strict types and you cannot pass objects by value and asynchronous programming is the norm.

C is also a lot less forgiving so it takes more skill to even be able to create a mud-ball of a reasonable size. ;)

In my limited experience, functional programming languages just move a significant amount of difficulty up-front to the code. Perhaps it makes for less difficulty in architecture - I never suck around for long enough to find out. The difficulty threshold is enough to keep out lesser programmers, which certainly helps code quality. The price is that far fewer people use the language.


> Could you outline what you think makes JS more likely to turn into mud-balls compared to other languages?

Plain JS is hell to refactor. Between arbitrary autocasting, little tooling and no useful type system, if you don't have a very detailed test suite, you won't find the parts of your code that are broken now. Also, if you do, you now have to reimplement all those tests that did nothing else than to check your call signatures.

Ease of refactoring is a major indicator for a good enterprise language, as constant refactoring is basically the main development process. If your devs are afraid to touch certain pieces of code because nobody knows what the ramifications are, your software is already dying.

That's why TypeScript is taking off so nicely. People crave those static guarantees that JS doesn't have.


I've seen exactly the same problems with large TypeScript codebases as the JavaScript codebases have. Refactoring becomes a hassle not because there is types or not, but because there are endless of useless unit tests testing implementation details, because the architecture is highly coupled in the wrong places and because little care was made when the architecture was dynamically created (because "agile" or what ever) without considerations.

Having good and smarter tests helps more for ease of refactoring than having everything mapped to types. But I might just have seen bad examples of TypeScript codebases.


Probably this is a side-effect of translating JS unit-testing techniques without adequate thought and reflection. I am sometimes amazed at the extent of trivial tests that do no more than double-check the compiler that I see from web developers that move up the stack.


When I talk about shitty unit tests, I mean regardless of language (bad unit tests seems as prevalent in JavaScript, as TypeScript, as Ruby, as Python, Rust, Go and the list goes on)


When you have reached the stage of mud-ball it is already too late. ;)

With the appropriate amount of skill and discipline, however, this stage can be postponed for longer than the life of the application. Typescript is absolutely not required.


But that's a truism. Good coders write good code. Heureka!

The real gain is to be had in pushing the mudball stage as far into the product's life cycle in absolute terms as possible.

In that sense, nothing is required, but some things help. I argue that TypeScript helps.


If it helps you, great. I’m happy enough with plain JS, thanks. ;)


If it takes ages to load the core content, who’s still left on the site for that other stuff? In his comparison it took 40 seconds to load the image, I’d have closed the tab after 10 and none of that other stuff would matter.


Note that the vanilla 37k image took 2.4 seconds. Most people aren't on a 128 kilobit ISDN line. And there's a good chance you have much of the static content cached.

Anyone can easily build an image-serving website. What's hard is to build an image-serving website that pays for itself. I wouldn't be so quick to judge.


Sure. But he did say this was the mobile site, where you may have connectivity issues or low bandwidth. At home, I would never notice. Hell, I usually have good connectivity on my phone too, BUT often when I’m on the train, I have sketchy speeds and during these times, I do notice slow/overly bloated sites or web apps.

> Anyone can easily build an image-serving website. What's hard is to build an image-serving website that pays for itself. I wouldn't be so quick to judge.

That’s very true, but again, you don’t need a bloated slow (comparatively speaking) app to serve advertisement. If the total time had been 10s (ie 7.5 for adverts etc, 2.5 for content), he probably would have thought eh slow but whatever


I actively avoid clicking imgur wherever I can because I know it fails to load half the time.

I hope the people who still stick around on imgur for the community are enjoying the web application because it's clear people who just want to see an image after following a link aren't considered as their audience anymore.


It's totally doable with vanilla JS. You'll just likely end up with a mess or your own framework and all the associated costs.


A good web developer: 1. Finds a way to make the code serve the main purpose of the page/experience _first_ and bring in the extras after the main thing is settled while disturbing the main thing as little as possible. 2. Uses the same mindset to protect the page/experience from marketing/analytics/whoever. You want more pixels? Sure. I'm not telling you that they will only fire at +3s ContentLoaded, but you're probably not asking, either.


Thats out of the scope, you can still target and only load the image. Its all about technical design.


For me, this is just the revealing of the inevitable because of the mentality of:

- The network is reliable

- The bandwidth is cheap

- The hardware is also cheap

All three statements are wrong, because everything is fast when the n is small. We're past that point. Network is not reliable, bandwidth and hardware are not cheap. At least in terms of time, and my time is neither cheap nor free.

Developers don't or can't optimize because it's either too much work, or it's working reasonably fast. Reasonably fast is not fast enough. We're wasting too much resources in cases like this.

<img> old man yells at cloud </img>

</rant>.


The author of the tweet missed the obvious reason for that performance, even though he even mentioned that this is the mobile behavior:

They want you to use the app!

The button to open/install the app is clearly visible at an early stage. And since imgur users are the product, and not the customer, you want them in the app where you can retain them more easily (think notifications) and ad-target them more precisely based on all the data you can collect directly on the devices.


That's not obvious at all, and many people would rather leave the site than install an app.


By that logic Reddit does the same: horrible website that pushes users into installing the app.

Hasn't worked with a single person I know.

I tend to correlate bad website with possibly bad app and I suspect others do too.


I think reddit is very obvious about it. They only care enough about their mobile website to add huge, impossible to miss „download our app“ callouts and ignore the rest.


This has little to nothing to do with overengineering, its because Imgur.com revenue depends 100% in you clicking the small thumbnails on the side bar or one of the ads placed there (and make sure that if you do click a thumbnail the next image and its comments do load really fast to get you hooked to those dopamine shots), because _that_ is where the money is, meaning when you waste your time there because the longer you spend the more likely you are going to click their ads (and download their app and sign up and everything else that may help that goal)


None of that requires 5Mb of cruft or an entire React app to load. It would take a few lines of JS for some dynamic comments. This is the very definition of overengineered.

Compare that to Stackoverflow which generates completely dynamic pages with more interactivity in 50 ms. Also making the site faster would generate more revenue as more users would actually finish loading page.


"Some dinamic comments" is easy to see you are not understanding all it does here, the comments are more than 100 per image, replies are hidden until clicked, the points of each comment update in (almost) real time, the comments are posted using Ajax to never lost the scroll position, same than all the other interaction such as upvoting, downvoting and report, like everything is over Ajax it needs to keep track of url history itself, all this needs to work on IE10 and other shady browsers (due being one the most popular sites). They don't have much control over some of the assets because they are from the advertiser and have little say what's going on there if they want to win money. Is one of the top 20 sites visited in the United States and I seriously don't think being slightly faster would help much or maybe at all.

And most important than all that: All the assets are properly cached, so despite bothering thousands of engineers for most people the load nuisances only happen once or at most a few times.


We're talking about just the frontend here. I can definitely redo it with substantially less code and no frameworks, including automatic transpiling for different browsers. No need for React + jQuery with megabytes of JS, but if you must have a framework then use Preact or Svelte.

Ads are different but I work in adtech and site speed matters. Faster sites make more money. Remember this is on a mobile page. People don't wait more than 3 seconds for a site to load. You're not getting any ad impressions from visitors who never show up or leave.


Doesn't matter if you can redo with a couple of jQuery lines, their site has more traffic than any other image hosting the world and they need some code that if tomorrow all the engineers leave it still can be maintained, and React plus jQuery is a extremely smart decision for that goal.


Maintainability is a product of good engineering practices and documentation. No framework magically solves for that.


>good engineering practices a

One of the "good engineering practices" is to use a baseline code widely know and that's what full frameworks like React and Angular buys you, of course you still can create a mess with any of those 2 but is way harder to do that than to do it with in-house custom framework, nothing makes an JS developer run faster from a job position than being told they are using some custom framework that has been growing organically for years, regardless of how good your documentation is.


> "of course you still can create a mess "

That's exactly what happened here, and what this entire post is discussing. The frameworks aren't the problem, aren't really necessary, and there are faster alternatives if they must be used (like Preact and Svelte).


No is not what happened here, what happened here is that engineers voices get amplified from their anger because they value megabytes because they are more aware of their existence than the average joe; for average people nothing of this matters as the assets are being cached, it doesn't matter even for the developers of the site because is doing pretty well being one of the most visited sites in the world.


Again, this page for a mobile site took more than 40 seconds to load a single image. The average user is not waiting, they've already left. You seem to be missing that detail. Performance matters and has been proven by metrics and research from every major internet company.

Site popularity has nothing to do with UX. Imgur is popular because of reddit, and there's plenty of users who dislike Reddit's slow and heavy redesign too.


  > and make sure that if you do click a thumbnail the next
  > image and its comments do load really fast to get you
  > hooked to those dopamine shots
Isn't this what <link rel="prefetch"> and <link rel="prerender"> are for? Or am I missing something?


> What’s even more even more disgusting is that THIS IS THEIR M-DOT SITE. This was a deliberate strategy for serving mobile devices.

More likely incompetence than malice, designing for mobile in terms of screensize and UI but neglecting to consider network bandwidth and memory+CPU resource. That and no doubt a fair chunk of that payload relates to advertising in some way.

If it _is_ by design then I suspect it is intended as a way to funnel people into installing their app, much like reddit's user-hostile mobile design (I should thank them, they've saved some time in my day by song me bothering with that site on mobile).


i.reddit.com is the only way I browse on mobile.


Imgur is struggling and gasping for a business model. They do serve a useful purpose, but there's no possible way for them to make money without wrapping images and trying to engage users to stay longer (and eventually get served with an ad).


I think we're at the point where these abominations should be served as application/html rather than text/html. At the end of the day, users come for content, so let them decide to block crap that serves no purpose and wants to reimplement a browser in the browser. I've found that the value of content correlates inversely with the amount of JavaScript on a page.


Is there a static Twitter thread cache somewhere? Loading this fails with "something went wrong" and then "you are rate limited".

If Twitter's dark pattern to force login/app usage were just a little closer to the headline, I would have thought the failure was the punchline of this posting.


This happens _every single time_ I open a link to twitter if it's been >~1 hour since I last viewed a twitter page. Different devices, different networks, etc. It's been this way for at least a year. All I can think of is I don't have some tracking cookie they're using to guess that I'm not a bot. Reloading the page (w/ F5) fixes it.

Presumably logging in or using their app would fix it, but I've got far too much spite for that. Instead I just avoid the site where possible.


This happens to me on iPhone (Safari) as well, and looking for it, it turns out it's been a well-known problem for months (if not years). Astonishing.


You need to reload the page. The refresh button is useless.


Google search results are equally abysmal. Most of the download has little to do with the 10 URLs that we want.


I remember when they introduced ads and it was jarring. After switching to DDG about a year ago, it’s very frustrating to try to interpret google results of I’m ever presented with them.

I have to do this mental dance: * are these ad links? * are these knowledge graph links? * how many results did I get?

It is a bit tiring.


a few years ago i resisted switching to DDG because google was still noticeably better in many ways in terms of search results; the balance has been shifting significantly since, and i use DDG for over half my searches at this point. the main downsides are less 'magic' when using very vague descriptions as queries, and more spam domains. but the image search is easily much better at this point.


> This industry needs some sort of regulatory body.

Yikes.

If you can do it better and make money, then build it and compete with them. What kind of bureaucratic dystopia would even "fix" this?


Yup, the market is a regulatory body.


As someone have pointed out in the thread itself: this is not in any way React's fault. If you want, you can add (p)react for a few kb to your page, serve a max 20-25k js code with it and still do not match the served image's size.

However, when websites decide to instead track the everliving hell out of their users, this is what happens. Absolutely pathetic. Another similar experience I have with mobile sites is Reddit, its intentionally throttling the site and tries to turn you towards their app.


> Another similar experience I have with mobile sites is Reddit, its intentionally throttling the site and tries to turn you towards their app.

That's why I use the .compact version of reddit on mobile. It's as ugly and clunky as it's snappy.


It's funny because their website is pretty basic in features. Also try dragging multiple pictures to upload , there is no progress bar, no feedback nothing. It seems to upload 1 or 2 of them and then stop. Anyway good thing you can just copy the image URL and send that one instead


The entire modern web feels like someone reading that joke interview with Stroustrup and not realising it's a joke and you shouldn't actually keep piling complexity and building an infinitely tall job security tower. Or put another way - it's developers working in a way that optimises for profit and job security (which we all want, duh), but the incentives higher up are so aligned that good simple engineering is never the optimal strategy for maximum profit and job-security. There's probably several books on project management and sociology to be written from examining how the mis-engineering of the modern web came to be.


This is a naive assessment. Imgur is running a business and with that comes all the surrounding cruft Harry is complaining about. I hate including marketing and tag management tools in my project at work but that is what the business wants.


OP here. My beef wasn’t and isn’t with the ads or monetisation: it’s with the engineering decision to bury the product’s primary content in a 1.2MB JS application. The image itself just needs to be an `IMG` element in a server rendered page—ads, tracking, comments, upvotes, etc. can and should be all loaded as secondary content.

My beef is with a ‘mobile-optimised’ image hosting platform taking over 40s to display an image on mobile.


Imgur used to be just an img tag. They found out they couldn't make money that way. Eventually the images became the 110th most important thing on the page.

That wasn't an engineering decision - it was a business decision made to force people to switch to the app. Ever try browsing imgur on mobile web? You can't - it keeps popping up the 'download the app' button. Infinite scrolling breaks because of that popup (by design). If you don't want to support a business that tries to force people to use their app by making the mobile web version horrible you don't have to.


>If you don't want to support a business that tries to force people to use their app by making the mobile web version horrible you don't have to.

There's no hypocrisy in using something while complaining that it's absolutely terrible. (Particularly if there isn't a real alternative; "don't ever open any link to any image hosted on imgur even though you'd prefer the person had hosted it somewhere else" is not reasonable.)


You don't care to optimize those ? Don't you guys realize that snappiness is one of the core reasons people keep using a webapp?


I've been saying this for years. Capital is distorting science and technology. The React ecosystem succeeded mostly because it was backed by and associated with Facebook. Most popular tools these days are popular because of some social or business connection to a large pile of capital.

There are a lot of amazing tools which are not popular because they are not connected to capital. But those tools are great precisely because they are not connected to capital.

Too much capital invariably leads to overengineering.

For example, if you consider the PHP Hiphop compiler by Facebook. It tooks years to create it. It's an over engineered solution that did not need to exist. They could easily have rewritten their software in a different language instead if they really cared about saving a bit of CPU. The speedup which HipHop provided was a constant factor. In fact, they could probably have achieved a higher speedup with a simple rewrite even in PHP without HipHop (with just simple algorithm optimizations).

Corporations are extremely innefficient. Their main purpose is to create useless jobs.


This cycle seems consistent with past popular image hosting services. First, attract users with a simple, no-nonsense service. Second, shove ads in there. Third, try to build some sort of community. More ads. Lastly, fade into obscurity.


How much of that bigness is devoted to exfiltrating data?

If it's non-zero then you must compare to an IMG element design that can also efficiently exfiltrate the same amount of user data. Otherwise you're comparing apples to oranges.


Oh, and your design should also be equally extensible. If I want to add ambient light readings to the data set it should be essentially free to ship the change to the img element design across all browsers.


If you right click on an image you get an option named "open image in new tab". You can just share the direct link to the image.


That does not work most of the time. Browsers send different http accept header when image is clicked vs hotlinked on img tag. imgur looks at the header and redirect the image to the image page when clicked on the direct link.



that is possible too, even though they don't encourage it.

what I meant was when you click on the last link, sometimes it redirects to https://imgur.com/vjdfNOe making the direct link pointless.


imgur becoming somehow more bloated and less usable in every way over time is perhaps rivaled only by skype's trajectory


If the dev doesn't use a hundred different javascript frameworks what on earth are they going to put on their CV!


Maybe there is a potential revenue model lurking here - pay extra to get the lighter version of a page? (half joking)


I think you're right in the first part. Image hosts have been relying on ad-revenue to survive, but it destroys the user experience of a image host, so then the revenue quickly disappears and the host goes down. This is the life-cycle of a traditional image host.

I could see two different models solving the problem. One is a donation model where the image host is collectively paid for by contributors who'd like the host to remain free and independent.

The second model is to have funding per image. When a image is first uploaded, it gets 1 week of free hosting, and if people want the image to be available for longer, people can donate for that specific image to stay online for X days, depending on funds donated.

Basically crowdfunded media hosting.


You could offer hotlinking with a twist - instead of original file you get modified one with Ad encoded into upper 20% portion of image.


Bundle it with a URL shortener that sticks the usual Analytics tracking crap (utm_source, etc) and a QR-Code and you're golden!


And then when Leftpad-Azer removes his packages the next time the image won't load at all...


alright, i‘ll make an exception and ask for constructive „criticism“:

what better alternatives are out there?


Anywhere where you can directly link to a image without the hoster doing any funky redirects.

Usually, the image hosts starts being like ^ while then slowly transitioning to more and more user-hostile patterns.

Only way I found to avoid this is to setup my own Hetzner box (unlimited traffic) running nginx to serve images. Use `scp` to upload them.


> Usually, the image hosts starts being like ^ while then slowly transitioning to more and more user-hostile patterns.

Hilariously, Imgur itself followed this trajectory: it started out as a super-simple image hosting site whose creator was fed up with all the nonsense that other image-hosting sites did[1]. Now, though, Imgur has become the website everyone complains about!

[1]: https://web.archive.org/web/20090227183112/http://www.reddit...


obviously. got my own server as well. but i‘d like to know some side-project kind of thing. those are hard to find for me - in the age of seo-ridden search results trying to sell you everything and their mother.


Somebody should set up a service that links with a GitHub account, makes you a repo, and publishes the images you upload as a Github page. This could even be easy enough to work for non-devs. That would mean absolutely no lock in, no pages, redirects, ads or other marketing campaigns. You would literally download the image, nothing else.

Nowadays, using Keybases cloud might work, as it lets you set up direct links with no intervening pages.


>... i‘ll make an exception and ask for constructive criticism

Tangential, do you not usually do that? why?


[flagged]


Would you please stop posting in the flamewar style to HN? We ban accounts that break the site guidelines like this, and you've done it elsewhere recently too.

https://news.ycombinator.com/newsguidelines.html


I think you are saying 2 different things, in your definition of constructive feedback

1.) The feedback should be nice/not brash/not toxic.

2.) The work needs to be done by the person to give more detailed feedback.

While 2.) is a not a reasonable expectation it is pretty simple to achieve 1.), by keeping emotions out of your feedback. For example I would just go with "The webservice is slow, could you take a look? It is affecting xyz".


Server side rendering


well, my wording. i’m asking for alternative services.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: