Hacker News new | past | comments | ask | show | jobs | submit login

Quite sincerely, it's a total failure. I got the chance to try the new interface, and it's so slow that it's barely usable. It's even slower than the old website, that was already painfully slow.

Loading a random profile takes 8 seconds. Opening a messenger discussion takes 6 seconds. It reminds me of the new Reddit website. Facebook was more enjoyable to use 12 years ago.

It's really sad that in 2020, 10k+ engineers can't make a photo, video, post and message sharing website that is not a pain to use. We collectively failed as a profession. If one needs 2MB of CSS for such a website, there is clearly a problem.




I agree with you; however, the key disconnect is that Facebook is not a photo, video, post and message sharing website. It's a marketing platform intended to extract the most value out of you, the viewer, and transfer that value to Facebook and its advertisers.

If you think of it this way, you can see how you may need 2MB of CSS: to battle the bots trying to scrape your information and replicate your network, to sidestep the evil developers of adblocker software that threaten to destroy the sweet value transfer, the JS required to track every single movement you make both online and off, the A/B testing framework that allows you to determine how to most efficiently extract that extra 0.001% of valuable eyeball time, and so forth...

Connecting the world? Well, I guess that could be a nice side-effect...


Imagine this:

A "free" ad-driven social networking site that brings in gigantic revenue, but that has to pay thousands of high-priced engineers to implement all of the cruft you just described.

versus ...

A subscription-based, non-ad-driven social networking site (perhaps operating as a member-owned cooperative?) that brings in much more modest revenue but that also can operate with many fewer engineers because it can be largely cruft-free.

I know there have been a gazillion attempts at the latter and none has succeeded in any way comparable to the "free" sites. It's too bad, because if any of them were to ever achieve Facebook scale, the subscription price would probably be quite modest.


I think the biggest assumption you're making here is that most people care about ads - and they don't. Fundamentally, when I'm scrolling through Facebook or Instagram or Reddit or whatever, I just don't care that I see ads while I'm scrolling. I'm not going to pay $1/2/3/4/5 a month to avoid something I don't care about, and that's really only the value add of a subscription-based service. I'd also say most folks don't find Facebook's data policies as egregious as the tech community does.


I'd really like to understand how does one not care about ads. To me it's like potholes on road. It would require terrific willpower to ignore them.


I'm not interested in the ads, so I don't notice them if they're not shoved in front of what I am interested in.

I can see that if you consider ads to be inherently offensive, you would notice them wherever they appear, and be annoyed.

But that's not where most people are.


People do. My dad doesn't install an adblocker because he simply doesn't care. I wouldn't either if it wasn't for the serious performance impact.


Although it's getting harder and harder since many platforms try to blend their ads in with the actual site content, I can avoid them reliably well by checking metadata such as the poster, some indication that it's an ad, or if the content is clearly different from what I was expecting.

I wouldn't say it's as bad as potholes on the road, since ads are more predictable.


I don't think it's just about the ads themselves. It's about the fact that when a site is free, its users become the product, and there are all sorts of user-hostile design decisions made, which leads to all sorts of unnecessary bloat.

Think of it as the difference between a for-profit bank and a credit union. The bank exists to maximize returns for its shareholders. The credit union exists solely for benefit of its members. So the credit union isn't going to try to employ sneaky fine-print fees, because that's not what the members want.

In a similar vein, you might not care about ads and may have trained yourself to ignore them, but you might care about performance, which is sluggish because of all the cruft, or you might care about privacy, or you might care about being marketed to in more subtle ways than display ads.


Being unwilling to pay for the service suggests to me that the entire experience has no meaningful value.


Economists have done studies where they tried to determine how much someone would need to be paid to stop using Facebook for a year. Maybe people are not willing to pay for the service, but they would rather have the service than $X.


Food for thought- the value of each user is radically different when accounting for geographic (i.e. income) markets.

For subscription to work, you either:

1- undercharge users from wealthier countries

2- price poorer users out of you platform

3- give up on the idea of worldwide adoption (Facebook scale, as you say) entirely

4- attempt to charge different amounts by country of origin, and watch your users cheat the system mercilessly

5- go freemium, and suffer the same fate that news organizations do- find that far too few are willing to pay to go ad-free, stick ads back into the free version, and end up leaking data anyway

I suppose 4 might be the most feasible option, but once it is obvious that some people pay more for the exact same value, they are likely to assume that the product has less value than it actually does, feeling that they are being ripped off.

In short, there is probably a good reason that paid services will never reach Facebook scale.


6. Option 4, but just relax about the outcome.

At ardour.org, we offer 3 tiers of subscriptions ($1, $4 and $10 per month) named to target different economic conditions. We also offer a single fixed payment with a suggested but editable cost based on OECD data about the cost of dining out.

We're not trying to maximise revenue, which is perhaps an important difference between us and, oh, Facebook :)


What you argue here could probably be said about half the companies admitted to any YC batch, and still there are successes. Maybe the glass can be half full too?


I think the catch is focusing on "Facebook scale" in terms of users on a social network. I am not aware of any YC companies at close to their scale (or potential to get there) without being ad driven, freemium or running on borrowed money.

Small social networks are fine for what they are and, I think, have much more flexible options for getting the bills paid.


> A subscription-based, non-ad-driven social networking site (perhaps operating as a member-owned cooperative?) that brings in much more modest revenue but that also can operate with many fewer engineers because it can be largely cruft-free.

I, too, loved App.net. Alas, seems that people won't even pay a couple bucks a month to see what their friends are eating for lunch.


People instinctively know the value of something when asked to part with their money. :D

Yet they still spend their time on it. Huh.


People know their friends are not on App.net, but are on Facebook.

Or in other words:

https://en.wikipedia.org/wiki/Metcalfe%27s_law


People just don’t want to pay for social networking platforms. We’ve been conditioned to think that these things should be free.


Perhaps it's just a figure of speech, but I feel you should be challenged on the use of 'been conditioned to'. I see conditioning misused as an explanation all the time in tech discussions. This isn't an example of conditioning, but of anchoring, which if you want to view it through the lens of learning theory is an example of modelling rather than conditioning. But really it's better understood through the lens of behavioural economics. A value has been established and normalised, and other values are judged against it.

The reason I bring it up is that in changing user norms, it's much less helpful to think about them in stimulus response terms, than in terms of modelling other users behavior - which itself is guided by limitations on our capacity to process information, and heuristics which though adaptive are poorly adjusted to the modern media landscape. So people didn't flock to facebook initially because they were conditioned to prefer it - they used it because (in addition to offering a peek into the lives of others, it was cool). Conditioning is an important part of why people become addicted to the reward loops of social networking sites, but it's an insufficient explanation for their appeal and often misapplied.


I think social media is as social as sitting in a bar where everyone tries to get you to listen to their stories and look at their photos. To respond to someone in the bar you can send a text, which they might respond to somewhere in the future. Therefore, I am not planning to pay for it, since it is not adding (much) value to my life.


Well, they are free, so that probably contributes to that perception.


> I know there have been a gazillion attempts at the latter and none has succeeded in any way comparable to the "free" sites. It's too bad, because if any of them were to ever achieve Facebook scale, the subscription price would probably be quite modest.

MeWe is freemium (paid extra stickers and storage) and is actually nice, at least the parts I've seen, a lot like Google+ - a friendly neighborhood full of photographers, chili enthusiasts etc.

Of course most people are going to go with twitter, but if you'd like something more like Google+ or what Facebook could have been you might want to try MeWe.


Facebook's problem is that a subscription fee would drastically limit adoption and prevent them from monopolizing the social graph. They see more potential to make money without charging users directly.


> the subscription price would probably be quite modest

Yes. It would be set to the minimum required to dissuade banned users (e.g. spammers) from continuing to create new accounts. This amount would still likely be well above cost.


These views are aligned. A site that's fast and a pleasure to use advances both agendas. It's still a failure.


Only if there's competition. The network effect ensures that there is no competition. Leaving us where we currently are.


Not even, users can just 'not play' (or play less) if the game's no fun.


There are many users for which it is not fun, but an addiction. I have spoken to lots of people who say about once a month something like "Yeah, facebook is really bad for me, I just waste time and get upset", but they can't stop checking it every hour, and responding to posts that touch them emotionally, in either a good or bad way.

That's by design, of course - it benefits Facebook greatly that its herd is addicted, and unlike people addicted to alcohol, nicotine or other substances - there's not even another supplier they can turn to: It's either feed your addiction or suffer withdrawal symptoms.

And I think the success rate of quitters (as a percentage of those who actually want to quit) is also comparable, at single digit percent.


That's probably something that can be measured: if the profile/wall fills out over several seconds, when do the ads appear? First, before everything else? In that case it would be cynical, but I agree that they might monetize the delays by ensuring ads appear before anything else.


The ads are only on the side on profile pages.

The actual ads on FB desktop are the newsfeed ads, which you see as you scroll.


Unfortunately, you're right.


I guess they probably could build a nice and fast website if it was up to them. But there's probably a lot more requirements than just that.

Things like https://twitter.com/wolfiechristl/status/1071473931784212480... are probably not decided on and implemented by the engineering team but are coming down as a requirement from the top. This is probably the case for a lot of other decisions that slow down the page ("We need this A/B test framework", "this needs to be hidden to increase retention",...)


They already did: https://mbasic.facebook.com.


For me on FF this is only a tiny bit faster to load the main page, but way less readable and pictures are so small you have to open them to be able to see what's in there. So not a net positive result imo.


90's versions, like mbasic, of almost every website is better. More content, less noise. Mbasic could still use a flaming <hr> tag though


> 90's versions, like mbasic, of almost every website is better.

I mostly disagree and assume this is slightly hyperbolic, but I take your point. The web used to be documents, even for things that needed to be apps (like email). Now, the web is apps, even for things that should be documents.

Beyond that, the web was not the capitalist/SEO/marketer battleground that it is today, where many sites have so many costs behind them (devs, videos, etc.) that they need a boatload of ads just to try to stay in the black.

In the 90s, you could have a very popular message board with millions of pageviews a month for $30/mo.


> In the 90s, you could have a very popular message board with millions of pageviews a month for $30/mo.

You can still do that. A bare-metal dedicated server with an 4-core CPU, 32 GBs of RAM and SSDs can be rented for that price with unmetered bandwidth and it'll be more than enough to sustain that level of traffic.


>Things like https://twitter.com/wolfiechristl/status/1071473931784212480.... are probably not decided on and implemented by the engineering team but are coming down as a requirement from the top.

I've always suspected that the Div-itis plaguing fb's website is a result of React's dependence on the 𝚘̶𝚟̶𝚎̶𝚛̶𝚞̶𝚜̶𝚎̶ misuse of higher order components.


It’s not.

HoCs don’t add nesting. If they do, you’re doing it wrong.

React <15 (2yrs old), did nudge you in the direction of div-itis, because all components that rendered something had to render a single root element. The more recent versions did away with that constraint. HoCs don’t even need to return markup; they’re functions which return functions. The general expectation should be that an HoC behaves like a factory for the components it wraps.


So every React app can be effortlessly refactored to use whatever shiny new latest and greatest architecture abstraction React comes up with?


No, and FB themselves discourage this in their documentation.

Regardless of upgrade paths, React <15 would still let you use HoCs w/out adding excess element nesting.

I don’t contest that the older React tended towards div-itis if you weren’t careful with how you used it. But this thread was about Higher Order Components (or wrapper functions), which don’t have any inherent effect on nesting.


I'm not sure why FB's site has unnecessary DIVs... But to defend React, HOC or render prop techniques but don't have to output DOM. In other words, every React component does not map to a DOM element.


The extra divs are an obfuscation technique, intended to frustrate scrapers and browser extensions.


React components aren't a 1:1 mapping to the DOM, so you could in theory have 50 HoCs wrapping a single component and it still only output one div or whatever.

Also, HoCs have somewhat fallen out of favour over time, with hooks and the child as a function/render prop style becoming more popular. I think the only HoC I consistently use these days is `connect` from `react-redux`.


You can use hooks with redux now, so what’s the purpose of using connect instead of useDispatch or useSelector?


Two big reasons: Values passed via connect can be validated by proptypes, and by using connect you can mount your component without redux. That may make sense for building shared components that may be used with redux in one part of your app, but without redux in another part. There are a lot of smaller reasons to prefer connect, but those are the big ones to me. FWIW I use both approaches depending on needs.


Since the <Fragment> component was introduced (React 16?) excessive wrapper divs are no longer necesssary fortunately!

Edit: not sure they ever were actually, I think you could just return props.children in most cases?


All these things can be true and yet people will still frequently misuse them.


Heh just yesterday I was trying to click on a zoom meeting link for a Facebook event, and it wasn't working for whatever reason, so I tried to pull the url out of the page with the inspector, and I saw the same thing -- hopelessly deep nesting of the element, with the URL hard to find and its text version obfuscated ... and that's not even an ad!


> Things like ... are probably not decided on and implemented by the engineering team but are coming down as a requirement from the top

Yeah, welcome to the real world. We _all_ have to handle requirements like that, except maybe when we build our portfolio site


The problem there is that '10k+' "engineers" are trying to make the same 'photo, video, post and message-sharing website'.

It's a structural problem and little more: the website (and app) is their main money-maker, so they're going to give it a disproportionate amount of resources.

Imagine you hire ten thousand people to lay one railroad track. [note; see end of post] If any single one of them doesn't contribute directly in some way, you'll fire them. This seems kind of strange, doesn't it? Sure, it probably requires more than a single person to lay a track. But ten thousand people to lay one? How is that supposed to work, mechanically? This would be enough to warrant shareholder revolt.

Now, the railroad track gets broken a few hundred times, maybe they hammer it enough to make it twice as long, whatever. It now no longer resembles a railroad track. Certainly no train could go across it. Send a few hundred people to go ask the managers of this project for a replacement track. Okay, we're now at...maybe a tenth of people having contributed? Repeat this process until everyone's contributed. Maybe the manager gives different groups different materials for the track to fuck with them, whatever. But somehow, every single person manages to not get fired.

What's the outcome look like? You have a single railroad track, probably not even well-fit for the job (sparks fly whenever trains run on it; maybe it causes them to tilt, so on), but it's laid! And ten thousand people are employed!

It's the same thing with a website. You can't put a terabyte onto a user's device every single time they load your website; you just can't. So you have a window of performance you have to hit. Between ten thousand people trying to have things thrown onto user devices? Good luck making anything resembling 'decent'.

It's the same problem that Dave Zarzycki noted in his talk about launchd[1], but worse. Instead of 512 megabytes shared between some random abstract parties you can basically ignore, it's <10MB shared between ten thousand coders, translators, graphic designers, users, managers, etc. Does something seem strange about this?

[note]: This is the appropriate comparison here; at the scale of 'Over ten thousand people working on one program', it's grunt work, not art, science, or even programming. There's a word for implementation-grunts that's fallen out of favor in the past few decades: coders. This was seen as distinct until recently.

[1] https://youtu.be/SjrtySM9Dns?t=255


I don't know how many people FB actually allocates for their main app, but this reminds me of a chapter in The Mythical Man-Month. It is said over 1000 people and 5000 man-years went into OS/360. I don't see it anywhere today.

Instead, the book proposes The Surgical Team, i.e. about 10 people taking specialized roles, with the system being the product of the mind of a few key people. I wonder how well this aged.


Fred Brooks was accurate about most things, yeah. It's hard not to envy him; he got to work with (and write a book with) Ken Iverson. Can you imagine? People back then really had all the luck!

...at least as far as computers go, anyway.


For me the performance seems better. It also seems strange that if one of their two main publicly stated goals was to increase performance (the other goal being ease of maintenance), that it would slow down. Maybe you have extensions interfering?

Also, the set and scale of features in the Facebook app makes it literally one of the most complex webapps out there. It's far more than just multimedia posts+messaging -- it's a marketplace, dating, games, apps, groups, pages, and more. Nobody's "failing". And the 2MB of CSS was the "before" uncompressed. The "before" compressed was 400 KB, and this update appears to reduce it to under 80KB compressed. That's 96% less than the 2MB you're complaining about, more than an entire order of magnitude.

So Facebook seems to be improving here, no? I fail to see what is a "total failure" or "clearly a problem".


I'm not sure what your setup is, but 8 seconds to load a profile is not my experience. It takes less than a second here.

This is an anecdotal datapoint that is insanely useless in the real world, but the fact that it is the top comment is typical of this site.


Really? I really like it. I almost solely use Facebook to organise blood bowl matches, so it’s a lot of group chats and events, and it’s so much better than the old design.

I haven’t noticed it being slower either, it’s certainly not fast, but it’s not really something I notice either.


Sorry, I won’t take blame for that.

Still haven’t found a use case for React/Angular or SASS/whatever.

If I’m guilty of something is not recognizing the validity of those tools, as I’m sure there are.

But 2MB CSS is simply inconceivable to me.


I agree, React/Angular front ends always seem slow and clunky to me.


Big +1.

I would be really interested to find one, only ONE, website where React/Angular was really bringing a better experience and better final product than a standard pure JS with simple Ajax system.


What makes you think React/Angular are products to increase user experience? Of course you won't find that, they are tools for developers to streamline development and make maintenance easier. Have you worked on a platform that uses nothing but pure JS and fetch calls? I have, and there's no way to make sense of a project that has to deal with so many things.


>they are tools for developers to streamline development and make maintenance easier.

I'm not convinced they are successful at that either.



why is it downvote ? This link just show you can create a basic twitter like application in 10 minutes with the right tool. Imagine how many hours/days it would take with any JavaScript framework !


Probably because a Twitter close isn't even close to a complex application. You can build one in React/Vue just as fast as you would without them.


Yeah, it's a common trope. Everyone can create a Twitter/Facebook/Instagram clone in a week or so of intense work. But that's not actually the hard part. What's hard is getting millions of users in the first place and scaling to that magnitude. In that order.


Anytime you need to stream data to or from a client, like audio/video or data. Single page apps have a much better experience.

From an architecture standpoint, it does also allow a much simpler separation of concerns.

That said, it does it get used more often that it should and can take more time to build than a multi-page app.


Fucking gmail, for example.

The new SPA version is far, far slower than the old HTML version, and uses a truly insane amount of memory. I have 2 gmail tabs open, and according to Firefox's about:performance page, one is using 140mb of RAM, the other is using 95mb, and both are at the very top of the list in terms of CPU usage. Above even YouTube in both CPU and memory, which is itself fairly bloated.

It is absolutely disgraceful.


Google Cloud Console felt slow to me too. And the user experience could sometimes be better too. For example: I set filters in a table of items, clicked many times to Next to paginate on the 10th page. Then I accidentally clicked on an item on the page, detail page shows. Clicking browser' Back button takes me back but the filter is cleared and I am on the page #1 again. State was not persisted so I could start filtering and paginating again. I am not 100% sure but I think it was the Quota listing. The old, document-style pages used to hold state!


Humans have been using physical and mental systems to manage complexity for eons, I simply can't understand this argument. React doesn't automatically fix spaghetti code if you don't know how to organize a platform-sized code base efficiently in the first place.


This is like saying "cabinets don't help organize a kitchen if you don't put anything in them" -- I mean, duh, you have to use it right. That's not an argument to have no cabinets.

There's a reason nearly ALL major web applications rely on a _framework_....rails, django, laravel, you name it. These exist because it's really hard to organize vanilla code without a framework. React and FE JS are no different.

If you're arguing against using a FE framework to organize FE code, you're basically saying "We don't need frameworks in general! All code should be inherently organized!" That's not realistic. It's just not feasible when you have a large project.


> This is like saying "cabinets don't help organize a kitchen if you don't put anything in them" -- I mean, duh, you have to use it right. That's not an argument to have no cabinets.

Good counterpoint. Parent comment sound too much like the "TRUE programmers don't use data structures" old meme


Yeesh, you've badly misinterpreted my comment. Re-reading it, I'm partly to blame here.

>That's not an argument to have no cabinets.

This isn't the point I was trying to make. I didn't mean to piggy back on this part of the parent comment: "Have you worked on a platform that uses nothing but pure JS and fetch calls?"

I meant to respond to this part of the parent: "they are tools for developers to streamline development and make maintenance easier." I often find that the most ardent React fans will see "not React" and jump immediately to "spaghetti of vanilla js and fetch calls," with no further questions asked.

I'm trying to argue against React dogmatism, I'm not arguing in favor of "no framework" dogmatism.


Well, if we take a step back from React and talk about a sort of data-first approach to UIs, I think they are definitely, much easier than the old jQuery era.

Mutable state, notably knowing all possible variations of said mutable state and how it relates to everything, is very difficult imo.

Is React the best implementation of this? Definitely not. It will evolve as time goes on. But I don't think I could ever manage state in the jQuery mutate-everything model again.


To be clear I'm not advocating for React on every possible use case. In fact I've grown weary of it over the years, but not for the reasons the comment I was responding to pointed out. My point is that React makes the process of organizing frontend code easier, not that it's the only way to do so.


I've used react for internal administrative tools for configuring UI, and the ability to do a real-time preview of what the UI will look like based on how you've configured it is pretty useful. Also used it a number of times to build form inputs that aren't natively supported ("list of objects all with the same properties where you can select and make a change to a bunch at once" sort of thing).

Basically I've found that if you're doing something useful which could be done with jquery but would probably have had subtle bugs due to a combinatorial explosion of possible states, you can usually use react in that context to make a cleaner, faster, and less buggy version of that same UI that is faster to develop and easier to reason about (and thus better for the end user, since software that works consistently is more valuable than software that mostly works as long as you don't breathe wrong near it).

If you're looking for examples where a single-page app is better than server-side-rendered HTML with some javascript sprinkled in to make certain components more interactive, though, I can't help you. The successful use cases I've seen for react are of the "use it as the way you sprinkle additional functionality into your server-side-rendered HTML" type.


better experience for who? For the developer it's a way better experience. For the consumer it often isn't because of inefficient bloat, but if effort is put into packaging it sensibly it can be better thanks to build pipelines/optimizations. In my experience, there's a lot of apps that would have never gotten written in the first place without the boost from React/Angular/etc. It simply takes way longer (which means also more expensive) to use "pure javascript and ajax."


I've been doing FE development on and off since IE5/6, starting out with pure JS and Ajax, and in my opinion the modern FE developer experience using the major frameworks is a generational improvement over the messes we used to write. There are still frustrations, but it's a lot better than it used to be.

Browser API's and CSS have all improved drastically since then, so pure JS and Ajax isn't as bad. I still avoid frameworks for small things just to keep pages lightweight. But for heavyweight projects, if you don't use an established framework, you just end up with a shitty home-grown framework anyway, because the alternative, teams of developers working on the same site with no frameworks, is even worse.


Agree completely. I'm amazed at how good pure JS is these days. I only use React when I have a non-trivial UI that I need to write. It's just not worth it for simple stuff. That said I don't do a ton of FE work these days, so my opinion is less relevant.


For a certain kind of developer who drinks a certain kind of Kool-Aid, sure. Personally I am a lot less productive when I’m forced to work on a React codebase. It’s kind of like using an ORM, it can feel like it makes things easier, but really you end up fighting with the abstraction more than if you had just learned the underlying technology itself.


The thing is, every place I've worked that was against ORMs or frontend frameworks wound up evolving their own half-documented, half working framework. I find it's easier to spend as much time learning the abstraction as the underlying technology, and then that knowledge can be ported from system to system.


My experience too. The whole "we don't use a framework" sounds great but it always turns in to some sort of home grown framework that is poorly documented, has bugs, and isn't open source so you can't take it with you when you leave.


Related: "I don't watch or even own a TV" [watches Netflix/YouTube on a laptop six hours a day]


You also miss out on the community aspect of things. Working with a framework like React means that there is a worldwide community of developers that you can tap into when you run into a roadblock.

It is incredibly frustrating to run into an issue with a home grown framework and have to ask around only to discover that the person who wrote the part you're having trouble with left the company 2 years ago and no one else understands it.


I can't help but feel like the people that feel this way are people that don't build maintain and modify customer focused web products for a living.


I updated openEtG to use React instead of PixiJS. https://etg.dek.im https://github.com/serprex/openEtG

Fixed some UI bugs, made it straight forward implement features like better animations & replays, performance improved by removing stuff like a constant timer for UI & having a global mousemove handler tracking mouse coordinates


Come on man, real programmers use assembly and roll their own browsers.


Assembly is a needless abstraction. Real programmers create binaries with xxd.


Real programmers use a magnetized needle and a steady hand.


It sounds to me like you may have never worked in a codebase with thousands of JS files. On a major application, organizing code in vanilla JS is all but impossible. I agree it's better for a very small case, and I agree react slows things down, but that's a worthy trade-off for having organized, legible code.


These are parallel things. Frontend frameworks let you build more predictable and solid apps than vanilla JS.

Indirectly it leads to better UX since the developers can spend more time on UI tweaking, than doing it vanilla.


Well, one primary benefit seems to be to make loading indicators/spinners into first-class components.


What simple Ajax system?


That 2MB of CSS is legacy crap most of it they didn't use. It's the hallmark of Conway's law.


React, Vue, and Angular all use a virtual DOM, which means they're already deprecated 5 years from now. A virtual DOM is dumb.

SASS, SCSS, LESS, etc are kind of great though. It sucks you have to compile them to css, but you can do this:

    .App {
      .Topbar {
        .Logo { color: green }
      }
      .Content {
        h2 { color: orange; }
      }
    }
Saves a lot of time and effort.


Angular uses an incremental-dom not a virtual dom: https://github.com/google/incremental-dom


https://github.com/google/incremental-dom#usage

Technically correct. The point is, it's not native.


I'm not a Facebook (the company) apologist by any means, and only use it because there's a few groups on it relevant to a company I own.

That being said, I find the new FB to be insanely fast. I don't even block ads on it.

I do agree Facebook was way better 12 years ago (I saw real updates and photos about friends, rather than companies and ads). But speed right now hasn't been the problem.


I can't decide whether I love or hate the UX of this new stack. It certainly feels more like an app now than a website. I like the new basic layout, "app shell" or tier 1 rendering. It feels like the First Contentful Paint is improved and some random layout shifts have eliminated. It might take a couple of seconds more to load something but it appears where you expect it to appear.

On the other hand, navigation and clicking around is still sooo slow. My 60-year-old aunt called me and asked if she needs a new pc because facebook makes her laptop fans spin like crazy. I couldn't explain to her about all this react-redux-graphql thing and frankly, she doesn't care. All she cares is that facebook is slow and all she does is post photos and talk with friends like she did 10 years ago.


> If one needs 2MB of CSS for such a website

The 2MB was for the old site. The new site loads 20% or 400KB.


Agreed. Using this on the i9 8-core 16" MBP and the thing just isn't fluid. Has anyone at Facebook even bothered testing this on computers people actually use? Like some 2015 Macbook Pro? Or a 2014 Macbook Air or whatever.

I wouldn't even want to know how it runs on those.


I've got a 2015 MBP. FB is pretty sluggish, and I wonder why. When you send someone a list of notifications, it's probably worth getting the data ready in case they click it. And the resource usage is pretty big as well.

The mobile app seems to be just fine though, perhaps they want to push people to use that.


You’re missing their main goal: having an easier-to-change codebase.

Apparently this is way more economically rewarding than performance for Facebook.

With that in mind, who cares if the site is slow (btw this is the only complaint of your rant). If the software requires a few devs to change and a few eyes to maintain, they can literally scale as much as they want. And actually now they’re probably in a way better position than they would if they had developed a super performant but unmaintainable site.

The quote, premature optimization is the root of all evil, is still very much valid imho.


> It's really sad that in 2020, 10k+ engineers can't make a photo, video, post and message sharing website that is not a pain to use.

Too many cooks spoil the stew.

I might even go so far to say that 10 engineers would have a larger chance of success than 10k+ engineers.


This was my experience at Facebook. Attempting things with a small team (or heaven forbid by yourself) was heavily frowned upon because it didn't justify manager and director salaries. As a result you ended up with poorly performing over-engineered code bases that prefered complex, expensive systems that would take multiple teams to build, but for whatever reason complexity that would improve performance was frowned upon. I'm sure this is common at many big tech companies. I didn't work on the mainline FB app but it seemed like part of the culture.


When were you there, if you don't mind me asking?


The 2MB of CSS is needed to justify that headcount.


Where are you getting 2MB from?


"On our old site, we were loading more than 400 KB of compressed CSS (2 MB uncompressed) when loading the homepage, but only 10 percent of that was actually used for the initial render. We didn’t start out with that much CSS; it just grew over time and rarely decreased. This happened in part because every new feature meant adding new CSS."


Read the first four words.


It really is appalling. I'm on a top of the line laptop with Gigabit internet and I can't do anything on Facebook without waiting several seconds for loading. Usually I only open it to check notifications. I just refreshed and counted and it took 9 seconds for the page to load and to show my notifications.


At a company I used to work for, we worked so hard to make sure our web app would load extremely fast… just to end up losing the battle with the data and analytics team for analytics scripts. They used a tag manager (tealium) which by itself can be used for good but it ultimately gave the other team the ability to overload our site with 3rd party scripts.


Twitter's new design is pretty fast. Sure, it does a lot less than Facebook but it's using similar modern SPA tech.

Facebook is still pretty slow even on a Ryzen 3900x with 32GB of 3600mhz RAM. It's a lot better than it used to be though.


Yeah, I think that Twitter did a pretty good job. After the initial load, it even works offline, so the actual API calls are the only thing that it's fetching over the network.


After the twitter update, I can't seem to get the initial load of an individual tweet to work. Ever, on any device, on any network. I encounter this issue on my laptop (on both Windows and Ubuntu), on my desktop (also both Windows and Ubuntu) and on my Android phone. It doesn't matter if I'm logged in or not, I always get "something went wrong" when I load the page and have to refresh at least once.


It is not a failure of the profession. There are engineering teams out there that excel at software performance. Granted they may not have billions of users. It is a matter of mindset and core values and those are hard to change.


Facebook was more enjoyable to use 12 years ago

We'll see what the data show. I have been reading comments about Facebook's supposed decline for as long as I've been aware of Facebook and yet their published numbers continually show greater engagement. https://jakeseliger.com/2018/11/14/is-there-an-actual-facebo...


The unquestioning supplication at the alter of 'engagement' (a sterile marketing term if there ever was one) is what lead to where we are now in the first place. This is an affliction that pervades the entire consumer internet sector, but the folks at facebook seem to have refined it to its fullest potential.

The other day I got a facebook notification on my phone, which said something along the lines of "You have 4 new messages". Of course, thinking it was from my friends I opened the app to look at them. 3 of my 4 "messages" were notifications for friend requests from people I had never met. The last one was a photo someone had posted of cake she'd baked (not to me specifically, just in her feed). To someone sitting at her desk at facebook, looking at an engagement metrics chart, the notification would seem to have served its purpose - another data point, another person enticed to open the app in response, engagement maximized. But of course, this was deception. I found this experience distasteful enough to disable notifications entirely - probably another data point for their metrics team - and annoyed enough to complain about in an HN comment.


The new "Person X has posted a photo" notifications are the worst. Their abuse of the notification icon is getting ridiculous. It used to be focused on when someone interacted with something you had done, now it's just used to drive "engagement".


I've seen people in the past make the mistake of correlating "Enjoyment" with "Higher Engagement", but you want to be really careful there.

For example - Flame wars increase engagement, even if people feel drained and frustrated afterward.

I understand why it's a useful metric - It's particularly valuable if your business model depends on time-on-site to sell ads.

But I wouldn't recommend them as a proxy for enjoyment by any means.


There’s a lot of fake profiles though and I think a lot more than they’re prepared to admit. Even brazen binary options trading scam profiles don’t get removed - it appears that they’re happy as long as the number are going up.


They’re trying to police 2.5 billion accounts with 45 thousand employees (including HR, developers). I’m not surprised they stuck. Don’t get me wrong, it’s not OK for them to suck, but I’m not surprised given the 55,555:1 accounts:staff ratio.


>and yet their published numbers continually show greater engagement.

Do you think this might have anything to do with the fact that, as an advertising company, it's crucial that they are able to tell companies that engagement is increasing?


I would be very surprised if engagement wasn't down among Americans under 50 – it may well be counteracted by growth in other markets and in other apps (especially Instagram), but there's _much_ less activity on Facebook.com from my peers than there were five years ago.


Engagement isn't just driven by "enjoyability", so that's not a particular convincing counter-argument.


Looking at the internet today, I think we need to lower our expectations and be realistic. At least in the US.

We are still driving 60mph on freeways and what trains we have do not travel at 300kph.

Perhaps many of us flipped out when we only had 9600 baud modems, but you could get up, brew some tea, walk the dog, or read a book while waiting for a page to load. We all had so much more patience back then.

Why do we need instant gratification with FB and other social media? Maybe, or maybe not /s.


Because the underlying computing technology has gotten so much faster, unlike the case with cars and roads.


And computing technology is not beholden to oil/car interests and NIMBYs.


Let's not even get started with thir mobile apps, horribly large in size, poorly engineered (performance) and privacy loopholes are everywhere (perhaps by design).


Comments like these remind of an old post on Slashdot, "What makes a good website?" Ask "geeks" what they prefer, it's usually minimalism, no images, consistent text styling. In the end, the ideal format becomes a text file without markup. I think we need to accept the opinions of techies are increasingly irrelevant in tech. It's like being fine artist getting paid to design flyers or a chef making burgers.


Everyone hates slow webpages. Not just geeks. We can all argue over whether minimalism or eye-candy is preferred. But if your site feels like running in mud, it's frustrating regardless of the design.

And all these SPA, client-side rendered, sites seems guilty of this. You navigate to a page, and it loads up "instantly", except you see nothing but gray placeholder images. Then content starts loading in, but haphazardly. You see a link you want to click, and you go to click it, when BAM! it jumps down 37 pixels because some stupid widget just loaded above it on the page.

I really hate the modern web. Not the look of it, or the styling. The mechanics and slowness.


I hope we someday enter a period of reform where our field begins to apply more rigor to our process. We have the tools to make a fast, sleek web. The process is very, very broken from an engineering perspective. The money seems to never stop, which is why businesses can burn through developers without too much regard to whether they are doing the right things. Maybe that can't be fixed, because the people in charge are going to keep the status quo going if the money is coming in and they're made to look good. Optimization is a risk that companies of even modest size find to be too great.

Or perhaps people involved in a revamp/redesign/unification should be well compensated to the point where they are unlikely to leave in the middle of a multi-year project. I have a feeling based on my experience that a lot of these bad rebuilds are a result of too many engineers and designers coming and going.


But, perhaps ironically, Slashdot was killed because its design updates made it more "designer-y" but much less usable. I remember one update in particular that added a ton of whitespace, gave it a "cleaner" look I guess, but it meant there were like 1/4 of the posts on the screen so it just took longer to peruse the comments. They also f'd up how they showed voting so it became much harder to just scan for popular comments and valuable discussion. I remember going to Slashdot pretty much daily and after that update just said screw this, will have to find something else.

Design updates can be useful, but just like for engineers, "beware lots of highly paid people looking for something to do".


Yea, I left Slashdot after The Great Redesign, and other websites too for similar reasons. It seems to be this inevitable milestone in any website's (or, generally, software's) life:

- V1: Focused, works, fast, lacks some features, but good enough to grow

- V1.1: More features, still performs well, exponential community growth

- V1.2: Adds chat, messages, social, loses focus, performance starts to suffer, linear or slowing growth

- V1.3: Start loading up with ads, things are getting worse, usage plateaus or teeters

[EMERGENCY! HIRE THE DESIGNERS!]

- V2.0: Huge, unnecessary re-design [1] without community input. Most features gone. More ads. Community craters. This is the Fark.com "You'll get over it" phase.

- V2.1: Saturated with ads, founders have moved on, site is on autopilot, a shell of what it used to be.

1: https://www.youtube.com/watch?v=YnVeysllPDI


Wrong. The average user does not give a shit if the page is rendered server-side, or of it's a SPA.

Geeks prefer speed, like everyone. There are plenty of papers that show that a reduction in latency improves the conversion rate. And it's does not have to be ugly to be fast.


Performance need not be coupled to idea of minimalist design. A performant website can still look like Facebook and not take 5000 ms to load.


First, that's just wrong.

"Geeks", as a class, will tend to focus on technical issues before aesthetic ones. Looking at that fact and immediately equating it to an absurd extreme is a fun game, if you don't care about describing reality. I know some accomplished engineers who are also good designers, and vice-versa.

Second, if my professional opinions are not being taken seriously, that usually means one of two things: I'm too far out over my skis, or am in the wrong place with the wrong people. Especially so if you feel like a chef making burgers.

Of course, if "in total control" and "irrelevant" are the only two states of being one sees, I suppose I see how you get there.


Geeks prefer function over form

Consumers (apparently) prefer form over function (or at least, they are more easily fooled into thinking the more form, the more function)


I prefer form to follow function. Make it as pretty as you can, but not at the expense of the function.


It's Facebook's own standard: "We knew we wanted Facebook.com to start up fast, respond fast, and provide a highly interactive experience.". The person you're responding to is saying they haven't met their own metrics for success based on their experience. You can't really attribute this "preference" solely to them.


Can you please compare the speed of https://yang2020.app/events ?

On the mobile web or desktop, either one. (We're running it off of one server, it might get the HN effect, we'll see.)

We have been building our own, open source social networking platform and we have tried to make a lot of things more efficient while doing so. The site I linked to didn't minify any files or optimize images. However, it loads things on demand as needed, and even lazy-loads entire components.

Is it faster than Facebook? We have our own component system, not React.

Here is a site that did minify and combine all files: https://intercoin.org

And here is the platform we used: https://gitub.com/Qbix/Platform (warning: not all of it is documented, but enough, at https://qbix.com/platform/guide).


I've never been on Facebook so I can't compare but your site responds pretty quickly for me. Also you left out the h in GitHub in your link so it goes to a domain for sale site.


Remember when working at a FAANG company was supposed to be some mark of pride?


All this and the chronological sort mode is still totally broken. Third post in my feed is 3 days old followed by one 2 hours. Total joke.


They really don't want you sorting chronologically. That negates The Algorithm™.


The new design is so laughably bad. I was trying to send a message to someone on the website, it took 10+ seconds to open the chat window then when typing it couldn't keep up with my typing (I'm not an especially fast typer either). It was like having a 5 second ping over SSH. This is on top of the (pinned) tab regularly needing to be closed as it slowly takes up system resources.

This is all on my 8core/32gb workstation. I can't even imagine how much utterly useless crap they are running in JS to make that kind of experience.

On the bright side it does mean I am weaning myself off as keeping a pinned tab open is a non starter so I can't just have a quick refresh. And I'll be fucked if I'm installing their apps on my phone.

So I guess thanks needs to goto the FB engineers for making their new website so utterly garbage that the tiny dopamine hits driven by the FB algorithms are worth less than the pain caused when using the site.


I wonder if they fixed the bug where if you visit the site with Safari on an iPad, when you try to type a comment, every space becomes two spaces. Also, I wonder if paste (command-v) is also randomly blocked at times.

I use mbasic.facebook.com as much as possible. Occasionally I'll use m.facebook.com. I've had the mobile apps uninstalled for ages.


I still don't understand how the biggest websites get away with being so unbelievably bloated. My guess is that most people have medium to old phones and PCs that are bogged down with nonsense running in the background and facebook, instagram, twitter etc. run extra slow, but I guess people just put up with it.


people just put up with it


> Quite sincerely, it's a total failure. I got the chance to try the new interface, and it's so slow that it's barely usable.

Were you using a machine with a gigabit connection, 32GB RAM, and 10th gen intel cpu like the devs?


I don't use Facebook but how do you know it's not BE calls that are creating the slow experience? It sounds like they rewrote the FE not the entire system.


Does the compensation of those "engineers" reflect that they have "failed"? Perhaps there is another way to evaluate the work, not from the perspective of the user waiting in front of a screen. Do not forget that the money to pay the salaries of those who do this work does not come from users.


I don't know if I have the new stuff or not, but I agree that some parts are currently frustratingly slow to use on desktop. I figured it was because so many people are spending much more time on it (including, tbh, me). But I was trying to message with an old friend and simple gave up a few days ago.


Agree whole heartedly, Facebook is a disgrace of a website and has been for years. Crazy how slow it is to load.


I think you could add Twitter and New Reddit to the list as well.


In France we have a Craiglist-like website. They recently moved to ReactJS : https://www.leboncoin.fr/

The website features didn't changed in between. It's basically a pagination + a search based on radius (so DB related) + name (so DB related) + categories (so DB related).

The complete website could be build in pure HTML + CSS and a bit of JS + Ajax to refresh parts of it.

But no, it's build with ReactJS, and it takes seconds to search a simple item on it.

To compare, just try the same search on the Dutch equivalent, MarktPlaats, https://www.marktplaats.nl/. The experience is way snappier, way lighter, the features are the same, and it's just HTML + CSS + a bit of JS.

We made a mistake with React/Vue/Angular. And we should really go back and stop using those frameworks.


> And we should really go back and stop using those frameworks

I think you are cherry picking.

There are plenty of examples in the modern web of terrible React/etc implementations but that doesn't mean the approach in itself is bad.


> There are plenty of examples in the modern web of terrible React/etc implementations but that doesn't mean the approach in itself is bad.

If the creators of React can't get it right, then what hope is there?


React is bloated and slow, but there are dozens of frameworks other than React.

https://krausest.github.io/js-framework-benchmark/current.ht...


Please name some.


Some good React alternatives?


I’ve been plying around with lit-element lately and am finding it a much simpler experience overall. Likely a lot faster as well https://youtu.be/uCHZJy2n8Qs


Lit is excellent.

It's also possible to use htm with Preact for developing without a bundler if your target users support ES6.

https://github.com/developit/htm


What's the reason for redoing a website that provides the same functionality? Why use an SPA for basic website features?


They made the same with local Czech eBay-like website https://www.aukro.cz/ . Previously it was a fast website. Transition to the webapp was painful, their filtering component was not working properly on mobile. It still takes several seconds until the initial white page switches to the rendered DOM. They lost many customers due to this painful transition. I think they fixed some of the problems (I know filtering component works ok now) but I basically stopped using it since that transition too.


Thank you for this post. I yell at my screen each time I have to use leboncoin.

It was super ugly, but it did the job. Now it's super ugly, but it steals my focus at every opportunity, refreshes parts of the UI I'm about to click, or has select inputs that love to play hide and seek with my cursor. It's a UX nightmare.

I must say their new payment system is nice. But boy do we have to suffer when looking for something to buy now.


Could not agree more. It is barely usable now. The exact opposite of the original spirit. A complete failure.


Agreed. SPAs only make sense for few websites like Trello, for most other websites plain dynamic HTML with dash of ajax here and there are much better.


You know it's possible to use any of the modern JS frameworks without SPA, right?


It has 96 .js scripts on a single page.


This used to be a bigger deal before HTTP/2 increased the number of concurrent requests to be virtually unlimited.

Unless I'm missing something, it's "optimal" for a site to have many split files. If their JavaScript were 1 file, a change to a single character would mean the need to re-download every bit of JavaScript. Instead, with 96 files, it would mean 95 of them are still cached client-side and only 1 is need of downloading.


It looks to me like these scripts are asynchronously-loaded components that load only once they're needed. In this case it looks suspiciously like they're nested and each script download causes another script to be downloaded once the component renders, which would make HTTP/2 a moot point. I can even watch the requests pile up in the dev tools when they're cached, so my guess is if they dumped everything in one file (or even 10 files, just not 95) they'd get noticeably improved performance.


Twitter and the new Reddit aren't in the same category. Twitter is still somewhat usable.

Reddit, on the other hand, is an absolute clusterfuck from layout and usability perspective. The choices made were not made for monetary gain, they are simply really bad design choices that for one reason or another have not been fixed.


Aah, the Reddit redesign. I still can't use it. I check up on it once in a while, but my 7 year old MacBook Pro still don't like it. It remains slow, regardless of the number of times Reddit claims of have improved the speed.

On the phone it's even worse, a large number of subreddit now require that you use the app... unless you just go to old.reddit.com.

The point of the reddit redesign still alludes me. Sure the old design isn't mobile friendly, so I can understand that they would want to fix that. Then again they mainly use the redesign to push the app. And the new design isn't that mobile friendly anyway. Certainly not if you value battery life. Comments are now also hidden by default, which is just weird. But you have infinite scroll, which seems to be the main selling point. I'm not sure I needed that though.


Totally agree.

It's incredible that ugly old.reddit.com with RES still provides a better user experience.


I like the redesign actually. The default view is terrible, though. Once you start using classic or compact view it‘s usable.

Like 80% of my reddit usage is on mobile by now, though. So it doesn‘t matter that much to me.


Mobile twitter website gives me an “oops, something went wrong” or “you’re doing that too much” error probably about 50% of the time I open a link to it.


I sort of tolerated New Reddit at first, until I experienced first hand their (to the user) cynical reasons for doing it. Namely: throwing mobile app prompts in your face, blocking some reddits unless you're logged in, inserting ads masquerading as proper posts...

Even then, I could cope with some of it, except that they just totally broke the experience with shitty infinite scrolling. You can't click a damn thing and hope to go back to where you left on a post. Sometimes even old.reddit.com will redirect you to the new version now.

These redesigns would suck less if they were more about being functional and not about scraping every last morsel of engagement from unwitting visitors, through whichever devious methods they can imagine.


Yes! I use old.reddit.com and am not a twitter user so I don't have much exposure there.


Yes, in particular, for large discussions, when I get a notification about a comment I made on it, and I click to jump to that comment, it takes a while to load -- and in a way oddly proportional to the size of the discussion and frequency of the posts on that group.

I'll always see the very top post of the group, and its whole page load, and then slowly my discussion will come up and then it will scroll down to that comment; and if I do anything, that breaks the whole process.

It's like no one ever considered the concept of just loading a piece of the discussion, like reddit does.

What's more, there are all kinds of UX nightmares, like how, if I open the messenger in one Facebook tab, it opens in every tab, blotting out content I want to read.

Or how a FB livestream event will just randomly stop playing, giving me no indication that I'm lagging behind the current video -- I've done trivia nights that way and I only find out I'm behind after my team members suggest answers to questions I haven't heard yet.


Not sure why I'm being downvoted, in what way is Facebook's performance not atrocious?


There's too much CSS and JS because of web components and too many teams.

* team for web component A => CSS, JS

* Team for web component B => CSS, JS

And so on with 1000+ components,

Ends up to be a big pile of mud and everything maybe duplicated, just got a different name and failed to be optimised away and removed.


I wish they would just give us APIs and let us build our own experience.


I'm curious what your response to the Cambridge Analytica stuff was? Open APIs to build your own experience are like 100 times worse than that, at least the APIs that CA used were limited (such that they didn't provide enough info to actually recreate FB) and required CA to sign a developer agreement with FB to restrict how they could be used.


But then how would they force us to look at ads? How would they keep all their valuable data locked in their walled garden? Sadly I don’t think we’re ever going back to the glory days of open APIs.


There's always m.facebook.com if you want a retro experience.


I haven’t tested but I doubt the problem is with the CSS, is it?


Where did you get 2MB from?


Facebook isn’t trying to maximize your enjoyment, they’re trying to maximize their profits. By that measure they are doing vastly better than 12 years ago.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: