Hacker News new | past | comments | ask | show | jobs | submit login
The Google home page is 500K (web.dev)
323 points by tomcam 4 months ago | hide | past | favorite | 250 comments



A more intriguing fact is that Google's search result page will often run 2-2.5MB in size (!). And sub 500ms results are a thing of (distant) past. More often than comfortable, user will wait between 1.5s and 2 seconds for results to load. [1]

This has gotten out of the hand for a company with so much resources, that literally depends on page speed for revenue. It is incomprehensible that the size (which is probably the main reason for latency) can not be reduced.

At Kagi Search [2] (I'm founder) our home page (for logged in users, not the marketing landing page) is mere 68kb, everything works even without javascript, and 90th percentile result pages load in <1s with SERP page being on average 100-200kb total.

[1] https://pagespeed.web.dev/report?url=https%3A%2F%2Fwww.googl...

[2] https://kagi.com


Honestly, if I have to be logged in by default on a search page I'm not gonna use it.

Assuming that people don't clear their cookies is wrong nowadays. Even when your website is a little snowflake, the rest of the web isn't.

I honestly would recommend a Web Extension that adds your search as a search provider, or something like an Open Search Description.

Also: the landing page of GitHub annoys the shit out of me. Every damn time. I don't wanna register with three clicks if I already have an account.

Make a landing page easier for regular users to login, and prioritize login over signups...otherwise those regular users won't be happy. Any company should focus on keeping the regular users happy (UX wise) and not focus on one time trial registrations that will be a burden on infrastructure.


I get the impression that startups get something like a 10x multiplier in kudos, stock options, chances for a next round, (pick your reward) etc for signing up a new user than for a repeat visitor logging in. The former means growth; the latter doesn't. Hence it's always easier to find the signup button than the login button.

Fucked up incentive structures like this are why we cannot have nice things.


Having users that want to return is objectively good. We've just taken this to the wrong conclusions.

I strongly suspect that this has to do with attribution and shallow "data driven" choices - there is a heavy bias towards things that are easier to measure. Especially in startups with limited resources.

Measuring retained users regardless of anonymous/known state is very hard and lossy. So the nearest easy to measure analog that is often picked is logged-in users and we start the fucked up cycle you mention.

Similar things happen around all metrics that have to do with retention. We know that a better experience through higher performance and better UX is likely to create happier users that return... but we cannot prove it (easily) so we focus on optimizing other things.


I've been in more product meetings that have uttered a phrase like "we will work out what to do with all these users later" than I care to count. It really is a startup goal to sign-up as many users as possible. In most cases it is literally a metric for the next funding round ("Last quarter we had 100k signups!" Is a very attractive soundbite at the round table)


The minute something becomes a KPI it is no longer a useful KPI.


That's how extreme shortsightedness looks like.

The number of users you lose by doing that is smaller than the number you gain as long as you don't have enough users (and spends enough on publicity). This is also the one thing that limits the number of users a company will ever have on most cases.


I don't even mind logging in, but it does seem weird to force it before letting potential customers interact with the product. Why not just prompt for registration when personalization features are used?

What I do mind is the price...if you read the FAQ, they want to charge as low as $10/month, $30/month for families once it exits beta. ("As low as" means "possibly higher")

That's very high. The family price equates to 16 Office 365 Family seats, which includes 16TB of cloud storage.

In my opinion, the price for this kind of product should be more in the realm of $50/year. I'm thinking a price closer to what password managers charge.

This price also will need to be adjusted for global customers if that is the eventual intention. Spotify in India costs less than $1.50 a month.


The goal of Kagi is not to be a mass market product, but a premium search engine, offering best in class search product experience with top privacy guarantee (as the business model does not require fiddling with your data in bad ways). Search is arguably one of the most important activities we do every day both in personal and work setting. Unique Kagi features should be well worth the cost.

It is also an independent company, bootstrapped by the founder, without a ton of VC money to subsidize the cost. And at $10/mo per user it would be barely breaking even, if at all. So there are a lot of factors of play, and I completely understand why $10/mo may sound excessive given all the free alternatives out there. It is just one of the many uphill battles we are fighting ;)


tbh what your saying doesn't make sense to me vis. breaking even, unless there is some signifcant manual element to your search indexing. If this isn't the case, the revenue received from new customers should far outpace the marginal cost of supporting new customers unless you have signifcant fixed costs for some reason. I am interested in your product and will sign up for the beta and if it is quality would even be willing to pay the price if it was flawless quality, but i'd suggest doing some analysis vis. how many users you would need to reduce the price closer to somehting like $5 a month, which is far more palatable to me.

Right now you're charging nearly as much as something like Netflix, so your product has to be nearly a Netflix' worth's more value than DuckDuckGo (rather than Google as it's closer to your value proposition) for me to rationally justify the price.


I definitely understand where you're coming from. One can make the argument that a search engine is used with an incredible amount of frequency for most people – dozens of times per day. It should be worth a lot.

One thing that's great about search engines in terms of entrepreneurship fundamentals is that is is such a massive market. You only need a fraction of a percent to be a viable business, I'm sure.

I can also understand how underpricing a product can be a big mistake. Somebody who wants this kind of product has already made the conscious decision to pay. Pricing a product too low can be a signal that it's not a good product.

Perhaps one might say that this is an example of how complexity and cost of production is a bit detached from the customer's willingness to pay. The most complex software product, the operating system, is essentially given away for free. People are mostly not willing to pay for it, even businesses don't want to anymore. On the other end of the spectrum, there are SaaS products with very low levels of complexity pulling in a lot more than $0.

If it's truly barely breaking even at $10/month, that fascinates me. It must mean that search engines have to do a lot of computing.

In any event, I wish you the best, because the features do look cool. I like the idea of being able to set site preferences (e.g., when I search for a movie I might like to see Wikipedia over IMDB).


> Perhaps one might say that this is an example of how complexity and cost of production is a bit detached from the customer's willingness to pay. The most complex software product, the operating system, is essentially given away for free.

Spot on. Not that long ago we used to pay for OS, even beta versions of OS. Nowadays, most complex software products imaginable, like OS, browser, search engine, social networking... - are all free, and over the years we as a society have been slowly drawn into an expectation of these to be free. Of course someone or something is paying for it, as all these also "magically" generate hundreds of billions of dollars of revenue. And when someone shows up with an idea to actually sell such product and tries to create a business where user is also the customer, and incentives between the user and the business are aligned, they are usually looked at with a raised eyebrow.

I am personally of the opinion that you always get what you pay for and that one should not shy away from paying and supporting products they love. I hope we are building one such product, that some people (and admittedly you need to be a little different to use and pay for Kagi) will fall in love with.


If it really starts to take off I think you should consider lowering the price. Initially I understand a higher price is required to be able to cover expenses. A price that is low enough to not matter would lure more customers into the fold (if the products deliver on its promise).


The more expensive it is the better it will have to be relative to free alternatives though.

Is it that you're doing a large amount of compute per search which yields better results but higher cost per search?


Have you considered adding business or enterprise subscriptions[0]? B2B users are a lot less price sensitive[1] than consumers especially when you're competing against a free product.

[0] https://news.ycombinator.com/item?id=2781133

[1] https://news.ycombinator.com/item?id=13843743


Not really in a serious way, partly because I do not have an idea where would I start. Any ideas?


> Not really in a serious way, partly because I do not have an idea where would I start. Any ideas?

On premise search. Businesses have lots of resources on their internal websites, but no easy way to search them in one place. Addition of role based search, when I'm logged and search only shows me resources I have access to.


> What I do mind is the price...if you read the FAQ, they want to charge as low as $10/month, $30/month for families once it exits beta. ("As low as" means "possibly higher")

$10 a month is less than one lost hour of work.

I hope they price it around there somewhere so that many will use it but even at $20 it is a no-brainer for me.

The results are in another league compared to what I get elsewhere and by now it also feels like an insult every time I realize Google has altered my query: do you guys really think I don't know what I am doing?

Google probably works better for illiterate people these days, but in the process they have alienated me. I have lost too many hours searching for things I knew used to be there only to be ignored and corrected by their AI. It is rather patronizing at this point.


> $10 a month is less than one lost hour of work.

This is a pretty feeble argument for something, especially for a product where the user perception is that it should be free, and there exist multiple free options already.

While "one hour of work" might not seem a lot, a different way of looking at that is that for most people there are 160 hours of work in a month, and so this thing needs to be worth 1/160th of their life.

It's not so hard to make that choice for a couple of things that are of obvious value, but when they start adding up in your life for basic things, people will question the value of it to them, and I'd argue for most people, search just isn't worth that much to them when there are so many good free options.


For people who cannot afford the $10, I can unironically suggest bing. It actually responds to long queries much better than google.

You should still use the website's search function first. Github issues/SO search often turns up results that google will ignore.


It’s unfair to compare pricing between two totally different products, especially when one of them is owned by one of the biggest companies in the world with millions of paying customers and a vast suite of products.


>That's very high. The family price equates to 16 Office 365 Family seats, which includes 16TB of cloud storage.

Offtopic: That must be one of the best deals on online storage Excluding Google's enterprise unlimited storage option which always felt like they could pull the rug under you any day.


Whitelist github.com from Auto cookie clearing. Takes less time than a rant on SO. I had a guy at work ask me to store the entire state of an internal web site in the URL. He went so far to go and get his change request approved by management, resources and time allocated. When it was passed to me to implement and I dug into why he wanted this, turned out he was auto clearing his cookies. I told him to whitelist our internal website from his cookie clearing Paranoia and stop wasting my time and company money.


Wow, this guy must have some pull in the company for this to happen


> Make a landing page easier for regular users to login, and prioritize login over signups...otherwise those regular users won't be happy.

This is one of the first "asshole design" moves that I recognize, that unfortunately most sites use. If the "Sign up" button (which will be used once by each user) is highlighted more than the "Sign in" button (which will be used hundreds of times), I know that your company doesn't really care about its users.

I'm surprised how common it is.


> the landing page of GitHub annoys the shit out of me. Every damn time. I don't wanna register with three clicks if I already have an account.

?

If you're not signed in, `Sign in` is available on literally every page in the upper right and one click takes you to the sign in box. If one click is too many just bookmark github.com/login.

If you don't want to sign in the search box is also in the upper right on every page to take you wherever you want to go.


It is expected that if you are logged in you stay logged in. Only a minority of users clear all their cookies at the end of every session. Sorry but that is true. Thus the landing page is optimised for the most common visitor, that is the one with no account yet, not the loudest complainer on a tech geek forum.


> just bookmark github.com/login

This will make me be detected as a bot and I see captchas everywhere afterwards. I was using that for a while.

A better alternative would be content identity through URLs.

If your marketing page is at github.com/index.html, then make something like github.com/dashboard.html for logged in users, and redirect them to /login.html when they are required to be logged in.

This way users can easily bookmark the page that they actually want to see.


Github already has my public ssh key so I can push/pull/clone without typing a password. Why the heck don't they support ssl client authentication so the same thing works when I use a browser? SSL client certs are an actual thing yet no web site I have ever seen uses them. Why?


Using google search without being logged in and clearing cookies regularly will constantly harass you with a very intrusive "cookie banner". Quotes because it's much more than that.

The sole purpose of these is to get you to globally log into google and stay logged in.

So I don't think it really makes much of a difference.


The sole result for me has been a pretty insistent reminder that I should not be using this service.


Agree. The routine of opting out of that every day is very important. It is a constant reminder that I absolutely need to further reduce the use of google services.


That's true in Europe but not in the US, I believe. It is indeed one of the more obnoxious cookie consent banners I've seen.


It‘s annoying as fuck but tbh I got used to it, it‘s second nature to me now.


"I don't care about cookies" + "Cookie Autodelete" add-ons solve that problem easily for me.


> Honestly, if I have to be logged in by default on a search page I'm not gonna use it.

Curious how would you handle authenticating for a paid product on the web today?


Not the person being asked, but unless it’s for work I just don’t use it. There’s more content online than I can ever consume and if it’s a tool I need for a side project I might as well build it myself since I work on side projects solely to keep my tech skills up

Edit: I do authenticate on several paid products, but I was always introduced to them in some hassle free manner, and only got to the point of paying and dealing with adding another app to deal with maintaining after I’ve grown to like the service


You’re authenticated here? What’s the difference?


Did it after years of lurking and finally feeling like it’s worth it. This was in response to the application in question requiring authentication from the get go.

Ultimately it’s laziness, but dealing with authentication is annoying and I can’t be assed to do it for something I’m not already valuing


Not just laziness but increased attack surface for compromised credentials. Just like physical things weigh us down (spring clean or donating old stuff is liberating) so too is having another login somewhere. I wish I knew all the sites I have logins too but I dont and its frustrating, risky and worrying.


I have two periods in my life. Before password manager and after password manager.

After I made the password manager a part of my routine that I trust absolutely that I never skip, all of that anxiety went away. I still have logins from the before times that I wish I could round up, but for the last ten or so years I can guarantee I know every site that I’ve signed up for.


That and data being sold, haveibeenpwned gives me a larger list as the years go on from all the “secure” services who werent.

I feel bad for people making new services now who have to deal with this, but the internet isn’t new and they have to overcome our experience with all the bad behavior of shysters who operated like these apps and ended up lying about the service


If I had to be logged in to view HN the very first time I arrived here, I wouldn’t have bothered.


(I'm not the person being asked.)

As of now, I log in only when needed, and usually to a single account at a time, at most two.

I am willing to try new models for web search if the provider does not log any information. I'll prefer it to be stateless (other than account settings and payments).


> Make a landing page easier for regular users to login, and prioritize login over signups...

This. I suppose it's become so prevalent due to A/B tests finding it more efficient to optimize for new users, but at some point in a company's growth it would make sense to put the focus on user retention. Then again, it wouldn't be that hard to make both signing up and logging in easy and attractive.


I've tried Kagi the last two or three weeks and the page speed is a nice bonus but the real advantage it has over DDG and Google is that it does find the pages I look for, rank them reasonably well and doesn't mix in pages that doesn't contain my search terms.

It is just like being back in 2007 or something when Google worked this way and it is wonderful.

I've used DDG for the past few years and I still hope they too will succeed, but with DDG the quality is the same or (slightly) lower than Google and the bang operator comes in handy many times a week.

With Kagi I only think I have used the bang operator once (yes, they have it). It is that good: when I look at the results I know immediately that there is no way Google or DDG can do it better except by pure luck and also my results are already there on first or second try so why bother shopping around?

Bonuses:

- bangs work like in DDG (I think, I don't use them anymore)

- doublequotes actually work unlike in Google and DDG

- even without doublequotes it doesn't stray very far from my query

- I can add lenses (?) or something (I have not bothered yet, the defaults are good enough)

Yep, I'm a walking billboard now, but I'm not paid in anyway and don't know any of the people behind kagi, it just happens to be the biggest improvement I have seen since Google replaced everything else sometime around twenty years ago.

Edit: yes, it is a paid product but it is free during beta and so good that I would consider Kagi gift cards for friends in tech if it was paid, just like I'd consider gifting WhatsApp subscriptions before Facebook bought and subverted it (yes, I was considering worse words).

Edit 2: I'll do this for your product too if I love it. I did for WhatsApp and Google+ and in real life for the insurance company I have used the last few years.


I'm having great results with Kagi as well. I actually forgot I signed up for beta a few weeks back until I saw this post; and started testing it out this morning. Thus far, I'm quite impressed - similar sentiments to what you have outlined.


This testimonial is the best Christmas gift the team could have hoped for. Thank you very much!


Good to hear!

Lets be clear that this goes both ways, Kagi is a really nice Christmas present: so far this is the biggest software improvement I've seen in a number of years: on the same scale as when generics was introduced to Java (and this was a big deal for me) or in fact even on the same scale as when I was introduced to Google. I'll have to keep using it for a few months to see exactly how big it is but feels like a really bif deal so far :-)

Wishing you a merry Christmas whenever you celebrate it (we are celebrating tonight).


> - doublequotes actually work unlike in Google and DDG

This drives me nuts. If you're going to have a competitor to Google (ddg) at least give me proper functionality to refine my searches


Yep! How this passes QA is and why everyone else who knew old Google isn't crying out loudly is beyond my understanding.


Don’t you think they have run tests that trade off search speed for additional features and determined the features increase engagement / revenue?

There is a lot more going on than 10 blue links these days, including continual scroll on mobile. Perf versus features is always a matter of trade offs.


I think its more that they have a monopoly on search in the current zeitgeist. So no, I don't think they've done that, and if they did, there's no pressure for them to optimise for speed, so they'd optimize for something bloated that helps someone get a promotion. This is why, for example, everything Microsoft Office does is a bloated trash fire, and how Apple gets away with telling a whole generation of phone purchasers that they were holding it wrong.


Even with a monopoly, they make more money the more often it’s used.


> and how Apple gets away with telling a whole generation of phone purchasers that they were holding it wrong.

Okay. I'll bite. What does this mean?

I hold my iPhone 13 the exact same way I held a Nokia brick in 2003. What did Apple tell us?


11 years ago, if your finger bridged the two antennae together on an iPhone 4, you lost signal strength. Half a percent of iPhone users complained about it, Apple offered a free cover to anyone affected.

They were HIT WITH 18 LAWSUITS about it OMG.

They settled all the lawsuits by ... offering a free cover or $15 refund.[1]

Depending on your viewpoint this is either one of many blips along the way of manufacturing tens of billions of products over a dozen generations, or a way for "mactards [to] realize Steve Jobs are out to milk their clueless a$$es with his oh so clever marketing gimmicks" (Yahoo News comment).

Bonus points if you can find anywhere Apple said "you're holding it wrong" or blamed users. Steve Jobs wrote "Just avoid holding it (the iPhone] in that way." which is somewhat pragmatic, and rather what you'd expect someone to say if they weren't filtered by a PR firm. Probably what I would say. Apple haters can't face this.

Macworld.co.uk said:

> "We don't really think there was anything substantial to the problem, and the iPhone 4 sold tremendously well. Customers didn't seem to notice, let alone mind, and Apple repeatedly pointed out that other mobile phones suffered the same effect."

[1] https://www.cnet.com/news/settlement-reached-in-iphone-4-ant...

[2] https://www.macworld.co.uk/feature/antennagate-11-biggest-ap...


Jesus relax man your favourite trillion dollar company can survive someone remembering the shitty stuff they do. I'm sure they appreciate your help though.


If you're going to get hurt feelings and need to turn to swearing and ad-homs when your comments are put in context with citations, maybe don't post them?

Apple aren't my favourite trillion dollar company and just because they have money doesn't mean everything they do is bad; or that the fundamental attribution error applies differently (similar applies to other companies, groups, organisations)


No, they have money and they use it to be do bad with it, that's what makes them bad.


iPhone 4 had a bad antennae design. Jobs blamed the users. It was hilarious.


:facepalm: of course!


Gmail has a fucking loading screen now. Have I switched? Not yet, but the reason I switched to gmail in the first place was for the minimal, fast interface.


Gmail was the original heavy webmail interface. All its competitors at the time were basic html (and providing only 5mb of space). Heavy js to provide a smooth interface was what made gmail such a big deal other than high storage limits.


Gmail always had a loading screen.


I suppose you are right. I can only guess that RPU is the primary KPI and somehow more scripts allow for more revenue extraction even if the performance is lower.


I think you’re overthinking this.

You’re looking at cold start download sizes, but that’s not what people actual see after everything is cached follow the first load.

Google pages load plenty fast on every device I’ve used recently, including some very old android tablets. Most page loads don’t transfer much data at all due to caching. Combined with the high speeds of both browsers and devices today, there wouldn’t be much real change in a highly stripped down Google page relative to what they’re serving up today.


But it is not just a cold start problem. Check your network tab on a "warm" search.

Here is what I get:

https://imgur.com/a/Dl2B5Zo

2 seconds to load and 1.12MB transferred, all on "warm" start.


Using Chrome Dev Tools I'm seeing around 150-200KB actually transferred on each search page result (2MB resources, though most are cached).

I'm also seeing content loaded happen in under 1 second.


Can't tell from your UI screenshot, but quite often loading a page with devtools open will disable the cache. I forget if this is the default behavior or not.


And the top half of the results is all ads! I don’t know how people stand Google.


Or maybe just give it away to ppl and see if they like using it to find stuff?

Where did the love go....


I am sure that is the case, but I am not sure it is good for the user: Google ads a lot of ads and behavior tracking that probably is for their benefit, not yours.


> determined the features increase engagement / revenue?

This. Nothing else matters.


> Google's search result page will often run 2-2.5MB in size

While it is true, for almost all of the users most of it is cached. For me it is using 2.5 MB resources with 77kB transferred.


That's important of course, but the cached code will have to be refreshed when there's a change (could be done partially with code splitting), so how long the cached data lasts depends on how often the Search page is updated. For Google that probably isn't happening daily, but it could be very often. Hopefully Google have split out the hot code that changes a lot in to a small bundle so the rest can sit in the cache for a while.


That cached code (presumably) needs to be executed by your computer so it’s still wasteful, slower and somewhat mind boggling as to why that pages needs 2.5mb worth of html/css/js.


That's a lovely landing page and a nice domain you have there.

I really like the idea of a premium search engine too, I'll give it a go for sure.

I'm a little uncomfortable with having to be logged in though... Ideally if I'm going to pay for a search engine I would like to do that as anonymously as possible.

Best of luck with the project, I'd love to see some competition in search again. Google has been getting worse for years and they have no real incentive to improve given their dominance and lack of disruption in the space.


At least Google gets you a search page, rather than a sign-up prompt.


I can't try kagi before signing up.

I'm not sure that I want to sign up and give you data when your premise is to not mine the data. Is that your premise? How long do you retain knowledge of what had been searched?

The personalise features put me off. The level of personalisation I want is course location... I.e. online shops in my country, or if I search for something in Liverpool pick the right Liverpool in the world... Both of these are relative to where I am and an IP geo lookup is roughly good enough with an override in the worst case.

I'm not sure precisely what your differentiation is (again, can't find it without signing up)... But I want to be very sure before I sign up to anything that you treat user data as a radioactive resource. Thankfully given that GDPR exists and everything done on search is related to a user profile I'm assuming that you're petrified of storing and using data... But I don't know as the site didn't reassure on that front and I didn't see the privacy policy at the foot of the home page.

I want to like what you're doing so much... But you've designed it to have friction to that discovery.


> I'm not sure precisely what your differentiation is (again, can't find it without signing up)...

I'd echo this. I actually even went to the site to see what was so good about your search, then realised I can't even test with a single query without signing up, and lost interest.


> I can't try kagi before signing up.

Perhaps it is not obvious that Kagi has not launched yet. We are in closed beta, and email is used to invite people who want to be beta testers, nothing else. Once we officially launch, everyone will have the ability to try it.

>I'm not sure precisely what your differentiation is (again, can't find it without signing up).

The FAQ has a lot of information you may find useful https://kagi.com/faq


I was enjoying trying out kagi but it served me a page to the effect of "you exceeded your number of searches" and that immediately turned me off of the product. Will that limitation be lifted at some point?

My other feature request is that I still like google maps, and OSM is just not a good replacement. The important feature for me here is: google a place name, click the map square that shows up in the Answer for that place to open it in gmaps. Both kagi and brave search use OSM which is not clickable, this really breaks my UX and alone is enough to make me want to go back to google search. Using !m sometimes seems to work but I think ideal would be a "which maps provider would you like" feature which just lets a user choose.


Yep sorry for that. We introduced query limit to prevent abuse and control cost during (free) beta but initially set it very low for our target userbase (50 queries/day). We increased it since to 200/day after receiving similar feedback and of course once we launch there will be no limit.

Kagi Maps is still work in progress (beta like everything else) and we just didn't get to shipping this feature. We are not using OSM only though, and many users have complimented the map to be better than Google maps already (for features that we have shipped).


I looked at it, and I'm uncomfortable with the fact I have to be logged in, so you know all my search results, know my exact identity (through the credit card) and can analyze them and store forever (you can say you don't do that, but you cannot prove it). At least with Google I can delete cookies, disable fingerprinting and use a VPN, and hope they don't have my search history. Furthermore, you are in US jurisdiction, so the government can request access to your data and you can never tell anybody about that.

What about a version of that where I pay a few cents for each search, using Lightning Network, for example?


I'd give your project a shot but the signup funnel for your beta is a little funny -- you should let people choose to leave the textbox-based questions blank and still get an invite.

If you just made those parts drop-downs with the most popular answers and had an 'Other' option that makes a text-field option, you would probably get much more valuable data and more of it. I would not be surprised if you had some amount of dropoff in your beta signup funnel due to questions that make potential users feel like theyre being 'put on the spot'


Just did a search and it took 550ms. 1.5s and 2 seconds sounds like an exaggeration


I hope you won't be requiring logins in the future. Big deal breaker for me.


I don't care much page speed when I want to search some deep information. Sadly Google getting significantly worse for indexing.


Interesting business but I can't help but be a bit worried. You promise not to monetize usage data and such but as a paid search engine, you are in the perfect position to do so. Even more so than Google or DDG since neither of those have my name and other PII. You would have my payment information AND valuable insight into my interests.


Yes, and zero incentives to misuse that information, as the entire business literally depends on the trust between us and the user, and is entirely supported by user subscriptions. Incentives are perfectly aligned too - we have to constantly keep improving the product or you will jump to a (free) alternative.


I admit I might be more paranoid than the average person. This could work if we knew each other IRL or something like that but as it stands what you said is a nicely worded version of "Source: dude trust me."

Accepting crypto payments could be a solution but I'd totally get why'd you want to avoid that.


There is nothing else we can possibly tangibly offer, and you are right to have high standards. Note that also we never say a version of "dude trust me", but lay down the facts how the business operates, what the vision is and why are we doing it.

The best we can do really is have a business model that aligns incentives with the users and weirdly enough, this is already an innovation in this market. Then, create the best possible product that users will love. Gaining trust is a process that can not be skipped.


> that literally depends on page speed for revenue.

I don't think that's an accurate view of the business model of Google. For them consumers are mostly locked in and their business model is in tracking users and mining as much information as possible from every facet of the experience.


Unless you are using an ad blocker or have disabled it in the settings chrome precaches so for the majority of the people there probably is no 1.5-2 second load time. But I expect majority of the people browsing hacker news would have it disabled so would have the slower loadtime


Looks really nice. If I had any suggestions it would be to use a different search for each example, now Magic the gathering is used twice. I can imagine potential customers will love to see what the final product looks like before purchasing.

Best of luck, looks really ambitious!


> a company with so much resources

That is the problem: resources show up in abundance, not in scarcity. Only small or poor creators usually can do the latter.

Conways law at work, I guess.


I don't see how your paid service is any different from Google beyond interface and lack of ability to use it for free/no signup to compare.

Or I can use duckduckgo without needing to sign up and get your promised security with 1-2 second searches for free. Maybe you'll have enough takers to get off the ground but I'll rely on other tools.


Kagi looks interesting, can I have an invite?


I registered for the beta, I must say I have low expectations, but I'm ready to be positively surprised ;)


Google isn't as bad as you are saying. Doing a search of "test" the result page is about 130 KiB. This is an order of magnitude smaller than what you claim. It also loads in between 900ms and 1 second the majority of the time.


When i code in python or react, I get proud of my 900kb spa web pages. I respect your site so much in fact, I would like to apply to work at your company. Where do I apply?


> 100% free of ads and tracking.

> forces you to login


Don’t really want to login to search.


how can you search 'with privacy' / anonymously if it is tied to an account.


Privacy and anonymity are not the same. We do not claim anonymity although if you read our privacy policy you will see that it is very close to it (as we do not log searches at all, so it it not possible to tie them to a user).


literally depends on page speed for revenue.

Turns out, this might not be true at all.


Please don't spam your paid walled garden.


Performance only matters if you are in competition with others. You can win a marathon by crawling if you’re the only participant.

Who are Google’s serious competitors? Sure there are other search engines but I’d argue that none are serious or even marginal threats.

People will tolerate sub-optimal experiences if they perceive they don’t have any option; just ask anyone who has stood in line at the DMV.

The first sign of a monopoly is a rapid performance degradation which can’t be explained by economics of scale and which did not exist in prior versions of a product. Free market forces and incentive structures no longer apply to monopolies, which is why many large corporations can get away with sloppy work and treating their customers poorly, but do not see much if any consequences. Behavior that would have been disastrous in a startup is tolerable in a large corporation operating as a monopoly.

A product owner or dev at Google would be committing career suicide if they started to care about performance at this point, because that would not result in any appreciable changes or outcomes consistent with the incentive structures at Google.


I would say performance only matters if it's debilitating, outside of that it's something that few notice even fewer care about. Also I guess it can impact first impressions.

But once I'm using a tool for something unless the performance is actually preventing me from using it - I'm not going to bother switching. So no need for a monopoly - if your competitive edge is that you are faster than competition, unless the competition is dog slow it's not a selling point to me.

Eg. I won't look for VS code alternatives just because it's electron based and the competition has higher performance, vscode is fast enough for me. I will switch to a different compiler or even language if it shaves % off buid time that makes the iteration meaningfuly faster, or the tooling addresses pain points.


Or you could say, be doing your average of 7 to 15 searches a day, as an office worker. So you've waited some on 5 to 10 of them. No big deal.

But then the guy next to you is a coding intern. He averages 150 to 400 searches a day. Maybe one every couple of minutes, or every five minutes.

That guy would be pissed. All the time.


> People will tolerate sub-optimal experiences

Implying the Google Search homepage loading experience is sub-optimal? Chrome starts loading it before you finish typing google.com on your keyboard.


I think most products from Microsoft are relevant in this context. Teams is horrible and every week another thing breaks. Almost every page by Microsoft takes more than 10 seconds to load on my side. Yet everyone including me uses it.


captive audience!


I don’t think the actual experience is sub optimal. It’s not quite there yet, but give it another 10 years and it might be.

However, from a pure engineering perspective, I think most people agree that the page weight has increased at a rate that is unjustified considering the core features and functionality have been relatively static. So why the bloat? Engineers can tell of course. The margins of acceptance have gone way up thanks to faster devices and connections, but that still doesn’t explain why they don’t seem to care to optimize anymore. In optimize, I’m talking about it from a pure engineering perspective: what is the minimum amount of data needed to do the job of displaying a list of search results? Some people have forgotten that a list and some simple formatting is all there is to it. It’s not a web app, it has minimal interactivity, and relatively basic features. The fact that it has ballooned to 500k just proves that even Google is not immune. Maybe more resistant than other giant tech companies who’s software just keeps getting larger but not immune.

Remember that code is data too. 500k isn’t a lot in today’s world but multiply that by how much traffic they receive and it is. I’m not saying Google search isn’t a well designed product that doesn’t work well, but from a pure first principles perspective it does seem quite bloated based upon what it used to be.


I'm convinced that Google's massive push into NLP has a lot to do with this shift. Their next chapter is turning the search engine into a smart assistant and they seem to be willing to sacrifice quality in the short term for this broader goal


I have never felt like google search was slow. So I would say they are within the range of acceptable times.


I propose this exact scale, in the interest of productive discussion:

15-75 ms GOOD: (no thoughts had in that time)

75-150ms OK: oh, there it is

150- 250 NOT OKAY: wait, did I…

250- 500 VERY NOT OKAY: could I have typed something better?

>500+ ms ABSOLUTELY DEPLORABLE: is this thing on, is it working still?


The DMV is motivated to be efficient because it's the only shop in town and people who have to use it will make noise when it sucks. But they may not pursue efficiency because they are the only shop in town unless pressured.

In Oregon they seem to care. I've never spent more than 25 minutes in line and often been out the door in under 20 minutes even with complex forms to complete. They often have someone near the door to check your needs and get you the right form as you take a number so when you get to the window you're set to submit your form, pay any fees, then leave vs sitting down again to start the form or something. Oregon has it figured out.

Washington though, they have private offices running all aspects in separate locations ex emissions test up the highway, licensing across town, and vehicle registration down south. It took me over 3 hours to register my car when I moved and the DMVs there only proceeded to be worse.


>Performance only matters if you are in competition with others. //

Or if 'resources' are limited, like carbon emissions.


> Performance only matters if you are in competition with others.

I'm not sure that's right. There's a strong link between performance and repeat usage. Even in the complete absence of competition, Google search being fast could result in more searches per user being conducted and more revenue as a result.


> Who are Google’s serious competitors? Sure there are other search engines but I’d argue that none are serious or even marginal threats.

Google seems to disagree given how much they shell out to be default search engine on different devices and browsers


Google does this for the same reason coca cola advertises during the super bowl. It's not because they're afraid of competitors. It's to reinforce their position in your mind as a household name.


But if there are no alternatives to google or Coca Cola, why do they need to reinforce themselves as a household name?

Truth is, if neither brand doesn’t keep fighting for their top position they will lose it to competitors.

The walls are being pushed from all sides, but both Coca Cola and google has the strength to hold up those wall. But that doesn’t mean the push is insignificant.


Comparing the Google home page against a few competitors —

  1,326 KiB  Bing             https://www.bing.com/
    829 KiB  Yahoo!           https://www.yahoo.com/
    506 KiB  Google           https://www.google.com/
    358 KiB  DuckDuckGo       https://duckduckgo.com/
      8 KiB  DuckDuckGo Lite  https://lite.duckduckgo.com/


Out of curiosity, how much of that Bing load is the lazy-loaded background image? I would guess that the rest of the page is very light but maybe I’m wrong.

On my mobile device, Google feels slow because they have to throw up a spinner for 2 seconds while they figure out what local news to share with me. Bing manages that seemingly instantly on on my device.

For full disclosure, I’m a Microsoft employee. Not trying too shill for Bing. I’ve just been surprised how slow Google feels lately due to their (slowly) lazy-loaded content.


I was looking at the mobile results that PageSpeed Insights reports by default. Bing on mobile uses ~170 KiB for the background image while Bing on desktop uses ~1,000 KiB for a higher resolution version of the same image.


Ok. So the picture isn’t the bulk of it. Thanks


174 KiB (419 KiB without gzip/deflate) https://search.brave.com/


Oh, lite.duckduckgo.com is great on my somewhat underperforming old mobile. Wasn't aware of that. The URL could be more touchscreen friendy.


Since they own duck.com, you wonder why l.duck.com doesn't redirect to the correct URL (or lite.duck.com).


I didn't know lite.duckduckgo.com existed! It's amazing I've now set it to my default search engine on my phone.


377 kb Yandex


I have started using Yandex for image search, it has all the features Google removed to keep from getting getting sued. Excellent results and great tools.


I'm sure someone has coined a term for it but isn't it just a tendency for rich companies to keep employing more and more people who are adding less and less value so they invent roles, departments, imaginary needs and requirements so that everything gets more expensive, more complicated and slower.

What do you do if your current front page actually works really well with some basic PHP or whatever but you employ x thousand of the world's best developers? You invent new frameworks that are more complicated to solve one problem and create 10 more, you change things so they can be automated even though you can easily afford for people to press a few buttons.

And the worst of all, the need to "stay relevant", which is a sad but real pressure in marketing and can involve really expensive design consultants (new font Twitter?) and sometimes something that is not only very different than before but sometimes much worse.


They hire people so that they don’t go to the competitor.


Let me ask every single web developer a straight-forward question.

If huge tech companies like google, facebook, airbnb etc are unable to make fast websites... then why should we use their frameworks and tools?

I really feel like the emperor has no clothes here and it's insane for smaller companies to try and ape them.


> If huge tech companies like google, facebook, airbnb etc are unable to make fast websites... then why should we use their frameworks and tools?

You don’t have to at all. Lots of folks do because “fast websites” is one requirement out of many to consider.

What about secure? Stable? Reliable? Maintainable? Reusable?

I personally build excessively lightweight and simple web UIs for my personal projects, but I’m not managing infra that >1bn people use daily and is literally used to answer “does my internet work?” because of its expected reliability.


> It's insane for smaller companies to try and ape them

In my cynical view, When you think about engineers as people who want to stay up to date with 'latest and greatest' so that they are not left behind in the market, the use of their frameworks makes sense. Also, the frameworks do solve very key problems in their own ways (and also avoid re-inventing the wheel).

So the truth lies somewhere between the 2 sentences there. There's nothing which we can not achieve without frameworks however managing that solution is a pain. More so if one is working on a bigger team.


Their frameworks and tools also generally come with extensive documentation...which is quite rare in projects that are proprietary balls of mud written by a single programmer who left the company several years ago.


> then why should we use their frameworks and tools?

Because job postings demand X years experience in Y framework instead of looking for actual skills, and many investors look for hot things that are jumping on Z hip new tech buzzword and not all look at what they actually solve.

Plus to a certain extent they can be used to get a project off the ground faster. Although at a certain point a good number of projects that are successfully taking off would benefit from a nice rewrite--but that's not as sexy and marketable as adding another new feature in that same time span.


Very naive to think these companies don’t care about speed/perf if it truly affected their revenue


AirBNB has a fantastic marketing team and can afford to have a dog slow website.

Why should the rest of us give two fucks what their eslint config settings are?

Merry Christmas.


While I’m sure there’s some portion of developers who are ready to jump to google and Facebook tools right away, I doubt it’s a majority.

There have been a lot of misses from these companies. Some big wins for sure, but it’s definitely not a perfect record.

I do think there are a lot of smart people working on hard problems (user engagement… not cancer) with large teams at these companies though. Which means they are bound to come up with some percentage of good solutions. I’m glad they share them even when they flop.


Does anyone have a link to this story? I remember it as:

Way back when, a Google dev started to receive frequent cryptic emails. Each email contained a single number. They struggled to figure out what it represented, if anything. Somehow, it was realized the number represented the word count on their homepage.

This story comes to mind when I see promotions, statements and other things there. Perhaps someone should start emailing a byte count.



> Posted by Marissa Mayer, VP Search Products & User Experience

The years really do fly by. Looked her up and she’s currently a cofounder at a company that sells an iOS contacts app.


For anyone that isn't aware, she was also a very early Google employee (wikipedia says #20) and CEO of Yahoo for several years.


And she absolutely ran it into the ground.


Kinda hard to say it was just her. Yahoo! was already pretty much dead when she arrived.


For her first year, yahoo was growing in visits. 1 year into her tenure, yahoo had more visits than Google. The next four years were downhill. To me, that looks like Yahoo was on a path to recovery and she sunk it.


I was under the impression that running it into the ground was her job -- to turn it into a controlled crash rather than a completely unmanaged failure.


Yup time hasnt stopped yet. Been awhile since shes been in the press post yahoo sale. I guess that makes me kind of old.


How the mighty have fallen...


Pinboard has been trying to warn us about page bloat for years. Props to Maciej for still running one of the fastest, most useful, least creepy social anti-social media sites on the web.


Pinboard has got very slow for me, esp. at night.


It’s solar powered.


I think the link is meant to be https://pagespeed.web.dev/report?url=https%3A%2F%2Fgoogle.co... - a perf report on the Google homepage. When I click the link from HN it pulls up a performance report on the performance report service (which is also bad.)

It's fascinating that according to the report, someone on a Moto G4 would have to wait over 6 seconds for the google search page to be responsive so they could type in a query.


I wonder how accurate are the results really, or maybe I wasn't interpreting the numbers correctly. Accessing Google.com from my Moto G7 Power (a phone about 30% faster than G4 in terms of CPU performance) shows no seconds long delay. In face it feels almost instant.

On the other hand it seems to give a fair critique to other sites that are known to be bloated such as the one below.

https://pagespeed.web.dev/report?url=https%3A%2F%2Fnzherald....


They simulate slow wireless data, which I suspect is the cause of the gap. It's certainly the case that a big % of the world's population has bad mobile data even if they do have a relatively fast smartphone.


Wonder if it's due to a warm cache. I never use Google (hence it won't have google.com cached) and accessing it from Firefox on my high-end Android S21 takes a few seconds to load via my NZ fibre connection. Certainty not instant. DuckDuckGo on the other hand loads nearly instantly (although it's probably a warm cache on my part). To be more fair I tried Bing (which I also never use) which seems to be on par with Google, I suspect due to Bing loading a photogenic but ultimately useless background image.


Tried it again on desktop chrome with mobile simulation turned on and cache turned off. The Google.com front page still loads in about 1 second and it honest feels identical on an actual phone.

Despite what other comments seems to suggest here, the Google landing pages is actually pretty clean and not very heavy on the JavaScript.


dang informed me that HN stripped off the second half of the URL. It routinely distills them down to the canonical version. I feel slightly better about myself and dang restored it.


Appreciate that. Not sure how I managed to screw that up so royally.



This is a neat tool.

Hacker News hits 100%

Reddit fails w/ a score of 23

Looking forward to using it to test against my sites.


Reddit is intentional though. They really, really, really want you to use their app instead.

[Daily Standup]

“Hey guys, you know how we need more user data to monetize but no one wants to use our crappy app? Why not just make the website horrible and constantly nag them via dark patterns to install our app?”

PROMOTED!


I always use old.reddit.com - I find the default site virtually unusable


The classic forum style is part of the appeal to me. It walls off typical social media users in favor of people willing to embrace the platform for what it is, not what they are trying to make it via IPO.

If the communities I am part of (or built) weren't so entrenched and interesting I'd be happy to move from the front page of social media back to the front page of the internet.


There is two alternative reddit frontends, both of which should be smaller/faster

https://github.com/digitalblossom/alternative-frontends#redd...


Ah, the familiar Facebook approach.

For years reddit was unusual in not offering an official app.


On mobile (iOS) I use an app. Just not their app.

Apollo is the best thing since reddit bought Alien Blue and shut it down.


Reddit can't even handle copy-pasting into the comment box unless the cursor is at the end of what's typed already. It starts to run up my i7 if I have just a few of those comment boxes open at once.

Navigating back to the list of submissions, which are already in the DOM and are literally visible through a transparency, takes like nearly a second.

Of course the back button doesn't work in most cases. It's pretty embarrassing if it's not intentional.


A blog post I wrote a few months ago is apparently fast in general, but I get a bad score on pagespeed insights because I'm pulling in 3rd party scripts. The scripts I'm using? An embedded youtube video and some fonts from google fonts. I'm not sure why pagespeed insights says first paint is delayed behind youtube's javascript loading. Sounds like something I should fix.

Shame on me for using google's other services in the recommended way I suppose.

[1] https://pagespeed.web.dev/report?url=https%3A%2F%2Fjosephg.c...


Hah. Kindof same. Ran Lighthouse tests on my project and on the evaluation it complains about gtag and Google Sign In being slow to load.


I'm shocked they are allowed to report this. Interesting that they are allowed to be independent and not obfuscate the fact that the other Goog services are so slow.


That's always been the case. Having too many Google Ads can even downrank you in search results for being a bad user experience.

For Youtube embeds, I prefer Lite Youtube Embed[1] by the magnificent Paul Irish.

[1] https://github.com/paulirish/lite-youtube-embed


Oh that’s excellent. I’m surprised YouTube’s official embed code doesn’t use these tricks.


PageSpeed even moans about Google Analytics scripts!


I have old.reddit.com at ~40 (varies between 35 and 45) on mobile, and 80 on the desktop.

Apparently the reddit home page is getting hit hard by all the award icons, which google really doesn't like:

* they're really large, some are animations and they're 48x48 full-color (rendered at 12x12), meaning some of them are close to 100k

* they're (a)png which google dislikes

* they're unsized on the page (they're sized in CSS) which is dinged on grounds of layout shift

* they're not deferred so the awards below the fold are still fetched upfront

Google's estimates also seem over-optimistic e.g. they advertise potential savings of 94K (out of 95.2) by using next-gen formats for https://www.redditstatic.com/gold/awards/icon/Illuminati_48...., but plugging that in an AVIF converter yields a 27K file with neither the animation nor the transparency worker (no idea if it's an issue with ezgif's converter or firefox's support).


I had HashBackup's web site hosted at Google Sites for 10 years. The recent migration to Sites V2 looked really bad, so I redid it all in Asciidoc.

I downloaded a static HTML-only version of the old site from Google and a cat of all the HTML files was 25.3MB. The new static site is only 1.7MB of HTML, with more than a third of that being the release log. The average HTML page size is 19K, whereas with Google V1, the average was 450K. And it's way easier to maintain now with a text editor and rsync vs pointing and clicking in a browser text window.

The other major problem with the V2 Sites is that the HashBackup release log took 15 seconds to load. It didn't do that with V1 Sites, so this was a new "enhancement". With the Asciidoc site it takes a little over a second. Google has a Report Migration Issues button, but of course they never responded to anything I sent.


Thanks for adding the right URL, kevingadd. Have no idea how I screwed that up. As they said, it's supposed to be https://pagespeed.web.dev/report?url=https%3A%2F%2Fgoogle.co...


It says 5.5s to first interactive. Yikes. Though on my iPad Air 2 it’s significantly faster than that.


Yeah - bear in mind this is testing on a Moto G4 with a simulated horrible cellular connection.

It’s supposed to be a “worst case scenario” sort of test.


I have a Moto G4 and a sometimes patchy internet connection. This might be the prompt I need to upgrade!


I think it has more to do with simulating 3g network (which is being decommissioned now)


The performance of the device is actually a huge factor for TTI specifically


Also, i wonder who actually go to the google home page to search these days, instead of directly typing into the address bar?


I do. Google is not my default search engine. In the rare case I want to use it, I must visit the google website to search.


A few of the other search engines let you do "!g" in your query which automatically takes you to a google search instead.


For example: DuckDuckGo.


You can set google as a search engine then when you type "google.com <space>/<tab>" and it will switch the search engine for this one query. You can edit the exact keyword and set whatever search engine you want in settings -> "search engines" in google.


I got an extension that just adds a "search on google" option at the top of any duckduckgo search results page. For most things DDG suffices fine but when it isn't I'll just have to do one extra click.

This feels like a better flow to me than having to remember when I explicitly need to use google in the address bar or having to go to google.com and search it there.


You can just click the Google icon in your search box and it will then search with Google.


Especially on mobile.

Worth noting that the bad score is only on Mobile, I consistently get 45-50, but on the Desktop section I get 99.


But does this really matter? When I pull-up google.com on mobile the search bar is instantaneous; what isn’t instantaneous, is all the superfluous junk that nobody visits google.com for…


People googling Google, which is a lot.


Which is huge for a page that displays an input box and a button to send the query, which requires a few bytes. Add a few KBs for the suggestions.

These few bytes is actually mostly what you get when browsing Google with w3m. By the way I just noticed they now have a cookie banner that works in w3m without JS? Now this is some zeal!


None of us know what the google home page is actually doing. It reminds me of the gnu coreutils where the program "true" which simply returns a successful status code on run, is hundreds of lines in C. I could do that in 3 lines! Except I'd be ignoring all of the weird use cases and architectures that the coreutils have to support.

The google homepage works on a _remarkable_ number of devices. You could install a fresh copy of windows XP and the built in IE works flawlessly with google while almost any other page is broken.

And also forgetting all the analytics they need. I know its a dirty word on HN but this kind of stuff is essential for a product of this scale. Throw in some extras for accessibility, etc.


> gnu coreutils where the program "true" which simply returns a successful status code on run, is hundreds of lines in C

Looking at [0] it seems to be 80 lines, as it's doing argument-parsing (for --help and --version) and stream-management.

[0] https://github.com/coreutils/coreutils/blob/master/src/true....


If I remember correctly GNU true violates various specs because it does too much (help and version) when it should ignore all parameters. As you say a valid implementation can be done in three lines.


The code is made way more complex by combining both true and false in to the same binary and then using the output binary name to decide which return code it outputs.

It seems overly complex to me but I have no idea what kind of requirements they have.


As soon as you start typing your search, it starts making search suggestions. That's a complex feature which must involve lots of code.

Also, the search page is not just the search page, it's the portal for all Google services and has corresponding links and menus at the top.

So we should expect the page to be much bigger than "a few bytes", and it is.


Website obesity crisis progressing. http://idlewords.com/talks/website_obesity.htm

A webform requiring truckloads of stuff and gear.


Most of it should be cached. I think it’s a trade off between server load and client experience. Millions of 10KB requests << thousands of 500KB requests (also potentially with lots of compressed stuff).


It's really only about 43k and it loads instantaneously for me. The rest is scripts and other stuff that's cached. I think this is the right tradeoff for an ultra-popular site.


Yeah, I show just under 60KB transferred because it's cached and compressed.

Size is a useful metrics, but actual UX is what matters.


This is kind of sad.

Going back more than a decade ago, google.com used to be brutally optimized. Anything unnecessary was stripped to reduce load times and decrease network traffic. For example, attribute values were never quoted if they didn't have to be and closing tags like </html> were omitted. It was kind of fascinating to read how many terabytes of network traffic omitting </html> saved.

But then came the ill-fated boondoggle that was Google+ and with Google+ came the OneBar. This was the black bar across the top of the screen with key properties but, more importantly, the G+ login. The cost of this was really high.

We don't have the black bar anymore but a subset of links remain as does the login box, even if it isn't for G+ anymore. The Google home page just isn't the lean page it once was.

That doesn't make me mad, unlike say AMP, which makes me incredibly angry. But still I can't help but take it as a sign of lost values.


Even more sad is that 500k is incredibly tiny compared to most websites that shit every JS-based telemetry package under the sun at you.


Yep. Surprised the hell out of me too.


I wouldn’t be surprised if google serves different html based on location+ISP and browser agent etc. I.e if you’re on a modern browser with a high-speed isp you might get a heavier page than if you’re on dialup or mobile in 3rd world countries.


Doing anything too complex there would be pretty surprising to me. if they are it seems they didn't do too good a job since the test is emulating a old phone that should probably get a simple/smaller page


There's a CSS media detection feature called `prefers-reduced-data` that may be graduating from a Chrome feature flag Soon™, although I'm not sure if Google is using that anywhere on their sites today.

https://bugs.chromium.org/p/chromium/issues/detail?id=105118...


They absolutely do serve legacy versions to old browses. I'm pretty sure you can still get a version with the black bar on the top.


Does it matter anymore? I haven't used the Google homepage to use Google in years. It's integrated into the search bar now...

Also, the Google home page loads dependent services (single sign on, lots of JS for XHR searches, etc.)


The page suggests to remove unused javascript.

In the past I measured google search (and some other web apps), only around a 1/3 of the javascript loaded was actually used, despite, I suppose, google's javascript is optimized by closure compiler: https://www.youtube.com/watch?v=IoFleV1ybxE

Javascript could probably benefit from an optimisation like this: https://github.com/avodonosov/pocl


500k what? Link says nothing about 500k...


The title was confusing for me as well. Apparently its size is about 500KiB.


I think page load times like this are like 30fps vs 60fps in console gaming.

60fps is perceptibly better and for many of us, me included, is usually required or it is palpably uncomfortable.

But for the overwhelming majority of people: they don’t notice or don’t care.


Just use duckduckgo: https://duckduckgo.com/

Their homepage is 5722 bytes not including images.


there's also lite.duckduckgo.com if you want to go even more minimal


1431 bytes :-)


From a company like Google, this is shameful. I don't think it ever would have happened under Marissa Mayer's watch.


Agreed, and SO thrilled that Yahoo hasn’t changed one iota since her departure—imagine what’s the world would be like today has she been at lycos or alta vista!


Am i missing some irony?


No, just history: https://googleblog.blogspot.com/2008/07/what-comes-next-in-t...

(linked by sadjad upthread)


Pretty certain that Google will lower the payload on expensive connections. These things get A/B tested to death.


How the mighty have fallen, how far, how sad.

Once the Google home page was a master class in good design.


Has it ever been? The page used center tags last I checked and a mountain of spec violating stuff but technically works on all browsers.

Using the google page as a reference would be like ripping a few note pages from a physicists workbook as an example to learn math.


Sergey used to demand that the payload of the home page be tiny (I think it had to be < 100K, if I recall -- or maybe even smaller). That was part of the appeal of the simple search page, nothing but a logo and a search field. Now it looks minimal-ish, but contains a bunch of crap that serves only Google, not the end-user.


So what is this 244KB JS taking 1300ms to execute on the simulated G4 actually doing? It contains mostly parts of the Closure Library (https://developers.google.com/closure/library/).


500k *Sobs* and goes off to drown my sorrows with an early Christmas Negroni


Google should consider converting its website to AMP. It will definitely improve their ranking


Google services are crap, what a surprise.


500K what?


gmail is a lot worst




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: