This has gotten out of the hand for a company with so much resources, that literally depends on page speed for revenue. It is incomprehensible that the size (which is probably the main reason for latency) can not be reduced.
Assuming that people don't clear their cookies is wrong nowadays. Even when your website is a little snowflake, the rest of the web isn't.
I honestly would recommend a Web Extension that adds your search as a search provider, or something like an Open Search Description.
Also: the landing page of GitHub annoys the shit out of me. Every damn time. I don't wanna register with three clicks if I already have an account.
Make a landing page easier for regular users to login, and prioritize login over signups...otherwise those regular users won't be happy. Any company should focus on keeping the regular users happy (UX wise) and not focus on one time trial registrations that will be a burden on infrastructure.
Fucked up incentive structures like this are why we cannot have nice things.
I strongly suspect that this has to do with attribution and shallow "data driven" choices - there is a heavy bias towards things that are easier to measure. Especially in startups with limited resources.
Measuring retained users regardless of anonymous/known state is very hard and lossy. So the nearest easy to measure analog that is often picked is logged-in users and we start the fucked up cycle you mention.
Similar things happen around all metrics that have to do with retention. We know that a better experience through higher performance and better UX is likely to create happier users that return... but we cannot prove it (easily) so we focus on optimizing other things.
The number of users you lose by doing that is smaller than the number you gain as long as you don't have enough users (and spends enough on publicity). This is also the one thing that limits the number of users a company will ever have on most cases.
What I do mind is the price...if you read the FAQ, they want to charge as low as $10/month, $30/month for families once it exits beta. ("As low as" means "possibly higher")
That's very high. The family price equates to 16 Office 365 Family seats, which includes 16TB of cloud storage.
In my opinion, the price for this kind of product should be more in the realm of $50/year. I'm thinking a price closer to what password managers charge.
This price also will need to be adjusted for global customers if that is the eventual intention. Spotify in India costs less than $1.50 a month.
It is also an independent company, bootstrapped by the founder, without a ton of VC money to subsidize the cost. And at $10/mo per user it would be barely breaking even, if at all. So there are a lot of factors of play, and I completely understand why $10/mo may sound excessive given all the free alternatives out there. It is just one of the many uphill battles we are fighting ;)
Right now you're charging nearly as much as something like Netflix, so your product has to be nearly a Netflix' worth's more value than DuckDuckGo (rather than Google as it's closer to your value proposition) for me to rationally justify the price.
One thing that's great about search engines in terms of entrepreneurship fundamentals is that is is such a massive market. You only need a fraction of a percent to be a viable business, I'm sure.
I can also understand how underpricing a product can be a big mistake. Somebody who wants this kind of product has already made the conscious decision to pay. Pricing a product too low can be a signal that it's not a good product.
Perhaps one might say that this is an example of how complexity and cost of production is a bit detached from the customer's willingness to pay. The most complex software product, the operating system, is essentially given away for free. People are mostly not willing to pay for it, even businesses don't want to anymore. On the other end of the spectrum, there are SaaS products with very low levels of complexity pulling in a lot more than $0.
If it's truly barely breaking even at $10/month, that fascinates me. It must mean that search engines have to do a lot of computing.
In any event, I wish you the best, because the features do look cool. I like the idea of being able to set site preferences (e.g., when I search for a movie I might like to see Wikipedia over IMDB).
Spot on. Not that long ago we used to pay for OS, even beta versions of OS. Nowadays, most complex software products imaginable, like OS, browser, search engine, social networking... - are all free, and over the years we as a society have been slowly drawn into an expectation of these to be free. Of course someone or something is paying for it, as all these also "magically" generate hundreds of billions of dollars of revenue. And when someone shows up with an idea to actually sell such product and tries to create a business where user is also the customer, and incentives between the user and the business are aligned, they are usually looked at with a raised eyebrow.
I am personally of the opinion that you always get what you pay for and that one should not shy away from paying and supporting products they love. I hope we are building one such product, that some people (and admittedly you need to be a little different to use and pay for Kagi) will fall in love with.
Is it that you're doing a large amount of compute per search which yields better results but higher cost per search?
On premise search. Businesses have lots of resources on their internal websites, but no easy way to search them in one place. Addition of role based search, when I'm logged and search only shows me resources I have access to.
$10 a month is less than one lost hour of work.
I hope they price it around there somewhere so that many will use it but even at $20 it is a no-brainer for me.
The results are in another league compared to what I get elsewhere and by now it also feels like an insult every time I realize Google has altered my query: do you guys really think I don't know what I am doing?
Google probably works better for illiterate people these days, but in the process they have alienated me. I have lost too many hours searching for things I knew used to be there only to be ignored and corrected by their AI. It is rather patronizing at this point.
This is a pretty feeble argument for something, especially for a product where the user perception is that it should be free, and there exist multiple free options already.
While "one hour of work" might not seem a lot, a different way of looking at that is that for most people there are 160 hours of work in a month, and so this thing needs to be worth 1/160th of their life.
It's not so hard to make that choice for a couple of things that are of obvious value, but when they start adding up in your life for basic things, people will question the value of it to them, and I'd argue for most people, search just isn't worth that much to them when there are so many good free options.
You should still use the website's search function first. Github issues/SO search often turns up results that google will ignore.
Offtopic: That must be one of the best deals on online storage Excluding Google's enterprise unlimited storage option which always felt like they could pull the rug under you any day.
This is one of the first "asshole design" moves that I recognize, that unfortunately most sites use. If the "Sign up" button (which will be used once by each user) is highlighted more than the "Sign in" button (which will be used hundreds of times), I know that your company doesn't really care about its users.
I'm surprised how common it is.
If you're not signed in, `Sign in` is available on literally every page in the upper right and one click takes you to the sign in box. If one click is too many just bookmark github.com/login.
If you don't want to sign in the search box is also in the upper right on every page to take you wherever you want to go.
This will make me be detected as a bot and I see captchas everywhere afterwards. I was using that for a while.
A better alternative would be content identity through URLs.
If your marketing page is at github.com/index.html, then make something like github.com/dashboard.html for logged in users, and redirect them to /login.html when they are required to be logged in.
This way users can easily bookmark the page that they actually want to see.
The sole purpose of these is to get you to globally log into google and stay logged in.
So I don't think it really makes much of a difference.
Curious how would you handle authenticating for a paid product on the web today?
I do authenticate on several paid products, but I was always introduced to them in some hassle free manner, and only got to the point of paying and dealing with adding another app to deal with maintaining after I’ve grown to like the service
Ultimately it’s laziness, but dealing with authentication is annoying and I can’t be assed to do it for something I’m not already valuing
After I made the password manager a part of my routine that I trust absolutely that I never skip, all of that anxiety went away. I still have logins from the before times that I wish I could round up, but for the last ten or so years I can guarantee I know every site that I’ve signed up for.
I feel bad for people making new services now who have to deal with this, but the internet isn’t new and they have to overcome our experience with all the bad behavior of shysters who operated like these apps and ended up lying about the service
As of now, I log in only when needed, and usually to a single account at a time, at most two.
I am willing to try new models for web search if the provider does not log any information. I'll prefer it to be stateless (other than account settings and payments).
This. I suppose it's become so prevalent due to A/B tests finding it more efficient to optimize for new users, but at some point in a company's growth it would make sense to put the focus on user retention. Then again, it wouldn't be that hard to make both signing up and logging in easy and attractive.
It is just like being back in 2007 or something when Google worked this way and it is wonderful.
I've used DDG for the past few years and I still hope they too will succeed, but with DDG the quality is the same or (slightly) lower than Google and the bang operator comes in handy many times a week.
With Kagi I only think I have used the bang operator once (yes, they have it). It is that good: when I look at the results I know immediately that there is no way Google or DDG can do it better except by pure luck and also my results are already there on first or second try so why bother shopping around?
- bangs work like in DDG (I think, I don't use them anymore)
- doublequotes actually work unlike in Google and DDG
- even without doublequotes it doesn't stray very far from my query
- I can add lenses (?) or something (I have not bothered yet, the defaults are good enough)
Yep, I'm a walking billboard now, but I'm not paid in anyway and don't know any of the people behind kagi, it just happens to be the biggest improvement I have seen since Google replaced everything else sometime around twenty years ago.
Edit: yes, it is a paid product but it is free during beta and so good that I would consider Kagi gift cards for friends in tech if it was paid, just like I'd consider gifting WhatsApp subscriptions before Facebook bought and subverted it (yes, I was considering worse words).
Edit 2: I'll do this for your product too if I love it. I did for WhatsApp and Google+ and in real life for the insurance company I have used the last few years.
Lets be clear that this goes both ways, Kagi is a really nice Christmas present: so far this is the biggest software improvement I've seen in a number of years: on the same scale as when generics was introduced to Java (and this was a big deal for me) or in fact even on the same scale as when I was introduced to Google. I'll have to keep using it for a few months to see exactly how big it is but feels like a really bif deal so far :-)
Wishing you a merry Christmas whenever you celebrate it (we are celebrating tonight).
This drives me nuts. If you're going to have a competitor to Google (ddg) at least give me proper functionality to refine my searches
There is a lot more going on than 10 blue links these days, including continual scroll on mobile. Perf versus features is always a matter of trade offs.
Okay. I'll bite. What does this mean?
I hold my iPhone 13 the exact same way I held a Nokia brick in 2003. What did Apple tell us?
They were HIT WITH 18 LAWSUITS about it OMG.
They settled all the lawsuits by ... offering a free cover or $15 refund.
Depending on your viewpoint this is either one of many blips along the way of manufacturing tens of billions of products over a dozen generations, or a way for "mactards [to] realize Steve Jobs are out to milk their clueless a$$es with his oh so clever marketing gimmicks" (Yahoo News comment).
Bonus points if you can find anywhere Apple said "you're holding it wrong" or blamed users. Steve Jobs wrote "Just avoid holding it (the iPhone] in that way." which is somewhat pragmatic, and rather what you'd expect someone to say if they weren't filtered by a PR firm. Probably what I would say. Apple haters can't face this.
> "We don't really think there was anything substantial to the problem, and the iPhone 4 sold tremendously well. Customers didn't seem to notice, let alone mind, and Apple repeatedly pointed out that other mobile phones suffered the same effect."
Apple aren't my favourite trillion dollar company and just because they have money doesn't mean everything they do is bad; or that the fundamental attribution error applies differently (similar applies to other companies, groups, organisations)
You’re looking at cold start download sizes, but that’s not what people actual see after everything is cached follow the first load.
Google pages load plenty fast on every device I’ve used recently, including some very old android tablets. Most page loads don’t transfer much data at all due to caching. Combined with the high speeds of both browsers and devices today, there wouldn’t be much real change in a highly stripped down Google page relative to what they’re serving up today.
Here is what I get:
2 seconds to load and 1.12MB transferred, all on "warm" start.
I'm also seeing content loaded happen in under 1 second.
Where did the love go....
This. Nothing else matters.
While it is true, for almost all of the users most of it is cached. For me it is using 2.5 MB resources with 77kB transferred.
I really like the idea of a premium search engine too, I'll give it a go for sure.
I'm a little uncomfortable with having to be logged in though... Ideally if I'm going to pay for a search engine I would like to do that as anonymously as possible.
Best of luck with the project, I'd love to see some competition in search again. Google has been getting worse for years and they have no real incentive to improve given their dominance and lack of disruption in the space.
I'm not sure that I want to sign up and give you data when your premise is to not mine the data. Is that your premise? How long do you retain knowledge of what had been searched?
The personalise features put me off. The level of personalisation I want is course location... I.e. online shops in my country, or if I search for something in Liverpool pick the right Liverpool in the world... Both of these are relative to where I am and an IP geo lookup is roughly good enough with an override in the worst case.
I want to like what you're doing so much... But you've designed it to have friction to that discovery.
I'd echo this. I actually even went to the site to see what was so good about your search, then realised I can't even test with a single query without signing up, and lost interest.
Perhaps it is not obvious that Kagi has not launched yet. We are in closed beta, and email is used to invite people who want to be beta testers, nothing else. Once we officially launch, everyone will have the ability to try it.
>I'm not sure precisely what your differentiation is (again, can't find it without signing up).
The FAQ has a lot of information you may find useful https://kagi.com/faq
My other feature request is that I still like google maps, and OSM is just not a good replacement. The important feature for me here is: google a place name, click the map square that shows up in the Answer for that place to open it in gmaps. Both kagi and brave search use OSM which is not clickable, this really breaks my UX and alone is enough to make me want to go back to google search. Using !m sometimes seems to work but I think ideal would be a "which maps provider would you like" feature which just lets a user choose.
Kagi Maps is still work in progress (beta like everything else) and we just didn't get to shipping this feature. We are not using OSM only though, and many users have complimented the map to be better than Google maps already (for features that we have shipped).
What about a version of that where I pay a few cents for each search, using Lightning Network, for example?
If you just made those parts drop-downs with the most popular answers and had an 'Other' option that makes a text-field option, you would probably get much more valuable data and more of it. I would not be surprised if you had some amount of dropoff in your beta signup funnel due to questions that make potential users feel like theyre being 'put on the spot'
Accepting crypto payments could be a solution but I'd totally get why'd you want to avoid that.
The best we can do really is have a business model that aligns incentives with the users and weirdly enough, this is already an innovation in this market. Then, create the best possible product that users will love. Gaining trust is a process that can not be skipped.
I don't think that's an accurate view of the business model of Google. For them consumers are mostly locked in and their business model is in tracking users and mining as much information as possible from every facet of the experience.
Best of luck, looks really ambitious!
That is the problem: resources show up in abundance, not in scarcity. Only small or poor creators usually can do the latter.
Conways law at work, I guess.
Or I can use duckduckgo without needing to sign up and get your promised security with 1-2 second searches for free. Maybe you'll have enough takers to get off the ground but I'll rely on other tools.
> forces you to login
Turns out, this might not be true at all.
Who are Google’s serious competitors? Sure there are other search engines but I’d argue that none are serious or even marginal threats.
People will tolerate sub-optimal experiences if they perceive they don’t have any option; just ask anyone who has stood in line at the DMV.
The first sign of a monopoly is a rapid performance degradation which can’t be explained by economics of scale and which did not exist in prior versions of a product. Free market forces and incentive structures no longer apply to monopolies, which is why many large corporations can get away with sloppy work and treating their customers poorly, but do not see much if any consequences. Behavior that would have been disastrous in a startup is tolerable in a large corporation operating as a monopoly.
A product owner or dev at Google would be committing career suicide if they started to care about performance at this point, because that would not result in any appreciable changes or outcomes consistent with the incentive structures at Google.
But once I'm using a tool for something unless the performance is actually preventing me from using it - I'm not going to bother switching. So no need for a monopoly - if your competitive edge is that you are faster than competition, unless the competition is dog slow it's not a selling point to me.
Eg. I won't look for VS code alternatives just because it's electron based and the competition has higher performance, vscode is fast enough for me. I will switch to a different compiler or even language if it shaves % off buid time that makes the iteration meaningfuly faster, or the tooling addresses pain points.
But then the guy next to you is a coding intern. He averages 150 to 400 searches a day. Maybe one every couple of minutes, or every five minutes.
That guy would be pissed. All the time.
Implying the Google Search homepage loading experience is sub-optimal? Chrome starts loading it before you finish typing google.com on your keyboard.
However, from a pure engineering perspective, I think most people agree that the page weight has increased at a rate that is unjustified considering the core features and functionality have been relatively static. So why the bloat? Engineers can tell of course. The margins of acceptance have gone way up thanks to faster devices and connections, but that still doesn’t explain why they don’t seem to care to optimize anymore. In optimize, I’m talking about it from a pure engineering perspective: what is the minimum amount of data needed to do the job of displaying a list of search results? Some people have forgotten that a list and some simple formatting is all there is to it. It’s not a web app, it has minimal interactivity, and relatively basic features. The fact that it has ballooned to 500k just proves that even Google is not immune. Maybe more resistant than other giant tech companies who’s software just keeps getting larger but not immune.
Remember that code is data too. 500k isn’t a lot in today’s world but multiply that by how much traffic they receive and it is. I’m not saying Google search isn’t a well designed product that doesn’t work well, but from a pure first principles perspective it does seem quite bloated based upon what it used to be.
15-75 ms GOOD: (no thoughts had in that time)
75-150ms OK: oh, there it is
150- 250 NOT OKAY: wait, did I…
250- 500 VERY NOT OKAY: could I have typed something better?
>500+ ms ABSOLUTELY DEPLORABLE: is this thing on, is it working still?
In Oregon they seem to care. I've never spent more than 25 minutes in line and often been out the door in under 20 minutes even with complex forms to complete. They often have someone near the door to check your needs and get you the right form as you take a number so when you get to the window you're set to submit your form, pay any fees, then leave vs sitting down again to start the form or something. Oregon has it figured out.
Washington though, they have private offices running all aspects in separate locations ex emissions test up the highway, licensing across town, and vehicle registration down south. It took me over 3 hours to register my car when I moved and the DMVs there only proceeded to be worse.
Or if 'resources' are limited, like carbon emissions.
I'm not sure that's right. There's a strong link between performance and repeat usage. Even in the complete absence of competition, Google search being fast could result in more searches per user being conducted and more revenue as a result.
Google seems to disagree given how much they shell out to be default search engine on different devices and browsers
Truth is, if neither brand doesn’t keep fighting for their top position they will lose it to competitors.
The walls are being pushed from all sides, but both Coca Cola and google has the strength to hold up those wall. But that doesn’t mean the push is insignificant.
1,326 KiB Bing https://www.bing.com/
829 KiB Yahoo! https://www.yahoo.com/
506 KiB Google https://www.google.com/
358 KiB DuckDuckGo https://duckduckgo.com/
8 KiB DuckDuckGo Lite https://lite.duckduckgo.com/
On my mobile device, Google feels slow because they have to throw up a spinner for 2 seconds while they figure out what local news to share with me. Bing manages that seemingly instantly on on my device.
For full disclosure, I’m a Microsoft employee. Not trying too shill for Bing. I’ve just been surprised how slow Google feels lately due to their (slowly) lazy-loaded content.
What do you do if your current front page actually works really well with some basic PHP or whatever but you employ x thousand of the world's best developers? You invent new frameworks that are more complicated to solve one problem and create 10 more, you change things so they can be automated even though you can easily afford for people to press a few buttons.
And the worst of all, the need to "stay relevant", which is a sad but real pressure in marketing and can involve really expensive design consultants (new font Twitter?) and sometimes something that is not only very different than before but sometimes much worse.
If huge tech companies like google, facebook, airbnb etc are unable to make fast websites... then why should we use their frameworks and tools?
I really feel like the emperor has no clothes here and it's insane for smaller companies to try and ape them.
You don’t have to at all. Lots of folks do because “fast websites” is one requirement out of many to consider.
What about secure? Stable? Reliable? Maintainable? Reusable?
I personally build excessively lightweight and simple web UIs for my personal projects, but I’m not managing infra that >1bn people use daily and is literally used to answer “does my internet work?” because of its expected reliability.
In my cynical view, When you think about engineers as people who want to stay up to date with 'latest and greatest' so that they are not left behind in the market, the use of their frameworks makes sense. Also, the frameworks do solve very key problems in their own ways (and also avoid re-inventing the wheel).
So the truth lies somewhere between the 2 sentences there. There's nothing which we can not achieve without frameworks however managing that solution is a pain. More so if one is working on a bigger team.
Because job postings demand X years experience in Y framework instead of looking for actual skills, and many investors look for hot things that are jumping on Z hip new tech buzzword and not all look at what they actually solve.
Plus to a certain extent they can be used to get a project off the ground faster. Although at a certain point a good number of projects that are successfully taking off would benefit from a nice rewrite--but that's not as sexy and marketable as adding another new feature in that same time span.
Why should the rest of us give two fucks what their eslint config settings are?
There have been a lot of misses from these companies. Some big wins for sure, but it’s definitely not a perfect record.
I do think there are a lot of smart people working on hard problems (user engagement… not cancer) with large teams at these companies though. Which means they are bound to come up with some percentage of good solutions. I’m glad they share them even when they flop.
Way back when, a Google dev started to receive frequent cryptic emails. Each email contained a single number. They struggled to figure out what it represented, if anything. Somehow, it was realized the number represented the word count on their homepage.
This story comes to mind when I see promotions, statements and other things there. Perhaps someone should start emailing a byte count.
The years really do fly by. Looked her up and she’s currently a cofounder at a company that sells an iOS contacts app.
It's fascinating that according to the report, someone on a Moto G4 would have to wait over 6 seconds for the google search page to be responsive so they could type in a query.
On the other hand it seems to give a fair critique to other sites that are known to be bloated such as the one below.
Hacker News hits 100%
Reddit fails w/ a score of 23
Looking forward to using it to test against my sites.
“Hey guys, you know how we need more user data to monetize but no one wants to use our crappy app? Why not just make the website horrible and constantly nag them via dark patterns to install our app?”
If the communities I am part of (or built) weren't so entrenched and interesting I'd be happy to move from the front page of social media back to the front page of the internet.
For years reddit was unusual in not offering an official app.
Apollo is the best thing since reddit bought Alien Blue and shut it down.
Navigating back to the list of submissions, which are already in the DOM and are literally visible through a transparency, takes like nearly a second.
Of course the back button doesn't work in most cases. It's pretty embarrassing if it's not intentional.
Shame on me for using google's other services in the recommended way I suppose.
For Youtube embeds, I prefer Lite Youtube Embed by the magnificent Paul Irish.
Apparently the reddit home page is getting hit hard by all the award icons, which google really doesn't like:
* they're really large, some are animations and they're 48x48 full-color (rendered at 12x12), meaning some of them are close to 100k
* they're (a)png which google dislikes
* they're unsized on the page (they're sized in CSS) which is dinged on grounds of layout shift
* they're not deferred so the awards below the fold are still fetched upfront
Google's estimates also seem over-optimistic e.g. they advertise potential savings of 94K (out of 95.2) by using next-gen formats for https://www.redditstatic.com/gold/awards/icon/Illuminati_48...., but plugging that in an AVIF converter yields a 27K file with neither the animation nor the transparency worker (no idea if it's an issue with ezgif's converter or firefox's support).
I downloaded a static HTML-only version of the old site from Google and a cat of all the HTML files was 25.3MB. The new static site is only 1.7MB of HTML, with more than a third of that being the release log. The average HTML page size is 19K, whereas with Google V1, the average was 450K. And it's way easier to maintain now with a text editor and rsync vs pointing and clicking in a browser text window.
The other major problem with the V2 Sites is that the HashBackup release log took 15 seconds to load. It didn't do that with V1 Sites, so this was a new "enhancement". With the Asciidoc site it takes a little over a second. Google has a Report Migration Issues button, but of course they never responded to anything I sent.
It’s supposed to be a “worst case scenario” sort of test.
This feels like a better flow to me than having to remember when I explicitly need to use google in the address bar or having to go to google.com and search it there.
Worth noting that the bad score is only on Mobile, I consistently get 45-50, but on the Desktop section I get 99.
These few bytes is actually mostly what you get when browsing Google with w3m. By the way I just noticed they now have a cookie banner that works in w3m without JS? Now this is some zeal!
The google homepage works on a _remarkable_ number of devices. You could install a fresh copy of windows XP and the built in IE works flawlessly with google while almost any other page is broken.
And also forgetting all the analytics they need. I know its a dirty word on HN but this kind of stuff is essential for a product of this scale. Throw in some extras for accessibility, etc.
Looking at  it seems to be 80 lines, as it's doing argument-parsing (for --help and --version) and stream-management.
It seems overly complex to me but I have no idea what kind of requirements they have.
Also, the search page is not just the search page, it's the portal for all Google services and has corresponding links and menus at the top.
So we should expect the page to be much bigger than "a few bytes", and it is.
A webform requiring truckloads of stuff and gear.
Size is a useful metrics, but actual UX is what matters.
Going back more than a decade ago, google.com used to be brutally optimized. Anything unnecessary was stripped to reduce load times and decrease network traffic. For example, attribute values were never quoted if they didn't have to be and closing tags like </html> were omitted. It was kind of fascinating to read how many terabytes of network traffic omitting </html> saved.
But then came the ill-fated boondoggle that was Google+ and with Google+ came the OneBar. This was the black bar across the top of the screen with key properties but, more importantly, the G+ login. The cost of this was really high.
We don't have the black bar anymore but a subset of links remain as does the login box, even if it isn't for G+ anymore. The Google home page just isn't the lean page it once was.
That doesn't make me mad, unlike say AMP, which makes me incredibly angry. But still I can't help but take it as a sign of lost values.
Also, the Google home page loads dependent services (single sign on, lots of JS for XHR searches, etc.)
60fps is perceptibly better and for many of us, me included, is usually required or it is palpably uncomfortable.
But for the overwhelming majority of people: they don’t notice or don’t care.
Their homepage is 5722 bytes not including images.
(linked by sadjad upthread)
Once the Google home page was a master class in good design.
Using the google page as a reference would be like ripping a few note pages from a physicists workbook as an example to learn math.