Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: A website speed test tool to compare uBlock Origin with plain Chrome (webtest.app)
192 points by aberforth123 59 days ago | hide | past | web | favorite | 57 comments

YouTube sends more requests with uBlock Origin enabled? [1][2]

My theory is, without the ads, there is more space in the webview to load video thumbnails, each item representing a video in the HTML document probably requires a handful of HTTP requests to load and pre-fetch their corresponding metadata. I would not be surprised if other websites react in the same way. I hope this is the case, but I am suspicious enough that I will investigate further how the ad-blocker affects these websites.

[1] https://webtest.app/?url=https://www.youtube.com

[2] https://i.imgur.com/HBB4TkK.png

This is the case. With an addon that deletes most but not all homepage items or sidebar recommendations, YouTube does some noticeable spinning trying to fill the space which keeps getting emptied.

If you look at the thumbnail of the web page seems like this would be the case.

You need to filter the URLs that are accepted to avoid security problems - had you a contact address on your profile here, or on the site I'd have disclosed this more privately.

But consider this case:


In short you should restrict URLs to protocols of `http`, `https`, and even then you should filter based on IP. You don't want people to view http://localhost/server-status, etc.

Finally you need to make sure you avoid recursion:


> https://webtest.app/?url=file:///etc/passwd

I find it funny that the app reports even this file loads faster with uBlock.

Thanks, good find, I fixed it.

Great idea. Kudos for launch. Bookmarked.

What striked me is JSHeapTotalSize, never really think the ADs are eating so much RAM / resources (make sense). In Guardian case this accounts for 50% more allocation.

Minor typo on the website: "or to proof to others they should use an ad blocker" (should be "as proof" or "to prove").

Also, suggestion: don't require typing http:// or https://

TheGuardian is such a shit site without adblock. During the Australian election, I turned off my adblocker because "they're an independent news source" and I wanted to support their advertisers. I got bombarded with a full width banner at the top, and full height banner down both sides for the national Greens party.

Now don't get me wrong, I don't hate the greens party, but god damn, my eyes.

It's much better to donate, at least the correct people get the bulk of the money.

My wife and I decided to donate one year. Guess what happened next? They sold our data to their partners and we ended receiving so much spam mail in the following few years, we couldn't believe it. These organizations generate so much paper waste, it made us so angry we simply stopped our subscriptions. But we had to wait until we moved out for the spam to stop.

I'm not sure why the code isn't open source on this one. If the OP is hoping that news websites (etc.,) will pay for this service, I wouldn't hold my breath.

This site only highlights all the bad things happening on those sites - and the marketing teams there have most likely already been told by their developers what including 1 million cookies & 50 million tracker APIs will do to the performance. They want this gunk in there so that revenue targets can be met.

So the best bet is to put this up on GitHub where folks like me could learn from the code :-D

You could say the same thing about open sourcing it, if the OP is hoping for useful PRs they would be ill advised to hold their breath. Meanwhile OP now gets spammed with low effort installation and support requests.

Maybe open source the worker code, but the whole website?

In what way do you think news sites would pay for this service? They want full throttle ads everywhere right, not sure why this service is of any use. They just think: the more header bidding partners putting cookies everywhere the more money

Just to be super clear, I was not trivialising the project itself! I think it's a very cool idea, and very useful for users like me =)

I do not think it is 'not useful'. I was - not very clearly - posing a question on what your idea was to put it into 'practice'. I.e., whether you were hoping to turn this into a business or an open source project by posting here =D

Merely curious! Thank you for sharing!

News sites could use this as a benchmark to ensure that the experience is worse with uBO.

Will open source soon, need some refactoring first :).

Thank you:)

Nice way to see if EU sites are GDPR compliant, because you didn't give consent when hitting a website those websites should not use any external tags yet. You can see for example nos.nl or lemonde.fr are fairly clean, but bbc.com or bild.de are not, even though there is no consent.

Question for the developer: any particular reason you're using Chrome with xvfb rather than just headless. Its been stable for a while.

You cannot load extensions in headless chrome, so I had to jump some hoops here.

Nice, I like this.

Would be nice if there was a way to display/download all cached results. Would be a nice dataset for visualisation or a dashboard.

I found a bug: when I enter a URL into the form it seems to URLencode the characters, but this doesn't work on the site. That is, [1] works, [2] spins on "status: queued" forever. As far as I know I don't have any particular settings or extensions that would cause the URL to be unexpectedly encoded. Edit: it's not doing it right now, which is weird. It was just a moment ago, but maybe it was fixed? I had assumed the backend was down.

[1] https://webtest.app/?url=https://bbc.co.uk

[2] https://webtest.app/?url=https%3A%2F%2Fbbc.co.uk

Why does the Reddit page get bigger with uBlock Origin? From 2.73 MB on Chrome to 8.48 with uBlock Origin.


I'm noticing that DomContentLoaded, Processing Time, and Load Time are all blank for Reddit withoutout uBO as well.

I'm going to make a mild suggestion that the author double-check their implementation logic. Is it possible that the ad-encumbered page isn't loading before it times out? Or is it possible that without ads the page is loading more content?

Even showing something like a comparison screenshot at the end might help in odd cases like this.

The screenshots didn't even load because the my tiny server ran out of memory when testing Reddit.

This was an old result from when the server didn't have enough memory for big sites (especially the version without uBO). I reran it for Reddit, you can see the result on the same URL.

Quite often I’m getting 2-3x the load times for ublock.


It's the same 7 requests, probably a transient effect. E.g. https://webtest.app/?url=https://news.ycombinator.com/item?i... is 0.48 vs 0.5.

I don't think it lets you retest either, the first run is cached for a long time.

It would be very hard to retest, not just because of traffic, but also because some websites like amazon.com will throttle you to a halt or start showing captchas if they think you are scraping.

Great work.

Without uBO, Forbes loads 151 cookies.

https://news.ycombinator.com takes a second longer with uBlock

That means HN is doing things the right way :)

How is it so?

Page A loads content for X seconds and Z seconds loads ads. uBlock overhead is Y second but saves Z seconds by blocking ads. Total time with uBlock is X+Y-Z. If page has zero ads the time will be X+Y-0 which is greater than X+0.

It's also probably entirely random fluctuations in this case. uBlock doesn't add anywhere near that much latency normally.

Just to clarify, this is desirable because "fewer/no ads" is preferred over "faster page load times"?

Yes. Pages with ads are unreadable. I'm horrified every time I see friends navigate with no adblockers. I always recommend one. I'm going to start recommending Firefox again, as I did at the times of IE6.

Presumably because uBlock's baseline overhead is approximately 1 second. Therefore on other sites there is enough overhead from ads or tracking that it is still worth it to use uBlock.

Here's evidence[1] that uBlock adds less than 100ms to "an atypically large page from a nice site" (i.e. one that doesn't have any ads). Some other effect is at work here, it's probably just random network latency differences.

[1] https://github.com/gorhill/uBlock/wiki/Doesn't-uBlock-Origin...

So it has 1 second of startup cost? I have a hard time believing uBlock would add that much time to every request. Ideally this would make a more realistic measurement that doesn't include startup cost.

Or maybe it's just random variation.

Thanks for sharing this gem.

Did you consider computing the Speed Index? [1]

It would help assessing the performance impact these ads have.

[1] https://sites.google.com/a/webpagetest.org/docs/using-webpag...

Fun, I tried it with a regional newspapper and it doesn't shows the processing and load time in the versiones without adblock.

Also I never realized it was SO BAD without adblock


Regarding: "Also I never realized it was SO BAD without adblock". That's part of the problem. We techies have no idea what's happening without an ad blocker because we all use one. This service wants to show the unseen.

Excellent way to show the weight of ads and trackers, very clean side by side comparison.

Interesting to see many sites load faster without uBlock because DOM content loaded is faster... it would be nice if we could run more scientific measurement to get a better understanding of the load time differences.

I'm stuck in a queue. more workers please, this is awesome.

No sites are loading for me, including the one in the Show HN URL, which ought to be cached. I think we broke the site.

Yes sorry, it's tough!

Or maybe provide the ability to run it locally. Is it open source and does it have deployment instructions?

When I test your own site (webtest.app), it doesn't work. I was hoping for some introspection!

According to other replies this was a recent change to prevent recursive calls and attempts to access local files on the server.


Not using something like ublock is simply bad hygiene.

Adblockers in chrome are slow. Fast pages goes from having no noticeable delay to having one. Many times the latency is not uncommon. Sure, this is only a problem for really fast pages, but it's still really annoying.

Check this service, it proofs you wrong

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact