
Adblockers Performance Study - kkm
https://whotracks.me/blog/adblockers_performance_study.html
======
magicalhippo
FWIW, Chromium devs have just responded[1] to the massive amount of feedback
they got on the mailing list, like [2] and [3].

One of the main pain points raised was the lack of any way to dynamically add
rules, as well as the low maximum number of rules allowed (30k). Seems they've
decided to support dynamic rule addition, as well as increasing the number of
rules, though probably not by orders of magnitude by the sound of it.

Proof is in the pudding though.

[1]:
[https://groups.google.com/a/chromium.org/forum/#!topic/chrom...](https://groups.google.com/a/chromium.org/forum/#!topic/chromium-
extensions/WcZ42Iqon_M)

[2]:
[https://groups.google.com/a/chromium.org/forum/#!topic/chrom...](https://groups.google.com/a/chromium.org/forum/#!topic/chromium-
extensions/hQeJzPbG-js)

[3]:
[https://groups.google.com/a/chromium.org/forum/#!topic/chrom...](https://groups.google.com/a/chromium.org/forum/#!topic/chromium-
extensions/veJy9uAwS00)

~~~
pythux
Hey, dislaimer: I worked on this study. Thank you for your comment.

To me this reaction from the Chromium devs is missing one of the most
fundamental issues. I'm not fundamentally against the declarative API because
of technical limitations; I am against it because it is a strong innovation
lock. The current extension ecosystem is flexible enough to allow hundreds
(maybe thousands) of people to actively work on privacy-enhancing extensions
(ad-blockers, anti-tracking, etc.) and the technologies, heuristics, solutions
to protect users' privacy on the Web are constantly evolving. The APIs are not
used today the same way they were used 2 years ago. If Chrome decides to
"freeze" the blocking capabilities of the browser into a declarative API that
no one but Chrome devs can improve, they will be preventing people from
finding new solutions to tracking and advertising (at least from extensions).
It does not matter if they replicate 100% of the capabilities of today's ad-
blockers, as long as it does not allow evolution and adaption it will become
obsolete. There is precedent in this matter: Safari also has a similar API and
it has been a _huge pain_ for ad-blockers developers. The reason is simple:
Apple or Google do not have the same strong intensives that we have to
continuously improve the blocking capabilities of the user agent. My fear is
that this declarative API will be an ok-replacement for today's content
blockers, but will not allow the same kind of fast paced development we
benefit from today in the space of privacy extensions.

~~~
saagarjha
Speaking as a user (and early developer) of Safari's content blockers; I have
almost never run into an issue with them. What kind of development do you fear
will be stifled by Apple and Google not having incentives to improve the
blocking (which I find somewhat strange in the former case, anyways)?

~~~
pythux
What I'm afraid of is the following:

* The blocking engine operated by either Safari or Chrome is a black-box and independent devs will have a harder time understanding it, tweaking it, improving it, debugging it.

* Chrome devs are now playing nicely and get feedback and propose some improvements to the APIs but there is no warranty this will happen again, or that they will invest time/energy in the future improving this part of the browser.

* It's harder to work with this API than a JavaScript code-base you control.

* Chrome seems a bit better here but for Safari the documentation is pretty poor.

* You also don't get feedback regarding the rules which matched on a page and this makes it harder to debug or give nice insights to users.

That's only a few points from my personal experience but I discussed multiple
times with developers of other privacy-enhancing extensions/apps and we shared
similar feelings.

~~~
zaro
> Chrome devs are now playing nicely and get feedback and propose some
> improvements to the APIs but there is no warranty this will happen again, or
> that they will invest time/energy in the future improving this part of the
> browser.

I think this is especially true. It is somewhat similar to many other Google
products like Maps and Translate. They start as a good free product, but as
soon as they gain enough traction the rules change. I think once this
declarative Api is the standard for ad blockers in browsers Google will start
exercising its control over it for its own benefit.

~~~
takeda
This is their long game. To me all the push Google did with https, and
certificate pinning etc makes much more sense. I was wondering why they were
pushing it so hard.

I mean after they essentially blocked ways to use proxy to filter the content,
next logical step is to restrict API.

~~~
jefftk
If you want to proxy your HTTPS traffic you add a local CA, and Chrome does
not apply certificate pinning. Pinning is only for certs that chain back to
the default CAs, specifically so people who need to proxy can do so.

(Disclosure: I work for Google, though not on Chrome)

~~~
takeda
Sure, but then you're still at the mercy of the browser.

The API change is totally unnecessary, yet is happening despite many protests.

The concern is that it was performance and privacy issue, which looks like a
total BS (even according to the link we are discussing).

The extensions are installed by the user, so what not let them decide what to
do with their browser? If it's really a concern, I don't think anyone would
oppose if google would educate user what API given extension is using.

------
Dahoon
So Ghostery benchmarks and is the fastest? Fine. But Ghostery does not belong
in that test at all. Of course it can be faster. It misses a lot of the most
important features! Like custom lists. Totally an ad.

~~~
r3bl
Scroll to the footer and you'll see that this is Ghostery's website.

I wouldn't really call promoting their own product on their own webpage an ad.

~~~
kakarot
A rose by any other name would smell as sweet. Ghostery doesn't want their
bias to be obvious to the non-discerning user because it's obviously an
attempt to reclaim market share using dirty tactics.

~~~
tyingq
Fighting one dirty tactic (manifest v3) with another, though. I'm more alarmed
about the former.

~~~
kakarot
If you're alarmed about Manifest v3, we welcome you into the Firefox fold.

Of course we've had our own issues with WebExtensions but since it's not
politically motivated these issues will hopefully be resolved.

------
move-on-by
Thanks for the interesting read! Just right off the bat, I’m a little cautious
of benchmarks done by Ghostery that happens to show Ghostery is incredibly
fast. Not that anything else stood out to me as suspicious, just a comment.
Perhaps they are The fastest _because_ they are benchmarking and have fixed
bottlenecks.

Beyond all that, I’m a huge fan of ad blockers and use them as an attempt to
reduce tracking and targeting and it’s abundantly clear to me that they
greatly speed up many webpages. The amount of junk that so many sites load, no
doubt it even saves me data on my data plan.

It seems like common sense to not trust a company whose main income is based
on advertising to be making decisions on ad blocking.

~~~
pythux
Thanks a lot for your comment. The project is open source and everything
needed to run the study comparisons is available as well there:
[https://github.com/cliqz-
oss/adblocker/tree/master/bench/com...](https://github.com/cliqz-
oss/adblocker/tree/master/bench/comparison)

We would be really happy to see people running the same benchmarks and try to
reproduce the results!

~~~
alextooter
Hi,thanks for this fast ad-block software. After I give it a try,it does not
provide an option let me subscribe easylist or something like that.Then I open
a website,lots of ads are still there,right click on ad image,there is no
option to remove them.

From a user opinion,it may not ready for everyday using.

------
mirashii
> This work was motivated by one of the claims formulated in the Manifest V3
> proposal of the Chromium project: "the extension then performs arbitrary
> (and potentially very slow) JavaScript", talking about content-blockers'
> ability to process all network requests. From the measurements, we do not
> think this claim holds, as all popular content-blockers are already very
> efficient and should not incur any noticeable slow-down for users.

There's a few issues with the conclusion here. First, they article measures
and discusses only the time required to block a single request. Modern web
pages are issuing many, many more requests than that, like the 35 that this
page issues. At median timings, that would put the DuckDuckGo blocker at
almost 300ms, well within what humans can notice.

The second is that this API is not used solely by the popular content blocking
extensions, but by a variety of other extensions. The Chome team's performance
concerns likely stem from the fact that a user won't be able to differentiate
the browser slowing down and an errant extension slowing down the network
requests, and there are examples of extensions that use this API to issue
additional network requests or do slow things down. If you cherry-pick the
good citizens of this API to show that performance isn't a problem in general,
you're not showing that performance can't or shouldn't be the reason, just
that it isn't the reason for the fast good citizens. What this data could be
used to argue is that imposing strict deadlines on the execution time of these
extensions would allow the content blockers that the community cares about to
continue to function as they do today while also placing a performance cap on
bad extensions.

~~~
pythux
Thanks for the replay. Disclaimer, I worked on this study.

> There's a few issues with the conclusion here. First, they article measures
> and discusses only the time required to block a single request. Modern web
> pages are issuing many, many more requests than that, like the 35 that this
> page issues. At median timings, that would put the DuckDuckGo blocker at
> almost 300ms, well within what humans can notice.

That is true, but on the other hand the DuckDuckGo blocker is the exception
here and they could likely improve this performance if it becomes their focus
(one way would be to use one of the faster open-source alternatives). If you
consider uBlock Origin, Adblock Plus or Ghostery, we see that even blocking
100 requests would not take much time (probably around 1 ms with Ghostery).

> The second is that this API is not used solely by the popular content
> blocking extensions, but by a variety of other extensions. The Chome team's
> performance concerns likely stem from the fact that a user won't be able to
> differentiate the browser slowing down and an errant extension slowing down
> the network requests, and there are examples of extensions that use this API
> to issue additional network requests or do slow things down. If you cherry-
> pick the good citizens of this API to show that performance isn't a problem
> in general, you're not showing that performance can't or shouldn't be the
> reason, just that it isn't the reason for the fast good citizens. What this
> data could be used to argue is that imposing strict deadlines on the
> execution time of these extensions would allow the content blockers that the
> community cares about to continue to function as they do today while also
> placing a performance cap on bad extensions.

There are indeed examples of extensions doing bad things: collecting private
data, etc. But we are talking about diminishing the potential privacy
protection of _all_ users to prevent some abuse. On the other hand, the
manifest v3 will not prevent extensions from being slow, or doing bad things.
Extensions will still be able to use content-scripts, inject arbitrary content
in pages or send any private data home. WebRequest listeners will also still
be accessible (only not in blocking mode) which can allow any data collection.

So yes, I think these changes could in theory prevent some cases of abuse, but
I strongly believe that they will overall weaken the privacy protection of
users and that this is not an acceptable trade-offs.

~~~
saagarjha
> On the other hand, the manifest v3 will not prevent extensions from being
> slow, or doing bad things. Extensions will still be able to use content-
> scripts, inject arbitrary content in pages or send any private data home.
> WebRequest listeners will also still be accessible (only not in blocking
> mode) which can allow any data collection.

I can choose to not install extensions that do these things, while using ones
that only have a manifest list…

~~~
pythux
I agree with you, and that's why I would like to see this declarative API
being an addition to the current WebRequest API. This way there could be
extensions using it exclusively and users could decide to pick these if they
offer sufficient privacy protection for their taste. On the other hand, you
would still have the option of installing more powerful extensions using the
dynamic APIs (allowing things which will _never_ be possible with the
declarative API).

~~~
saagarjha
Yup, I’m not saying that the old API should be killed; it’s just that it’s
convenient to have the new one and be able to “trust” the extension to not be
able to slow down my browsing, steal information from the page, etc. but also
be able to fall back to something I _do_ trust for what gets through if
necessary.

------
seanwilson
> This work was motivated by one of the claims formulated in the Manifest V3
> proposal of the Chromium project: "the extension then performs arbitrary
> (and potentially very slow) JavaScript", talking about content-blockers'
> ability to process all network requests. From the measurements, we do not
> think this claim holds, as all popular content-blockers are already very
> efficient and should not incur any noticeable slow-down for users. Moreover,
> the efficiency of content-blockers is continuously improving, either thanks
> to more innovative approaches or using technologies like WebAssembly to
> reach native performance.

Not commenting on if the proposed V3 changes are a good idea or not but aren't
the changes not supposed to prevent badly written extensions from slowing
everything down?

It might be the case these adblockers are well written but you can't guarantee
that for all extensions. If a user doesn't have an obvious way to know it's
the fault of an extension then Chrome gets the blame.

By the way, I have a Chrome extension that from its own tab requests many
pages from arbitrary domains to examine their content. I need to modify the
request headers going out and observe the response headers coming in, but only
for requests made from the extension (so it's not impacting other tabs at
all). I'm guessing V3 impacts this only if the request header modifications
are dynamic and can't be added via the fetch API?

~~~
pythux
> It might be the case these adblockers are well written but you can't
> guarantee that for all extensions. If a user doesn't have an obvious way to
> know it's the fault of an extension then Chrome gets the blame.

It has been suggested in another comment that Chrome could give visual
indications of the performance of extensions. They already do some of this to
track the memory used.

------
dasuisa
Perfect way of showing that Google's performance argument is bullshit: just
measure. And congrats on being faster than uBlock Origin, not an easy feat.

~~~
clouddrover
> _congrats on being faster than uBlock Origin_

I'd like to see a comparison against the WebAssembly build of uBlock Origin
though. uBlock Origin uses WebAssembly in Firefox to speed up some functions.
I'm not sure if Chrome allows add-ons to use WebAssembly yet.

~~~
dasuisa
Ah, that's an interesting point, did not know that it could not use
WebAssembly in Chrome yet ([https://github.com/WebAssembly/content-security-
policy/issue...](https://github.com/WebAssembly/content-security-
policy/issues/7)). Would it be interesting to measure indeed, although the
amount of wasm code there seems to be minimal so far.

~~~
pythux
As far as I know WebAssembly in uBlock Origin is currently used for two
things:

1\. Matching the $domain option using an optimized Trie data-structure

2\. Parsing domains using the public suffix list, also based on a Trie data-
structure

I would love someone well-acquainted with the uBlock Origin code-base to
update the benchmark so that we can compare.

Edit: All the code to create the dataset (as well as the dataset used for the
study itself) and to run the benchmark and analyze the results (create the
plots, etc.) is available on the repository and should be reasonably easy to
run locally.

~~~
gorhill
The same code used to match the `domain=` option is also used to match all
filters which are essentially just a plain hostname, i.e. `||example.com^` --
which is a majority of filters found in filter lists.

------
ac130kz
Ghostery does not support custom external lists, which is why it is useless

~~~
alextooter
True,I can't using local AD list,it doesn't work like adp or ublock,which is
not user friendly.

------
pjc50
Background:
[https://bugs.chromium.org/p/chromium/issues/detail?id=896897...](https://bugs.chromium.org/p/chromium/issues/detail?id=896897&desc=2#c23)
: a proposed change to Chrome that would limit adblock extensions.

~~~
tedivm
It also screws over people who use Tampermonkey.

------
kacamak
Worth keeping in mind here is that Ghostery is proprietary, while uBlock is
free software. You should never trust proprietary extensions with your data.

~~~
lordlimecat
Unless you audit the source of every piece of software you use as well as that
of every compiler used in making their binaries, you aren't in a position to
make that sort of absolute statement. OSS has benefits but for most users they
are delegating the code review to someone else, which makes it very similar to
proprietary code.

There is an argument to be made about motives with free vs non-free software,
but open vs closed source is for most people much less important.

~~~
commoner
Transparency in software releases isn't all-or-nothing. The parent comment
makes a good point about preferring more transparency (free and open source
software) over less transparency (proprietary software) as a consumer.

However, the parent comment is a bit misplaced since Ghostery appears to be
released under the Mozilla Public License 2.0.

[https://github.com/ghostery/ghostery-
extension/blob/master/L...](https://github.com/ghostery/ghostery-
extension/blob/master/LICENSE)

------
zaro
This is great news, but I think most people already using ad blockers know
that even slow ad blocker improves loading speed of web pages. Even if
blocking a request takes 10ms, it's still a win when it block something like a
tracker which records every click and mouse movement on the page.

------
ra7
Can we not have editorialized title please?

~~~
QuercusMax
Yeah... that's barely even what the article is about, based on a brief skim.
They seem to disagree with what Manifest v3 is trying to accomplish, but the
article doesn't even use the word "bullshit".

(Disclaimer: I work for a non-Google Alphabet company.)

~~~
btown
Agree with the need not to editorialize, but that is indeed what the article
is about. The entire impetus for measuring the performance impact of ad
blockers is to disprove one of the two cited reasons for Manifest v3, namely
that there are real-world measurable performance benefits to preventing ad
blockers from intercepting web requests.

------
saagarjha
> This work was motivated by one of the claims formulated in the Manifest V3
> proposal of the Chromium project: "the extension then performs arbitrary
> (and potentially very slow) JavaScript", talking about content-blockers'
> ability to process all network requests. From the measurements, we do not
> think this claim holds, as all popular content-blockers are already very
> efficient and should not incur any noticeable slow-down for users. Moreover,
> the efficiency of content-blockers is continuously improving, either thanks
> to more innovative approaches or using technologies like WebAssembly to
> reach native performance.

I don't think it's valid to debunk this claim without testing the speed of
manifest/content blocker list-based blocking.

~~~
loeg
I believe this article _does_ debunk the Google performance claim. It doesn't
really matter if the "manifest" system is perfectly fast (exactly 0 seconds);
this article shows that the current blockers are fast enough that they are
indistinguishable from zero. There is limited room for improvement here (so
limited as to be effectively none).

~~~
mirashii
Again, it doesn't debunk it because the claim isn't "Current content blockers
are not performant", it is that the API allows extensions to do things which
cause performance issues, and there's a long tail of extensions that use this
API and do create human noticeable delays. If you cherry pick performant
examples and then try to debunk the whole landscape, it simply doesn't work.

~~~
loeg
> Again, it doesn't debunk it because the claim isn't "Current content
> blockers are not performant", it is that the API allows extensions to do
> things which cause performance issues, and there's a long tail of extensions
> that use this API and do create human noticeable delays.

Both of these arguments are easily rebutted and have already been in this
thread. As others have pointed out, the modified API still allows extensions
to do things which cause performance issues, just not in that particular path.
(Also, preventing ad load can improve page load performance so much that even
a "slow" adblocker may make up the difference anyway.)

> If you cherry pick performant examples

I don't think these examples are cherry-picked; they're among the most popular
adblockers in the landscape:

* [https://www.tomsguide.com/us/pictures-story/565-best-adblock...](https://www.tomsguide.com/us/pictures-story/565-best-adblockers-privacy-extensions.html)

* [https://www.digitaltrends.com/web/best-ad-blockers-for-chrom...](https://www.digitaltrends.com/web/best-ad-blockers-for-chrome/)

As others have pointed out, you can measure and break or shame poorly
performing blockers without punishing the ones that work well. So:

> and there's a long tail of extensions that use this API and do create human
> noticeable delays.

There's a long tail of extensions that are poorly behaved _in general_. You
can punish those ones without killing the ones that don't suck.

------
userbinator
I'm curious how filtering proxies like Privoxy, Proxomitron,
Proximodo/Proxydomo, etc. compare --- it's an extra (local) hop of network
latency, but those are pure native code. I've been using one for a long time
(ever since I heard of them) and it effectively works across all the browsers
on the system, even those built-in to other apps (often only for the purpose
of showing ads...) Even for those who don't routinely use multiple browsers,
given how increasingly user-hostile and unconfigurable they are becoming, I
think it makes sense to move filtering into its own application.

~~~
15DCFA8F
The problem with those external HTTP filtering proxies is that the usage of
SSL/TLS is pervasive nowadays.

~~~
userbinator
At least some of those support MITM using a local CA.

------
lohszvu
Ghostery is owned by an advertising company. It does not allow custom lists.

------
brianpgordon
I'm surprised that Brave performs so poorly. Isn't the whole point of having a
dedicated privacy-aware browser supposed to be so that the ad/tracker blocking
code can be written directly in C++ and not have to run in Javascript and talk
over plugin APIs?

Maybe these results are only applicable to the desktop version of Brave?

~~~
bbondy
Some things are very wonky with the experiment:

Brave is intentionally slow on parsing and do as much work there because it
doesn't parse from client code, it only use already parsed lists from memory.

"The memory usage of Brave could not be evaluated using the devtools and thus
is not included in this section." That doesn't make sense, I wonder if it's
maybe using a very old version based on the old muon code base? If you can get
the memory from Chrome you can get it from Brave.

No information was given about versions that were tested.

Total parsed rules is too small.

~~~
pythux
> Some things are very wonky with the experiment:

Thank you for taking the time to read this study. We do not think we claimed
anything that was false in this study (although the scope might not be as wide
as some would expect or desire); this is not a reason to be dismissive. We
have ourselves a lot of respect for the work done at Brave.

> Brave is intentionally slow on parsing and do as much work there because it
> doesn't parse from client code, it only use already parsed lists from
> memory.

That was indeed one of the things measured, but not the most important one. In
fact we explicitly say that this is a one time operation and does not
necessarily matter, especially if as you suggest you can perform this work
backend-side and ship the serialized version to clients. What is more
interesting is the time it takes for matching requests.

> "The memory usage of Brave could not be evaluated using the devtools and
> thus is not included in this section." That doesn't make sense, I wonder if
> it's maybe using a very old version based on the old muon code base? If you
> can get the memory from Chrome you can get it from Brave.

If we got this thing wrong we would be very happy to update the results with
the correct measurements. The version we used was the latest version from
`master` on the following repository: [https://github.com/brave/ad-
block](https://github.com/brave/ad-block)

> No information was given about versions that were tested.

This is indeed unfortunate and we will be correcting this. The measurements
were performed last week with the latest version of each projects but we
should definitely indicate the exact version used.

> Total parsed rules is too small.

Too small for what exactly? Easylist is one of the most popular lists and it's
pretty common to use it as a base-line for comparison. It is trivial to re-run
the experiment with different lists given that all the code is open-source.

------
Tsubasachan
You know what slows down browsing the internet?

Ads.

~~~
pmarin
Javascript.

------
DeepYogurt
TIL: DDG has an adblocker. Good to know and I hope they can improve it.

------
lvs
Only part of the argument with ad blockers is speed. The rest of the argument
is trust. So, how do you make money?

As far as I know, ublock Origin has no profit motive whatsoever. Correct me if
I'm wrong.

------
StreamBright
Also, how about blocking on the DNS level? Why does this article concerned
with in browser ad blocking only? I bet it is much less resource utilisation
if you do it in lower layers.

~~~
thetinguy
DNS adblocking sucks because it takes too long to turn off when a website
breaks because of ads.

~~~
StreamBright
On contrary, I do not want to use any website that breaks because of ads.

~~~
kkm
Yes, that's another way to look at it. But it is not necessarily true for all
the users.

~~~
StreamBright
I understand, I never implied that it is for all users, I was just mentioning
that there are other options than JS ad filtering.

Btw. DNS based filtering works for non tech-savvy users like my parents who
would certainly fall for malicious ads which are frequent even on Google. And
guess how ransom-wares spread.

[https://www.zdnet.com/article/skype-served-up-malware-
throug...](https://www.zdnet.com/article/skype-served-up-malware-through-in-
app-malicious-ads/)

~~~
kkm
Yes, I think there are certain domains specially spyware, malware that need to
be blocked altogether and DNS blocking is the most optimal way for that,
specially considering the wide adoption of IoT, where extensions cannot run.

------
codedokode
> All benchmarks were ran on an X1 Carbon 2016 (i7 U6600 + 16 GB) in Node.js
> 11.9.0.

They should test on average hardware and not on the top one. Take a 2013 year
Celeron or Atom with 2 Gb of RAM and HDD and test on it.

Regarding privacy, I think the code that blocks network requests could run in
an isolated environment so that it cannot send the information about requests
outside.

~~~
barrkel
I think it should be on a desktop class machine. I'm less concerned about
absolute numbers and more with variance. Laptops generally cannot dissipate
all the heat from full CPU usage for more than a minute or two. Benchmark
runs, if they exercise the CPU hard for any length of time, get thermally
limited and the results end up with substantially more variance.

Even things as simple as build times, I've seen vary by 20+%.

~~~
pythux
Benchmarking is always a hard problem - no such thing as spherical chickens in
a vacuum. That said we'd love to at least standardize the setup for other
people running the benchmarks; that is why we opened all the code and data, as
a starting point.

For the study, measurements were run with one of our personal laptops (an X1
Carbon from 2016 with an i7 U6600 CPU and 16 GB of Ram, which is indeed a
pretty powerful machine). We tried very hard to limit the impact of frequency
throttling due to limited thermal dissipation of the device for the long-
running benchmarks. In fact, for the measurements we put the laptop outside at
0 degrees Celsius, and we could observe that the CPU temperature did not go
beyond 60 degrees (which is pretty low).

Do you have any suggestions on how we could improve this setup? We welcome all
contributions.

------
Fnoord
I don't see Firefox mentioned once in this whole article, so I assume it isn't
meant for the sole browser I am using. Therefore this performance study is
useless for me.

------
jachee
Faster still is DNS-based, network-wide blocking with pi hole.

~~~
darkpuma
If that works for you, that's great. But it's a pretty crude tool. Kind of
like wood carving with a chainsaw; some people swear by it and it's undeniably
efficient, but other tools give you more delicate control. I wouldn't go
without ublock origin's cosmetic filters which I use to block many things that
aren't ads (such as those annoying floating bars at the top and bottom of the
screen a lot of websites use just to annoy users (sorry, _" improve
conversion"_) and waste vertical screen space.) uMatrix (and to a lesser
degree, advanced mode in ublock origin) also lets you differentiate between
blocking on first party or 3rd party. For instance I block youtube on every
website except youtube.com

------
ajobforme
why is adblock not included?

~~~
pythux
Good point and we should certainly have mentioned it. AdBlock is using the
same underlying engine that Adblock Plus is developing (adblockpluscore),
which means they have the same performance when it comes to matching requests
(there could still be differences in the extension itself but that was not
measured in the study).

Source: [https://github.com/betafish-inc/adblock-
releases#intro](https://github.com/betafish-inc/adblock-releases#intro) and
[https://en.wikipedia.org/wiki/AdBlock](https://en.wikipedia.org/wiki/AdBlock)

------
Dahoon
Anyone notice who posted this? Ghostery did.

Look at that sub history..

[https://news.ycombinator.com/submitted?id=kkm](https://news.ycombinator.com/submitted?id=kkm)

They should be shamed. I'll make sure to tell anyone I see mention them to
stay clear.

------
g45y45
Google, the worlds largest advertising company, abuse market share to crush
those that would stand in their way. I stopped using Chrome when this was
proposed, and you should too!

Article provides evidence that Google's pretext of performance doesn't hold
water.

------
bronlund
This has to be one of the stupidest performance reviews I have ever seen.

