
Firefox 57 delays requests to tracking domains - bzbarsky
https://www.janbambas.cz/firefox-57-delays-requests-tracking-domains/
======
codezero
I work for an analytics company, this will affect my snippet – I don't think
it is a big deal, but I'm going to benchmark it on a few sites. From what is
described here, it may actually help us. When folks throw tons of tags into
their site, we're all competing against each-other as well as the site's own
loading – it makes sense to prioritize the site's rendering.

I think of this as no more invasive than pre-caching stuff in a page when you
see links that are likely to be clicked, or any other browser optimization.

I may be pounding my fist later if the results don't look great, but
ultimately, I think Mozilla's heart is in the right place.

I can think of tons of ways to work around this if I thought it was a real
issue, I don't. People who want to block analytics, trackers, ad networks,
etc... know how, and this doesn't feel like it is targeted at those people, or
even squarely at the tracking pixels.

HN-pre-emptive-defend-myself: My company doesn't share or sell data with third
parties (it's in our TOS – and could obviously change in some Orwellian
future, but we'd have to update the TOS), it's the customer's data, they can
use it in the same way they'd use any other data they collect. We don't do
cross-domain tracking, or aggregate behavioral information across customers,
and the way we've built it, it makes it pretty hard to tie any two visitors
together across sites if you were to hypothetically acquire our dataset. We're
also working on building tools to comply with GDPR and wipe out data per-
request. Anyways, I think a lot of very bad actors have made the space have a
bad reputation and I don't blame anyone for blocking everything, or even for
hating my company, or generally all analytics.

~~~
Sophistifunk
It's a grey area. But I am interested, do your ToS forbid your customers from
on-sharing the data with cross-site aggregators?

~~~
anfedorov
"on-sharing the data with cross-site aggregators"?

The ToS and Privacy Policy of an analytics company discloses what the that
company may do with their client's users' data, not what the client can do
with their own users' data.

There's another Privacy Policy between the client and the user where the
client discloses what they may and may not do with user data and the user
decides whether to become / remain a user.

There are also laws which dictate the data handling further on behalf of large
groups of users, e.g. GDRP for EU citizens and, on this side of the pond,
interesting Supreme Court cases being decided about what kind of data users
could have property rights to:
[https://en.wikipedia.org/wiki/Carpenter_v._United_States](https://en.wikipedia.org/wiki/Carpenter_v._United_States)

------
marten-de-vries
Interesting approach. I hope more browser vendors will adopt it. It comes with
an additional advantage: if a website still needs to function when tracking
domains are delayed a second or so, they likely will function too with said
domain completely disabled (e.g. using extensions). Sounds like a win from a
user perspective.

~~~
michaelmior
Do you mean that browsers implementing this will force website owners to make
their sites work with tracking domains delayed? This does seem like a nice
side benefit.

~~~
marten-de-vries
Yes, that's my hope. For the income of website owners it is better if they all
collectively block their site until the trackers are fully loaded. But if only
a few do so, users might avoid their site, so individuals are still encouraged
to support delayed loading.

I imagine Mozilla is in a similar situation: if too many websites block, they
will have to disable the delay or start whitelisting trackers if they do not
want to loose users. That is, unless other browsers follow their lead here.

Prisoner's dilemmas all around. It's going to be fun to see what we'll end up
with, although I'm optimistic as I haven't heard complaints about this yet,
but then again I wouldn't notice any difference myself as I'm already using
NoScript.

~~~
yjftsjthsd-h
I think it's not quite a normal prisoner's dilemma, because it's not boolean.
The browser can be less aggressive initially, pushing against only the worst
cases, then become more aggressive over time to make site behave better.

------
korethr
While I like the idea, a potential problem comes to mind. A list of what
domains are tracking domains will need to be maintained. The need to maintain
that list will possibly move the cat-and-mouse game of ads vs ad-blockers to
the tracking list as well. I could totally see Alphabet paying Mozilla to
remove analytics.google.com from the tracking domain list (because "it's just
analytics; it doesn't track you or infringe on your privacy at all") similar
to advertisers paying to be on Adblock Plus' "Acceptable Ads" list.

~~~
deckar01
> using data of the Tracking Protection database

This database is already being maintained and used for private browsing. I
find it unlikely that an analytics CDN will play cat and mouse with Mozilla
just so it's script will load a few milliseconds sooner.

~~~
rihegher
Where can we find this list?

~~~
KozmoNau7
The list is supplied by Disconnect.me

------
kevin_thibedeau
> Google’s A/B testing initially hides the whole web page with opacity: 0

Sounds like a violation of Google's own policies.

~~~
vim_wannabe
Now try Google's sweetheart project
[https://www.ampproject.org/](https://www.ampproject.org/) or any amp-powered
site. They contain this gem of a boilerplate hiding page content for 8 seconds
or until whatever script is finally loaded that overrides it:

    
    
        body {
          animation: -amp-start 8s steps(1,end) 0s 1 normal both; 
        }
        @keyframes -amp-start {
          from { visibility: hidden; }
          to { visibility: visible; }
        }

------
qwerty456127
Isn't it anti-net-neutrality? Is it ok to build it into a major browser? I
actually block all the tracking, malware, ads, social networks, fraud,
gambling, porn (except a couple of porn sites I like :-)) etc domains I could
find information about but this is my personal conscious choice, I have
manually installed extensions for this and built the lists. Shouldn't other
people do the same themselves too if they choose to or leave everything as is
if they don't actually feel they need to change anything?

~~~
paulgb
For me, using Firefox _is_ the personal conscious choice you refer to. Their
brand is that the browser is on your side, I'm happy to have the defaults set
accordingly.

(Edit: I should add that I think you're asking a very valid philosophical
question that my opinion as a user of Firefox doesn't fully address)

~~~
qwerty456127
From the practical point of view, I am afraid the trackers will respond a way
if such a move is made on such a global level. Nobody was fighting AdBlock+
when it was only used by geeks and now as is it has gained so much attention
there are a lot of sites that won't work if you use AdBlock+. Introducing te
"do-not-track" header was a great idea but it has been completely ruined by
major browsers turning it on by default so the trackers have legitimately
chosen to ignore it.

~~~
fro0116
> Introducing te "do-not-track" header was a great idea but it has been
> completely ruined by major browsers turning it on by default so the trackers
> have legitimately chosen to ignore it.

This seems to imply that doNotTrack would have been successful for its
intended purpose had it not been a default setting at one point, which feels
like wishful thinking.

Sure, that aspect made it impractical for uncharacteristically privacy-minded
companies in the space to support the header, but the vast majority of
companies in the business of tracking and advertising would have ignored it
anyways, because there's zero consequence for not doing so.

------
jrochkind1
Much of this discussion is missing the point that the stated goal of this
change is actually to _help_ sites that use lots of tracking scripts, not to
penalize them.

It has become common to use so many tracking scripts that the perceived page
load time (time to display/interactivity) is actually significantly slowed
down. I actually first installed an ad-blocker myself when I realized some
sites were taking like 5+ seconds to load, and loaded quicker with the ad-
blocker. (But this change isn't a _blocker_ of scripts, it's trying to change
order and timing of execution to speed up page load while _keeping_ the
scripts).

The intent of this change is to delay load of those scripts (which are already
being loaded with code that loads them async, that is, without spec guarantees
of load order or timing) until after the page UI is loaded and operative, to
_improve_ perceived load time.

I'm not sure if people are missing this point, or don't believe the stated
goal and think it's secretly a plan to hurt these sites instead. I believe the
stated goal. (whether they like or hate that idea!) As the OP says though,
there are certain pages that _may_ be unintentionally harmed by the change, if
they were relying on quick load of scripts that they should not have been
relying on because they were already being loaded async (that is, with no
guarantees of load order or timing, already).

If this ends up being a non-trivial number of pages, and those pages/tracking
frameworks don't fix themselves to accomodate, then I predict the change will
be considered unsuccessful and unfortunately rolled back.

It is meant to _help_ pages that use a lot of tracking scripts, not hurt them.
Although I guess the assumption is that actual load time of interactivity is
prioritized over making sure your tracking scripts are in immediately. If site
owners actually prefer to slow down their pages non-trivially in order to
guarantee tracking scripts immediately, then I guess they wouldn't see it as
help. shrug.

I think the OP author is probably regretting his post title. It maybe should
have been "Firefox 57 speeds up load time to interactivity of pages with lots
of tracking scripts", heh.

------
mehrdadn
Does anyone know where I can find the list of sites that are affected?

EDIT: Seems to be here: [https://github.com/mozilla-services/shavar-prod-
lists](https://github.com/mozilla-services/shavar-prod-lists)

~~~
anfedorov
Here it is newline-separated in case you want to import it into the Adblock
DNS filter for iOS:
[https://gist.githubusercontent.com/anfedorov/1fa7dc8871b20da...](https://gist.githubusercontent.com/anfedorov/1fa7dc8871b20dab92f856f1e8bddaff/raw/9dd5f81068cac4c90129c8c2ebb218a6df03fa5b/gistfile1.txt)

~~~
mehrdadn
Thanks! I wasn't but I'm sure others will find it useful.

~~~
anfedorov
Just realized the list contains facebook.com as well as mail.google.com, so
perhaps a little overly broad there.

------
bo1024
I love to see software that is actually acting on behalf of the users running
the code rather than companies serving it.

This does point to some problematic directions though.

If web standards were simpler and it were easier to build competing browsers
(and/or if we could trust plugins more), then it wouldn't be much of a concern
if some browsers choose to experiment with these kinds of protections.

------
feelin_googley
This raises the question: Why delay instead of block?

Assumption is that user wants page to load faster _but does not object to
tracking_.

What if user wants page to load faster _and_ objects to tracking?

Source of Firefox tracking protection is list at disconnect.me?

credit: eco
[https://news.ycombinator.com/item?id=15964393](https://news.ycombinator.com/item?id=15964393)

Basic gethostbyname()->HOSTS file blocking:

    
    
       #!/bin/sh
       exec curl https://disconnect.me/trackerprotection/blocked \
       |sed -n '/<\/br>/!{/\./s/^/255.255.255.255 /;};/\./p' >> /etc/hosts;
    
       User may want to add these too:
       https://disconnect.me/trackerprotection/unblocked
    

Beyond HOSTS file, authoritative DNS gives more flexibility, e.g. logging all
requests, using wildcards, etc. For example, tinydns:

    
    
       #!/bin/sh
       curl https://disconnect.me/trackerprotection/blocked \
       |sed -n '/<\/br>/!{/\./s/.*/.&\
       =&:255.255.255.255:1/;};/\./p' >> _root.zone/root/data;
       if cd _root.zone/root/data;then exec tinydns-data;fi
    

Or dnscache:

    
    
       #!/bin/sh
       curl https://disconnect.me/trackerprotection/blocked \
       |sed -n '/<\/br>/!{
       /\./s/.*/echo 255.255.255.255 > _dnscache\/root\/servers\/& /;};
       /\./p' > block.sh;
       if sh -c ./block.sh;then exec rm block.sh;fi
    

Note I am not recommending this particular block list. Compared to the one I
use, it seems incomplete. I prefer to create own list from DNS logs from own
network traffic. Use of wildcards can shorten a long list like this
substantially.

~~~
Vinnl
Blocking is already supported: [https://support.mozilla.org/kb/tracking-
protection#w_how-to-...](https://support.mozilla.org/kb/tracking-
protection#w_how-to-turn-tracking-protection-on)

Mozilla doesn't want to enable it by default because it is the funding model
of the web, and no viable alternative funding model for most of the web's
content has arrived yet. (At least, that's the public statement. I don't know
if e.g. potential lawsuits, etc. also play a role.)

~~~
bigbugbag
Tracking is the funding model of online advertising and investor storytime but
it's not the funding model of the web. I think there's no such thing as a
funding model of the web.

the web was exclusively non commercial and later was opened to commercial
activity and later advertising which added tracking because their business
model was broken and they needed something to show their customers and
investors to gein/keep their trust that online advertising was not a total
scam as physical world mostly is.

The unexpected side effect is the creation of a global totalitarian
surveillance state just to show ads:
[https://www.hooktube.com/watch?v=iFTWM7HV2UI](https://www.hooktube.com/watch?v=iFTWM7HV2UI)

------
paulie_a
Ive wondered if it would be possible to tumble the requests via P2P network.
Completely destroy the analytics

~~~
yjftsjthsd-h
Use TOR?

~~~
paulie_a
That doesn't accomplish my goal,. I don't want to hide, I want to ruin the
data collected by making it innacurate and useless

EDIT: Judging from the low quality of online marketing, it isn't hard to do.

~~~
bigbugbag
Something like adnauseam ?

[https://adnauseam.io/](https://adnauseam.io/)

------
dewiz
I’d prefer to be paid to be tracked. Eg the browser should send my wallet
address as a signed header, and upon payment verification the delay (or any
other penalty) lifted for 60 minutes

------
beedogs
I just block tracking domains. And ad domains. My browser, my rules. The sites
that can't handle it aren't worth the bother.

------
659087
Tracking scripts should be blocked by default, not just delayed. If a user has
some strange desire to be watched, give them the option to opt-in to their
browser unblocking tracking scripts/domains.

~~~
bzbarsky
> Tracking scripts should be blocked by default, not just delayed.

There is in fact a preference to do that: Settings -> Privacy -> Tracking
Protection, set to always.

It breaks some reasonably popular sites, which is why it's not on by default
across the board right now.

~~~
659087
Yes, there's an option to enable blocking. I think it should be the other way
around, which was kind of my whole point. That's the only way to encourage
site owners to stop designing sites that break when they can't spy on users.

~~~
bzbarsky
If the tracking protection thing were being created before the sites, I agree.

As things stand, it's a hard sell to have a browser do something that breaks
sites that used to work. Most users tend to not be happy about it. This is
definitely an area worth pushing on; the question is how best to do it.

------
sundarurfriend
> To conclude on how useful the tailing feature is – unfortunately, at the
> moment I don’t have enough data to provide (it’s on its way, though.)

I wonder if this post was rushed to publication, to manage Mozilla's public
image and reestablish it as a friend to user privacy, after the 'Looking
Glass' fiasco a few days ago.

(I'm not exactly opposed to such PR efforts, as long as they're accompanied by
actual internal change in the company.)

~~~
bigbugbag
Much of mozilla communication is PR and marketing.

Maybe this is damage control, but somehow I think they are not well organized
enough to have this kind of thing happen.

Besides how is this helping with privacy ? Trackers are still loaded and
tracking.

~~~
ta12348765
"By Deeds, Not Words", Mozilla. They've had a few too many strikes lately.
Their actions seem particularly greedy and tone-deaf given their supporters,
and no longer aligned with what I want out of a company. Despite their
constant words and apologies, they keep pulling this stuff; as a 10+ year user
of Firefox I'm pulling the plug on it, just can't be trusted any longer.

Silently installing a plugin and doing an end-run around any policies in place
etc. is just clown-school level.

------
fenier
Any idea if a detailed notice from Mozilla will be issued about this at
[https://developer.mozilla.org/en-
US/Firefox/Releases/57](https://developer.mozilla.org/en-
US/Firefox/Releases/57) or similar?

------
ariana82
This is not something new as I use chains of local proxy and cache proxy
(Privoxy and Squid) on my PC with Firefox to block ads and tracker. The
problem is Firefox has memory leak issue as it consumes ~2GB RAM while
browsing Facebook or using addons like Ghostery or Adblock.

~~~
bigbugbag
Why would you use ghostery which was a tool to study people who want to black
ads so they can be better served ads ?

Why would you use adblock which is outdated when ublock origin has been
available for a while ?

It's strange that you take extra steps to block ads and tracker but peruse
facebook whose sole purpose is tracking and profiling you and your family,
friends, acquaintances through you.

~~~
ariana82
I don't use Ghostery and AdBlock. I see my friends using it and how Firefox
consume tons of his PC memory.

I use different browser to access different category websites. Brave browser
for social media and Firefox for news reading or researching. Both of that
browsers using the same local proxy connection in my PC. I installed Privoxy
and Squid on the same PC with that browsers and so they are appear as the same
user agents (e.g. Chrome).

When I access [http://www.janbambas.cz](http://www.janbambas.cz), of course my
browser loading faster as my Privoxy allow/block these domains:

    
    
      2017-12-21 00:55:32.780 00000f04 Request: www.janbambas.cz:443/
    
      2017-12-21 00:55:34.632 00001194 Crunch: Blocked: fonts.googleapis.com:443
    
      2017-12-21 00:55:34.670 00000e3c Crunch: Blocked: secure.gravatar.com:443
    
      2017-12-21 00:55:34.697 00000e8c Crunch: Blocked: www.google.com:443
    
      2017-12-21 00:55:34.711 000013e8 Crunch: Blocked: secure.gravatar.com:443
    
      2017-12-21 00:55:34.741 000012cc Crunch: Blocked: secure.gravatar.com:443
    
      2017-12-21 00:55:35.680 00000ce4 Crunch: Blocked: secure.gravatar.com:443
    
      2017-12-21 00:55:35.721 00000978 Crunch: Blocked: secure.gravatar.com:443
    
      2017-12-21 00:55:35.736 000010e0 Crunch: Blocked: www.google.com:443
    
      2017-12-21 00:55:35.759 00000104 Crunch: Blocked: secure.gravatar.com:443
    

You see that, my Privoxy blocked 9 connections and 3 domains.

What makes me sick is this person #Honza Bambas# doesn't give a real solution
to our browsing problems. _He only makes us die slowly._

~~~
Dylan16807
Can you explain why you think gravatar is a problem?

Or google fonts?

~~~
ariana82
By blocking both of domains we reduce page load time. The browser doesn't have
to wait _forever_ for both of connections complete as it already blocked.
Moreover, I need only the article and not picture of his avatar or fancy
fonts. Remember, Google fonts are files in Google's web server, every request
to a web server will be written in its log file (e.g. browser's user agent,
http referer, IP address). Like suspicious OCSP requests (in disguise), it's a
kind of tracker, right?

------
jancsika
Is there a config setting for the delay period in double precision units? :)

~~~
TD-Linux
If you're alluding to setting the delay to infinitely large, you can do that
by going to Settings -> Privacy -> Tracking Protection and setting it to
"always".

------
fiatjaf
Google search pages are taking a lot to load on Firefox sometimes, they fail
altogether a lot also after some time.

Does that happen with anyone else?

------
b0rsuk
Unfortunately this can give ammunition to the anti-neutrality lobbyists. "See,
everyone is doing it anyway."

------
pb000
Well, love them or hate them but tailing them is not Net Neutrality ;)

------
fooyc
Is this net neutrality compliant ?

~~~
iscoelho
Net neutrality is in regards to the internet carrier, not the browser (or
client). The browser is irrelevant to net neutrality.

~~~
pbhjpbhj
>The browser is irrelevant to net neutrality. //

It's not the currently spoken of Net Neutrality, but it's relevant to the
neutral carriage of data over the internet. If browser companies pick and
choose whose data to delay (or otherwise alter) then they have the power to
bias the web - Firefox _could_ always delay scripts from other companies than
Google, for example, in order to preference their business associate. They
presumably aren't informing users, or requiring users to enable the function.

In short it seems highly pertinent in the net neutrality debate, to me,
despite perhaps not having reached problematic levels and despite not being
the specific form of neutrality that has erstwhile been grabbing the
headlines.

~~~
notatoad
it's an issue with similar consequences to net neutrality, but it's very much
not the same issue. The network is one layer, the client is a different layer.
Let's not start forcing clients to conform to network rules, or vice versa.

additionally, the reason net neutrality is so important is because there is no
consumer choice in the ISP market for many users. Even when one browser is
dominant, there's a lot more choice among browser vendors, so pushing out
regulation to them is less important.

But let's not muddy the waters of net neutrality by injecting separate issues
into the debate. Yes, browsers should treat all traffic equally, but that
isn't net neutrality.

~~~
nasredin
I think the anti-net-neutrality people are running out of fresh, not yet
discredited arguments.

In OSI model IIRC:

Application (software) is layer 7.

Physical connection is layer 1.

I am sure somebody will correct me.

~~~
pbhjpbhj
Not sure who you're claiming is anti Net Neutrality, but if that's levelled at
me it's quite wrong.

See my other comment in thread, but in short I don't think a user cares really
that action is at a different level of the conceptual OSI model, they care if
they can consume given media, if NN legislation shifts things so that the
browser blocks the media from a particular server instead of that servers
upstream ISP I don't think users are going to be applauding too much.

~~~
brokensegue
They should be applauding though. We can always fork Firefox if it's not
acting in our best interest. We cannot fork Comcast.

~~~
ankushnarula
Sure we can:
[https://en.wikipedia.org/wiki/Breakup_of_the_Bell_System](https://en.wikipedia.org/wiki/Breakup_of_the_Bell_System)

~~~
brokensegue
Well "cant" v. "wont" is sometimes hard to distinguish. There's little
practical differences here though.

------
jjordan
This doesn't seem like it should be built in at the browser level. The
brower's job should be to process requests and render webpages as quickly as
possible. If a user wants to intervene in that process, that is their
prerogative, but it definitely should remain at the plugin level.

~~~
bzbarsky
I agree the browser's job should be to process requests as quickly as
possible.

If the web page asks for 150 scripts to be loaded (e.g. that's what
[https://www.nytimes.com](https://www.nytimes.com) does right now), what is
the fastest way to load them? "All in parallel" is not the right answer, so
you end up prioritizing. At that point, maybe you want to prioritize the 40
non-tracking scripts (how many get loaded if I enable tracking protection in
Firefox on that site) over the 110 tracking ones.

------
abalone
Problem here is Firefox is making big and sometimes wrong assumptions about
what these so-called "tracking domains" do. That's not in any W3C spec --
that's Mozilla attaching specific assumptions to specific parts of the
Internet. Case in point:

 _> One example is Google’s Page-Hiding Snippet, which may cause a web page to
be blank for whole 4 seconds... Both the analytics.js and the test script are
loaded from www.google-analytics.com, a tracking domain, for which we engage
the tailing delay._

Clearly if a domain is serving up an A/B test script then it is doing more
than tracking.

 _> Simply said some sites may need to be fixed to be able to adopt this
change in scheduling._

I have a problem with the term "fixed" here. Seems like Firefox's assumptions
are what's broken here.

------
privacywall
I built PrivacyWall to block all Firefox telemetry urls. It may make your
browsing even faster since it blocks all unwanted data collection that happens
in the background at the OS level. That means it is more effective than an
extension running on top of Firefox, and Firefox cannot surreptitiously send
data without your knowledge anymore. If you are working on sensitive projects
and this is something you are worried about, I am making it available for free
for non-commercial users at
[http://www.privacywall.org](http://www.privacywall.org)

I received alot of emails from loyal Firefox users telling me they are worried
about their privacy when using Firefox 57 after the Looking Glass debacle, so
I decided to make it available for free. If there are tracking domains sites
you you think should be blocked due to suspicious behavior, tell me the urls
and I will evaluate for inclusion. Please feel free to submit it as a comment
to this thread or submit it using the form submission field on the PrivacyWall
homepage.

~~~
Vinnl
Fun fact: you can turn off telemetry yourself in Firefox. And it's open
source, so you (or someone else) can check that it's actually off.

~~~
privacywall
For the sake of Firefox users, I hope you are right. There's been complaints
it flips back on after Firefox updates, so your privacy is at the whims of
Firefox.

Fun fact: Firefox just pushed out the Looking Glass add-on to users without
notice or consent this past weekend.

~~~
Vinnl
In the case of it flipping back on, you're at the whims of bugs - just like
with all software. If there's any organisation I'd trust, it's Mozilla - if
only because if they make mistakes, there is a lot of pressure for them to
correct this (case in point: Looking Glass).

Note that Mozilla pushes code without explicit consent for all parts of it all
the time - they're called software updates. The problem in this case was that
it was for a potential feature that very few people cared for, and that it
showed up as a scary extension in the extension list. That definitely should
not have happened, but it's not a privacy violation.

------
cookiecaper
This is the ground on which the war against Google, Facebook, et al is fought,
but open-source solutions like Firefox are going to need to find a better way
to monetize if they want to win. All this does is firm up the resolve of user
identity resellers like FB and Google to see that Firefox gets destroyed, and
it demonstrates why tech eventually all boils down to platform control.

With Facebook code intruding into every nook and cranny of the web via React
and Google's position as digital Sauron, monitoring and parsing out your
search history, watch history, all phone activity, emails, SMSes, and more,
user identity resellers are a formidable foe, and Mozilla is in its familiar
position as the David of noble, non-moneyed interests challenging the user-
hostile Goliath.

Firefox may find unlikely allies in Microsoft and Apple, since these two
companies still make at least _some_ money selling an actual product instead
of just slurping up information about their users and repackaging it. The best
thing Mozilla could do is convince Apple and Microsoft to give up their
independent lackluster browser implementations and ship Firefox as the default
instead.

~~~
marten-de-vries
I agree it would be great if other browsers implemented this as well, but you
lose me at:

> The best thing Mozilla could do is convince Apple and Microsoft to give up
> their independent lackluster browser implementations and ship Firefox as the
> default instead.

Multiple independent implementations are of vital importance to the web. We've
seen it with IE6 before, and having two browsers is cutting it too close.

~~~
cookiecaper
The "independent implementations" aren't worth much if nobody uses them.
Chrome already has a majority of the market share at ~60%, with Safari, its
nearest competitor, trailing far behind at ~15%. Firefox is sitting at 9.3%.
[0]

Combining IE/Edge, Safari, and Firefox still leaves Firefox at half of the
market share of Google, and that's still a distinct disadvantage, but it's a
much stronger fighting position than single-digit market share. Google and
Facebook are not naive upstarts and it won't be easy to quash them, especially
not when little old Mozilla is running on comparative fumes compared to a
couple of the best-capitalized companies on the planet.

Google and Facebook's interests are aligned as both base their business model
on profiling and reselling data derived from user behavior, so it's unlikely
that Google will feel inclined to implement desired user protections against
that business model.

It's very important that we have strong competition representing a diversity
of interests, especially when the dominant player's business model is just
"slick spyware". Without a competitor in the same league, consumers don't
really have an option.

[0]
[https://www.w3counter.com/globalstats.php?year=2017&month=11](https://www.w3counter.com/globalstats.php?year=2017&month=11)

------
peterwwillis
Well, <expletive>. Looks like I can't rely on either Chrome or Firefox not to
<expletive> with websites.

I would really love to just stop using the web entirely and go back to Gopher.

~~~
marten-de-vries
From the article:

> Scripts are delayed only when added dynamically or as async. Tracking images
> are always delayed. _This is legal according all HTML specifications_ and
> it’s assumed that well built sites will not be affected regarding
> functionality.

(emphasis mine)

~~~
peterwwillis
I don't know about you, but I don't browse the World Wide Implements-The-
Specification-Perfectly Web.

~~~
mquander
If you aren't complaining because it violates the spec, what are you
complaining about? Is any change to any detail about how a browser works
necessarily bad?

~~~
mehrdadn
I understand the complaint. Let me put it this way: how would you feel if your
ISP was delaying your connections to a subset of websites for a few seconds?
It wouldn't violate any specs, as far as I know. But a _lot_ of people have
expressed the sentiment that they don't want middlemen messing with websites.
It's not clear to me that Firefox qualifies as an exception to the rule,
especially if this becomes something other browsers adopt.

~~~
peterwwillis
It's not even a NN-type middlemen issue for me, though that is exactly what's
going on here. The bigger problem for me is causing regressions for users. On
top of that they're causing regressions just because they don't like X
traffic, and they're not even exclusively affecting X traffic - they're
affecting unrelated traffic too. It's just an incredibly arrogant, annoying,
bad thing to do to users who never requested this to begin with.

~~~
yjftsjthsd-h
Should only affect pre-broken code. Like complaining that a compiler is doing
something with undefined behavior than you wanted: I get that it's annoying,
but maybe fix your code so it's not a problem?

~~~
peterwwillis
This isn't even a broken code issue. This is a totally unnecessary
functionality regression issue. Instead of just loading a page, they're
waiting four seconds to load the page, because the page uses an asset on a
domain they flag as a tracking domain.

This is like if the compiler generated loops with 4000ms sleeps because the
app links a library the compiler thinks is annoying.

Technically the compiler never said it _wouldn 't_ add random sleeps into
loops. _It 's totally in spec! What's the big deal?_

Meanwhile, my app is slow now. Or in the case of some apps, actually broken
for active use cases where it used to work fine. Which, again, is totally
regression by any QA standard.

~~~
bzbarsky
> they're waiting four seconds to load the page

You make it sound like Firefox is just adding a wait for no reason.

The reality is that the page is asking Firefox to download dozens or hundreds
of scripts [1]. Firefox needs to prioritize those loads somehow, because it
generally doesn't want to open that many connections to the server in
parallel. So it prioritizes the non-tracking bits over the tracking ones. If
all the non-tracking bits are done loading, the trackers start loading at that
point.

> This is like if the compiler generated loops with 4000ms sleeps

No, it's more like if your OS scheduler decided to prioritize some
applications over others based on how much it thinks you care about them (e.g.
based on whether they're showing any UI, or based on whether they're being
detected as viruses by the virus scanner).

[1] For example, [http://www.cnn.com/](http://www.cnn.com/) shows 93 requests
for scripts in the network panel in Firefox. If I enable tracking protection,
that drops to 37 requests.

Or for another example, [http://www.bbc.co.uk/news](http://www.bbc.co.uk/news)
has 67 script requests and only 20 with tracking protection enabled.

Or for another example, [https://www.nytimes.com/](https://www.nytimes.com/)
has 150 script requests and only 40 with tracking protection enabled.

------
rotrux
:[ I'm afraid this might be a "too-little-too-late" attempt to stave-off the
downfall of net-neutrality.

Net-Neutrality means that the finance of the internet is driven by site-owners
receiving funds from advertisers (or from subscriptions...but that seems
comparatively negligible)

Getting rid of net-neutrality will mean ISPs (despite their common-carrier
designation) will control this whole model^...meaning they will control the
internet.

The largest argument in support of repealing net-neutrality involves taking
power away from advertisers...so does this Mozilla policy.

~~~
Rudism
I'm not sure what article you read, but this change in Firefox is simply a
change in the timing of requests to certain domains that are known to be for
tracking/analytics in an attempt to improve the load time of websites.

If anything this move is anti-Net Neutrality since it's prioritizing non-
tracking domains over tracking domains as opposed to treating all domains
equally.

~~~
chowells
It cannot possibly be anti-net-neutrality, because it is an action being taken
by the user agent, not a transport.

~~~
rotrux
@Rudism actually convinced me you're correct (in spirit) unless there is some
other portion of information we're missing. However a client-application is
not the same thing as a user-agent. Just because their software is on my
laptop does not mean I'm in control...What makes you think client-side
requests preclude this being about net-neutrality?

Latency is additive...regardless of what layer we're talking about. increases
in transport latency can be compensated for at the physical or application
layer...time is time, the OSI model isn't involved.

