This was one of my favorite examples of how there was much more to NoScript than most people assumed, and it had a depth of features that could not be matched by "alternatives" like uMatrix. But that feature was killed by the mass extension breakage in Firefox 57: https://github.com/hackademix/noscript/issues/133#issuecomme...
So in a way, this feature can be seen as more than three years overdue.
The truth is everyone trusta these particular extensions and so they should have more privileges and deeper integration. For example, Google's new extension APIs actually make a lot of sense: they allow extensions to do useful things without actually looking at user data. This is a big improvement and it should be imposed on all extensions on their store. It's just that uBlock Origin is so important that it shouldn't be subjected to these limitations. That's why I say it might as well become a browser feature.
I think I'd pay 50/year or so for Firefox.. Even if Chrome is free. In fact I already donate about half that. But the point of course is getting a lot of people to do that so they can achieve independence from Google.
I'm already paying with all the data Mozilla and Google extract from me. In fact, they should pay me. I haven't asked for or wanted any of the new browser features which have appeared since, say, 2001. They all serve somebody else, not me.
> I haven't asked for or wanted any of the new browser features which have appeared since, say, 2001.
Netscape 4.80 is available for download here: http://www.oldversion.com/windows/netscape/
IE 6 is available here:
Yeah, what's up with these experiements? Sometimes I run htop and notice lots of Chromium processes with field-trial-id parameters. Don't think I ever signed up for this!
I have donated to the servo project under the Linux Foundation and I'll possibly pay for Firefox VPN when it becomes available if I know the money goes into the corporation and not the foundation (yep, weird, but the corporation is where the browser gets developed. And money only goes from the corporation to the foundation, so if I want to support the development of the browser I guess that's how it has to be.)
IMO companies that aren't getting direct money from ad businesses deserve my donations more.
There's no guarantee that reaching independence from Google will stop Firefox from getting Google money, disabling features that made Firefox different or cutting jobs.
Firefox development is done by Mozilla Corporation (that doesn't accept any donations, AFAIK). A part of their earnings are given to their owner, Mozilla Foundation, that you are suggested to donate to, and it may not be a bad idea, but it does not finance Firefox development in any way.
Or to put it in another way: browser developers cannot imagine every possible use case that may come out of browsers nor are always the best judges of what is important and what not. It is just a matter of limited human imagination. The combined imagination of all potential extension authors is much greater than the combined imagination of whoever makes decisions about the features in a single browser - and extension authors do not have to convince anyone about adding those features in the browser, they can just throw them at the wall (users) and see what sticks.
For a similar see X11 vs Wayland and how the latter has to make application-specific extensions for functionality provided by programs written using functionality the former provided since practically forever.
I agree. I'm not saying we shouldn't have extensions. The entire ecosystem should be healthy, varied and with a low barrier to entry. I'm saying some extensions turned out to be so incredibly important that they really ought to be installed by default for every user. The only thing that stops uBlock Origin from being a browser feature is the fact it is an extension.
I installed uBlock Origin not only in my own browsers but also in the browsers of every single computer I have ever used. Sometimes people even comment on how much nicer the whole web browsing experience has become and they can't explain why when I ask them. People also seem to magically become immune to malware since malicious ads are no longer being shown and malware domains are being blocked.
When an extension has such an immensely positive impact on your users, browser developers need to recognize that fact and integrate it into the browser. At the very least they should ship the extension with the default browser package.
Open source has an advantage when it sets itself up as basic infrastructure that can be tailored to many roles. It is notable that Brave, being started by a CTO from Mozilla with extensive experience in Mozilla, went with Chromium as the browser base for whatever reason.
Maybe if Firefox hadn't damaged its extension ecosystem instead Brave's niche could maybe have been done with extensions. Who knows. The former userbase has been delivering powerful votes of no confidence against Firefox for a decade now.
That'd be great!
Someone here (long-ago thread) suggested uBlock Origin but it doesn't come anywhere near the functionality of uMatrix.
I'll continue using uMatrix and it continues to work perfectly but if Mozilla ever breaks it with incompatible changes, I'm at a loss what to do. Keeping fingers crossed it works for a long time.
I'd be happy to pay substantial money for something like uMatrix.
Eg my static filters start with:
Eg the first rule in the GH section says that github.com is allowed to make websocket and XHR requests to s3.amazonaws.com. If that line wasn't there, the very first line's rule would've blocked it.
Notice that 1p JS appears to be enabled by the fourth line, but I actually have dynamic rules to prevent JS by default, unless enabled per site:
no-scripting: * true
no-scripting: github.com false
The only thing that uM does and uBO doesn't is cookies, so I still use uM for that.
Which means that this is not a proper replacement.
Not sure if you can easily substitute arbitrary scripts (would probably be placing too much trust into filter lists) but the resource library seems to be quite extensive: https://github.com/gorhill/uBlock/wiki/Resources-Library#url...
No, that just makes the name more confusing!
But it experienced the best kind of scope creep: it gained the ability to block other dangerous web features (eg. Flash and other plugin objects, web fonts, etc.), gained features to make life easier when blocking scripts (ie. the surrogate scripts feature), gained other security features for blocking evil actions by the scripts that are permitted (XSS blocking, clickjacking protection), and helped pioneer some security measures that weren't related to scripting (HSTS, ABE as a precursor to and superset of CORS).
This is a bit of a PITA when one tries to make the site "work well" with JS both enabled and disabled; or provide _alternatives_ for when the user-agent isn't running JS.
Those work really well when the user-agent is blocking JS globally, but not for NoScript: broken behaviour everywhere.
But if you are doing progressive enhancement correctly you should not need any <noscript> so this becomes a moot issue.
It's not like you downloaded a Mozilla's executable one day and expected to see a Flaming Canine instead of a web browser.
Edit: Not you specifically, but someone.
It's whack-a-mole, but better whack-a-mole to learn-to-love-the-mole.
Another way to think of it besides the futility of whack-a-mole is, it's pushback, resistance, sand in the gears. It's making an undesired behavior less valuable. Yes you didn't stop sites from including analytics, yes tomorrow google will have some counter move, but that doesn't mean the effort was pointless. If you can exert a 5% pressure on some system and maybe only get a 5% reaction, that's perfectly fine.
It can also be used like the traditional GTM model, where it loads the primary GTM script browser-side, then that loads additional browser-side scripts based on the tags you implement (GA, Facebook, chat systems, map widgets, whatever). But the default GA support built into it avoids loading anything from Google's domains directly by the browser. And it's not even subject to the CNAME cloaking protections that ITP have implemented, since it's not using the "CNAME to third party" technique that's typically common for these sorts of things to get first party access/privileges and is instead actually running on your infrastructure.
I imagine the opposite is true, in that they hold so much power they can do as they please.
The GA specific shim:
Folks advocating for the use of hosted analytics instead of GA are correct ... but that’s not what most people will do. It’s just simpler to add a one line GA tracker to your code and call it a day. And these people will see Firefox usage drop to 0.
We have already seen “this site works best/only on Chrome”, especially on Google products like Inbox. Expect to see more of that as the web becomes a Chromium/Safari duopoly, according to analytics.
I get that people would like you know as much as possible about who visits their website. Sometimes even for legitimate reasons and not just out of an obsession with collection as much data as possible. But this analytics madness has gone too far. Pretty much every website you visit ships a bunch of data about you to multiple third parties. Often without consent. Just stop doing that. It's not a hard thing to do.
No, but it’s what people use. Let’s not ignore reality.
Bad developers already only test in Chrome regardless of what GA is telling them. This won't have much impact there.
Sarcasm aside, sites breaking or not working when analytics scripts are blocked is nuts. Is there a Wall of Shame for such sites (it may probably be the size of a search engine index)?
Imagine if calling into a library randomly failed in Python. Or random apps directly inserted data into your SQLite database. Or users regularly injected code into your iOS at runtime to remove or change views.
One classic example we ran into was DOM that we'd just rendered suddenly had a different structure because Google Translate would insert new DOM nodes. So after a.appendChild(b); b.parentNode would be some random value instead of a. As a coder that's hard, you need some certainty to build on top of.
Experienced devs can develop intuition about stuff that breaks. But it's hard to be exhaustive. And there isn't a great deal of tooling available for fuzz testing this kind of stuff.
Gosh, this continues to be one of the problems I have with all of the major frameworks. Rather than assuming the DOM is a mutable, shared resource like it actually is, they treat unexpected DOM changes as undefined behavior, and will usually break at the slightest attempts by browsers or extensions to help the user. The Google Translate issue is still largely unsolved, and I’m always frustrated whenever I attempt to translate a blog post on development from Chinese and find out that it’s not working because they’ve decided to render in client-side with React.
Expecting every website developer to code defensively for every single operation is unsustainable. A better solution might be to just build APIs for common cases; like creating new nodes that are anchored to existing DOM nodes.
That used to be more of a thing. A big iOS app I was indirectly involved in eventually added jailbreak detection. Not because they wanted to block it, but to log it so they could track down some fun bugs caused by random tweaks that changed the UI in impressively hacky ways.
window.ga && ga(“user likes socks”);
Obviously if jQuery is critical to your site working at all there’s not much you can do, but for any dependencies that are not critical or only critical to a small portion of features, it’s a much better UX to degrade only those features.
It's like missing a test case that covers 15% of your user base.
I have no idea. They don't show up in my analytics...
Check your own metrics. It's for sure not zero.
It's actually reasonable for sites to be able to estimate the proportion of their population who block their analytics by looking at say, the proportion of signups or conversions or sales or whatever that come from 'untracked' sessions. But that is confounded by the fact that the population who uses ad and script blockers is not necessarily similar in behavior to the population who don't.
If 2% of signups to my newsletter come from sessions that don't show up in google analytics, does that mean 2% of my site traffic is using an ad blocker, or are they actually 10% of my site traffic - but those users are just 5 times less likely to give me their email address?
I'm not just talking "can't log in"/"add to cart" break (obvs), but like, fail-to-catch-the-exception-thrown-by-localStorage-in-render()-so-completely-blank-white-page break.
Now I work around the terrible exception-throwing behavior of localStorage by leaving cookies on, but using the Cookie Autodelete extension.
This is wrong and will break things: if there are bad behaviors, like the cookie usage, the rules should be changed to prevent it, that's great, but having ifs and replacing selected scripts is a horrible way to go.
First reason for this is that obviously Google will try to go around that rule and change it's script. Or some nasty tricks like using script proxies, ...
Second is that if Google Analytics is blocked by name, then other tracking services will take the space, and users will loose anyway.
They had private implementations only supported by Google webservers and Chrome.
Or maybe just a publicity sham since Firefox by default already sends all the links we visit to Google.
You can argue it is bad engineering, but it isn't exactly the tag manager's fault any more than it is the CDN's fault.
Please don't suggest it loads other meaningful things.
May as well claim torrenting is used for downloading Linux isos so its not a piracy problem.
May as well claim the internet is used for something other than piracy so it's not a piracy problem.
See how silly that is?
May as well claim it's silly to not try to prove negatives.
B) Here is another computer communication protocol that can be used for many things. One of those things is infringing copyright.
The internet, being the IP protocol and the bit torrent protocol built on top of IP each are described above.
Differentiate A from B identifying which is IP and which is Bit torrent.
This demonstrates the silliness of the argument made. You either have principles and rules applied equally or you don't. I'm very, very much for the former. Being a an utterly essential foundation of functioning democracy, the rule of law and opposition to the tyranny of government by whim.
It's used to add tracking and advertising. But I've also seen it used for chat bots and chat agents. I've even seen it used for bug fixes that designers wanted to get out the door quickly.
But I also have gtm black holed.
It analyzes your log files instead of client-side tracking.
If you want to try one, I am building https://userTrack.net, you can PM me on Twitter if you need a discount.
My meta-recommendation is actually to opt for zoho's "one" suite, which gives you basically everything they offer for $30/mo/full-time employee: https://www.zoho.com/one/pricing/
I use Firefox because of their strong pro-privacy stance. If they started charging for "real" privacy, it would damage that image - "privacy for those who can afford it" would be a bad slogan.
Also, Firefox has made it clear that they don't want our money, as seen in their continuous refusal to accept donations.
They are absolutely on the right track with Mozilla improving the actual -browser- with all of these new privacy features and core improvements. This could put them in a position to create a revenue stream independent of Google, where people would actually be willing to pay to have a browser wholly decoupled from these ad companies.
Right now there's some kind of equilibrium by having alternative browsers/engines, on Windows especially, plus Google still gets traffic from millions of Firefox users by being default search engine, which makes them $$$.
I look forward to that day - until then, all decisions Mozilla makes are impacted by that fear.
Why shouldn't site owner know you've visited their site? How will they do their job if they don't know where people come from, what content they enjoy, what devices they should optimized for, general demographic of their audience, etc.
These are all the things a restourant owner would know about their customers, for example. But no one seems to have a problem with that.
While I am not as concerned about big tech's data siloing as some, I can see why it's worrying.
Unfortunately, not only is GA the best totally free analytics solution that any marketeer will know how to use, many ad blockers nuke ALL analytics scripts, even if they have nothing to do with google.
That's because you don't need analytics scripts to see if people visiit your site - you have the original page request for that. Analytics script collect additional information beyond that, which users that block them have deemed to be not acceptable.
I think people in general have gotten so sick of ad powered big tech they are having a bit of an over-reaction against analytics in general, not just google's product.
They would just know implicitly by observing their customers who are right in front of their eyes. (At least pre-covid)
The restaurant I visit most often "knows" only my first name (only) plus my mobile phone number, I suppose if they really tried they could probably collect data on my approximate height, build, eye and hair colour, and that I have multiple kids. That's it (since I pay them in cash).
Oddly enough they don't worrying about tracking their customers and instead focus on delivering an excellent product with excellent service. They're known in the region for that, they're usually busy, so one might think their strategy seems to be working(?)
All together, that's more data than a Google Analytics user knows about any of their visitors.
Also, feels like a bit of an arbitrary boundary.
And why should a random page I visit get to know my demography, interests and where I come from? How can you portray avoiding that as taking privacy too far??
I don't care if that makes it harder to optimize your business. Find another way or perish.
You don't have a problem with them knowing that.
Of course in any case Google knows and they choose how much they want to tell you.
Nevertheless, I don't feel iffy about my Interests profile participating in aggregate data available to the sites I visit. Since virtually all sites are free of charge, giving some of that insight back seems like a fair trade.
That said, having ALL that data available to Google without anonymization is a bit more worrying, although I haven't seen many examples where it hurt someone in real world.
Web site owners can analyze their web server's log, which has at least client's IP address, user agent, timestamp and the URL. Already too much if you ask me.
> general demographic of their audience
This is not useful to improve a product unless combined with proper research into the demo, which most people don't do. They just apply their own biases and make their product _worse_.
So many people are making all their decisions based on shallow data like this and never do a simple usability test that yields massively more impact.
Put differently, people use this data to try to focus in on specific traits of their audience before even testing that their software works for "humans".
Also, you have zero control of what code any client executes on their machine. Zero say, whatsoever.
So what about sites that claim that certain cookies are necessary for operation of the site, when that's at best bending the truth and at worst an outright falsehood?
Many sites work perfectly well and - amusingly - become blazingly fast once you block all scripting and cookies. No annoying GDPR notices, no annoying ads, no (client-side) tracking. So much for "necessary" cookies :/
Also if you have an Android phone, you could try to install LineageOS without gapps, or go with /e/.
If you use Drive, Sheets etc, you could try ONLYOFFICE. They use GA, but your decked out Firefox should already block that.
For more alternatives, there are different resources that try to be helpful, like this one:
you have to keep a safe distance from computers/smartphones in order to effectively avoid it.
I mean devs will see that Firefox market share is way lower due to this than it is actually true. And will stop bothering about FF?
Ir am I misunderstanding about what this feature does?
I guess the reason to block GTM until load it use it show some personalized ads/pricing/buttons.
Google Analytics can't cross-reference data from other sites on the browser, b/c it's not a third party cookie now... what's the problem?
One DNS lookup you’re doing anyways.
Power to the people
Google Analytics doesn't have anything to do with building an advertising profile around users, correct?
I know it's popular to hate on Google but does this achieve anything against tracking users across sites in order to build advertising profiles? I was under the impression all the profile-building people object to was done via the pages with ads themselves.
Is Google Analytics actually an evil tool? Or is just "evil by association" because Google ads track users across sites, and Google Analytics also does "tracking" albeit a different kind? I'm just wondering if this is actually anything substantive, or if it's more symbolic.
Edit: wow those were some FAST downvotes. I'm just asking some basic questions to understand how meaningful this is, folks. Hopefully nobody's taking offense.
Even if it didn't, it still tells Google what you're visiting: with so many sites using it, they can get a pretty much complete view of your browsing history, just like Google Fonts.
See that's the thing, I keep seeing this asserted but when I search for any evidence, I can't find a single article that demonstrates this to be true.
If you own multiple sites you can enable analytics across them, but that's all.
And if Google wants to know what you're visiting to build advertising profiles, they have so many options -- not just Fonts, but DNS, Chrome, ads... it's not like GA by itself is making any substantive difference. But again, just because it could be used for this doesn't mean it is.
So I don't get how this is actually helping. I worry it's a distraction from actual achievements.
Nope, nothing will break. I am blocking GA in the following ways: NoScript, PrivacyBadger, Windows HOSTS file. I see the thing being called, and nothing gets through, and websites work properly.
Edit: the bugzilla article mentions both GA and googletagmanager.com, which (both) I have been successfully blocking in the above ways for many years. I never had any website not working because of those two pieces.
Someone built the unsubscribe mechanism in a way so that they FORCE the user to be tracked by GA. That someone is... Xfinity? By Comcast? Hahahaha!!
One.Website. So we should yield to those **(profanity)? That one website does not deserve our respect.
Edit: "we are scum. We want to ** your privacy any way we can. We just got hit with a multi-million fine, so we will continue. You want to unsubscribe? Sure, be tracked a bit more while at it." /end-rant