> Chrome and Safari are the two most prevalent browsers in our data, with Chrome being associated to about 43% of the ad transactions and Safari to about 38%. About 73% of the ads shown on a Safari browser do not have a cookie associated, whereas on Chrome this is the case about 17% of the time.
> The difference is probably due to different default tracking settings across the two browsers, with Safari impeding, by default, third-party tracking cookies being set on the user’s machine (the user has to explicitly allow the usage of third-party cookies)
Google gets most of it's revenue (~70%) from it's first party sites, and stuff like AdSense could be made to work without cookies, and given Google's size in the market, people would switch to whatever ad embedding format they required.
But smaller ad networks won't have that power, and don't have huge first party sites either. So in a way, if Google jumps onboard this bandwagon in Chrome, they could be accused of doing it to strengthen their own position, the same way adopting Apple's extension/ad blocking restrictions in Chrome, led people to accuse them they're trying to sabotage ad blockers, instead of trying to reign in a toxic hell stew malware from overly permissive extensions.
That being said, I defended Google’s choice of implementing ad blocking extensions using an approach similar to Apple’s because I inherently don’t trust random third party extension makers that can intercept all of my web browsing. I also don’t trust VPN providers to protect my privacy but that’s a rant for another day.
I’m very careful on what gets installed on my work Windows laptop and I rarely use my home computer. I’m usually on my iPhone or iPad. There, I install any random app because I know the permission model only allows so much.
(Usual disclaimer: Googler here, opinions are my own.)
Why doesn‘t Firefox block all 3rd party cookies by default? That would be a huge win for privacy.
Yes, some sites would break. But if Apple can do it with Safari, Mozilla can do it with Firefox.
Be brave! Do it!
Vista's added security measures like UAC (https://en.wikipedia.org/wiki/User_Account_Control) broke a lot of poorly coded Windows applications that didn't bother to follow the rules of the platform.
Who did the users blame when those apps broke? Microsoft.
Why? "It worked fine until I upgraded to Vista!"
I think they did improve the UAC situation with Vista SP1, if memory serves.
I seem to recall Vista was way more sluggish than 7 on the same machines.
But then again I guess i'm not an average user.
The surveillance internet is something that was allowed to evolve by a lack of foresight from the standard authors in their desire to build a rich web. We should put an end to it as soon as possible. The browser should not leak unique identifiers in any situation and all standard browsers should be as finger-print resistant as the Tor Browser.
In my view, if you leak data that is critical to the privacy of the user, it's as bad if not worse than an improper implementation of TLS. It takes only two, modestly skilled, collaborating webadmins (one site "shameful" and one used with authentication) to possibly destroy a person's life, reputation and family. A practical attack on SSL2.0 is orders of magnitude more difficult.
Let's put responsibility for privacy back where it should be: on the browsers and web standard authors.
[Whitelist INC saasapp.com From mydomain.com]
Or they would just not support Firefox.
Sadly, I think the moment has passed for Mozilla to make moves like this. They don't have enough marketshare.
So from that perspective, how is ad-tech a problem for privacy?
Safari has enough market share that web developers must work within Apple's guidelines if they want their websites to work on iOS.
On a desktop, if Firefox breaks, people will just switch to Chrome or something else. Plenty of people want things to "just work" out of the box.
Maybe the better solution is to have a guided install process. When installing firefox, ask the user if they want it to "just work" or be "privacy conscious"? Warn them of the tradeoffs of both and let them decide once.
I suspect we're just training people to click the Whatever button. As that's most “normal” people's response (I am not normal), the courts will eventually rule that a first-use roadblock doesn't produce informed consent, so doesn't excuse you under the GDPR. Hopefully, then, websites will stop performing a superficial impression of caring about privacy.
(Explicit informed consent is the GDPR's last-resort excuse, which you only need if your use of personal data falls outside all of the GDPR's automatically-acceptable justifications for using and storing personal data.)
Personally I always hoped they would simply copy Privacy Badger and develop an algorithm, instead of relying on a black list.
As an alternative approach I would suggest empty settings to begin with, forcing the user to think about their preferences on first use.
this gives me the impression that Mozilla is trying to pull in a bunch of new recruits, also does this mean upgrades or repetitive DLs will not have this ~privacy by default?
HOW MANY TIMES MUST I UNCHECK WHAT TO SYNC TO MY ACCOUNT? YOU WOULD THINK THAT IS SAVED PERSISTENTLY.
I think my qualify of life on Firefox would be improved greatly if a notice popped up saying some of my settings were reset to defaults because of breaking changes (or minor). Like they give a crap.
If Mozilla knows that, then Private Browsing Mode isn't as private as it could be.
It'd be a lot more acceptable if there was an option to show me "This is the exact telemetry payload we want to send to Mozilla." And even then you are taking someone's word for it that there isn't some other piece of data hidden in a hash or something, or that the browser isn't secretly sending data.
I'm not quite paranoid enough to do the full monitoring of all network traffic, but how do I reasonably know what is going on without listening to traffic/watching memory at all times? In the end, I'm trusting a faceless corporation that is attempting to put on a facade of trustworthiness.
The only trustworthy computer is an unnetworked one.
There is! Navigage to about:telemetry in Firefox.
Actually you aren't. It's quite easy to build your own Firefox and use that, or use say Ubuntu's build if you trust them more. You wouldn't want to audit the Firefox source code yourself, but if Mozilla was intentionally sneaking backdoors into the source code, someone could (and probably would) eventually find that out and you'd be able to verify it by examining your source archives... which means Mozilla would have to be stupid/crazy to try that.
At the end of the day if you want Firefox to make rational decisions about what feature to implement they need to collect some data. That fact alone can't be a problem for anyone who wants to have firefox be a competitive browser.
Eventually, you have to just get on with your life and trust people.
I can certainly respect that it bothers you—but may I ask what browser you intend to switch to? I'm skeptical that you'll find a better, usable option. That's a sad state of affairs for sure, but also the way of things right now.
By default, Firefox:
- Collects a bunch of telemetry data via several mechanisms and ships them to Mozilla HQ
- Provides Mozilla with remote code execution privileges on your machine via the shield (or normandy, or whatever they are calling it these days) mechanism, which can install and uninstall extensions and certificates, change browser settings, etc
- Uses Google as the default search engine, and search suggestions leak private data to Google
- Uses Google Location Services for their geolocation thingy, which - unsurprisingly - phones home to Google
- Ships closed source third party add-ons
- Comes with a bunch of "about:config" settings configured in sub-optimal ways, privacy wise - battery API enabled by default, accept all cookies by default and so on
Sure, Chrome is worse, but bringing that up that is like arguing that your pile of manure is better because it doesn't smell as bad: in the end, you are still arguing about shit.
Mozilla is very up-front about exactly what telemetry data they're collecting and what it's used for, there's even a pop-up when you first install the browser about it telling you what's collected and how to disable it if you want to. And then when Mozilla makes decisions based on telemetry like removing features that 2% of people use the people who disabled telemetry complain that Mozilla is ignoring their opinions.
The optional syncing service is end to end encrypted so Mozilla can't see the data you're syncing.
Shield is a valid complaint, I am not a fan of it being opt-out.
Search suggestions are disabled by default in private browsing mode and probably a feature most people want anyway. Your query gets sent to the search engine when you hit enter either way.
The battery API was completely removed from Firefox two and a half years ago, that particular complaint is very outdated. Firefox has been tracking cookies by default for a while now too. More strict cookie policies would just annoy the vast majority of users.
You can see the telemetry data that engineers look at themselves (https://telemetry.mozilla.org/new-pipeline/dist.html). It's not very detailed.
I consider myself relatively technically inclined. When I started using Firefox, I absolutely did not know about
- Normandy as an RCE engine to install arbitrary extensions and customize random settings
- Google Location Services as the location backend
- Which about:config settings I need to change for a reasonable expectation of privacy
With a remote code execution engine, someone could hack into their backend and then start running malicious code on thousands or millions of machines. If they compromise a software update, at least there is a chance it can be caught before it gets to me.
That said, every auto-update system is essentially an RCE system. For highly exposed and security-sensitive applications like browsers, the auto-update is a net win in many deployment scenarios.
Telemetry and auto-updates are important enough that having them on by default isn't wildly unreasonable.
The nice thing is that you don't have to ask. You can look for yourself. Mozilla's pretty transparent about what they have and what is in it.
Turns out telemetry is good for things like finding / addressing crashes and seeing if updates have gone out properly.
Also, I seem to recall being explicitly asked if I wanted to participate. But my memory could be failing me.
Once your browser is competitive at those table stakes, only then can you give it a "privacy focus" to differentiate from Chrome.
In reality no-one outside Mozilla is auditing updates (other than black-hats reverse-engineering security fixes to catch the people who don't update immediately). I don't think the situation for other browser vendors is any different.
The privacy points you raise are significant but at no point discredit the very real privacy efforts made by Mozilla and in no way make it comparable to Google.
Mozilla does a lot of good things for privacy, but HN needs to see past their blind loyalty to Mozilla and criticize them where necessary.
Doesn't apple do the same thing?
I'd like some clarification on that. Does Chrome send automated telemetry reports about my browsing to Google when you are not logged in with your Google account? Does Chrome give remote code excecution privileges to Google (Yeah, via the Updater, but that does not really count)? I searched but I found nothing on the net about telemetry.
I never use Firefox because it has these horrible defaults, and keeping up on all the about:config switches I need to toggle to be able to use it is just too much.
Yes. And if you sign into Chrome (which now automatically happens if you sign into any Google website, IIRC), it uploads your full browsing history.
> Does Chrome give remote code excecution privileges to Google (Yeah, via the Updater, but that does not really count)?
I don't see how you could have done honest research into this and arrived at the conclusions you have.
maybe it's just me, but Google privacy note seems a lot more reasonable. And if it is correct, they ask if they may collect usage statistics during the installation, while Firefox does not.
Either the vendor can remotely change your software and its configuration without user intervention, or it cannot. For any software that supports unattended updates, it can. End of story.
I don't like how the opening line of article exploits the fact the average person does not know cost of average online ad to make it appear like tracking has basically no value.
>... data about you was transmitted to dozens or even hundreds of companies, all so that the website could earn an additional $0.00008 per ad.
For the reader to be able to accurately understand how much money this is they need to know the percentages.
Very roughly (this varies widely based on the country and websitem) the average online ad only costs ~$0.0005, so that insignificant $0.00008 is around 10-20%. If the article had presented the exact same information but instead framed it in the form of revenue available to pay employees at an online company dependent on advertising, this would sound very different while really conveying the same concept.
Edit: I read the linked study and the data they used had an average cost per add of $0.001 putting the difference around 4%. This is smaller than I would have predicted. I would still rather they have lead with this number.
Even if the relationship was - you get the article, we get the above plus we'll collect some bits of information about you that we explicitly list. The ad itself will give us ~$0.00008, and the collected data another ~$0.000007 - that would be ethical imho.
But the real model is - we give you an article, and in return you sign a blank document that allows us to collect all the possible data and try to maximize the amount of money we can make on it. You step into it today, but we are not comfortable putting any price point on this agreement because we bank on the idea that in the future we'll make more as we increase our grip over understanding of user behavior and improve our ability to monetize it in any, potentially unethical way.
The reason companies hide the nature of the relationship is because their business models are built around the assumption that the data collection will generate increasing amount of revenue in the future.
And since there's no way for you to understand the relationship, or step out of it, you're entering it with information disadvantage, and there's no turning back.
I hope you can see how this approach is by design hostile to users and the Internet as a public plane.