Hacker News new | past | comments | ask | show | jobs | submit login
When it comes to privacy, default settings matter (blog.mozilla.org)
454 points by nachtigall 43 days ago | hide | past | web | favorite | 127 comments

Mozilla didn't mention this in the article, but the study they referenced had an astounding statistic proving their point about default settings.

> Chrome and Safari are the two most prevalent browsers in our data, with Chrome being associated to about 43% of the ad transactions and Safari to about 38%. About 73% of the ads shown on a Safari browser do not have a cookie associated, whereas on Chrome this is the case about 17% of the time.

> The difference is probably due to different default tracking settings across the two browsers, with Safari impeding, by default, third-party tracking cookies being set on the user’s machine (the user has to explicitly allow the usage of third-party cookies)


[Googler, but this is just my own musing] Here's a theoretical question, if all third party tracking cookies were blocked, wouldn't that strengthen Google's position in the ad market and weaken all of the third party ad networks?

Google gets most of it's revenue (~70%) from it's first party sites, and stuff like AdSense could be made to work without cookies, and given Google's size in the market, people would switch to whatever ad embedding format they required.

But smaller ad networks won't have that power, and don't have huge first party sites either. So in a way, if Google jumps onboard this bandwagon in Chrome, they could be accused of doing it to strengthen their own position, the same way adopting Apple's extension/ad blocking restrictions in Chrome, led people to accuse them they're trying to sabotage ad blockers, instead of trying to reign in a toxic hell stew malware from overly permissive extensions.

I am not a Googler and have been a long time critic of any business model that is not “I give you money and you give me stuff.”

That being said, I defended Google’s choice of implementing ad blocking extensions using an approach similar to Apple’s because I inherently don’t trust random third party extension makers that can intercept all of my web browsing. I also don’t trust VPN providers to protect my privacy but that’s a rant for another day.

I don't trust random third party extension makers either. That's why I don't go around randomly installing extensions.

Even if you do trust the extension maker at the time you installed it, how many times have malware makers bought formerly trustworthy extensions?


I’m very careful on what gets installed on my work Windows laptop and I rarely use my home computer. I’m usually on my iPhone or iPad. There, I install any random app because I know the permission model only allows so much.

I generally agree, though I do trust uBlock Origin with the coveted webrequest blocking API. It has some pretty powerful features that I don’t think could be implemented otherwise, block list limitations aside. I kind of hope Firefox does not get rid of the webrequest blocking API for that reason alone.

(Usual disclaimer: Googler here, opinions are my own.)

What's AdSense going to switch to that works without cookies? And whatever that is, what makes you think Firefox and other non-Chrome browsers won't block it?

Seems like pay-per-click would work without third-party cookies? The redirect is first-party.

Almost all of the privacy issues are about data being collected and used when the user doesn't interact with the ad.

This step is overdue and i applaud Mozilla for doing it. But:

Why doesn‘t Firefox block all 3rd party cookies by default? That would be a huge win for privacy. Yes, some sites would break. But if Apple can do it with Safari, Mozilla can do it with Firefox.

Be brave! Do it!

When sites break, people leave Firefox. No amount of explaining or media changes that: the number one (by far) reason people leave a browser is because a site is broken.

Yes, this. Anyone who was around for the launch of Windows Vista could see this effect in action.

Vista's added security measures like UAC (https://en.wikipedia.org/wiki/User_Account_Control) broke a lot of poorly coded Windows applications that didn't bother to follow the rules of the platform.

Who did the users blame when those apps broke? Microsoft.

Why? "It worked fine until I upgraded to Vista!"


Further evidence: Windows 7 wasn't even that much of a change from Vista. The main differences were 1) software had adjusted to deal with UAC and 2) new laptops were more powerful. But most of the good new architectural features had premiered with Vista.

I think they did improve the UAC situation with Vista SP1, if memory serves.

MS changed UAC for windows 7 so much in an effort to stop annoying window's users that people complained that defeated the point


The main UAC change in Windows 7 was just that it didn't pop up for changing of settings. But UAC on settings wasn't the problem (in fact, macOS basically has this and it's a non-issue), UAC on legacy Windows apps only tested on Windows 9x or on administrator accounts on NT systems was.

> Further evidence: Windows 7 wasn't even that much of a change from Vista.

I seem to recall Vista was way more sluggish than 7 on the same machines.

W7 actually has lower minimum requirements than Vista. I've gotten 7 running on plenty of devices that Vista wouldn't touch.

I second this. I had a secondary computer that I switched between Vista and 7, back and forth, for various reasons, and it was much, _much_ slower on Vista than 7.

Most of the anecdotal evidence was comparing a laptop running with years' worth of bloated software to a fresh install. No surprise that Windows 7 came out ahead.

I was referring to my own experience of fresh installs of both.

I don't remember, did Vista let users turn off UAC, as Windows 7—10 does?


I use this setting (blocking all 3rd party cookies) since years and the number of sites that are unusable is rather small. I usually go to another site when it happens. I think in the last 2 or 3 years there were only three occasions where i bothered to change the browser settings because I absolutely had to access the broken site. I think disqus is one of the notable services that is semi-broken due to this.

But then again I guess i'm not an average user.

Yeah, this exact experiment was already run. The user attrition stats were dire.

Yup. They don't have the advantage of being preinstalled and the default on all Apple machines. They really can't afford to break functionality.

Also, Apple can break sites and people can’t leave mobile Safari. Sites are forced to test on Safari. If a site doesn’t work with Firefox - oh well.

I don't like it, but there is another option, with its own set of trade-offs: force mobile users to download your app.

I strongly agree. There is no legitimate functionality that can't be implemented without 3rd party cookies. Some sites will break at first until they are fixed, and we could have a whitelist for a while, while making it very cumbersome to be added to that list. New sites would simply need to workaround the lack of 3rd party cookies and accept that user tracking is technically illegal.

The surveillance internet is something that was allowed to evolve by a lack of foresight from the standard authors in their desire to build a rich web. We should put an end to it as soon as possible. The browser should not leak unique identifiers in any situation and all standard browsers should be as finger-print resistant as the Tor Browser.

In my view, if you leak data that is critical to the privacy of the user, it's as bad if not worse than an improper implementation of TLS. It takes only two, modestly skilled, collaborating webadmins (one site "shameful" and one used with authentication) to possibly destroy a person's life, reputation and family. A practical attack on SSL2.0 is orders of magnitude more difficult.

Let's put responsibility for privacy back where it should be: on the browsers and web standard authors.

I'm genuinely curious about your assertion. Let's say I have an application for my business and I want to embedded a report on one of my pages from a separate application. This other application is a SaaS application, so I can't just "mount" it at mydomain/some-sub-path. How do I have secure reports without third-party cookies? Store cookie-like stateless tokens in my wrapping app and send them in the URL?

Assuming I'm understanding your question correctly, yes, you would have something like:

  <iframe src="https://saasapp.com/report?url=
(spaces added for readability)

Cool, if third party tracking cookies are "defeated", this will be the next step in the arms race.

This site includes content from saasapp.com:

[Whitelist INC saasapp.com From mydomain.com]

[Blacklist saasapp.com]

[Allow once]


That could work well because if the message shows the domain that looks like the content they want to see they will allow it but if its a random tracker domain they will block it.

It's also important to not include [Whitelist INC i.imgur.com], only [Whitelist INC i.imgur.com FROM example.com], because trackers will adapt to exploit the first one as soon as it's common enough to be noteworthy.

I'd be happy with that.

If browsers break 3rd party cookies and they are not fingerprintable, the arms race stops: there is no way to track a client across multiple sites, other than weak correlations with IP, interests, etc.

If it's inside your business you can use subdomains like site1.domain.com and site2.domain.com and set the cookie domain attribute to domain.com and you will not have a problem because the cookie will be visible among all the sites of your domain.

> New sites would simply need to workaround the lack of 3rd party cookies and accept that user tracking is technically illegal.

Or they would just not support Firefox.

Sadly, I think the moment has passed for Mozilla to make moves like this. They don't have enough marketshare.

You can switch to firefox.

Technically, ad-tracking is not an invasion to privacy per sé, and does not identify a person. It is always connected to a client, and it is illegal to connect the client browsing data to a real person.

So from that perspective, how is ad-tech a problem for privacy?

> Yes, some sites would break. But if Apple can do it with Safari, Mozilla can do it with Firefox.

Safari has enough market share that web developers must work within Apple's guidelines if they want their websites to work on iOS.

Plus users do not have an alternative, at least on iOS. They all pass through webkit.

On a desktop, if Firefox breaks, people will just switch to Chrome or something else. Plenty of people want things to "just work" out of the box.

Maybe the better solution is to have a guided install process. When installing firefox, ask the user if they want it to "just work" or be "privacy conscious"? Warn them of the tradeoffs of both and let them decide once.

Well, there is a real danger the web would split into two groups: those who care about privacy and use Firefox - so are a threat to several business models that are en vogue in SV right now, and those who use Chrome and are therefore better candidates for milking. Website owners would do everything to discourage people from using their websites with Firefox. Instead of seeing your favorite website in Firefox, you would see "Welcome to Oath family" or similar bullshit.

s/Firefox/Europe/g and this is already happening.

I suspect we're just training people to click the Whatever button. As that's most “normal” people's response (I am not normal), the courts will eventually rule that a first-use roadblock doesn't produce informed consent, so doesn't excuse you under the GDPR. Hopefully, then, websites will stop performing a superficial impression of caring about privacy.

(Explicit informed consent is the GDPR's last-resort excuse, which you only need if your use of personal data falls outside all of the GDPR's automatically-acceptable justifications for using and storing personal data.)

I get what you mean and yes, it would definitely be a good move in terms of showing the way to sane defaults, however I would have a hard time calling it a "huge win" considering Firefox + Safari make up <20% market share. Maybe I'm just being cynical.

Well, i guess i meant a huge win for Firefox users who use default settings.

Seems like they're in for a treat then :


Good question. It would have made everything so much easier. How much work was done on this selective cookie blocking implementation?

Personally I always hoped they would simply copy Privacy Badger and develop an algorithm, instead of relying on a black list.

As of right now their UI to allow third-party cookies is pretty bad. Chrome shows an icon in the address bar for it. In Nightly there is a section in tracking prevention, but it's not in normal Firefox yet.

Safari does not block all 3rd party cookies by default last I checked. It does something a lot more complicated that still allows 3rd party cookies in various cases.

[comment retracted]

It‘s just cookies recognized to be tracking cookies, isn‘t it? These companies are earning money by tracking you. They have a strong incentive to bypass these mechanisms.

Oh, hmm, it looks like you're right ("known third party trackers"). That's what I get for headline reading.

(Maybe put "Edit:" in your message, for a minute I thought this was your reply, implying that the parent comment should retract their statement or something, but given the replies to your comment, I'm guessing you want to indicate that you changed your mind.)

When it comes to the greater public, default settings might as well be the only available setting. Apart from a few 'techies' most people will never even touch the default settings out of the naive belief that "the default setting is what's best for me".

As an alternative approach I would suggest empty settings to begin with, forcing the user to think about their preferences on first use.

That would only work if every browser implemented it but for the average user, choosing between a blank-slate approach where they have to parse through terminology they don't understand, and an alternative offering "sensible" defaults, I suspect most users would just pick the easier latter option.

Perhaps there's a middle ground. Give users a range of options (say 3 to 5) that aggregate the settings, ranging from "I don't really care about privacy" to "I wear a tinfoil hat to bed", along with pointers to where and how they might wish to delve deeper into more detailed settings. It can't be that hard...?

If you put a big scary decision as the first thing users see, many will just close the browser because they don't know what they should pick. When they open a different browser that doesn't present them with that choice, they may conclude that it's not a problem on that other browser.

"Today marks an important milestone in the history of Firefox and the web. As of today, for NEW USERS who download and install Firefox for the FIRST TIME, Enhanced Tracking Protection will automatically be set on by default, protecting our users from the pervasive tracking and collection of personal data by ad networks and tech companies."

this gives me the impression that Mozilla is trying to pull in a bunch of new recruits, also does this mean upgrades or repetitive DLs will not have this ~privacy by default?

Probably. This is generally the correct way to handle an upgrade while minimizing breaking changes to the user. Changing a user's settings during an upgrade will erode users' trust in doing so in the future, even if it's "good" for them.

The article covers this, they have further plans for existing installs.

So they're killing the Google relationship? I mean something called "Enhanced Tracking Protection" would have to disable any sending of data to Google (or anyone, except the server as required to get the data requested), surely?!?

Google might not be happy about this move, but historically Google has paid Mozilla to be the default search engine in Firefox, which doesn't require Firefox sending any data to Google (apart from actual search queries, obviously).

There's some discussion of how this ties into Mozilla's mission at https://blog.mozilla.org/blog/2019/06/04/the-web-the-world-n...

The thing that absolutely pisses me off is how I try to be actively aware of what settings I disagree with and disable things I don't like - and then an unseen update resets things to default.


I think my qualify of life on Firefox would be improved greatly if a notice popped up saying some of my settings were reset to defaults because of breaking changes (or minor). Like they give a crap.

This is a slightly difficult problem but it has been solved in other places. When I update my computer with linux sometimes I will get a message saying that upstream has changed a config file that I have also modified and it asks if I want to keep my version, keep upstreams version or open it in an editor.

Not sure why the down votes that would sure piss me off. I'm very conscious of what I want synced and it only takes a single bug like that to throw away all the effort to keep things separated

Perhaps you disabled settings sync ;)

> In fact, nearly 25% of web page loads in Firefox take place in a Private Browsing window.

If Mozilla knows that, then Private Browsing Mode isn't as private as it could be.

The total amount of loaded pages in Private Browsing doesn't really have to be private?

As long as the data is aggregated and not tied to individual users, I'm ok with it.

Mozilla is going even further and implementing technologies like Prio [0] to provide cryptographic guarantees of privacy for collected telemetry data.

[0] https://hacks.mozilla.org/2018/10/testing-privacy-preserving...

Thats exactly what firefox telemetry data is. It collects things like % of requests over HTTPS, % over IPv6, etc and sends anonymized stats to Mozilla.

But at the end of the day, I'm just taking someone's word for it that this is all they send, and assumes that it won't change over time in a browser that regularly updates itself.

It'd be a lot more acceptable if there was an option to show me "This is the exact telemetry payload we want to send to Mozilla." And even then you are taking someone's word for it that there isn't some other piece of data hidden in a hash or something, or that the browser isn't secretly sending data.

I'm not quite paranoid enough to do the full monitoring of all network traffic, but how do I reasonably know what is going on without listening to traffic/watching memory at all times? In the end, I'm trusting a faceless corporation that is attempting to put on a facade of trustworthiness.

The only trustworthy computer is an unnetworked one.

> It'd be a lot more acceptable if there was an option to show me "This is the exact telemetry payload we want to send to Mozilla."

There is! Navigage to about:telemetry in Firefox.

> But at the end of the day, I'm just taking someone's word for it that this is all they send, and assumes that it won't change over time in a browser that regularly updates itself.

Actually you aren't. It's quite easy to build your own Firefox and use that, or use say Ubuntu's build if you trust them more. You wouldn't want to audit the Firefox source code yourself, but if Mozilla was intentionally sneaking backdoors into the source code, someone could (and probably would) eventually find that out and you'd be able to verify it by examining your source archives... which means Mozilla would have to be stupid/crazy to try that.

Firefox is open source, and even the telemetry server code is on github, so if you wanted to really dig in you could.

At the end of the day if you want Firefox to make rational decisions about what feature to implement they need to collect some data. That fact alone can't be a problem for anyone who wants to have firefox be a competitive browser.

You don't know there isn't a government official placing a secret camera across the street from you, watching your computer screen through the window, neither.

Eventually, you have to just get on with your life and trust people.

Check the source code and then build it from source?


I'm not particularly bothered by the idea that Firefox can see how often private browsing is used. If you don't know what features people use, you can't prioritize development resources. Mozilla doesn't know who is using private browsing, or what they are looking at.

I can certainly respect that it bothers you—but may I ask what browser you intend to switch to? I'm skeptical that you'll find a better, usable option. That's a sad state of affairs for sure, but also the way of things right now.

Can you explain how you know that Firefox collects the data versus just asking people in a survey?

What if the numbers are based on opt-ins only, and extrapolated?

The irony is strong with this one.

By default, Firefox:

- Collects a bunch of telemetry data via several mechanisms and ships them to Mozilla HQ

- Provides Mozilla with remote code execution privileges on your machine via the shield (or normandy, or whatever they are calling it these days) mechanism, which can install and uninstall extensions and certificates, change browser settings, etc

- Uses Google as the default search engine, and search suggestions leak private data to Google

- Uses Google Location Services for their geolocation thingy, which - unsurprisingly - phones home to Google

- Ships closed source third party add-ons

- Comes with a bunch of "about:config" settings configured in sub-optimal ways, privacy wise - battery API enabled by default, accept all cookies by default and so on

Sure, Chrome is worse, but bringing that up that is like arguing that your pile of manure is better because it doesn't smell as bad: in the end, you are still arguing about shit.

There are some valid privacy complaints about Mozilla but I think they are severely overblown by a lot of people.

Mozilla is very up-front about exactly what telemetry data they're collecting and what it's used for, there's even a pop-up when you first install the browser about it telling you what's collected and how to disable it if you want to. And then when Mozilla makes decisions based on telemetry like removing features that 2% of people use the people who disabled telemetry complain that Mozilla is ignoring their opinions.

The optional syncing service is end to end encrypted so Mozilla can't see the data you're syncing.

Shield is a valid complaint, I am not a fan of it being opt-out.

Search suggestions are disabled by default in private browsing mode and probably a feature most people want anyway. Your query gets sent to the search engine when you hit enter either way.

The battery API was completely removed from Firefox two and a half years ago, that particular complaint is very outdated. Firefox has been tracking cookies by default for a while now too. More strict cookie policies would just annoy the vast majority of users.

> Mozilla is very up-front about exactly what telemetry data they're collecting and what it's used for

You can see the telemetry data that engineers look at themselves (https://telemetry.mozilla.org/new-pipeline/dist.html). It's not very detailed.

> Mozilla is very up-front about exactly what telemetry data they're collecting and what it's used for,

I consider myself relatively technically inclined. When I started using Firefox, I absolutely did not know about

- Normandy as an RCE engine to install arbitrary extensions and customize random settings

- Google Location Services as the location backend

- Which about:config settings I need to change for a reasonable expectation of privacy

Didn't you already trust Mozilla to execute their code on your machine when you installed the browser, in the first place? And to do it remotely with auto-updates.

There is a big difference between them being able to activate a connection to my machine at their whim and execute code, vs me downloading their software or an update at a time of my choosing, especially since if I am very security conscience I can wait until an updated has been audited or tested.

With a remote code execution engine, someone could hack into their backend and then start running malicious code on thousands or millions of machines. If they compromise a software update, at least there is a chance it can be caught before it gets to me.

There's a config-flag to turn it off. You could even deploy that enterprise-wide.

That said, every auto-update system is essentially an RCE system. For highly exposed and security-sensitive applications like browsers, the auto-update is a net win in many deployment scenarios.

Isn’t it kind of ironic that you mention a user flag to turn off telemetry that is on by default on a post about “defaults matter”?


Telemetry and auto-updates are important enough that having them on by default isn't wildly unreasonable.

Auto updates yes for security. But why would telemetry be important to the end user - especially for a “privacy focused browser”?


The nice thing is that you don't have to ask. You can look for yourself. Mozilla's pretty transparent about what they have and what is in it.

Turns out telemetry is good for things like finding / addressing crashes and seeing if updates have gone out properly.

Also, I seem to recall being explicitly asked if I wanted to participate. But my memory could be failing me.

No browser is really "privacy-focused". Performance, security, stability and Web compatibility are all table stakes for Web browsers. If you aren't competitive at those, it doesn't matter what else you do, your product isn't viable. And telemetry data is really valuable for achieving all those; without it, you'll waste a lot of resources fixing the wrong things. Mozilla certainly can't afford to do that.

Once your browser is competitive at those table stakes, only then can you give it a "privacy focus" to differentiate from Chrome.

If you're security-conscious then you'll install updates immediately, before you get compromised by whatever attack it might be fixing.

In reality no-one outside Mozilla is auditing updates (other than black-hats reverse-engineering security fixes to catch the people who don't update immediately). I don't think the situation for other browser vendors is any different.

The thing is - at the end of the day - there is no much difference between default Firefox and Chrome. Regarding data being sent to Google.

*If you use google search with Firefox.

In that fecal analogy, Mozilla is a fresh cup of coffee from squirrel-digested coffee beans, while Chrome is a neck deep swim in human sewage.

The privacy points you raise are significant but at no point discredit the very real privacy efforts made by Mozilla and in no way make it comparable to Google.

This is absolutely the truth. This blog post is an exercise in hypocrisy.

Mozilla does a lot of good things for privacy, but HN needs to see past their blind loyalty to Mozilla and criticize them where necessary.

I get annoyed when I remove most of the default search providers, and then an update brings Google back. I specifically removed some of those providers so my searches would not be predicted with those services.

You forgot "Actively encourages search engines to track which pages users click on in search results"

Source: https://www.bleepingcomputer.com/news/software/mozilla-firef...

Search engines are tracking that already and essentially always have. Enabling 'ping' doesn't change the privacy situation at all.

You don't even mention Pocket or the stories they put on the default home page.

How is that a privacy concern? You can also hide the stories if you don't want to see them.

Any unwanted, unrequested connections to 3rd parties counts as a privacy concern. If I don't explicitly click a link or enter a URL in the address bar I don't expect traffic to be sent anywhere or any content at all downloaded.

Mozilla owns Pocket, meaning it's a second party rather than a third. You and others may still choose to regard it as a concern, of course.

>- Uses Google as the default search engine, and search suggestions leak private data to Google

Doesn't apple do the same thing?

Apple is not the one making a blog post about privacy-centric defaults.

There's literally an article on the front page talking about how Apple is really a privacy as a service company based on all of their marketing talking points about privacy.

Still, Apple did not make a blog post about sane defaults. Mozilla did. Apple is completely besides the discussion here.

I think the implied point is that everyone (except microsoft) has google as a default.

It's been a while since I set up a new Firefox, but isn't the Search Suggestions feature opt-in?

No, this feature is opt-out. Just checked on a fresh profile and whatever you type in the URL triggers requests like: https://www.google.com/complete/search?client=firefox&q=quer... for each keystroke.

What's your Firefox version? I just checked and "browser.urlbar.suggest.searches" still defaults to false on mine, and it doesn't show search suggestions.

Version 67.0 on Ubuntu, in Europe (maybe that makes a difference?). And this same preference is set to true for me in about:config.

I'm running v67.0.1 on Windows in the US and my preference is also set to true and says it is the default.

No, it's still opt-in. There's a prompt which asks you whether you'd like to enable search suggestions.

As said for the other comment, on my 67.0 installed on Ubuntu, from Europe (if that makes any difference), I have no prompt on new profile and the suggestions are enabled by default. Not sure if that changes per region, operating system, or other reasons to be honest.

> Sure, Chrome is worse

I'd like some clarification on that. Does Chrome send automated telemetry reports about my browsing to Google when you are not logged in with your Google account? Does Chrome give remote code excecution privileges to Google (Yeah, via the Updater, but that does not really count)? I searched but I found nothing on the net about telemetry.

I never use Firefox because it has these horrible defaults, and keeping up on all the about:config switches I need to toggle to be able to use it is just too much.

> Does Chrome send automated telemetry reports about my browsing to Google when you are not logged in with your Google account?

Yes. And if you sign into Chrome (which now automatically happens if you sign into any Google website, IIRC), it uploads your full browsing history.

> Does Chrome give remote code excecution privileges to Google (Yeah, via the Updater, but that does not really count)?


I don't see how you could have done honest research into this and arrived at the conclusions you have.

You can compare https://www.google.com/chrome/privacy/whitepaper.html. and https://www.mozilla.org/en-US/privacy/firefox/

maybe it's just me, but Google privacy note seems a lot more reasonable. And if it is correct, they ask if they may collect usage statistics during the installation, while Firefox does not.

Why does "Updater" "not really count" as "remote code execution privileges"?

Either the vendor can remotely change your software and its configuration without user intervention, or it cannot. For any software that supports unattended updates, it can. End of story.

WARNING advertising SWE insider

I don't like how the opening line of article exploits the fact the average person does not know cost of average online ad to make it appear like tracking has basically no value.

>... data about you was transmitted to dozens or even hundreds of companies, all so that the website could earn an additional $0.00008 per ad.

For the reader to be able to accurately understand how much money this is they need to know the percentages.

Very roughly (this varies widely based on the country and websitem) the average online ad only costs ~$0.0005, so that insignificant $0.00008 is around 10-20%. If the article had presented the exact same information but instead framed it in the form of revenue available to pay employees at an online company dependent on advertising, this would sound very different while really conveying the same concept.

Edit: I read the linked study and the data they used had an average cost per add of $0.001 putting the difference around 4%. This is smaller than I would have predicted. I would still rather they have lead with this number.

My personal problem with the model is not how much they make, but rather the intentionally hidden relationship they develop with the user. If the relationship is - you get the article, we get to show you an ad that gives us a chance to sell you something and make money on it, which is ~$0.00008 - that would be clear.

Even if the relationship was - you get the article, we get the above plus we'll collect some bits of information about you that we explicitly list. The ad itself will give us ~$0.00008, and the collected data another ~$0.000007 - that would be ethical imho.

But the real model is - we give you an article, and in return you sign a blank document that allows us to collect all the possible data and try to maximize the amount of money we can make on it. You step into it today, but we are not comfortable putting any price point on this agreement because we bank on the idea that in the future we'll make more as we increase our grip over understanding of user behavior and improve our ability to monetize it in any, potentially unethical way. The reason companies hide the nature of the relationship is because their business models are built around the assumption that the data collection will generate increasing amount of revenue in the future. And since there's no way for you to understand the relationship, or step out of it, you're entering it with information disadvantage, and there's no turning back. I hope you can see how this approach is by design hostile to users and the Internet as a public plane.

When it comes to everything, default settings matter.

I'm amused by this because when I called Mozilla out a few days ago, I got a bunch of downvotes [1]. Plus one of the top comments was a subtweet of mine.

[1] https://news.ycombinator.com/item?id=20055322

> Please don't comment about the voting on comments. It never does any good, and it makes boring reading.


Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact