A quick "grep -r iridiumbrowser.de" of the source reveals that they replace calls home to Google with calls home to various hostnames of the form "trk-NNN.iridiumbrowser.de", where NNN is a three-digit number. Presumably these hosts act as proxies. For example, lines 37-38 of chrome/browser/history/web_history_service.cc:
const char kHistoryQueryHistoryUrl =
"Replace URLs to Google services by URLs to our own server, so as to analyze where we still have to patch the browser to make it stop blurting data out."
That's an acceptable excuse in a debug branch, but there's no reason for this kind of privacy-impacting debug code to reach a public build.
Phoning home without permission goes against the entire concept of a secure browser.
Have you ever seen the underhanded C contests? There are far too many ways for something like this to turn nasty. Especially given there's inherent plausible deniability.
I don't claim it's necessarily malicious. The explanation given in the git log makes sense, though there are better ways to accomplish the stated goal, such as replacing Google's tracking URLs with invalid ones. But leaving those changes in public releases of software supposed to increase privacy seems negligent.
And from a person assumed to have no privacy rights by the NSA (since I'm not an American citizen) there's some benefit of that phoned home data existing in a non US legal jurisdiction too... Even if you _don't_ assume Google is currently doing evil, we _do_ know they're subject to PRISM and NSLs...
They used to say that you have to trust the compiler. But not even that is true these days. Perhaps you've read about the secret hdd partitions that the NSA were using.
These days it goes like this: open-source hardware, open-source firmware, open-source compiler, open-source software.
Only when an entity can follow this path to build their own trusted stack can we begin to move with high confidence.
You can't trust the supply chain any more, you can't even trust the silicon unless you made it (and all the machines you used to make it) yourself.
Are you _sure_ that network chip, usb controller, or flash ram you built your open source hardware out of isn't exploited?
If you're on the fence about whether you're "too paranoid" or "not paranoid enough", make sure you've read what some people do for free, just for fun/curiosity/geek-cred/reputation:
Then imagine what their grey-suited counterparts on 200K salaries at the NSA with (for all intents and purposes) unlimited research budgets might be up to.
Before somebody hops up, yes, I am as aware as the next guy that today's "path that can't be reached" is tomorrow's path that can, and yes, of course outright removal would be better. It's absolutely fair to raise an eyebrow and ask sharp, pointy questions under the circumstances. But it's not fair to say a browser "phones home" if in fact it doesn't.
Even if it's not phoning home now (and I'll assume it isn't, in good faith), it's far too easy for a seemingly-trivial change later to enable it.
To be fair, that's a very tiny leak: it only indicates (maybe) that someone is running your browser at the source IPv6 address. But if it's tiny, it might as well continue going to Google Public DNS, since it's much more likely to get lost in the noise there (Google Public DNS probably gets a fair bit of IPv6 traffic from users, not to mention all the existing Chrome / Chromium users).
Possible something in the future; or possibly something they could not neuter as easily as they would like?
Even if the fork doesn't add bugs, you are now relying on the fork's maintainer to push security updates. Will they be as good at this as Chrome's team? This is unfair, of course: no startup or small project is ever going to have Chrome's resources. But when it comes to security, speed of updates really does matter.
I think it depends on who and what you most eager to secure yourself from. If you think hackers are the greatest online threat, perhaps you should go with Chrome (if you chose between these two browsers). If you don't trust Google to stay classy when it comes to privacy and data collection, perhaps you could consider running a one or two versions old Iridium version of Chrome. Personally, I use Firefox. I prefer not to use a browser from a company that lives on data collection.
Chrome might be kosher now (to be honest, I don' know), but a decision at the headquarters can change that at the next automatic update.
Google's _security_ record is ridiculously good. Their _privacy_ record is at best questionable, and I can certainly see where people might be interested in a privacy-centric fork of Chrome.
That said, I was under the impression that all of Chrome's "phone-home" features can be turned off via settings.
We're right back to the 'government backdoor' argument, only in this case it's the 'Google backdoor.'
I seem to recall a recent hn post that made FF sound like a bit of a privacy disaster in its own right - specifically on the topic of addons (they all seem to phone home).
Random software installed from the Internet can be harmful, who'd have thought!
In the meantime, Chromium 42 has been released: http://googlechromereleases.blogspot.de/2015/04/stable-chann..., including a bunch of security fixes.
That may be true now that Mozilla has utterly destroyed Firefox Sync's security, but it didn't used to be.
And it's still true that most of Sync's design at least tries to keep your privacy…private, whilst Chrome firmly believes that Google is all-loving, all-trustworthy and all-dependable, and thus deserves to have everything about you.
Keeping up with vulnerabilities in browser codebases is a full-time job and there are very few teams in the world who can fund it, so odds are, forked browsers are going to need creative ways to piggyback on their upstream.
Disable transmission of partial queries, keywords, metrics to central servers without consent.
Builds reproducible, modifications auditable.
I will pass.
Don't get me wrong I think security/privacy is HUGE and I don't want to sound defeatist but come on... I'd love if everything was open source and I could inspect/debug everything I run but that's not the world we live in and you would have to go back to the "dark ages" of computing in order to live by that standard. It's similar to people who have android phones with stock roms who will tell me they prefer android because it's open source. Oh really? Cause all I see is binary blob in your hand just like me. 1-2 of them play with custom roms and this but NONE of them go as to install F-Droid (or whatever to FOSS marketplace is) and ONLY use apps from it.
At some point you have to say "Yes I know I can't know 100% the security of this app/device" and STILL use it. It's that or be a hermit, I don't like it but that's how it is.
The Chrome extensions from https://chrome.google.com/webstore/category/extensions seems to work just perfect in Iridium.
Your comment makes questionable motives visible and I wonder why you don't get downvoted.
I think is something we all do. We all have "wants" in our products and "needs", I want all my software to be secure and open source but I also want something usable and something that allows me to interact easily with my peers. I believe both iOS and Android do this and I believe both are equally closed. Sure Android has a open source base but let's not pretend that that's what the VAST majority of people are running. For the handful of people running custom roms with no closed code and a FOSS app store I salute them. Good for them, I can't do it and the vast majority of us can't. I'm not going to cut off my nose to spite my face. IMHO, the "Android" that 99.999% of people use is no better than iOS when it comes to openness/knowledge that the software you are running does what you think it does.
I've got to weigh the pros and cons of which both my time and my want for secure products factor in. My options a lot of the time are use closed source software or use open source (that may or may not exist) and extend it to do what the closed source software does. Since the latter would take substantially more time (especially when also considering I only know a handful of languages) I choose to use the closed source software. So yes, I trade time/energy for my security from time to time. Put very simply, I'm not going to choose to play life in hard mode, I've got enough stuff going on that I'm not going to add endless coding to that list (Note: I code plenty in my free time to just keep up with and learn more). If that makes me a monster then so be it, I'm genuinely interested to know how other people handle this.
I'd really be interested for someone who ONLY uses FOSS to blog about their whole setup. There are some tools for my job that I simply have to use that are closed source, how do other FOSS advocates deal with this?
Since you seem to be trolling I won't analyze your comment furthermore.
However, I use a custom ROM and no gapps. Do you have specific questions?
In the spirit of friendship and goodwill, my name is Christian Weinz.
I didn't compile my operating system nor the apps that I run. I run cyanogenmod without gapps, and I am preparing to compile own version to remove some features of cyanogenmod I don't like. I use a firewall to block most apps access to the internet. I trust that there are people behind the fdroid app manager that oppose strongly to spy on its users.
The point is privacy is not possible outside of free software.
If you'd rather have convenience and comfort over privacy, it's your choice and you're probably renouncing to your freedoms.
They advertise their product as ”a secure browser“ without making any significant changes under the hood. As ”unicornporn“ said: ”privacy“ != ”security“. Especially when you replace one villain by another.
All the browser does is prevent phoning home to Google, which is preferable if you've decided to permaban Goog. from your Internet traffic. Google is so tightly woven into Chrome and is a huge privacy risk.
On the other hand, you could route all your vanilla privacy waiving stuff through Chrome and use Firefox to do real surfing. Excuse the bias here, but I know my way around the web and Chrome likes to think I don't. I suspect Chrome is some sort of fisher price browser designed for non tech savvy folk.
So use Chrome for Facebook, Youtube, other Google products. But don't use it for actually surfing the web.
I personally find aviator to be more trustworthy at this point though.
* Enabling Do-Not-Track by default: https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...
This is widely considered to be a questionable plan, and violates the Internet-Draft (section 6.3: "It MUST NOT transmit OPT-IN without explicit user consent."). Are they asserting that merely having Iridium over Chromium is explicit user consent?
* Disabling hyperlink auditing: https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...
* Increasing the default client certificate (?) length to 2048 bits from 1024 bits: https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...
Given how much Google's been yelling about 1024-bit server certs, this seems like an obvious thing to change upstream. Has it been submitted / is there a reason they haven't changed it in Chromium?
* Disabling globally-installed NPAPI plugins on OS X, but still allowing those installed in your homedir: https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...
Why? (There's probably a reason, I just have no idea what it might be.)
* Emptying the list of CAs allowed to sign EV certs: https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...
Why? As far as I can tell, the only effect is that EV certs will show up as normal certs (green lock, instead of bar showing the organization name). What does this have to do with improving security or privacy?