Hacker News new | past | comments | ask | show | jobs | submit login
Iridium – Secure Browser (iridiumbrowser.de)
95 points by fcambus on May 3, 2015 | hide | past | favorite | 58 comments



The developers of Iridium don't reveal that their browser phones home to their servers, and that's cause enough for distrust here.

A quick "grep -r iridiumbrowser.de" of the source reveals that they replace calls home to Google with calls home to various hostnames of the form "trk-NNN.iridiumbrowser.de", where NNN is a three-digit number. Presumably these hosts act as proxies. For example, lines 37-38 of chrome/browser/history/web_history_service.cc:

    const char kHistoryQueryHistoryUrl[] =
        "https://trk-139.iridiumbrowser.de/history.google.com/history/api/lookup?client=chrome";
Edit: The git log for this change:

https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...

"Replace URLs to Google services by URLs to our own server, so as to analyze where we still have to patch the browser to make it stop blurting data out."

That's an acceptable excuse in a debug branch, but there's no reason for this kind of privacy-impacting debug code to reach a public build.


This is really great feedback. Reading this and the other comments makes clear that we need to improve documentation what and why we changed things. All the trk-xxx.iridiumbrowser.de hosts are there to find connections which we were not able to disable yet. All these end up at nothing (404 not found) and are not proxied in any way. Essentially Iridium browser should never contact them - if it does then it is a code path we have missed and a bug.


Then replace them with a crash or popup ("Line <x> was reached but shouldn't be reachable. Please report this." or whatever)

Phoning home without permission goes against the entire concept of a secure browser.

Have you ever seen the underhanded C contests? There are far too many ways for something like this to turn nasty. Especially given there's inherent plausible deniability.


I think you want to reach out to them and find out their reason for this. Obviously they are revealing what's going on because it's in the publicly available source and commits.


Iridium's website makes a selling point of the fact that Chrome contacts Google servers, but fails to mention that it may phone home to servers of its own. Having to search the source to discover that large a privacy difference isn't proper disclosure to my mind.

I don't claim it's necessarily malicious. The explanation given in the git log makes sense, though there are better ways to accomplish the stated goal, such as replacing Google's tracking URLs with invalid ones. But leaving those changes in public releases of software supposed to increase privacy seems negligent.


FWIW, whatever Iridium might be doing with "phoned home" data, it's spectacularly unlikely they're capable of getting anything like the datamining utility out of it that an assumed-currently-doing-evil-Google could do. Google Analytics across 60% of the web, Adwords, the other end of a large proportion of my email content even if I don't use gmail (because it's sitting there in my correspondents Google hosted mailboxes) - Iridium don't have any of that to correlate my browser's behaviour with.

And from a person assumed to have no privacy rights by the NSA (since I'm not an American citizen) there's some benefit of that phoned home data existing in a non US legal jurisdiction too... Even if you _don't_ assume Google is currently doing evil, we _do_ know they're subject to PRISM and NSLs...


I'll also mention that - in a completely unfair-to-you way - I just went and scanned through your comment history to see if I could quickly identify NSA shilling... Such is the nature of the discourse these days. Sorry. (Or, if you _are_ an NSA stooge, congratulation, you do a reasonable job of hiding it...)


Hahaha, distrust goes both ways though! Suppose you're trying to coerce people into using this browser knowing that you've (NSA) already snuck in some backdoors?

They used to say that you have to trust the compiler. But not even that is true these days. Perhaps you've read about the secret hdd partitions that the NSA were using.

These days it goes like this: open-source hardware, open-source firmware, open-source compiler, open-source software.

Only when an entity can follow this path to build their own trusted stack can we begin to move with high confidence.


Even open source hardware isn't sufficient against an attacker as seriously resource rich (both cash and technical ability-wise) as the NSA.

You can't trust the supply chain any more, you can't even trust the silicon unless you made it (and all the machines you used to make it) yourself.

Are you _sure_ that network chip, usb controller, or flash ram you built your open source hardware out of isn't exploited?

If you're on the fence about whether you're "too paranoid" or "not paranoid enough", make sure you've read what some people do for free, just for fun/curiosity/geek-cred/reputation:

http://travisgoodspeed.blogspot.com.au/2012/07/emulating-usb...

http://www.bunniestudios.com/blog/?p=3554

Then imagine what their grey-suited counterparts on 200K salaries at the NSA with (for all intents and purposes) unlimited research budgets might be up to.


I wholeheartedly agree, but that is what I meant by 'own trusted stack'. From the ground up (literally), you have to control each part of the process.


Have you verified that the browser actually phones home via network sniffing? If they have in fact successfully used that commit to remove all cases of phoning home, than it is an unfortunate oversight in the commit history of a browser that still doesn't ever phone home, and in practical terms, nothing more. If the code paths can not be reached, they can not be reached.

Before somebody hops up, yes, I am as aware as the next guy that today's "path that can't be reached" is tomorrow's path that can, and yes, of course outright removal would be better. It's absolutely fair to raise an eyebrow and ask sharp, pointy questions under the circumstances. But it's not fair to say a browser "phones home" if in fact it doesn't.


Have you seen the underhanded C contest?

Even if it's not phoning home now (and I'll assume it isn't, in good faith), it's far too easy for a seemingly-trivial change later to enable it.


People's ability to echo my own points back at me as if I somehow failed to consider them and they are somehow correcting me never ceases to amaze me.


In the following commit, the IPv6 probe is changed from Google Public DNS's IPv6 address to that of iridiumbrowser.de:

https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...

To be fair, that's a very tiny leak: it only indicates (maybe) that someone is running your browser at the source IPv6 address. But if it's tiny, it might as well continue going to Google Public DNS, since it's much more likely to get lost in the noise there (Google Public DNS probably gets a fair bit of IPv6 traffic from users, not to mention all the existing Chrome / Chromium users).


All the links in that commit result into a 404; so if they are collecting data it's purely off of whatever they have output in the Access logs.

Possible something in the future; or possibly something they could not neuter as easily as they would like?


Given that Chrome (and Google in general) has possibly the best defensive security team in the world, it's hard for me to take these security-oriented forks too seriously. Indeed, the last "secure Chromium fork" I heard about, WhiteHat Aviator, turned out to introduce a bunch of new vulnerabilities:

https://plus.google.com/+JustinSchuh/posts/69qw9wZVH8z

Even if the fork doesn't add bugs, you are now relying on the fork's maintainer to push security updates. Will they be as good at this as Chrome's team? This is unfair, of course: no startup or small project is ever going to have Chrome's resources. But when it comes to security, speed of updates really does matter.


> Given that Chrome (and Google in general) has possibly the best defensive security team in the world, it's hard for me to take these security-oriented forks too seriously.

I think it depends on who and what you most eager to secure yourself from. If you think hackers are the greatest online threat, perhaps you should go with Chrome (if you chose between these two browsers). If you don't trust Google to stay classy when it comes to privacy and data collection, perhaps you could consider running a one or two versions old Iridium version of Chrome. Personally, I use Firefox. I prefer not to use a browser from a company that lives on data collection.

Chrome might be kosher now (to be honest, I don' know), but a decision at the headquarters can change that at the next automatic update.


Hmm, on a second look I see that Iridium appears focused on privacy, not security. In that case they should call it "private browser", not "secure browser". These words mean very different things.

Google's _security_ record is ridiculously good. Their _privacy_ record is at best questionable, and I can certainly see where people might be interested in a privacy-centric fork of Chrome.

That said, I was under the impression that all of Chrome's "phone-home" features can be turned off via settings.


The thing is, privacy is security: I'm not secure if someone else has access to my private information.

We're right back to the 'government backdoor' argument, only in this case it's the 'Google backdoor.'


>Personally, I use Firefox.

I seem to recall a recent hn post that made FF sound like a bit of a privacy disaster in its own right - specifically on the topic of addons (they all seem to phone home).


Yeah, the extension situation is supposedly questionable. I think you should always be wary with extensions. I usually limit myself to popular and open-sourced extensions.


Those add-ons are things you install yourself on top of the browser. What does that have to do with Firefox's privacy? Nothing.

Random software installed from the Internet can be harmful, who'd have thought!


Except that the browser is missing a lot excepted features and historically positioned itself as "just pile extensions on top of FF to get features we won't add or have removed".

adblock ? disable javascript ? mouse gestures ? download manager ? privacy protection ? duplicate tab ? and so on all are extensions because mozilla refused to implement or removed those features.


The Git tree is already outdated: https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/log/, last commit 2015-04-09.

In the meantime, Chromium 42 has been released: http://googlechromereleases.blogspot.de/2015/04/stable-chann..., including a bunch of security fixes.


Apart from the obvious (it's branched off Chromium 41, whereas stable is 42 and contains security fixes), they turn off automatic updates, so it certainly doesn't seem like it could be a "secure browser". I agree with your other comment that it could be a "private browser", although (of course) those are not entirely orthogonal.

https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...

https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...


Yes. And Aviator is what happens when you put an unusually security-conscious team in charge of one of those forks.


> Given that Chrome (and Google in general) has possibly the best defensive security team in the world

That may be true now that Mozilla has utterly destroyed Firefox Sync's security, but it didn't used to be.

And it's still true that most of Sync's design at least tries to keep your privacy…private, whilst Chrome firmly believes that Google is all-loving, all-trustworthy and all-dependable, and thus deserves to have everything about you.


The one question you really want answered from any "secure" or "private" browser fork of Chromium or Firefox is: exactly how, in excruciating detail, do they track upstream security fixes? Are they getting notification of issues alongside the browser vendor, or do they find out only when the public does, when the embargo on disclosure is lifted?

Keeping up with vulnerabilities in browser codebases is a full-time job and there are very few teams in the world who can fund it, so odds are, forked browsers are going to need creative ways to piggyback on their upstream.


Every time I read a story like this, I'm reminded of Iron: http://neugierig.org/software/chromium/notes/2009/12/iron.ht...


Okay, so their current release is based on chrome 41, supposedly with some security improvements. You know what else made security improvements over chrome 41? Chrome 42.


None of these "secure browsers" seem to actually do any real hardening.


"But it does call home to Google." This doesn't seem to be that much more secure if it still has to contact their servers.


Yet Another Chromium Fork. "Iridium has various enhancements where it forces strict security to provide the maximum level of security without compromising compatibility." -- what does that means, exactly?


The homepage says:

Disable transmission of partial queries, keywords, metrics to central servers without consent.

Builds reproducible, modifications auditable.


Plus you get late updates for security fixes and new ones because of the "secure" additions to the code.

I will pass.


Couldn't those recommendations be pushed to the Chromium project, without the need to create yet another brand of Chromium?


I second that "Chromium" used more in Linux distros.


All chromium forks seem rather useless IMHO. They don't support chrome extensions AFAICT and at best they are a few hours behind Chrome in shipping updates and at worst days/weeks/months/years. Sounds like I have to trade quite a bit for "reproducible builds" which I'm not saying isn't anything, it is but just not something I'm super interested in giving up so much for. We ALL use code everyday that we can't see the full source for (I seriously don't believe that anyone actually working in tech and staying up-to-date can run an OS + all open source software, I just don't believe it) so adding one more piece doesn't seem like that big of a deal.

Don't get me wrong I think security/privacy is HUGE and I don't want to sound defeatist but come on... I'd love if everything was open source and I could inspect/debug everything I run but that's not the world we live in and you would have to go back to the "dark ages" of computing in order to live by that standard. It's similar to people who have android phones with stock roms who will tell me they prefer android because it's open source. Oh really? Cause all I see is binary blob in your hand just like me. 1-2 of them play with custom roms and this but NONE of them go as to install F-Droid (or whatever to FOSS marketplace is) and ONLY use apps from it.

At some point you have to say "Yes I know I can't know 100% the security of this app/device" and STILL use it. It's that or be a hermit, I don't like it but that's how it is.


> All chromium forks seem rather useless IMHO. They don't support chrome extensions [...]

The Chrome extensions from https://chrome.google.com/webstore/category/extensions seems to work just perfect in Iridium.


Then I stand corrected, I had read somewhere before that they didn't work.


If you say that privacy is huge and you don't demand only free software then you are inconsistent. You don't put your money where your mouth is.

Your comment makes questionable motives visible and I wonder why you don't get downvoted.


> If you say that privacy is huge and you don't demand only free software then you are inconsistent. You don't put your money where your mouth is.

I think is something we all do. We all have "wants" in our products and "needs", I want all my software to be secure and open source but I also want something usable and something that allows me to interact easily with my peers. I believe both iOS and Android do this and I believe both are equally closed. Sure Android has a open source base but let's not pretend that that's what the VAST majority of people are running. For the handful of people running custom roms with no closed code and a FOSS app store I salute them. Good for them, I can't do it and the vast majority of us can't. I'm not going to cut off my nose to spite my face. IMHO, the "Android" that 99.999% of people use is no better than iOS when it comes to openness/knowledge that the software you are running does what you think it does.

I've got to weigh the pros and cons of which both my time and my want for secure products factor in. My options a lot of the time are use closed source software or use open source (that may or may not exist) and extend it to do what the closed source software does. Since the latter would take substantially more time (especially when also considering I only know a handful of languages) I choose to use the closed source software. So yes, I trade time/energy for my security from time to time. Put very simply, I'm not going to choose to play life in hard mode, I've got enough stuff going on that I'm not going to add endless coding to that list (Note: I code plenty in my free time to just keep up with and learn more). If that makes me a monster then so be it, I'm genuinely interested to know how other people handle this.

I'd really be interested for someone who ONLY uses FOSS to blog about their whole setup. There are some tools for my job that I simply have to use that are closed source, how do other FOSS advocates deal with this?


You are saying that there is more than security and privacy when it comes to technology. I hear you. Still, saying "security/privacy is HUGE. (...) [but features and social pressure are more important.]" sounds inconsistent. Maybe start next time with "features and social pressure are HUGE".

Since you seem to be trolling I won't analyze your comment furthermore.

However, I use a custom ROM and no gapps. Do you have specific questions?


I'm honestly not trolling. Hell, I'm using my own name to post this all under. I'm genuinely interested in if you compile the rom yourself and every app you use or do you trust in the people behind the rom/app store?


The reason why I suspected you to troll was that you said that Android were no more open than iOS, while in fact it is and I assume that everybody here knows that. For example, one can install Firefox on Android while on iOS one can only install a different UI for the apple-delivered browser. Also, on most Android devices you can easily replace the operating system while on devices with iOS you mostly can't. Apple imprisons its customers.

In the spirit of friendship and goodwill, my name is Christian Weinz.

I didn't compile my operating system nor the apps that I run. I run cyanogenmod without gapps, and I am preparing to compile own version to remove some features of cyanogenmod I don't like. I use a firewall to block most apps access to the internet. I trust that there are people behind the fdroid app manager that oppose strongly to spy on its users.


Your seem to be confused.

The point is privacy is not possible outside of free software.

If you'd rather have convenience and comfort over privacy, it's your choice and you're probably renouncing to your freedoms.


I would have picked a different name since there is already an Iridium in the tech space (satellite phone company). Not sure if it matters in Germany.


Yes it matters (at least to me), i was confused initially if it would be some browser for the iridium sat network.


yeah that's weird..


I don't trust them. They have not a single contact information and no imprint on their website. Which is in fact illegal as their website ist hosted in Germany and uses a german top-level-domain.

They advertise their product as ”a secure browser“ without making any significant changes under the hood. As ”unicornporn“ said: ”privacy“ != ”security“. Especially when you replace one villain by another.


Another big issue I found is that it does not really start with a clean slate. It copies over your existing google profile to make the setup seamless. I think that must be part of their debian packaging, but the profile path it is using (/home/nemo/.config/iridium/Default) is freshly created in my file-system and yet I can see my history from my current chrome profile in there.


It should start with a clean state. Please file a bug at https://github.com/iridium-browser/iridium-browser-ubuntu/is...



There are lots of other 'flavours' of Chromium out there. Try to avoid any closed source binary blobs like Comodo Dragon, and others. I like this because at least we can inspect the source: https://iridiumbrowser.de/development Rather than download from their site, I would much prefer to build this from the source code they provide.

All the browser does is prevent phoning home to Google, which is preferable if you've decided to permaban Goog. from your Internet traffic. Google is so tightly woven into Chrome and is a huge privacy risk.

On the other hand, you could route all your vanilla privacy waiving stuff through Chrome and use Firefox to do real surfing. Excuse the bias here, but I know my way around the web and Chrome likes to think I don't. I suspect Chrome is some sort of fisher price browser designed for non tech savvy folk.

So use Chrome for Facebook, Youtube, other Google products. But don't use it for actually surfing the web.


It would be nice to see how Iridium fares against WhiteHat aviator https://www.whitehatsec.com/aviator/

I personally find aviator to be more trustworthy at this point though.


How does extensions update happen in Iridium? I noticed in Chromium, they check for updates frequently by connecting to the Google store. Does the same happen in Iridium? If not, how?


There are a bunch of things I don't understand in the patches. I wish they'd link to a bugtracker or something. (Incidentally, Chrome/Chromium has a public bugtracker: Iridium seems to have a Trac that nobody used apart from creating two tickets.)

* Enabling Do-Not-Track by default: https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...

This is widely considered to be a questionable plan, and violates the Internet-Draft (section 6.3: "It MUST NOT transmit OPT-IN without explicit user consent."). Are they asserting that merely having Iridium over Chromium is explicit user consent?

* Disabling hyperlink auditing: https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...

As the HTML spec (https://html.spec.whatwg.org/multipage/semantics.html#hyperl...) points out, the behavior of hyperlink auditing in terms of privacy impact is already achievable in several ways, like server-side redirects, JavaScript, etc. The goal with the feature was to make performance and user experience better, while not changing the privacy standard. Is it being changed in Iridium for privacy reasons or for other reasons?

* Increasing the default client certificate (?) length to 2048 bits from 1024 bits: https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...

Given how much Google's been yelling about 1024-bit server certs, this seems like an obvious thing to change upstream. Has it been submitted / is there a reason they haven't changed it in Chromium?

* Disabling globally-installed NPAPI plugins on OS X, but still allowing those installed in your homedir: https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...

Why? (There's probably a reason, I just have no idea what it might be.)

* Emptying the list of CAs allowed to sign EV certs: https://git.iridiumbrowser.de/cgit.cgi/iridium-browser/commi...

Why? As far as I can tell, the only effect is that EV certs will show up as normal certs (green lock, instead of bar showing the organization name). What does this have to do with improving security or privacy?


Wow, these guys seem to be using an ancient and vulnerable version of cgit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: