Hacker News new | past | comments | ask | show | jobs | submit login
Cryptocat Considered Harmful: The Root Cause (datavibe.net)
52 points by _pcwg on Dec 31, 2013 | hide | past | favorite | 66 comments



So make something better that people will actually use--then the question of what to use will become a no-brainer. "Just use Foo." The "best" alternative to something like Cryptocat is Pidgin/Adium+OTR plugins, and you can't seriously claim they're as usable (nor are their implementations actually perfect.) If not that, then help to fix whatever issues the popular tools have. (They're open source, after all.)

Make formal security proofs, implement them, open source your prototypes, and have them vetted by as many cryptographers as possible (so one or two if you're lucky.) Then figure out how to market your product.

By far the hardest aspect of cryptography engineering is getting people to use your software in the first place. It doesn't matter how good you are at crypto if your software is never used.

It's very easy to criticize. Much harder to actually make more secure, more usable alternatives. (And, ironically, the people who ought to be doing this the most are much more hesitant to do so since they know of many more subtle ways to make mistakes.)


> By far the hardest aspect of cryptography engineering is getting people to use your software in the place.

I think perhaps a neglected aspect of the problem is how to turn difficult social / political problems (eg. nobody uses PGP and people think you're a weirdo if you try to persuade them to) into tractable technical problems (the kind cryptographers mostly talk about). I sometimes think it would be preferable to start from a point where everybody had public and private keys and knew how to use them, but the crypto was no better then ROT13, than the current situation where the crypto is pretty good but getting people to use it is nearly impossible.

I also think the emotive "bad crypto puts lives at risk" argument only really makes sense if you're talking about crypto for the military or a small number of political activists, who will in any case benefit if their encrypted transmissions are buried among everybody else's. Those people need to be more careful than the rest of us with our more quotidian privacy concerns. I would rather have more bad (but tractable) crypto than great crypto that is used by nobody.

Hopefully somebody will persuade me I am wrong about this so I can stop feeling like a crypto heretic.


Cryptographic communication tools have a network effect (just like any other communication system), so it's kinda pointless if only the few high-profile activists use it. Also, that would make them stick out, thus reducing their security in some ways. If you can detect the important people by the communications protocol that they are using, you already have the most important part of the information without any need to decrypt anything. Them being buried among bad cryptoraphy most likely won't work - making cryptography indistinguishable is one of the hard parts, so it's one of the properties that bad tools are unlikely to have.

Also, a part of the social/political problem is that people tend to not know that the crypto they are using is bad, and political activists tend to not necessarily be cryptography experts either, so how would they know that they are in danger when everyone around them tells them that the broken crypto they are using is the thing to use?


Your first paragraph seems to be more or less agreeing with what I said - maybe I misunderstand.

> Also, a part of the social/political problem is that

> people tend to not know that the crypto they are using is

> bad, and political activists tend to not necessarily be

> cryptography experts either, so how would they know that

> they are in danger when everyone around them tells them

> that the broken crypto they are using is the thing to use?

But there is always going to be a problem with telling people "use our software and you can organise the overthrow of your government without fear". There is no way around the fact that people who are doing that need to understand the risks better than most people do.

(How are you supposed to blockquote text on HN?)


I at least did not intend to agree ;-) - my point is that cryptographic communication tools are kindof useless for activists unless they are used by a large number of people in general, both because otherwise they are isolated and can not actually communicate with anyone using the system (except for a few other activists), and because using the tool would make them stick out, which is why they are also indirectly somewhat useless for the general public, because part of the function of a secure cryptographic communication tool is to give the general public the benefits of the activism by making the general public help in hiding the activists. So, I guess my point kindof is that "crypto for <x>" doesn't actually work because using it marks you as <x> - secure crypto has to be used by everybody in order to provide protection to those who need it the most.

As for the fact that people who do risky things need to be a bit more cautious anyway: Well, yes, but that does not mean that they would not benefit if everyone knew which crypto tools are secure and how to use them, and in contrast to most of physical security, there really is not that much need to distinguish between "professional" and "end customer" tools - proofing your vault against bombs might be a bit more expensive than proofing it against a burglar, but secure cryptography does not need more expensive computers or anything like that.

And also no clue how you to quote here ... ;)


You certainly do notice that the same "argument" would apply to yourself?

Also, suppose some new appliance regularly killed its users due to bad electrical isolation. Would you use the same argument when someone criticizes the manufacturer of that appliance? People doing things in a way that harms others is beyond criticism unless you yourself are doing things better? You wouldn't complain if your doctor treated you incompetently unless you could do it better yourself?

Also, your basic premise is flawed: Making valid criticism is not "very easy", but also often requires considerable expertise, which in turn takes considerable work to acquire. But that doesn't matter anyhow: Criticism either points out actual problems or it doesn't, it's completely irrelevant to its validity how much work went into it.


You sure are extracting a lot of things from my comment that I actually didn't say.

I don't think anyone is above criticism, nor do I think truly understanding how a piece of software works is "very easy." All I'm saying is that we see criticism of Cryptocat over and over, yet, here we are, with people still using Cryptocat.

The author wants Cryptocat shut down, but if that happens, what will the people using Cryptocat do? Communicate in plaintext? Isn't it irresponsible (and in line with your own reasoning about putting people in danger) to not present the users with a better alternative first?


I don't see that he wants it shut down - all he wants is to be made sure that people are aware of the risk. If your alternatives are to communicate in plaintext-equivalent (that is, broken crypto) while thinking you are protected or to communicate in plaintext, knowing that eavesdropping is possible, the latter is the better alternative! And there are even more alternatives already, like meeting people in person, or using PGP, or any number of things. Even using cryptocat while knowing that it is not reliable is better. There are tons of options right now, and the worst of them is to use an unreliable protection while thinking it is protecting you, hence the criticism.


> All I'm saying is that we see criticism of Cryptocat over and over, yet, here we are, with people still using Cryptocat.

People make poor decisions all the time. The fact that people use Cryptocat might indicate that it has good marketing; it might indicate that it has a good UI; it might indicate that people are responding to network effects in communications. What it doesn't do is contradict the security criticism of Cryptocat. It's irrelevant to the question of whether or not Cryptocat is secure.

> The author wants Cryptocat shut down, but if that happens, what will the people using Cryptocat do? Communicate in plaintext?

They are already effectively communicating in plaintext; it's better for them to have to do so, and be forced to recognise the fact. Someone who lives in an oppressive regime and incorrectly believes his communications secure may very well betray himself; someone who lives in an oppressive regime and believes his communications insecure is less likely to do so.

> Isn't it irresponsible (and in line with your own reasoning about putting people in danger) to not present the users with a better alternative first?

It's more irresponsible to give them a false sense of security, and lead them into deadly danger.

It really is quite simple: at some point, Cryptocat's bad marketing will cost more human beings their lives than good marketing would; at some point, Cryptocat's bad design will cost more human beings their lives than good design would; at some point, Cryptocat's bad implementation will cost more human beings their lives than a good implementation would. Those lives are IMHO far more important than the warm-and-fuzzy convenience of easy-to-use but insecure communications.


> They are already effectively communicating in plaintext

Got anything to back up this statement, or is this what you're inferring from the post and the analysis of the group chat component a while back? Are you saying that the OTR implementation in Cryptocat leaks the plaintext? That would be very serious.

Also, I don't disagree about misleading messages, but take a look at https://crypto.cat/ and tell me if the content on there is misleading compared to the messaging of many other security software companies.


> > They are already effectively communicating in plaintext

> Got anything to back up this statement, or is this what you're inferring from the post and the analysis of the group chat component a while back?

That flaw meant that key-guessing was easy, and with an easily-guessed key even the best-encrypted data becomes plaintext.

Given the numerous flaws so far found in Cryptocat and the quality of its code, I wouldn't trust my treasure, freedom or life to it.


Indeed, the PRG was flawed, but IIUC, this applied to only the RSA key generation used for group chat, not OTR, which is what cryptocat is mainly used for.

People reference "numerous flaws" a lot, but it all seems to lead back to the criticism of group chat from a while back. I'm not saying you're wrong--just be careful of the echo chamber.


> The author wants Cryptocat shut down, but if that happens, what will the people using Cryptocat do? Communicate in plaintext? Isn't it irresponsible (and in line with your own reasoning about putting people in danger) to not present the users with a better alternative first?

First of all, I personally think that if you have to use Cryptocat that you might want to exhaust all other options before using it.

Second of all, if you're already in a compromised situation, do you want to use a compromised communication medium? It doesn't seem sensible.

Lastly, there are alternatives to Cryptocat:

https://otr.cypherpunks.ca/

This is actually created by someone with a clue and isn't full of cutesy icons and faux Amiga designs.


Ironically, OTR is what Cryptocat uses. But for the sake of argument, let's compare it to the OTR plugin for Pidgin, another IM: It's just not as usable. I'm sorry. Look at the very website you linked!

But... it's possible to make it just as easy to use! Or even better: To make a minimal client that accomplishes the same as Pidgin without presenting as large of an attack surface.

Glenn Greenwald nearly missed out on the biggest national security story of the past decade because he couldn't figure out how to get PGP to work. Yes, we can expect people to put a little more effort into protecting themselves if they genuinely believe they're at risk, but we can't expect to do things they can't do. Not everyone is a techie.


OTR is only used in one-on-one communications in CC; group chat mechanisms are custom, and may now converge towards the mpOTR draft but that's still a pretty big risk.


your response only works assuming everyone could plainly see how the appliance works and the manufacturer was openly looking for contributions to the design.

Of course, if that was the case, the edge of your response is blunted, because then responsibility for that failure is more distributed.


Crypto can't be easy and safe. You need to check fingerprints (attempts to make this "friendly" with images and the like have tended to have vulnerabilities). Beyond that, what's your issue with OTR? A separate network has the advantage of not having to do the negotiation step of OTR-over-jabber/AIM/etc, but it will have bootstrapping problems (or maybe not; maybe a well-designed app that just used OTR all the time, on a new network, would be enough).


> Crypto can't be easy and safe.

It's not that black and white. Yes, usability tends to carry with it some measure of sacrifice in security, but Skype used to have a lot more security, and was as easy to use as it is today. They're not absolutes. You can have "quite usable and very secure" and "very usable and quite secure", things none of the apps we're discussing are.

I don't have any big concerns with OTR (aside from the inability to do offline messaging,) just the implementions, mainly OTR in Adium. OTR in Pidgin appears to be decent, but hasn't received a lot of review, as far as I know, and Pidgin has its own problems/provides its own attack surface.


Skype was never meaningfully secure; the code wasn't open and you had to trust their servers to handle authentication. (And IIRC the best-guess reverse engineering of the crypto looked to be RC4, which while not outright broken is not a massively secure cipher).


I didn't say it had a satisfactory level of security, just that it was a lot more secure than it is now, and still had the same level of usability that it does today.


I think it's misleading to say "more secure". It wasn't secure and I can't think of any realistic attack scenario that it would have been secure against (but now isn't); for the NSA to intercept everyone's messages while it was more distributed would have required releasing a version update and waiting for the supernodes to pick it up, but that's not a high bar.


> I think it's misleading to say "more secure".

I think it's misleading to insinuate that security is an absolute, but, again, you're missing my point. I'm not telling anyone to use Skype. I'm saying that the added security didn't come at a cost to usability.

> for the NSA to intercept everyone's messages while it was more distributed would have required releasing a version update and waiting for the supernodes to pick it up, but that's not a high bar.

I would say that your bar is very high, then. This would defeat nearly everything that exists and is in use today.

Also, there's probably no one in the world that can actually defend themselves against the NSA if the NSA is determined to know what they, specifically, are doing.


> I'm not telling anyone to use Skype. I'm saying that the added security didn't come at a cost to usability.

What added security? I didn't say it's impossible to be easy and as safe as older versions of skype (which is to say, not very). I said it's impossible to be easy and safe. (In particular, you need some kind of key fingerprint checking, and no-one's found an effective, user-friendly way to do that).

> I would say that your bar is very high, then. This would defeat nearly everything that exists and is in use today.

It wouldn't defeat GPG, or OTR-based systems used in reasonably popular open-source clients.

> Also, there's probably no one in the world that can actually defend themselves against the NSA if the NSA is determined to know what they, specifically, are doing.

Sure. But let's look at a realistic threat model, and at what's actually happened: the NSA did intercept all communications channels run by individual providers, including skype. The NSA was prepared to demand these providers deploy new backdoors into software they distributed that didn't currently have them, as we saw with lavabit and RSA, and when lavabit refused they were shut down. The NSA were not terribly effective at compromising open, respected standards (they did succeed in getting a broken algorithm standardized, but the main reason this wasn't noticed is that hardly anyone was using it, and even then questions were being raised in the crypto community), and did not compromise GPG or similar open-source projects. Nor did they tap users of those systems indirectly by compromising their email clients or similar. Observe that Snowden, with inside knowledge, chose to use PGP to communicate with journalists, and this did in fact provide sufficient security.

Meaningful security is possible. Skype isn't it.


> It wouldn't defeat GPG, or OTR-based systems used in reasonably popular open-source clients.

Until you consider where GPG and OTR are used, e.g. Enigmail or Pidgin, addons or clients which both autoupdate or ask to be updated.

There are very, very, very few pieces of software that either don't need to be updated, or can't trivially be backdoored by the vendor itself through updates.

You keep going back to "Skype didn't have security"--and I can't tell if you're trolling, or what--but you can't seriously harp on it for auto-updating. So does Chrome, and it's lauded for auto-updates (the downside of not updating is obviously that security issues aren't fixed, arguably a much bigger risk than the vendor backdooring the software in later updates.)


> Until you consider where GPG and OTR are used, e.g. Enigmail or Pidgin, addons or clients which both autoupdate or ask to be updated.

I can't speak to those; I use KMail and Kopete, neither of which auto-updates. My OS does ask to update those packages, but it will only do so with my explicit intervention, there's a code signing process in place (and any bad updates would be traceable to individuals rather than an institution), and the people who run it are based outside the US.


https://heml.is/ should be on the way. I've been keeping on eye on it. It'll be interesting what the security community thinks of it after release.

There is also TextSecure (https://whispersystems.org/), but it requires text messaging.


Definitely holding off until the open source release (and hopefully it won't just be dumps of old versions like Silent Circle's.)

Agree that TextSecure and Redphone are great tools, albeit in different categories, and as far as I can tell their implementations are sound.


Nadim's ego has lead him down a path where he believes that what he is doing is infallible and his critics do not deserve any level of praise--and it is reenforced by those who do not know any better than he does. You can see this in any project or startup, but in the case of Cryptocat, we have a situation where lives are potentially at risk and there is a likelihood that someone has already been compromised due to his actions.

The "cutesy" icons and flashy colours that Cryptocat displays are really nothing more than lipstick on a pig.


As a passive observer of all cryptography discussions on HN, I can't help but think if security researchers spent as much time on creating usable, secure software as they did in proving that other's implementations were flawed we'd be in a much better place.

As a user, I just want to be able to message another person, over the internet without having to worry about setting up plugins or setting up any kind of keys. I want to add them to my friend list, click their name, send them a message and be comfortable in the fact that my communication cannot be intercepted.


As another passive observer, I am grateful for all of the time that security researchers spend on proving that many cryptographic implementations found in the wild are flawed. If they did not do this, all implementations would be closed source ROT13 with a $50k pricetag.

I'm still waiting for simple, usable crypto - but I'm not willing to settle for evidence-free, warm assurances of safety from borderline incompetents in the field while I wait for it.

>I can't help but think if security researchers spent as much time on creating usable, secure software as they did in proving that other's implementations were flawed we'd be in a much better place.

Can this comment be part of the HN crypto thread drinking game? It's in every thread x10. I can't help but think if commenters spent as much time learning how to use good crypto as complaining about the time researchers spend picking apart bad crypto, all of their issues with the current implementations would disappear.


> I can't help but think if commenters spent as much time learning how to use good crypto as complaining about the time researchers spend picking apart bad crypto, all of their issues with the current implementations would disappear.

Not really, because each commenter here interacts with 10's if not 100's of people who have little to no chance of learning to use good crypto.


Trying to evaluate and attack systems is a vital part of building secure systems. Many of these researchers do work on their own systems (think tor, redphone, ...) while also engaging with the crypto community at large to find flaws and improve methodology.


> As a user, I just want to be able to message another person, over the internet without having to worry about setting up plugins or setting up any kind of keys. I want to add them to my friend list, click their name, send them a message and be comfortable in the fact that my communication cannot be intercepted.

The problem isn't that nobody understands that, the problem is that that's a very difficult (arguably impossible) problem to solve.


I thought tptacek's response to my question here was interesting, WRT crypto dick waving:

https://news.ycombinator.com/item?id=5776111

Edit: another thought, looking at the incentives involved:

* If you, as a security guy, spend your time breaking stuff, you win some points for that if you score a 'hit'. If you don't, well no one is really paying attention. Pretty much all systems - even those written by really bright guys like cperciva - have flaws, so if you look enough, you'll probably find some.

* If you write your own system, you attract the attention of all the people out to break it. And eventually they probably will find some problem and write Comic Book Guy style posts about how the system is badly flawed. And your reputation will suffer.


Ironically, tptacek spends a large amount of time criticizing crypto, but a lot of the time it's useful criticism (i.e. actually provides some kind of a solution.)

I have nothing against criticism. If there's a flaw in something, let's talk about it. But I don't care for "I know how to do it better," and nothing more.


tptacek's comments are usually interesting and insightful, and he seems like a good person who does his best to interact in meaningful ways here, rather than simply channeling the Comic Book Guy as is the custom in crypto dick waving forums.


Would you rather use a piece of communications software which purported to be cryptographically secure, but wasn't, and not know it because no security researchers spent any effort attempting to prove that its crypto implementation was flawed?


> Would you rather use a piece of communications software which purported to be cryptographically secure

..than communicate in plain text? Yes.

Where's the alternative? We can have Cryptocat shut down, which is what the author is suggesting, but then what are we (and by that I really mean people who currently use Cryptocat) going to do?


So, let me put that a bit more clearly:

You would prefer to communicate in plaintext-equivalent where you think nobody can read it even though in fact everybody can over communicating in plaintext where you know everybody can read it?


I wouldn't prefer that, but I'm also not as convinced that Cryptocat is as "clearly broken" (i.e. plaintext is trivially recoverable) in its current state as a lot of people on here are. Most of the attacks that I've seen so far were against the group chat implementation, which, granted, is significant, but not against the primary component, the OTR chat.

I think it is somewhat naive to believe that any mechanism other than a one-time pad will absolutely keep your communications safe, and that it's a little dangerous to insinuate that Cryptocat leaks information about the plaintext but X or Y doesn't.


Where is the author suggesting that Cryptocat be shut down?

He makes the very valid point that advertising a insecure product as secure to people who need security but don't understand it is very wrong. He's asking that the advertising be corrected.

We're not worrying about your credit card getting ripped off - this software claims to solve life or death problems.


He wants it taken off the app stores.

I don't disagree with the point that you shouldn't pretend, but literally no one is in a position to make absolute promises like that, yet everyone's doing it.

I'm not saying it's okay, but let's keep in mind that they say on their frontpage (http://crypto.cat) that you shouldn't trust it with your life. That's better than most security software marketing.


Yeah, but they're not saying it on the app stores, hence wanting the text changed or the app removed.


I have no answer. I don't think anyone can argue that the research is not valuable, but what would be more valuable is a peer-reviewed piece of software created by experts in this field so we would not even be having this discussion (re: telegram, cryptocat etc.)


OTR is as close to this as you'll get for IM at the moment, and even the most popular implementations like OTR for Pidgin (or Cryptocat!) haven't been vetted very thoroughly (not to mention Pidgin itself has a fairly spotted security track record.)

The people at heml.is are looking at implementing something like TextSecure as an IM, and its interface does look quite attractive, but it's a no-go as long as they're not open source (especially given that none of its authors are cryptographers), and thus can't receive a thorough review. (Actually getting qualified people to perform a review once it's open source is a challenge in itself, but perhaps people will spend more of their energy on that than writing blog posts saying they could do it better.)


But that would require implementing and marketing a significant piece of software. The users who are risking their lives using Horribly Insecure Piece of Software X are important and we should save them.. but they're not that important.

/s


Guy creates a blog and his first single post is to discourage someone truly trying to innovate in the cryptography space (though admittedly more in usability aspects).

After listening to Glen Greenwald at the CCC it was quite clear that cryptography that is easier to use than PGP is really needed in this world (he almost lost the Snowden story due to it). I think that Nadim needs to be encouraged. Sure, point out any flaws but aim for constructive feedback.

The points here centre around it "not good enough". This is a bit of a chicken and egg problem and isn't really helpful.


Don't implement your own crypto. Better people than you have tried and failed. Everyone should know this by now. If you can innovate on the usability, that's great, and we really do need that - but build it on top of a well known, peer-reviewed protocol like OpenPGP. It's not like it's even any harder than rolling your own.


Definitely. I'm really interested in the progress of OpenPGP.js. It could possibly replace a lot of the sketchier parts of Cryptocat.


Even if it does, it still won't help. Crypto in the browser is like playing soccer in a minefield: either you don't move or you lose a leg. Either way, your game is hosed.

The issues are, to put it mildly, insurmountable. The environment is simply too toxic to trust. Between standard Web security flaws, timing attacks (what happens when one context can detect the timing of another? Remember, the code is slow, so your resolution doesn't have to be good), inadequate random number generators, an inability to securely manage memory (don't want key materials floating around), etc.

I'd rather trust Bob's Discount Car And Certificate Authority than JS crypto.


Unfortunately, after the recent revelations this is how I feel about computers in general :)


Well a blog is probably going to have a first article :\

I agree that the "world" could benefit from an easier to use cryptography product than PGP (event thought I'm fine with PGP) and I think that this post is valid criticism.

Disclaimer: Not a cryptography expert in any way, neither annoyed by the fact cryptography is hard and will probably benefit from processes like peer-review.


I understand the problem here: don't experiment with crypto with your users' safety in the balance, claiming all the while that they're safe. The sad reality is that none if his users will ever know that there's a problem until it's too late.

Slightly off-topic, but this is one of those areas that bugs the hell out of me, and I don't know the solution. On one hand, security and cryptography people tell lawmakers and those in authority that crypto is math, anyone can do it, it's silly to try to regulate it, etc. On the other hand, these same experts tell the "anyones" of the world not to implement their own crypto, mistakes are easy to make, correct implementations are hard ...

Here's the kicker for me: If you absolutely should never release another piece of software that might have bugs that could endanger someone's life, then you'll never release another piece of software. You can become the greatest cryptographic implementor on the planet, implement to the current state of the art, and, in a couple years, still have your work completely obliterated by a new attack against a cryptosystem that you are using correctly.


I don't think your two examples are contradictory. It is silly to try to regulate export of strong crypto, and it is difficult to get crypto right.


Note that this article simply shits all over Cryptocat without giving any concrete examples: "has had myriad errors in implementation" and "After being berated by dozens, repeatedly, because of the myriad flaws". I kept waiting for Paul to substantiate his criticism or at the very least link to some of the implementation flaws he keeps trumpeting, but he doesn't. Pointing out that Cryptocat has tried multiple encryption schemes isn't really evidence in itself, either.

For all I know this guy could be totally right about Cryptocat, but this is absolutely not the way to make this kind of statement. It isn't well-reasoned and it sure as shit isn't informative.


A nice quote from Phil Zimmerman from a comment in a post by Schneier which was posted in a comment to this post:

"I remember a conversation with Brian Snow, a highly placed senior cryptographer with the NSA. He said he would never trust an encryption algorithm designed by someone who had not earned their bones by first spending a lot of time cracking codes. That did make a lot of sense. I observed that practically no one in the commercial world of cryptography qualified under this criterion. "Yes", he said with a self assured smile, "And that makes our job at NSA so much easier." A chilling thought. I didn't qualify either. "

https://www.schneier.com/blog/archives/2011/04/schneiers_law...

edit:

By the way I think that Jeffrey Paul has a relevant point, I think it deserves to be taken into account. I understand his words can hurt Nadim Kobeissi nevertheless from my point of view they carry no such will.


[deleted]


I voted you up due to sentiment, and I don't mean to sound like a grumpy member of the Cabal of Crypto Criticizers, but what's the point in Nadim developing the application if he won't respond well to honest peer review and wakes up every day and decides to fuck around with the basic structure of the application, and not only that, but makes fatal flaws when doing so?

I'm all for learning and experimentation, but not when some dude from Syria is literally Tweeting you and saying, "Hey, thanks!", which is the point of the article. I think end-point spying is probably bad enough at this point that we really don't need a broken protocol too. I don't really see the point in constantly changing out crypto primitives like he does, and it has introduced major security vulnerabilities in the past. He should find something, stick with it and then get it audited and reviewed, and then I think less people would bitch.

I support the goal of making crypto more usable.


I think you responded to my comment that I somehow managed to delete, but re-posted here: https://news.ycombinator.com/item?id=6990738

I agree with you. I don't think Cryptocat or how the implementation is handled is perfect by any means, but people have been complaining about that for a long time, and we still don't have good alternatives. What I'm saying is that if somebody makes something that is technically sound and is as attractive for regular users, in Syria and elsewhere, then this whole debate becomes relatively moot.

If Nadim is as unwilling to cooperate as you're implying (which I don't think is totally true, but I will grant you that the underlying constructions for the group chat component were switched out recklessly and ignorantly), then surely complaining about it in blog posts will be very ineffective.


I don't think he's unwilling to cooperate, and I don't mean to pick on the guy, because he's basically become a whipping boy. I think the reason he reacts so adversely to the criticism is the obvious use of him by bloggers and "crypto pundits" to bolster themselves. It's not constructive, and I don't wish to encourage that.

That said, security issues are still security issues, and my tl;dr as someone no one on this site cares about is for Nadim to just stick with a set of crypto primitives and protocol design that work, fix the problems that arise and stay there until he's more confident in what he's doing.


It would be useful to get the opinion of the Crypto Cabal on what is good safe software that is accessible to the general public and deserves to be promoted and well marketed?

Any suggestions?


I'm not a member of any "Crypto Cabal", and I use the term sardonically, directed at the people people who come out of the woodwork to pimp/gratify themselves and their businesses by posting bullshit about crypto, especially on HN (e.g., the "USE BCRYPT USE BCRYPT USE BCRYPT" guy who claimed that the problem with an RSA exponent of 1 was that "1 is a prime number" (?), or ironic articles of the "I JUST LEARNED ABOUT [crypto topic] SO DON'T EVEN THINK ABOUT ENCRYPTION" variety).

My opinion, as someone who is not important, is that most crypto software is bad, and most software that is fun to use has bad crypto. The Silent Circle stuff looks good, as does the Whisper Systems stuff, and I personally use Pidgin + OTR, which is crap from a UI standpoint.

I totally understand the design/UI motivations behind Cryptocat, but IMO Nadim needs to stick with a protocol design and crypto primitives that work, fix any flaws and then leave it alone until he's more comfortable (perhaps he's done that already).


The best general answer to this question is probably "Use Tails: https://tails.boum.org/ ." Especially for activists in Syria/regular users who have a genuine concern for their lives.

(Tails uses Pidgin with the OTR plugin.)


Things that use OpenPGP or libOTR. At the level of an organization where you can run your own CA, X509 might be easier (it's integrated into outlook IIRC).

I don't know what the accessible frontends are and no doubt there's work to be done there, but the basic primitives are a solved problem, and I'm pretty sure a better frontend on top of either would be very welcome.


Considered Harmful considered harmful, please use the active voice.


"Considered Harmful" is supposed to be humorous, please do not attempt humour on HN.


See?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: