With those in mind, and that Signal already exists, it's worth considering if there's actually something for the DoD to gain by sinking millions of dollars and years of time into developing and maintaining something new that might not be an improvement. The answer might legitimately be "No, that's not worth it".
The problem with the military using an off the shelf system like Signal isn't the tangible features (re: encryption, usable ux). The problem is one of control. What happens when the government gets into a dispute with Signal (say, over a warrantless wiretap) and Signal decides in protest that it will no longer have any DoD affiliated customers? What happens when, through a series of shell companies and legal loopholes, a controlling stake in Signal ends up in Chinese or Russian hands?
From the PoV of the DoD, owning the system from end to end is a feature they shouldn't compromise on, even if the other features suffer and it costs more than it otherwise would.
(I’d add here that I’m a big fan and user of matrix, while being aware of its limitations as a marketed product.)
Matrix have shipped some high profile products to clients like national governments, but there’s little hope for traction with the general public when the riot.im signup process is so unpolished, and when the most common entry point into the Matrix system brands itself with a name other than Matrix.
Signal uses the signal website and signal app for iOS and signal app for desktop and refers to the signal protocol in all it’s literature. Matrix could take a leaf out of their book!
(Perhaps licensing the trademark would be a good start?: Riot -> Matrix Riot, Synapse -> Matrix Synapse, and move the riot.im functions into a domain that has matrix in it?)
Perhaps they are ones that I expect the DoD has considered? As far as I know Signal is open source and no commercial arrangement is required to make use of it, putting the DoD in a position to respond in a technical capacity should there be a conflict.
In legal terms, I believe the DoD could stop any acquisition on national security groups. The US government, like many national governments, reserves that power. Signal is developed and run by a 501c3, making this a somewhat unlikely series of events.
Again, those are great points. I'm just not sure they wind up being any risks the DoD can't readily control.
Lots of things are contracted out but still run in DoD data centers with government oversight.
Me = Former DoD software engineer of 12 years.
But even so, there are a shit ton of DoD software people.
It’s called the Defense Information Systems Agency. And they’re not even the biggest in that space.
I worked there for about five years.
"Oh don't worry about TLS interception for the office firewall, we can break RSA anyway" is not how a bureaucracy thinks. Come on.
Conspiratorial bureaucracies operate that way. Trust no one, allow nothing that can be used to undermine...
Read Memoirs Found in a Bathtub.
'we cannot SURPASS the existing (and free) solution already in the market'
This seems more like an endorsement of the level of integrity and robustness of the product and a recognition that taxpayer dollars could be better spent not re-inventing the wheel.
Additionally, off the shelf products mean that they could spent far less resources ensuring it continues to meet and exceed the necessary standards rather than building a new thing.
> Squad: the #1 small unit combat App
>What's New in 3.14: Enhanced Scout/Recon route setting- select up to 7 types of perimeter security
>In-App Purchase: Unlimited "Friendlies" List - avoid unwanted friendly fire incidents.
And accessories like cases and cables from Kagwerks: https://kagwerks.com/collections/smartphone-cases / https://kagwerks.com/collections/cables/products/usb-3-0-to-... (aside: their training courses are A+)
This directive however, likely has nothing to do with combat. Its more likely related to trying to maintain some semblance of OpSec when things which shouldn't be sent over these cell phones ends up being sent over these cell phones and to help avoid every other private from texting mom/dad/gf things which will then be immediately leaked, by reminding them of the "seriousness" of the situation.
Why do they not have some MDM like InTune or Knox and have the apps they need pre-installed and the phone locked down to prevent any other apps from being installed?
Signal already gives encrypted backups with a password you see one time when setting up. I imagine you could easily modify that to lock the app with tye Android administrator shenanigans, and then periodically upload incremental backups.
The messages would of course remain encrypted but the key would be in the IT administration's hold.
I believe the end-to-end part of Signal is very interesting if you consider the whole NETWORK to be hostile, but that both ENDS are friendly once authenticated.
I can see a very FoA-friendly implementation of this.
It always seemed kind of stupid to me that they would spend billions of dollars to reinvent another wheel. It's like saying "Physics are good and we could add some more research and engineering for our case, but no, physics is opensource so let's make our own physics"
Not exactly the same but ...
But I'll disagree with you (while agreeing with you) on this
> Everyone here is speaking like there HAS to be a backdoor
The DoD __does__ have a backdoor. It is the cellphone, not Signal. There doesn't have to be a backdoor in Signal for them to have full access to these communications.
How? By compromising the baseband, and then pivoting to compromise the AP? But what about IOMMU? Or is that pwnable?
This is the huge advantage that Signal has over mail, the default mode in Telegram and pretty much anything there is: it does matter if NSA, FSB, MI5, Mossad, Google and Facebook all have root on a server that all the traffic passes through. To the best our knowledge - long as they don't compromise one of the endpoints - the only thing they'll get is metadata and the only thing they can do is disrupting the service.
Also, Wickr is open source: https://github.com/WickrInc
We fly troops on Delta and American too. I bet a few even flew in Boeing 737 Maxes. Perfect? No. Excellent decision among some number of possibly better decisions, that could have been made at the time? Yes.
Unless it’s classified and it would be very easy to make the case that all of the communication being sent on Army-issued cell phones would be Confidential at a minimum. The important stuff all gets copied into OPORDS and will be available for declassification someday.
E2E encryption is tight until you hack a client or system on either side, but at least it makes it nearly impossible to attack the middle.
We aren't. Exploits are found in decades old software that we once believed was relatively well vetted, and in Signals case their desktop app for example had a rather nasty XSS about 14 months ago or so: https://www.cvedetails.com/cve/CVE-2018-11101/
That doesn't meen there isn't a huge difference between Signal and everyone else:
WhatsApp keeps on uploading unencrypted backups to places where it is know to be within reach of NSA, so if NSA is in your threat model, forget about WhatsApp. They have also been sloppy when it comes to accepting remote client swaps, and they've also had at least one nasty vulnerability.
Telegram, AFAIK, has managed to annoy most leading cryptographers so they don't care to put much effort into verifying it it seems.
so $700B in defense spending can't match some motivated, talented FLOSS devs? that's rich.
With the exception of:
The motto seems to be: "Open Source is BAD! How are we going the get support?! Let's just buy a product or solution from a vendor" Even though they have people/teams who were hired as developers. I have so many horror stories, it's not even funny.
DoD mindset re: digital acquisitions is changing.
- KR dudes are great and will probably unfuck the AF's tech if they have enough wiggle room and command support against Lockheed and co. Hiring at GS-12s, few weeks approval, quick clearances, other unheard of comp strategies to get good civ talent.
- Army futures command is ..... near retired E9s and O6s in cargo shorts and underarmour polos, hiding out in the Austin Wework
KR stood up by working closely with Pivotal who supplied both the Pivots to pair program with the comparatively inexperienced AF devs as well as the deployment platform.
While the means are debatable, the ends that Units supplying devs to KR had to face we're not. Those Units got back programmers completely reliant on Pivotal Cloud Foundary. You would get devs that had no concept of what happens to their code after they run `cf push` and the Units had to face the reality that their devs were ineffective without PCF which costed 10's of millions to purchase and maintain by a team of Pivotal engineers.
Obviously Pivotal is a company that exists to make profit but smaller units that supplied devs to KR largely felt taken advantage of. And after that you had things like SpaceCamp, LevelUp, Platform1, that are very similar to KR just without the heavy reliance on Pivotal or their products popping up left and right.
Now that it's gone on for so long even leadership in KR is getting pressured to actually produce a product ready app for all the money that's been dumped. They have plenty of MVP's but afaik nothing to big AF's satisfaction.
At least from a lowly enlisted programmers perspective you can live in Boston in civilian clothes for 6 months.
While we do have spring applications running on PCF, we have a significant and growing number of services and applications that are not built on PCF and PCF isn't required to build a production application and be CATO compliant. In fact almost all of the applications and services in my branch are not.
Not really accurate anymore. Maybe it was true a decade ago but when I worked in the DoD space almost every single project was attempting to use ONLY FOSS where possible and to try and move off of companies proprietary stuff.
Unfortunately most of the projects I've seen ended up failing and Palantir usually sweeps in and gets a contract because the users really like it.
There is a huge want inside of the DoD for FOSS, it's just mortally wounded by others who want systems like Palantir or through simple incompetence.
In my experience, anyway.
I'd love to see some data on FOSS usage across the DoD but that type of audit is likely impossible.
There are also various parties working hard to promote more OSS use in DoD, two examples: GAO, OSSI. Not to mention the DoD is mandated to release at least twenty percent of its own custom software as open source. Like anything gov, it's a slow process.
Also, the requirement to release at least 20 percent of custom-developed code as OSS is not a sole DoD directive.
They are just enforcing the Federal Source Code Policy from: https://sourcecode.cio.gov/OSS/
Not in my experience. The problem is not technical talent, it's culture.
That's not to say it's impossible to solve these issues, only that a culture of top-down "get it done" does not tend to mesh well with the rigid discipline needed to make a secure product. Ever had to say "no" to a general?
On any system built in house, there will be management feature requests which force security compromises. For example, "We must archive the data of this comms network - we have a legal requirement to do that!!" or "We need to ability to access user's data for internal/external investigations", etc.
Solving these issues in a secure manner is incredibly hard, and IME it can be incredibly difficult to explain to someone non-technical why "just do X" will harm the security posture. More often than not, a developer (with their salary paid by the boss) will be forced into "just doing X" by someone who does not truly understand how much that compromises the system. Boss will be happy, thinking they "pushed it through" and non-crypto developer will be happy thinking "it has some authentication applied so must be secure" while cryptographer will be largely ignored or misunderstood. (Note: not a cryptographer, but I am an expert in other domains and have worked with enough to see their pain first hand)
Most non-cryptographers on the project start to get confused, typically thinking all of the compromises are OK because they only open doors for the DoD and that is who the product is for, without having the training or knowledge to realize how problematic this thinking can be. IMO - listen to your cryptographer, you hired them for a reason.
I don't think the problem is "top talent". I think the problem is how people think about software, and therefore how they budget for and manage it. And I don't think the problem is unique to government. I've heard about plenty of private sector giant-project boondoggles where enormous sums were wasted in similar fashions. It's just that those don't make the newspapers, but instead get gossiped about by tech people over beers.
Think about it: if you want to implement an echo server that can't be cracked to view previous things it echoed, you could probably do it quite well. Now imagine I bring a billion dollars, and some more requirements, and I hire a few hundred people to work with you on this, and some of them are security consultants, and some are compliance guys who want the echoed text saved for compliance, and some others are privacy experts who want the echo text inspected by DLP software so that you aren't sending PII.
I rate former you more likely to succeed than latter you. In the first problem, it's technical. In the second, it's organizational.
There are not a lot of people who can do crypto well, and they tend to be too weird to make good government or megacorp employees. The NSA will have people capable of doing it, but they undoubtedly have more important things to do than write a chat app.
They could hand a billion dollars to Microsoft or Amazon, who would happily take it, but would that produce a better result? Or any results at all?
Or simply do not want to work for the government or a megacorp.
If the NSA can make it secure against themselves, that's not a bad start.
A VPN will protect data leakage from other applications, and potentially help out a little against DPI and dns logs.
> So why use these apps?
Because (theoretically) Signal/Wickr cannot read your messages. Meanwhile facebook has access to all Facebook Messenger conversations, even if you used a VPN.
The VPN provides security for the LTE or Wifi connection. The network that the VPN terminates on may not be a "trusted" network either. Think of an office LAN. You usually don't require a VPN to join the office Wifi or wired ethernet. The LAN provides a level of trust sufficient to access the printer or something similar, but you still secure a connection to your server with SSH.
But they probably don't fully trust their vpn, good spycraft assumes that the adversary has (partial) access to your systems.
The problem is that the phone is likely used for other things too, and you don't want [insert important app here] to leak information that could've been protected with a VPN. Also things like dns logs and the IP connection will still be revealed without a VPN so an attacker will be able to know that you're using Signal, when you're using it, potentially with DPI they might get to know when you send/receive a message. Collect this information across a wide enough network and compare the times, you may be able to tell who is messaging who and when.
With a VPN there's no risk of other apps leaking data, it hides your dns queries, and the end connection. This means it will be difficult you're using Signal specifically at all, and should make DPI at least more difficult.
> But they probably don't fully trust their vpn
The VPN isn't necessarily be in charge of keeping the messages themselves secure, but for preventing other data leaks. The messaging application is responsible for that part, and they had to use something like Signal, because (theoretically) signal cannot read their messages. Slack/MS Teams/Facebook messanger/etc still have access to your messages regardless of any VPN.
There is a reason the DoD shells millions to msft to provide xp support past EoL
The Military might not care about people knowing they are talking, just what they are talking about. In that case, using a VPN makes sense because they can tunnel traffic to a server that is exposed to a private network and use a central chat server there or something. All traffic to that central server would be encrypted.
You'd have to use a technology like TOR to get closer to real anonymous communications on the internet (although even that isn't perfect).
Good though, although most AD have been using Whatsapp anyways for comms back home
IMhO, Keybase is adequate to protect against spying from ISPs, Ad agencies, journalists, etc but doesn't offer resistance against a formidable foe like a state actor, but then neither does most of the devices on which Signal is run.
Why would you want something that had to send traffic outside of the VPN?
EMP, signal jamming, GPS spoofing etc.
It is an all inclusive term. So this includes things like radio.
At first glance it seems like a reasonable choice. US based and end-to-end encrypted seems to be what the military wants in this case.
Saying it's not like Signal is just silly.
*have not investigated myself, but found their repo which implies as much. https://github.com/WickrInc/wickr-crypto-c
Wickr and Signal have protocols with similar goals (in that Signal invented the goals/service model of all modern secure messaging protocols). Wickr's is clunkier and based on NIST primitives. Signal's protocol has been received peer review "organically"; it's important enough to be an academic target. Wickr has paid to have competent cryptographers review it.
I'd have a strong preference for Signal Protocol over Wickr's but, unlike something like Telegram, I wouldn't see the cryptographic gap between the two as dispositive in and of itself. Wickr is cryptographically "fine".
The bigger concern for me --- and this applies to all secure messaging systems, not just Wickr --- is that Signal remains unique in its fanatical avoidance of serverside metadata. Signal has deferred core messaging features simply to avoid collecting user data in a database. User profiles are a good example, now resolved, but the canonical one is "usernames instead of phone numbers", which looks like it'll be resolved this year.
Compare that to something like Wire, which effectively keeps a database of every person that any other person on Wire has talked to. It's not doing that because Wire is evil, but rather because Wire wants things like contact lists to work seamlessly across devices. That's understandable but is a major privacy tradeoff.
As I understand it, Wickr is better than Wire in this regard, but worse than Signal.
He is not, in truth, an anarchist. He is a crypto-zealot, most definitely eccentric, and opposed to a lot of government intrusion. In today's world, I guess that's an anarchist
He's also the creator of the Anarchist Yacht Club and has produced voice content for Audio Anarchy audiobooks. In his documentary about sailing (Hold Fast, it's really good) he flies a black flag.
Just saying, there is more evidence towards his anarchism than just zealousness towards crypto.
WhatsApp was (maybe still is?) the most popular messaging application in the world. Getting it E2E encrypted is a huge coup.
The upshot is I worry that E2E encryption has given folks a false sense of security.
> Media and messages you back up aren't protected by WhatsApp end-to-end encryption while in Google Drive.
WhatsApp has gotten more people to use E2E encryption than any other product in human history. By many orders of magnitude. If you believe that people should use E2E encryption there is no better product to work on.
It's not obvious that intent is sarcastic when it expresses a commonly held position, unless the sarcastic person is well known to have different views - not so easy to discern on a site like HN. To be obvious to the casual reader, you need to articulate a position which goes to absurd extremes, eg
A bit alarming to see the government embracing an end-to-end chat system with perfect forward secrecy, considering that it impede economic growth in our identity-theft and espionage industries.
Of course, that might not be recognizable as sarcasm in some countries where such activities are of greater strategic importance.
I think it's kind of silly to use for that reason.
If anything, this is a really good thing, as the utility of publicly available comms are embraced by government entities, it should somewhat guarantee that us consumers will also have access to this.
I largely agree with you but a more cynical take would be that it allows the following argument to be advanced:
"If the 82nd Airborne uses it, then it is a weapon of war. Why should civilians have access to the same communication equipment as a soldier needs in a time of war."
That is, they associate the technology with combat and the military. Then, they use that association as a way to argue the technology is only inherently useful for those planning to engage in combat.
It also allows governments to advance the argument that if it useful for our soldiers then it should be denied to enemy soldiers and thus it should be tightly controlled (see night vision googles or cryptography as a weapon).
Nah. Lets redefine "arms" to mean "guns", assume for some reason the constitution needed to give the government freedom from itself, and ignore the whole thing as archaic and outdated.
Because nobody achieves power by controlling the existence of toilets? There are people who will argue the sky is green if it fits their narrative — bending reality is what they will do, if we let them.
I doubt any would argue that un-breakable (at least, theoretically) comms would provide no tactical advantage. If anyone would, I'd refer him to the second world war as a prominent example. This argument could help built a bi-partisan pro-cryptography group by pulling in support from the 2A crowd. I support cryptography remaining fully legal (even if classified as a munition or armament) for the same reason I support the 2A, practical security considerations aside.
It's the whole idea behind Tor. Interests sometimes align.
Now the government has an incentive to contribute to the open source project to ensure that its communications are secure.
You're right that it could "weaken" signal because more people are trying to attack it which means a flaw is more likely to be found. But it can also strengthen Signal because the need for it to be strong and improve faster is higher. More people looking at it (especially with a government interest) means there are more trying to patch those flaws as well.
Additionally, it can add to the popularity of Signal. Which again adds to both edges laid out above.
Honestly I can't see this as anything but good. It is __GOOD__ when the government's interests align with that of the public.
1. IF the government comes to rely heavily on the security of signal, THEN the government will want to make signal as secure as possible.
2. The approach the government may take to make signal stronger could be to regulate signal. For instance the government might require background checks on everyone authorized to push out signal updates.
3. Such regulation is likely to have the unintended effect of making signal less secure by causing the designers of signal to abandon the project.
Now this is a pessimistic take. An optimistic take could be that the tool the US government uses to strengthen signal is to fund developers and give out grants for security research on signal. As another poster pointed out the US government funds TOR.
This is a good thing
> 2. ...
This is against the interest of the DoD. It may be in the interest of other agencies, but not the DoD. The DoD's interest is that __all__ communications of their soldiers. Because it matters if a soldier texts their buddy back home "blah blah I'm taking a shit in this shack outside the base. You should see what it's like out here blah blah." That has security issues for them. Soldiers are going to text their buddies, significant others, and family by any means that they can. So your choice is 1) through a secure channel or 2) through an insecure channel. DoD is obviously in favor of option 1. Increased regulation, such as your suggesting, is counter productive and only encourages option 2.
> 3. ...
The code is open source. Moxie is also pretty adamant about keeping it open source. DoD also has an invested interest in keeping it open source.
> As another poster pointed out the US government funds TOR.
This user was me?
Since the conversation is exactly the same as the one we were having the other day on another thread I am literally going to reference that thread. I think the pessimism comes from "anti-government" thinking, but also a lack of understanding how agencies work. You can probably tell from my chat history that I'm not super pro strong gov and pro privacy. But these agencies have differing agendas and this has to be understood. The intelligence community has a split incentive when it comes to Tor/Signal/encryption. The part that handles protecting their spies is pro Tor and wants other users on it because they don't want spies found because only spies connect to Tor nodes (or only spies/pedos/terrorists even). Conversely, those in charge of finding spies (or spies and pedos) don't like Tor (and that's why they'll claim that only spies/pedos/terrorists use Tor/encryption. It is pushing __their__ agenda).
It really comes down to "what agenda makes their life easier?" So it should be no surprise that a defense agency is... in favor of defense. It should also not be a surprise that agencies in charge of attacking (let's avoid debate about who they are attacking) is... anti defenses. Those that are in charge of protecting kings are pro castles because it is easier to defend your king behind a castle. Those in charge of killing kings are anti castle because it is harder to kill a king behind a castle. These people may work for the same king, but they don't really talk to each other that often.
I actually think the government will start taking a more pro encryption stance in the future (we've seen some of it already), especially as tensions rise. Those that worry about foreign interference have an incentive to protect everyone's communications from foreign adversaries. It is harder to manipulate those that you have no information on. Anti encryption sentiment comes from when we are in a stronger position and less worried about being attacked. Now as we're transitioning into a period where we're becoming more concerned about defense we have a much higher incentive (before it was ambivalence) to improve defenses.
If your spy agency controls the vast majority of the Tor nodes, you can see what everybody is doing with Tor, and nobody else can. Whereas if somebody else's spy agency controls the vast majority of the Tor nodes, they get that power.
When you're both working hard to get more Tor nodes, the Tor network is made better for everybody and unless you achieve that vast majority control you get no benefit for your effort, still, never give up.
Suppose the Russians have 100 Tor nodes, the Americans have 100 Tor nodes, the Chinese have 100 Tor nodes and random good Samaritans run 100 more. Nobody can snoop on Tor, it works really well with 400 nodes. The Americans buy 200 more nodes. The Russians don't like that and nor do the Chinese! They each buy 200 more nodes too. Now there are 1000 Tor nodes, it works even better, and nothing changed for user security.
God Bless Open Source.
I highly doubt that personally, but that's what I read from nimbius comment.
there has always been separate rules applied to the people and government, the issue comes down to the simple matter elected officials should always be considered public while in office.
Why is it alarming? I don't think the FBI and CIA are advocating backdoors for government encryption. The fact that the military endorses it for their own operational security is weak evidence that Signal is more secure than other apps.
Just because they're both government agencies doesn't mean they are colluding against the people. Different agencies have different agendas and prerogatives. It's also well known that agencies often fight over things and don't exactly work together well.
Mildly off-topic, but every time I read this, it makes me ill.
Why is this a difficult concept to grasp? All agencies are pro "agenda that makes my agency's job easier".
I wasn't disagreeing with you, and grasp the concept better than most.
> All agencies are pro "agenda that makes my agency's job easier".
They didn't used to be. These agencies used to work, with pride, for country and people. Not their own sorry selfish fucking asses.
the book Team of Teams is pretty good for that https://www.amazon.ca/Team-Teams-Rules-Engagement-Complex/dp... when the best of the best is beaten by people that barely know to use guns.
You can see exactly what Signal is able to provide to a legal request. Time of last connection related to a phone number
I also didn't know it was closed-source and privately-owned in the U.S., which should have been a red flag for dark market users.
They don't do anything to dispel the notion of being a honeypot and basically do the exact opposite.
I wanted to try jami to do some p2p voip on windows. It crashes when I create an account.
It really sounds like secure alternatives are "disappearing" or just not usable or mature enough, in favor of mainstream apps that are known to be monitored. Not to mention the call quality of skype/discord can suffer because it's using servers.
I don't need 'Perfect', I do need 'Better'.
OS independent: *BSD, Linux, Windows, Android, iOS
I'm willing to run my own server, but the wife-unit needs something "easy" (hence the Windows requirement).
What are some reasonably secure E2E IM platforms?