Hacker News new | past | comments | ask | show | jobs | submit login
82nd Airborne unit told to use Signal or Wickr on government cell phones (militarytimes.com)
361 points by danso 25 days ago | hide | past | web | favorite | 237 comments



Given how much of that $700B is likely to go towards developing such a system - likely somewhat less than all of it - and the general care towards usability in DoD-built systems? I think it might not be a wildly unreasonable position to take. Doubly so when you consider that DoD procurement and development practices are not widely renowned for being quick.

With those in mind, and that Signal already exists, it's worth considering if there's actually something for the DoD to gain by sinking millions of dollars and years of time into developing and maintaining something new that might not be an improvement. The answer might legitimately be "No, that's not worth it".


I suppose I'll take a shot at a counterargument.

The problem with the military using an off the shelf system like Signal isn't the tangible features (re: encryption, usable ux). The problem is one of control. What happens when the government gets into a dispute with Signal (say, over a warrantless wiretap) and Signal decides in protest that it will no longer have any DoD affiliated customers? What happens when, through a series of shell companies and legal loopholes, a controlling stake in Signal ends up in Chinese or Russian hands?

From the PoV of the DoD, owning the system from end to end is a feature they shouldn't compromise on, even if the other features suffer and it costs more than it otherwise would.


It feels like we’ve taken a step backwards. I remember years ago when the UK government could only communicate over blackberrys with custom certs/encryption keys. Now the UK government/cabinet seem to share everything with each other via WhatsApp groups. I don’t really see that as progress.


The progress is made in convenience, the regression is made in security.


These are good points. The DoD should settle on Matrix. So should everyone else.


Matrix is like a bright teenager: they are full of promise and to be encouraged at every opportunity, but they also need to cut their hair and stop wearing CND T-shirts before they can be taken seriously.

(I’d add here that I’m a big fan and user of matrix, while being aware of its limitations as a marketed product.)

Matrix have shipped some high profile products to clients like national governments, but there’s little hope for traction with the general public when the riot.im signup process is so unpolished, and when the most common entry point into the Matrix system brands itself with a name other than Matrix.

Signal uses the signal website and signal app for iOS and signal app for desktop and refers to the signal protocol in all it’s literature. Matrix could take a leaf out of their book!

(Perhaps licensing the trademark would be a good start?: Riot -> Matrix Riot, Synapse -> Matrix Synapse, and move the riot.im functions into a domain that has matrix in it?)


We’ve spent ages working on the onboarding flows for Riot; which bit is unpolished?


Those are very wise questions!

Perhaps they are ones that I expect the DoD has considered? As far as I know Signal is open source and no commercial arrangement is required to make use of it, putting the DoD in a position to respond in a technical capacity should there be a conflict.

In legal terms, I believe the DoD could stop any acquisition on national security groups. The US government, like many national governments, reserves that power. Signal is developed and run by a 501c3, making this a somewhat unlikely series of events.

Again, those are great points. I'm just not sure they wind up being any risks the DoD can't readily control.


Given that it is open source, it seems odd that they wouldn't run it on their own servers.


Just because I know how a car works mechanically doesn't mean I'm inclined to build a fleet of my own.


The DoD absolutely has the technical ability to run their own shit.


The trend for the last couple of decades (since Reagan?) is: if it’s not used to kill people, contract it out. Systems that used to be run by DoD personnel are now ran by Oracle/Microsoft in the cloud and Northrop/Raytheon and various small companies for on-premise.


Sorta but sorta not.

Lots of things are contracted out but still run in DoD data centers with government oversight.

Me = Former DoD software engineer of 12 years.

But even so, there are a shit ton of DoD software people.


There’s an entire Agency of them.

It’s called the Defense Information Systems Agency. And they’re not even the biggest in that space.

I worked there for about five years.


Even the "killing people" part is also contracted out.

https://en.wikipedia.org/wiki/Private_military_company


Yes, but he's suggesting that DoD might not care to, which I can believe. Whether that's out of laziness or time constraints is another question.


You are assuming Signal is not already subverted by US Gov.


Like virtually everyone literate in these kinds of systems.


You don't use covert intelligence capabilities to satisfy enterprise IT policy controls.

"Oh don't worry about TLS interception for the office firewall, we can break RSA anyway" is not how a bureaucracy thinks. Come on.


>not how a bureaucracy thinks. Come on.

Conspiratorial bureaucracies operate that way. Trust no one, allow nothing that can be used to undermine...

Read Memoirs Found in a Bathtub.


I agree; I took the comment to be rather precisely worded:

'we cannot SURPASS the existing (and free) solution already in the market'

This seems more like an endorsement of the level of integrity and robustness of the product and a recognition that taxpayer dollars could be better spent not re-inventing the wheel.

Additionally, off the shelf products mean that they could spent far less resources ensuring it continues to meet and exceed the necessary standards rather than building a new thing.


Was the URL changed? I see nothing about $700B in the militarytimes article.


And also that the US Fed Gov's role in funding Signal. Hopefully, the best outcome will probably be seeing how stupid the backdoor and encryption debate is.


Are we witnessing the start of the combat iPhone? These devices are staggeringly useful, I guess it just makes sense that they find their way into this aspect of human life as well.

> Squad: the #1 small unit combat App

>What's New in 3.14: Enhanced Scout/Recon route setting- select up to 7 types of perimeter security

>In-App Purchase: Unlimited "Friendlies" List - avoid unwanted friendly fire incidents.


Already exists (well, not iPhone, but...) https://www.samsung.com/us/business/solutions/industries/gov...

And accessories like cases and cables from Kagwerks: https://kagwerks.com/collections/smartphone-cases / https://kagwerks.com/collections/cables/products/usb-3-0-to-... (aside: their training courses are A+)


Combat iPhone is ok only in asymmetric warfare. Near-peer adversary will quickly figure out how to exploit it. Russia has already successfully targeted artillery strikes using location of Ukrainian soldier's phones (Android, but this difference is not very important).


We've been using iPhones in combat since at least 09.


CGI Federal/Raytheon has been working on something like this for the last 5-6 years for (AFATDS)


I'd argue that ATAK is closer to a "combat iPhone" system than AFATDS, which is uniquely focused on field artillery coordination.

https://en.wikipedia.org/wiki/Android_Tactical_Assault_Kit


I've worked on ATAK. It is a very capable combat tool.


How do they secure baseband?


Price: 10k$ per user. Fair enough if we’re billing the army for an app with a huge bug bounty.


This is old news. Look up ATAK.


This has been rolling out for a couple years across parts of the US government. The primary objective, as I understand it, is to eliminate the pervasive use of WhatsApp and Gmail for unclassified communication for security reasons.


The ironic part is the military lives on regular text messaging as it is.


ATAK already has several hundred thousand end users, and is likely going to continue growing in popularity in the domestic environment. Extensions of the FBCB2/BFT concept have been none stop since the 90's really. The tactical cell phone has already found its place on the modern battlefield.

This directive however, likely has nothing to do with combat. Its more likely related to trying to maintain some semblance of OpSec when things which shouldn't be sent over these cell phones ends up being sent over these cell phones and to help avoid every other private from texting mom/dad/gf things which will then be immediately leaked, by reminding them of the "seriousness" of the situation.


"told to"

Why do they not have some MDM like InTune or Knox and have the apps they need pre-installed and the phone locked down to prevent any other apps from being installed?


the acquisitions procedure to do that would be a nightmare


I can alrady imagine a three-year waiting list to get new apps remotely installed, which then inevitably fails under common international travel circumstances.


Everyone here is speaking like there HAS to be a backdoor for the messages to be auditable, forgetting that they could modify the application to abide to some sort of device management strategy that uploads a backup to some server when connected to a LAN for example.

Signal already gives encrypted backups with a password you see one time when setting up. I imagine you could easily modify that to lock the app with tye Android administrator shenanigans, and then periodically upload incremental backups.

The messages would of course remain encrypted but the key would be in the IT administration's hold.

I believe the end-to-end part of Signal is very interesting if you consider the whole NETWORK to be hostile, but that both ENDS are friendly once authenticated.

I can see a very FoA-friendly implementation of this.

It always seemed kind of stupid to me that they would spend billions of dollars to reinvent another wheel. It's like saying "Physics are good and we could add some more research and engineering for our case, but no, physics is opensource so let's make our own physics"

Not exactly the same but ...


I think the idea of a backdoor comes because CIA/FBI/NSA has incentives to be anti-encryption. It is their job to break it. So they want their lives to be easier. But the DoD has an incentive to have strong encryption. It is their job to defend their communications.

But I'll disagree with you (while agreeing with you) on this

> Everyone here is speaking like there HAS to be a backdoor

The DoD __does__ have a backdoor. It is the cellphone, not Signal. There doesn't have to be a backdoor in Signal for them to have full access to these communications.


> The DoD __does__ have a backdoor. It is the cellphone, not Signal.

How? By compromising the baseband, and then pivoting to compromise the AP? But what about IOMMU? Or is that pwnable?


There's a lot of comments here about Signal being better than Wickr, but little information on why yet. A quick Google search didn't give me any decent answers, so can anyone fill in why Signal is better?


Signal is open source https://github.com/signalapp


"Open-source"... but don't you dare actually try modifying the code. Moxie Marlinspike (speaking for the Signal project) has stated that forks of the client aren't authorized to use the official servers -- and since there's no federation, that makes modified clients effectively unusable.

https://github.com/LibreSignal/LibreSignal/issues/37#issueco...


Even their server code?



But how can you be sure there's no additional code being run on their server?


It doesn't matter as the messages are end to end encrypted and the way it is done is continously verified by multiple leading/up-and-coming cryptographers as far as I understand.

This is the huge advantage that Signal has over mail, the default mode in Telegram and pretty much anything there is: it does matter if NSA, FSB, MI5, Mossad, Google and Facebook all have root on a server that all the traffic passes through. To the best our knowledge - long as they don't compromise one of the endpoints - the only thing they'll get is metadata and the only thing they can do is disrupting the service.


But Signal requires your phone number to work.

Also, Wickr is open source: https://github.com/WickrInc


Wickr is not open source. There is a review-only-licensed copy of a C-language implementation of their encryption protocol on Github, and little else.


Turns out the US military also buys lots of commercial off the shelf stuff, including vast quantities of #10 letter envelopes for their mail. The US Defense establishment has thoroughly accepted the fact that they can't out-innovate commercial companies in a commercially viable space. Using the COTS top 5 or 10 is entirely reasonable in such cases.

We fly troops on Delta and American too. I bet a few even flew in Boeing 737 Maxes. Perfect? No. Excellent decision among some number of possibly better decisions, that could have been made at the time? Yes.


“ Electronic communications and text messages sent as part of official government business are part of the public record, and should be accessible via a Freedom of Information Act request.”

Unless it’s classified and it would be very easy to make the case that all of the communication being sent on Army-issued cell phones would be Confidential at a minimum. The important stuff all gets copied into OPORDS and will be available for declassification someday.


I’m curious why Wire wasn’t recommended. I prefer it because you don’t need a phone number to use it.


Signal talked about how this is actually a security flaw. I believe they are working on implementing the feature though. It is about being able to create social graphs. So like if you get a new phone you can keep your social network (also that if you discard a phone an adversary can't pick up your identity trivially). With the current implementation you'd lose your social graph every time you got a new device. I didn't check very hard but I think it is this blog post [0]. They don't have many posts, so I'm sure you can find it if this isn't the one.

[0] https://signal.org/blog/private-contact-discovery/



Wire got bought out, apparently: https://blog.privacytools.io/delisting-wire/


to a US company though, DoD wouldn't mind about that


Everyone's making comments about this indicating there could be a backdoor. Wouldn't that be impossible for Signal without putting it in the open source app? Otherwise wouldn't it have to be a protocol-level vulnerability?


How do you know the binary in the App Store is actually compiled from the published source. Is it possible to reproduce iOS binaries?


Great question. I would assume Signal has some sort of signed reproducible builds, but I have no idea what the Apple store process is like. Do you not just submit a binary to the store?


Didn't Bezos' phone get hacked through a WhatsApp message? I know they're not talking operational comms, but how confident are we these two apps are exploit-free?


We are not. It’s software.

E2E encryption is tight until you hack a client or system on either side, but at least it makes it nearly impossible to attack the middle.


No application is exploit free. Freedom from basic framework- and language-level vulnerabilities is not an option with any modern messaging protocol, secure or not.


> but how confident are we these two apps are exploit-free?

We aren't. Exploits are found in decades old software that we once believed was relatively well vetted, and in Signals case their desktop app for example had a rather nasty XSS about 14 months ago or so: https://www.cvedetails.com/cve/CVE-2018-11101/

That doesn't meen there isn't a huge difference between Signal and everyone else:

WhatsApp keeps on uploading unencrypted backups to places where it is know to be within reach of NSA, so if NSA is in your threat model, forget about WhatsApp. They have also been sloppy when it comes to accepting remote client swaps, and they've also had at least one nasty vulnerability.

Telegram, AFAIK, has managed to annoy most leading cryptographers so they don't care to put much effort into verifying it it seems.


Maybe they're confident that WhatsApp does have backdoors because they put them there...


> “I don’t have confidence that DoD could build a unique texting system with proper security protocols that would beat any commercial, off the shelf, version,” the former official said.

so $700B in defense spending can't match some motivated, talented FLOSS devs? that's rich.


That's the sad reality. Not just on the DoD side, but from the entire federal IT workforce/teams.

With the exception of:

18F https://18f.gsa.gov/

USDS https://www.usds.gov/

The motto seems to be: "Open Source is BAD! How are we going the get support?! Let's just buy a product or solution from a vendor" Even though they have people/teams who were hired as developers. I have so many horror stories, it's not even funny.


Back when I was in the military industrial complex we loved Red Hat because we could buy Linux from them so it wasn't "freeware" and we could use it.


Check out Kessel Run as well. https://kesselrun.af.mil/

DoD mindset re: digital acquisitions is changing.


how each branch is implementing tech commands is fascinating. - DDS in general seems like a great program.

- KR dudes are great and will probably unfuck the AF's tech if they have enough wiggle room and command support against Lockheed and co. Hiring at GS-12s, few weeks approval, quick clearances, other unheard of comp strategies to get good civ talent.

- Army futures command is ..... near retired E9s and O6s in cargo shorts and underarmour polos, hiding out in the Austin Wework


Sadly it's hard to argue the effect that KR is having is "positive".

KR stood up by working closely with Pivotal who supplied both the Pivots to pair program with the comparatively inexperienced AF devs as well as the deployment platform.

While the means are debatable, the ends that Units supplying devs to KR had to face we're not. Those Units got back programmers completely reliant on Pivotal Cloud Foundary. You would get devs that had no concept of what happens to their code after they run `cf push` and the Units had to face the reality that their devs were ineffective without PCF which costed 10's of millions to purchase and maintain by a team of Pivotal engineers.

Obviously Pivotal is a company that exists to make profit but smaller units that supplied devs to KR largely felt taken advantage of. And after that you had things like SpaceCamp, LevelUp, Platform1, that are very similar to KR just without the heavy reliance on Pivotal or their products popping up left and right.

Now that it's gone on for so long even leadership in KR is getting pressured to actually produce a product ready app for all the money that's been dumped. They have plenty of MVP's but afaik nothing to big AF's satisfaction.

At least from a lowly enlisted programmers perspective you can live in Boston in civilian clothes for 6 months.


This is no longer the case.

While we do have spring applications running on PCF, we have a significant and growing number of services and applications that are not built on PCF and PCF isn't required to build a production application and be CATO compliant. In fact almost all of the applications and services in my branch are not.


> The motto seems to be: "Open Source is BAD! How are we going the get support?! Let's just buy a product or solution from a vendor"

Not really accurate anymore. Maybe it was true a decade ago but when I worked in the DoD space almost every single project was attempting to use ONLY FOSS where possible and to try and move off of companies proprietary stuff.

Unfortunately most of the projects I've seen ended up failing and Palantir usually sweeps in and gets a contract because the users really like it.

There is a huge want inside of the DoD for FOSS, it's just mortally wounded by others who want systems like Palantir or through simple incompetence.

In my experience, anyway.


Maybe from your experience with one agency/department but the DoD is a BIG organization. But it's still a thing in most federal agencies. Some not as bad as the others.


Oh for sure! When I worked as a contractor I worked with many departments but even the groups I worked with didn't represent the entire department they were in. But it was a very common theme with every single one that I worked with.

I'd love to see some data on FOSS usage across the DoD but that type of audit is likely impossible.


OSS is used extensively in DoD.


Can you give examples? And do they also activiley participate in the upstream?


Most of what's used in enterprise are used, so no examples really needed. Upstream activity tends to be minimal, but it happens, usually indirectly via contractors. There are many stories online that can be found by searching, in regards to specific systems, and some software.

There are also various parties working hard to promote more OSS use in DoD, two examples: GAO, OSSI. Not to mention the DoD is mandated to release at least twenty percent of its own custom software as open source. Like anything gov, it's a slow process.


I guess department wise, DoD/Pentagon is leading the front in OSS.

Also, the requirement to release at least 20 percent of custom-developed code as OSS is not a sole DoD directive.

They are just enforcing the Federal Source Code Policy from: https://sourcecode.cio.gov/OSS/


The Tactical Service Oriented Architecture[1] has a Battle Command Display module that is built on top of Liferay[2]. There's other C2 systems that use Cesium[3]. I don't think the TSOA devs are pushing their code upstream, but that's just a guess.

[1]https://www.candp.marines.mil/Programs/Focus-Area-4-Moderniz...

[2]https://www.liferay.com/

[3]https://cesium.com/cesiumjs/


Quote is about beating not matching. If the FLOSS software matches all of their requirements then there isn't a way to beat it and it's already made.


Not once has that ever stopped an RFP from being put out or a contract from being awarded.


Lots of RFPs are awarded to companies with existing commercial technology, or to solution providers who bid on a plan to integrate open-source software and maybe add a little flavor.


Hopefully this is the first time. Honestly the government needs to make more open decisions like this (hard because of tracking budgets and how contracts are awarded and defined as complete). A bit more flexibility and open decisions with easy input from their users would do our country a HUGE service and save a metric fuck ton of money.


Even if there was a FLOSS option that checked every box on features, there would still need be an RFP for implementation. You can't create and manage the infrastructure of a system at that scale without spending enough money to require an RFP.


Always fun to get downvoted for voicing a statement of fact: an RFP is required, if I remember correctly, for any purchase in excess of $25,000. It's not a high bar, and would easily be surpassed by having to pay a contractor to implement and integrate even the simplest of tools if it's to be used across the entire DoD Enterprise.


> so $700B in defense spending can't match some motivated, talented FLOSS devs? that's rich.

Not in my experience. The problem is not technical talent, it's culture.

That's not to say it's impossible to solve these issues, only that a culture of top-down "get it done" does not tend to mesh well with the rigid discipline needed to make a secure product. Ever had to say "no" to a general?

On any system built in house, there will be management feature requests which force security compromises. For example, "We must archive the data of this comms network - we have a legal requirement to do that!!" or "We need to ability to access user's data for internal/external investigations", etc.

Solving these issues in a secure manner is incredibly hard, and IME it can be incredibly difficult to explain to someone non-technical why "just do X" will harm the security posture. More often than not, a developer (with their salary paid by the boss) will be forced into "just doing X" by someone who does not truly understand how much that compromises the system. Boss will be happy, thinking they "pushed it through" and non-crypto developer will be happy thinking "it has some authentication applied so must be secure" while cryptographer will be largely ignored or misunderstood. (Note: not a cryptographer, but I am an expert in other domains and have worked with enough to see their pain first hand)

Most non-cryptographers on the project start to get confused, typically thinking all of the compromises are OK because they only open doors for the DoD and that is who the product is for, without having the training or knowledge to realize how problematic this thinking can be. IMO - listen to your cryptographer, you hired them for a reason.


Crypto is a tiny community and doing it right is very hard. It is not surprising to me that access to talent is worth more than raw dollars.


The first iteration of healthcare.gov cost $1B iirc and it was a broken mess. The task was herculean but no doubt a lot of that money filled bureaucrat pockets. The public sector, and contractors just don't attract top talent like the private does, nor are they spearheaded by people who earned their position.


Bureaucrats in the US rarely end up with full pockets. As with a great deal of government IT work, it's private companies that are raking in the money. Healthcare.gov in particular was outsourced.

I don't think the problem is "top talent". I think the problem is how people think about software, and therefore how they budget for and manage it. And I don't think the problem is unique to government. I've heard about plenty of private sector giant-project boondoggles where enormous sums were wasted in similar fashions. It's just that those don't make the newspapers, but instead get gossiped about by tech people over beers.


The French company Thales basically re-invented BlackBerry for the exclusive use of their government https://www.thalesgroup.com/en/worldwide/security/telephony-...


Well, that doesn't sound impossible. After all, it's what the dollars are attached to.

Think about it: if you want to implement an echo server that can't be cracked to view previous things it echoed, you could probably do it quite well. Now imagine I bring a billion dollars, and some more requirements, and I hire a few hundred people to work with you on this, and some of them are security consultants, and some are compliance guys who want the echoed text saved for compliance, and some others are privacy experts who want the echo text inspected by DLP software so that you aren't sending PII.

I rate former you more likely to succeed than latter you. In the first problem, it's technical. In the second, it's organizational.


Yes, that sounds about right.

There are not a lot of people who can do crypto well, and they tend to be too weird to make good government or megacorp employees. The NSA will have people capable of doing it, but they undoubtedly have more important things to do than write a chat app.

They could hand a billion dollars to Microsoft or Amazon, who would happily take it, but would that produce a better result? Or any results at all?


>and they tend to be too weird to make good government or megacorp employees.

Or simply do not want to work for the government or a megacorp.


fwiw RHEL is all over Army systems under the hood


This is true for DoD, NATO, etc...


Not due to budget, but due to incentive structure. Low ranked military would avoid potential liability at all costs and just stay put if no order is issued.


Maybe not DoD, but perhaps NSA.

If the NSA can make it secure against themselves, that's not a bad start.


According to the article, they also require a VPN. So why use these apps? Is the VPN purely optional or do they not trust the inner network?


A VPN is completely different from the messaging applications.

A VPN will protect data leakage from other applications, and potentially help out a little against DPI and dns logs.

> So why use these apps?

Because (theoretically) Signal/Wickr cannot read your messages. Meanwhile facebook has access to all Facebook Messenger conversations, even if you used a VPN.


The NSA guidance for security is to act as if any network is compromised.


You can't trust that the endpoint of a call is on a secure network. If you need to interact with multiple stakeholders from an organizational POV, you may not trust identity of the other parties either, so a traditional Skype/Teams/Jabber type thing won't get it. Signal lets you do that ad-hoc, at least for low trust level validation.

The VPN provides security for the LTE or Wifi connection. The network that the VPN terminates on may not be a "trusted" network either. Think of an office LAN. You usually don't require a VPN to join the office Wifi or wired ethernet. The LAN provides a level of trust sufficient to access the printer or something similar, but you still secure a connection to your server with SSH.


Signal supports local encryption doesn't it? Maybe they are worried about physically losing their phones?

But they probably don't fully trust their vpn, good spycraft assumes that the adversary has (partial) access to your systems.


Let's assume that Signal is fully encrypted and there's no feasible way to read the messages.

The problem is that the phone is likely used for other things too, and you don't want [insert important app here] to leak information that could've been protected with a VPN. Also things like dns logs and the IP connection will still be revealed without a VPN so an attacker will be able to know that you're using Signal, when you're using it, potentially with DPI they might get to know when you send/receive a message. Collect this information across a wide enough network and compare the times, you may be able to tell who is messaging who and when.

With a VPN there's no risk of other apps leaking data, it hides your dns queries, and the end connection. This means it will be difficult you're using Signal specifically at all, and should make DPI at least more difficult.

> But they probably don't fully trust their vpn

The VPN isn't necessarily be in charge of keeping the messages themselves secure, but for preventing other data leaks. The messaging application is responsible for that part, and they had to use something like Signal, because (theoretically) signal cannot read their messages. Slack/MS Teams/Facebook messanger/etc still have access to your messages regardless of any VPN.


It doesn’t have any mode that’s not encrypted.


SMS?


the app supports SMS for convenience, but that's not "using signal" as in the protocol


Wouldn't you sort of expect the military to implement zero trust networking where possible?


As a prior technology acquisitions professional, the contract for that capability will touch so many different '4 letter' offices and can arguably be bid under each service's acquisition arm that the pain and dollar amount associated with it simply isn't high on the priority list.

There is a reason the DoD shells millions to msft to provide xp support past EoL


Trusting VPN is bad practice.


Can you expand on this? Trusting a VPN is equivalent to trusting the VPN provider. As long as you believe the VPN provider is more trustworthy than the network you're tunneling through, I don't see why trusting VPN is bad practice.


I've been under the impression that your VPN traffic is always "you". This could be a bad assumption tho.


This is a good assumption. VPNs do not protect your identity. At best a public VPN will mix your traffic with other users at some exit IP, but it's very hard to trust that the VPN itself is not logging your traffic (or at least the metadata).

The Military might not care about people knowing they are talking, just what they are talking about. In that case, using a VPN makes sense because they can tunnel traffic to a server that is exposed to a private network and use a central chat server there or something. All traffic to that central server would be encrypted.

You'd have to use a technology like TOR to get closer to real anonymous communications on the internet (although even that isn't perfect).


It's just "Tor".


So, the whole point of a vpn is to route all your traffic through the vpn. To websites, all your traffic looks like vpn's traffic. The trust point is to the vpn, because they can only forward your requests if they know it's "you".


AG Barr, what now? Signal/Wickr use only allowed with govt employment?

Good though, although most AD have been using Whatsapp anyways for comms back home


The irony that Edward Snowden also recommends Signal is so thick. Isn't this a gateway drug for other whistleblowers?


In what sense would it be a gateway? Because of the possibility of data exfiltration?


This allows people to communicate securely with certain people, say journalists, if you have something you want to share in private. If you don't know about this kind of app you might be fearful of doing that. I meant gateway drug as a joke but hopefully you get the point.


Why not keybase?


Keybase comes with social tools to verify identity. Would be pretty silly to see a military personnel profile in public view.


Good point, its been a while since I signed up and I forgot about the whole "verification" process. Thanks.


Put it in context, is it ever acceptable to have all your activity publicly logged? Keybase isn’t forward secure, so so disclosure of your keys later leads to complete availability of your cleartext.


According to https://keybase.io/blog/chat-apps-softer-than-tofu ephemeral messages are indeed forward secure, but normal messages are not, strictly speaking.

IMhO, Keybase is adequate to protect against spying from ISPs, Ad agencies, journalists, etc but doesn't offer resistance against a formidable foe like a state actor, but then neither does most of the devices on which Signal is run.


The US military has no need for Stellar.


Keybase is miserable to use currently and its only generally gotten worse. Was an early user and have continued to be disappointed.


I'd love to hear more as it doesn't match my experience. Among the oh-so-many IM apps I use, Keybase is my favorite as it's transparently cross platform, isn't tied to a phone [number], and the usability is good IMO.


DoD already has a PKI, it has no need for peer-to-peer identity/trust.


Why wouldn't they just install and use a boring old corporate style XMPP system with auditing? Everything would be open source and entirely under their control.

Why would you want something that had to send traffic outside of the VPN?


"[...] where adversaries can exploit American communications systems, cell phones and the electromagnetic spectrum." (emphasis mine)

Wait, what?


>the electromagnetic spectrum

EMP, signal jamming, GPS spoofing etc.


This is why TEMPEST is critical.


> the electromagnetic spectrum.

It is an all inclusive term. So this includes things like radio.


Good to know they are only US-backdoored then


Signal's auto-delete feature is optional and off by default.


It could still be problematic if "With regards to transparency and records keeping requirements, Foote said he 'cannot confirm if any personnel have Signal or Wickr settings which allow auto-delete of messages at this time.'"


Wickr makes little sense. Maybe they meant Wire?


Why? I'm genuinely curious. I've never used their product, but their Wikipedia page says they are a San Francisco based company with end-to-end encrypted messaging and video conference calls[1]. And their home page[2] says "Fully encrypted. Enterprise-ready. Private."

At first glance it seems like a reasonable choice. US based and end-to-end encrypted seems to be what the military wants in this case.

[1] https://en.wikipedia.org/wiki/Wickr

[2] https://wickr.com/


Why does that make little sense? Isn't Wicker just like Signal?


Wickr is not just like Signal.


Can you explain how? That's why I asked the question. From my limited knowledge they both seem to be end-to-end encrypted chat.


Just looking at the website, it looks like Wickr might be a little more like slack.

Saying it's not like Signal is just silly.


I mean, they're both open source* e2e protocols. The primary difference is signal has a lot of cryptanalysis cred. It basically comes down to your opinion on which you prefer.

*have not investigated myself, but found their repo which implies as much. https://github.com/WickrInc/wickr-crypto-c


Wickr is not open source. An implementation of its crypto protocol is available on Github, for review only.

Wickr and Signal have protocols with similar goals (in that Signal invented the goals/service model of all modern secure messaging protocols). Wickr's is clunkier and based on NIST primitives. Signal's protocol has been received peer review "organically"; it's important enough to be an academic target. Wickr has paid to have competent cryptographers review it.

I'd have a strong preference for Signal Protocol over Wickr's but, unlike something like Telegram, I wouldn't see the cryptographic gap between the two as dispositive in and of itself. Wickr is cryptographically "fine".

The bigger concern for me --- and this applies to all secure messaging systems, not just Wickr --- is that Signal remains unique in its fanatical avoidance of serverside metadata. Signal has deferred core messaging features simply to avoid collecting user data in a database. User profiles are a good example, now resolved, but the canonical one is "usernames instead of phone numbers", which looks like it'll be resolved this year.

Compare that to something like Wire, which effectively keeps a database of every person that any other person on Wire has talked to. It's not doing that because Wire is evil, but rather because Wire wants things like contact lists to work seamlessly across devices. That's understandable but is a major privacy tradeoff.

As I understand it, Wickr is better than Wire in this regard, but worse than Signal.


why not backdoored whatsapp? it would be perfect government endorsement.


This is probably because Signal and Wickr are actually intelligence operations.


Definitely not Wickr.


Why?


As one who studied protocol state of many Internet-based protocols, Wickr isn’t as closely guarded protocol as Signal is with regard to being able to do MitM (between a specific two protocol states). This is from a theoretical analysis. No time to do actual jerryrigging.


DARPA literally invented the Internet for this kind of use case and now they can't even write their own secure messaging app?


And they now recommend an app developed by anarchists?


Source? I didn’t know that.


Moxie Marlinspike (owner of the world's greatest name) is the lead of the Signal project is often described as an anarchist (as in this wired article: https://www.wired.com/2016/07/meet-moxie-marlinspike-anarchi..., which is better than it's headline).

He is not, in truth, an anarchist. He is a crypto-zealot, most definitely eccentric, and opposed to a lot of government intrusion. In today's world, I guess that's an anarchist


>He is not, in truth, an anarchist.

He's also the creator of the Anarchist Yacht Club and has produced voice content for Audio Anarchy audiobooks. In his documentary about sailing (Hold Fast, it's really good) he flies a black flag.

Just saying, there is more evidence towards his anarchism than just zealousness towards crypto.


Not to mention that many of the product screenshots they use for Signal reference prominent anarchists throughout history.


An anarchist who collaborates with WhatsApp (Facebook), that's rather comical :o)


An anarchist who works to get all the major messaging platforms end-to-end encrypted, so that normal people who don't think to explicitly opt-in to encryption get the benefits by default? Makes perfect sense to me.

WhatsApp was (maybe still is?) the most popular messaging application in the world. Getting it E2E encrypted is a huge coup.


true, but storing all conversation histories locally and in the cloud is an anti-coup. Not that Moxie had any influence on these decisions one way or the other (to the best of my knowledge).

The upshot is I worry that E2E encryption has given folks a false sense of security.


I'm probably not here for the accelerationist argument that the most popular messaging application in the world should be kept unencrypted in the hopes that it would get people to move to better applications.


E2E means it is encrypted before transmission, so if that data is stored in the cloud, wouldn't that mean that it would remain encrypted if it does get stored in the cloud?


not according this announcement by WhatsApp.

> Media and messages you back up aren't protected by WhatsApp end-to-end encryption while in Google Drive.

https://faq.whatsapp.com/en/android/28000019/?category=52452...

https://www.zdnet.com/article/whatsapp-warns-free-google-dri...


Why?

WhatsApp has gotten more people to use E2E encryption than any other product in human history. By many orders of magnitude. If you believe that people should use E2E encryption there is no better product to work on.


Is this surprising? Anarchists are not exactly hard to find these days.


The military uses various MMHS implementations for all messaging that needs to fulfill military messaging requirements, which is very different from Signal's protocol.


But why do that when there are already solutions available?


A bit alarming to see the government embracing an end-to-end chat system with perfect forward secrecy, considering both the head of the FBI and the CIA have confirmed that the lack of a back door would impede the fight against ISIS.

https://www.theguardian.com/technology/2015/jul/08/fbi-chief...


You really need an /s on this one


Kind of sad people can’t distinguish obvious sarcasm these days without an “/s”. Ruins the beauty of sarcasm.


Timing is everything; the narrower the bandwidth of the communications channel, the more important context becomes if you want to avoid falling victim to Poe's law.

It's not obvious that intent is sarcastic when it expresses a commonly held position, unless the sarcastic person is well known to have different views - not so easy to discern on a site like HN. To be obvious to the casual reader, you need to articulate a position which goes to absurd extremes, eg

A bit alarming to see the government embracing an end-to-end chat system with perfect forward secrecy, considering that it impede economic growth in our identity-theft and espionage industries.

Of course, that might not be recognizable as sarcasm in some countries where such activities are of greater strategic importance.


To be fair, I think language barriers often make this much more difficult for folks.


Recognizing sarcasm generally requires context on the author. You can make assumptions that lead to this looking like sarcasm or you can take it on face value; both are fairly likely to have some users that would write the comment.

I think it's kind of silly to use for that reason.


Sarcasm in text never worked well though. You also have to remember that a large percentage of HN readership are not native English speakers.


No, they don't need to modulate their speech patterns to fit awkward, trendy and ephemeral language quirks.


Why is that alarming? The FBI and CIA requests for backdoors are alarming. Unless you're positing that these apps may have secretly complied with CIA and FBI requests?

If anything, this is a really good thing, as the utility of publicly available comms are embraced by government entities, it should somewhat guarantee that us consumers will also have access to this.


>If anything, this is a really good thing, as the utility of publicly available comms are embraced by government entities, it should somewhat guarantee that us consumers will also have access to this.

I largely agree with you but a more cynical take would be that it allows the following argument to be advanced:

"If the 82nd Airborne uses it, then it is a weapon of war. Why should civilians have access to the same communication equipment as a soldier needs in a time of war."

That is, they associate the technology with combat and the military. Then, they use that association as a way to argue the technology is only inherently useful for those planning to engage in combat.

It also allows governments to advance the argument that if it useful for our soldiers then it should be denied to enemy soldiers and thus it should be tightly controlled (see night vision googles[0] or cryptography as a weapon[1]).

[0]: https://en.wikipedia.org/wiki/Night-vision_device#Legality

[1]: https://en.wikipedia.org/wiki/Export_of_cryptography_from_th...


You would think that the constitution would have included a passage about peoples right to keep and bare armaments being necessary to ensure the country stayed free...

Nah. Lets redefine "arms" to mean "guns", assume for some reason the constitution needed to give the government freedom from itself, and ignore the whole thing as archaic and outdated.


Cryptography has always been used by militaries, and was legally considered a munition back in the 1990s. Since then, two courts have ruled that cryptographic source code is protected by the First Amendment.

https://en.wikipedia.org/wiki/Pretty_Good_Privacy#Criminal_i...


The army also uses toilets. Why is it then that nobody argues toilets are weapons of war?

Because nobody achieves power by controlling the existence of toilets? There are people who will argue the sky is green if it fits their narrative — bending reality is what they will do, if we let them.


If they make that argument, it shouldn't be much of a stretch to say that the 2nd amendment should apply to this technology then.


There's certainly a case here. If Barr goes forward with his nutty anti-encryption push, I hope the opposition uses this argument. Google lists definition #1 for "arms" as "weapons", and definition #2 of "weapons" as "a means of gaining an advantage or defending oneself in a conflict or contest."

I doubt any would argue that un-breakable (at least, theoretically) comms would provide no tactical advantage. If anyone would, I'd refer him to the second world war as a prominent example. This argument could help built a bi-partisan pro-cryptography group by pulling in support from the 2A crowd. I support cryptography remaining fully legal (even if classified as a munition or armament) for the same reason I support the 2A, practical security considerations aside.


Well they made that argument about cryptography being a munition and I don't see anyone advancing a 2nd Amendment challenge against those laws.


The funny thing is, the basis for upholding the National Firearms Act restrictions on short barreled shotguns and rifles (challenged in the '30s) was that they weren't weapons of war.


Because they'll also want to encrypt soldiers communications back home, not just between each other. When the soldiers are talking to their friends and family. It isn't E2E if friends and family aren't using the same software.

It's the whole idea behind Tor. Interests sometimes align.


I'm not accusing them of advancing a consistent or defendable position, just that there is always a danger in the military adopting a technology. To add another example, what if signal has a flaw which allows a foreign adversary to read all signal messages and that this ends up harming US natural security. The government now has an incentive to regulate signal as "critical telecommunication infrastructure" necessary for the security of the US.


You could read this another way too.

Now the government has an incentive to contribute to the open source project to ensure that its communications are secure.

You're right that it could "weaken" signal because more people are trying to attack it which means a flaw is more likely to be found. But it can also strengthen Signal because the need for it to be strong and improve faster is higher. More people looking at it (especially with a government interest) means there are more trying to patch those flaws as well.

Additionally, it can add to the popularity of Signal. Which again adds to both edges laid out above.

Honestly I can't see this as anything but good. It is __GOOD__ when the government's interests align with that of the public.


I explained myself poorly. My argument was that:

1. IF the government comes to rely heavily on the security of signal, THEN the government will want to make signal as secure as possible.

2. The approach the government may take to make signal stronger could be to regulate signal. For instance the government might require background checks on everyone authorized to push out signal updates.

3. Such regulation is likely to have the unintended effect of making signal less secure by causing the designers of signal to abandon the project.

Now this is a pessimistic take. An optimistic take could be that the tool the US government uses to strengthen signal is to fund developers and give out grants for security research on signal. As another poster pointed out the US government funds TOR.


> 1. ...

This is a good thing

> 2. ...

This is against the interest of the DoD. It may be in the interest of other agencies, but not the DoD. The DoD's interest is that __all__ communications of their soldiers. Because it matters if a soldier texts their buddy back home "blah blah I'm taking a shit in this shack outside the base. You should see what it's like out here blah blah." That has security issues for them. Soldiers are going to text their buddies, significant others, and family by any means that they can. So your choice is 1) through a secure channel or 2) through an insecure channel. DoD is obviously in favor of option 1. Increased regulation, such as your suggesting, is counter productive and only encourages option 2.

> 3. ...

The code is open source. Moxie is also pretty adamant about keeping it open source. DoD also has an invested interest in keeping it open source.

> As another poster pointed out the US government funds TOR.

This user was me?

Since the conversation is exactly the same as the one we were having the other day on another thread I am literally going to reference that thread[0]. I think the pessimism comes from "anti-government" thinking, but also a lack of understanding how agencies work. You can probably tell from my chat history that I'm not super pro strong gov and pro privacy. But these agencies have differing agendas and this has to be understood. The intelligence community has a split incentive when it comes to Tor/Signal/encryption. The part that handles protecting their spies is pro Tor and wants other users on it because they don't want spies found because only spies connect to Tor nodes (or only spies/pedos/terrorists even). Conversely, those in charge of finding spies (or spies and pedos) don't like Tor (and that's why they'll claim that only spies/pedos/terrorists use Tor/encryption. It is pushing __their__ agenda).

It really comes down to "what agenda makes their life easier?" So it should be no surprise that a defense agency is... in favor of defense. It should also not be a surprise that agencies in charge of attacking (let's avoid debate about who they are attacking) is... anti defenses. Those that are in charge of protecting kings are pro castles because it is easier to defend your king behind a castle. Those in charge of killing kings are anti castle because it is harder to kill a king behind a castle. These people may work for the same king, but they don't really talk to each other that often.

I actually think the government will start taking a more pro encryption stance in the future (we've seen some of it already), especially as tensions rise. Those that worry about foreign interference have an incentive to protect everyone's communications from foreign adversaries. It is harder to manipulate those that you have no information on. Anti encryption sentiment comes from when we are in a stronger position and less worried about being attacked. Now as we're transitioning into a period where we're becoming more concerned about defense we have a much higher incentive (before it was ambivalence) to improve defenses.

[0] https://news.ycombinator.com/item?id=22114149


Tor is designed to benefit from an arms race by spy agencies.

If your spy agency controls the vast majority of the Tor nodes, you can see what everybody is doing with Tor, and nobody else can. Whereas if somebody else's spy agency controls the vast majority of the Tor nodes, they get that power.

When you're both working hard to get more Tor nodes, the Tor network is made better for everybody and unless you achieve that vast majority control you get no benefit for your effort, still, never give up.

Suppose the Russians have 100 Tor nodes, the Americans have 100 Tor nodes, the Chinese have 100 Tor nodes and random good Samaritans run 100 more. Nobody can snoop on Tor, it works really well with 400 nodes. The Americans buy 200 more nodes. The Russians don't like that and nor do the Chinese! They each buy 200 more nodes too. Now there are 1000 Tor nodes, it works even better, and nothing changed for user security.


So then people will just use the hundred other published implementations of Moxie's encryption algo used in Signal.

God Bless Open Source.


My interpretation is that he meant it's alarming to see the government suddenly openly suggesting an e2e messenger because it might hint to the agencies having discovered a hole in the Signal protocol.

I highly doubt that personally, but that's what I read from nimbius comment.


So then why would they want the military to use it for internal discussion? If the govt is aware of a hole, then other nations might be aware of the same or a different issue.


they aren't using it for classified conversations so it doesn't really matter


It does matter. Soldiers talk about sensitive stuff all the time. You don’t want schedules, personnel, whatever, leaking through inane conversations to people we’re at war with.


Unfortunately, classified comms of various classification levels spill over into publicly available channels all the time, both accidental and out of laziness, but also sometimes an immediate necessity.


Because its only crackable by the 1kqb quantum computer located in Mt rushmore under Lincolns invisible top hat.


I need me some of this invisible top hat quantum computer.


I think nimbius was being sarcastic?


Pointing out a contradiction of policy and interest between civilian and military who each have independent concerns and capability doesn't necessary imply sarcasm. I took nimbius to mean that it will be interesting (as in get your popcorn) to see who's influence in the halls of power win out when such a contradiction exists.


I see little conflict, there is little reason the military would observe restrictions in the battlefield. now they could limits its use only while deployed or simply run it under an exception only policy, as in you get to use it for this period of time and not otherwise.

there has always been separate rules applied to the people and government, the issue comes down to the simple matter elected officials should always be considered public while in office.


It is sarcasm


To me it's a relief. Finally the information warmongers can't argue that it has no benefit to national security.


I think it was sarcasm? Or am I completely missing the point?


It’s just for this sort of turn of events that hn should allow a limited number of memes and emojis.


They do allow some unicode chars like: ⏩⏪⏫⏬ and ⏰, I guess they are validating up to some unicode code point. Because most emoji are not shown. I believe this chap: ◿, the lower right triangle, U+25FF, is the last one allowed.


[flagged]


[flagged]


Please stop.


sorry I'll take one complimentary stick as well to go along with everybody else


> A bit alarming to see the government embracing an end-to-end chat system with perfect forward secrecy, considering both the head of the FBI and the CIA have confirmed that the lack of a back door would impede the fight against ISIS.

Why is it alarming? I don't think the FBI and CIA are advocating backdoors for government encryption. The fact that the military endorses it for their own operational security is weak evidence that Signal is more secure than other apps.


Privacy for me, not for thee....


This might not count as much, but the DoD is a different organization than the FBI or the CIA.


It should count. The DoD has different interests than FBI and CIA. It should make sense that DoD wants encryption, it makes their jobs easier. It should also make sense that the FBI and CIA don't want encryption, lack of makes their jobs easier.

Just because they're both government agencies doesn't mean they are colluding against the people. Different agencies have different agendas and prerogatives. It's also well known that agencies often fight over things and don't exactly work together well.


> The DoD has different interests than FBI and CIA.

Mildly off-topic, but every time I read this, it makes me ill.


Why? Here's an analogy. People that are in charge of defending kings are pro castles (it helps them defend their kings). People that are in charge of killing kings are anti castles (it is easier to kill a king if they aren't in a castle).

Why is this a difficult concept to grasp? All agencies are pro "agenda that makes my agency's job easier".


> Why is this a difficult concept to grasp?

I wasn't disagreeing with you, and grasp the concept better than most.

> All agencies are pro "agenda that makes my agency's job easier".

They didn't used to be. These agencies used to work, with pride, for country and people. Not their own sorry selfish fucking asses.


eh,

the book Team of Teams is pretty good for that https://www.amazon.ca/Team-Teams-Rules-Engagement-Complex/dp... when the best of the best is beaten by people that barely know to use guns.


E2E encryption and perfect forward secrecy doesn't matter if the protocol is leaking copious amounts of metadata about who you're talking to and when.


That's what OPSEC is for. A bit hard in enemey territory but def less noisy than using w/e else they were currently using.


The Signal server doesn't even know who the sender of a message is, just the recipient.


Even just traffic analysis makes any kind of communication potentially leaky. Encryption matters, but it doesn't protect against everything. Even the lack of communication leaks information.


What data is known exactly? Phone number? IP?


https://signal.org/bigbrother/

You can see exactly what Signal is able to provide to a legal request. Time of last connection related to a phone number


This is one of the coolest exchanges I’ve seen between a company and the government. Class act.


Just the IP, IIRC, but it doesn't know who you are. There are more details on their blog in one of the latest blog posts.


Well, maybe they finally got their backdoor baked in and they now want to spy on their own people?


I wonder what Barr thinks of this...


I don't trust Wickr


I never heard of Wickr before this post.


It's a go to for dark market comms these days, but other than that, this is the first I've seen it mentioned on clear net.

I also didn't know it was closed-source and privately-owned in the U.S., which should have been a red flag for dark market users.


It's a red flag for many dark market users. When I see people offer their customer support on Wickr, I think "is this guy serious? everyone is really confident in this person?" and never contact anyone on Wickr.


Why not?


Closed source, compellable US based company, lack of emphasis on being installable on privacy OS' like Tails and Whonix.

They don't do anything to dispel the notion of being a honeypot and basically do the exact opposite.


I'm not them but I imagine it is because Wickr is closed source.


Well signal is not supported on non-phone androids (there was some option on the desktop dev build but it's not supported anymore, so it's not even usable on desktop). I've heard that this limitation increases security but I have a hard time understanding why. So I can't really vouch for Signal, but I wish I would. I guess requiring a physical android phone makes it hard to create bogus accounts?

I wanted to try jami to do some p2p voip on windows. It crashes when I create an account.

It really sounds like secure alternatives are "disappearing" or just not usable or mature enough, in favor of mainstream apps that are known to be monitored. Not to mention the call quality of skype/discord can suffer because it's using servers.


While I appreciate that there are only 2 CVE's for Signal [1], it doesn't appear to tell the whole story.[2][3][4]

I don't need 'Perfect', I do need 'Better'. OS independent: *BSD, Linux, Windows, Android, iOS

I'm willing to run my own server, but the wife-unit needs something "easy" (hence the Windows requirement).

What are some reasonably secure E2E IM platforms?

[1]https://www.cvedetails.com/vulnerability-list/vendor_id-1791...

[2]https://www.google.com/amp/s/www.forbes.com/sites/daveywinde...

[3]https://www.fastcompany.com/90444005/hackers-could-have-expl...

[4]https://www.scmagazineuk.com/researchers-reveal-easily-signa...


Hmm. Define "reasonably secure". Your link [3] is about WhatsApp and [4] is a about a compromised platform. No app is secure once you hack the phone.


Signal works on Windows desktops.


www.Riot.im




Applications are open for YC Summer 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: