Shameless plug: our team developed an open-source tool that helps organisations visualize security issues (like missing https) on a map so it is easy relatable for citizens (and government employees) if for example their municipality is doing a bad job and helps those citizens (and goverment organizations) improve their security.
If you see the "announcement" linked in the article, you'll notice that it's referencing this page: https://home.dotgov.gov/, which is about the HSTS preload list.
The US government probably has more software than any non-tech company out there.
Non-tech being very important here. The US government is not a technology company. So think of the worst enterprise software you've seen and the US government has it beat somewhere on some system they are still using.
How does this happen? Contracts, they write contracts to the lowest bidder to build them software and then they pick the greenest/cheapest software engineers to do work in a mix of waterfall, agile, what's older than waterfall?
On top of that they've got mission critical software that runs on this stuff. A silly little change like upgrading paper Notices to Airmen (NOTAMs) requires YEARS of planning, safety analysis, and buy in from a representative every airport/airline/airtraffic controller across the US. Multiple unions etc.
At it's worst the us government is slower than the slowest enterprise company you've seen.
On top of all that, there is a review board for EVERY (mostly open source library you want to import, so there aren't bad actors on it. This review process is slow and pushes back on innovation. The worst thing you can do as a contractor is not hit your deadline. So many of them use tried and true technologies to get the job done, but don't innovate.
Is it by decision? Yes the decision to pick the cheapest contractors, yes by the decision to include security in a way that actually slows things down, yes by a myriad of decisions that make innovating government software really hard.
Is it by incompetence? Meh, yes and no. You have people even at these contracts who are railing and screaming against this. There is competence in all these places. But the pressure to slow down and do things a specific way push back so hard it's really difficult to accept new risk.
And this was just my experience on one contract. Another government contract was super innovative and fun.
Government policies. Legacy code. Cascade of updating vendor contracts and waiting for them to update their product/service. The Webmaster frequently doesn't have the autonomy to be able to prioritize the project they think is most critical.
The switch from HTTP to HTTPS is almost never as simple as "call LetsEncrypt and switch the port number". If you have an existing HTTP site that includes Images, Script, Styles, Frames from other HTTP resources, switching the top of the page to HTTPS may break functionality on the page if the browser's security settings deny HTTP resources from being loaded on the newly HTTPS page.
That said, there's really no reason any government shouldn't prioritize these fixes... except we are in the middle of a COVID-pocalypse where budgets are likely to get slashed. Sometimes it's "incompetence" of management (not seeing the risk after it's described to them or their underlings not communicating the risk), but mostly
If I understand things like EARNIT correctly, they want to limit end-to-end encryption. I don't think they have a problem with HTTPS or any of the other transport encryption schemes.
I know its a bit of a nitpick, but its important that they know we understand the laws they want to push through and STILL disagree with them.
Inertia, forgetfulness, lack of decision. Which can be construed as incompetence if it goes on for too long. If it's for a website that started within the last three years, it's direct incompetence.
>>> If people don't want to see my site with random trash inserted into it, they can choose not to access it through broken and/or compromised networks.
I regret reading this article. Damn, its just constantly "But by site doesn't need HTTPS". So many of the responses to criticism don't address the response at all and just parrot "but I don't need it".
Make a valid argument and I'll listen to you, but parroting "but I don't need it" does nothing to add to the conversation at all.
Is adding HTTPS a 100% required critical sev0 ship-blocking bug on every site in the world? No. Is it a valuable improvement to add HTTPS? I'd argue that it is.
I don't believe they are making encryption illegal. Rather, they would likely require any org doing TLS for them to provide lawful intercept, logging, audit and compliance. If a government org is managing their own network stack, then requiring TLS would be a non issue as they can provide their audit orgs with all the data they want, as they likely do today. From all the bills I have read, there is nothing that makes encryption illegal. They want the ability to access data from the servers after it has been decrypted. This includes a way to intercept what users perceive to be end-to-end encryption.
The only problem I have run into with various government orgs is the lack of knowledge around implementing intermediate certificates. They will often try to talk people into installing their certs rather than installing intermediate certs correctly. I always point them to testssl.sh [1]
I am having trouble parsing your comment. What do you mean by,
> I don't believe they are making encryption illegal....They want the ability to access data from the servers after it has been decrypted. This includes a way to intercept what users perceive to be end-to-end encryption.
This is contradictory, no? The data is not encrypted between 2 users if a 3rd user can read it. It sounds like every bill you read makes encryption illegal, by your own explanation of the bill? I just don't understand what you mean here.
Maybe it's my personal understanding of 'encryption'? If Alice encrypts a message to Bob, and also gives the keys to Eve, but doesn't tell Bob that Eve has the keys, then I do not personally consider that message encrypted at all. Do you? What about the bills?
A message encrypted with a key held by a potential attacker is not an encrypted message any more.
Perhaps this is a terminology issue. Encryption being illegal implies to me that if someone uses encryption, they are violating a law and will be penalized. Providing lawful intercept (backdoors) does not preclude the use of encryption; but rather, renders it non existent for those with the lawful intercept access. Where this gets really sketchy is if companies give users the perception that they have end-to-end encryption, but also have a way to intercept that traffic. So I think we are in effect saying the same thing, but I am just saying that encryption is not illegal, just rendered non existent for specific parties. This is actually already in place today in many of the big centralized service providers. I believe these proposed laws are just a way to officiate this so that companies are protected and less likely to resist requests for information for fear of legal repercussion.
> but rather, renders it non existent for those with the lawful intercept access
And for plenty of people who don't have the lawful intercept access. This downgrades encryption from "encrypts stuff" to "obfuscates stuff"; while technically incorrect, it's probably better to describe this as "no longer encryption" because it's clearer to non-techies.
No, encryption still encrypts stuff. There is no obfuscation being suggested in these bills. Rather, some groups will have the ability to see data without encryption, as they do today, but with easier access and less red tape and less having to do inside deals and people looking the other way or depending on NSL's. Please don't get me wrong, I am not in support of these bills. I think it is risky to incorrectly describe what is happening, especially for non technical people.
I believe the right way to describe this to non technical people would be: "Today, some people can see your traffic even if it is encrypted. If one of these bills (there are a few versions) are approved, then more groups will be able to see your data, easier and quicker."
If you want to get people more interested in speaking up, then the best lessons learned would be from the interview between John Oliver and Edward Snowden. John interviewed many people on the street about the NSA having access to all your phone data. Nobody cared until he clarified and said, "So you would be ok with government employees looking at your significant others nude pics / sexting pics". At that point, most of the people interviewed said they would be furious.
After observing the mountains of opposition to EARN IT on large tech sites like Reddit and HN, I am starting to believe the "banning encryption" meme was started by tech companies to cripple the bill even though the word "encryption" isn't mentioned anywhere in the bill text. Guessing they are not liking the fact that the bill would force sites to be liable for illegal user-submitted content on their websites:
"(3) MATTERS ADDRESSED.—The matters addressed by the recommended best practices developed and submitted by the Commission under paragraph (1) shall include—
(A) preventing, identifying, disrupting, and reporting child sexual exploitation;
(B) coordinating with non-profit organizations and other providers of interactive computer services to preserve, remove from view, and report child sexual exploitation;
(C) retaining child sexual exploitation content and related user identification and location data;
(D) receiving and triaging reports of child sexual exploitation by users of interactive computer services, including self-reporting;
(E) implementing a standard rating and categorization system to identify the type and severity of child sexual abuse material;
(F) training and supporting content moderators who review child sexual exploitation content for the purposes of preventing and disrupting online child sexual exploitation;
(G) preparing and issuing transparency reports, including disclosures in terms of service, relating to identifying, categorizing, and reporting child sexual exploitation and efforts to prevent and disrupt online child sexual exploitation;
(H) coordinating with voluntary initiatives offered among and to providers of interactive computer services relating to identifying, categorizing, and reporting child sexual exploitation;
(I) employing age rating and age gating systems to reduce child sexual exploitation;
(J) offering parental control products that enable customers to limit the types of websites, social media platforms, and internet content that are accessible to children; and
(K) contractual and operational practices to ensure third parties, contractors, and affiliates comply with the best practices."
From the EFF (a civil liberties organization, not a tech company) coverage:
> We’ve explained how the EARN IT Act could be used to drastically undermine encryption. Although the bill doesn’t use the word “encryption” in its text, it gives government officials like Attorney General William Barr the power to compel online service providers to break encryption or be exposed to potentially crushing legal liability.
> The bill also violates the Constitution’s protections for free speech and privacy. As Congress considers the EARN IT Act—which would require online platforms to comply with to-be-determined “best practices” in order to preserve certain protections from criminal and civil liability for user-generated content under Section 230 (47 U.S.C. § 230)—it’s important to highlight the bill’s First and Fourth Amendment problems.
> could be used to drastically undermine encryption
That doesn't mean that is what's going to happen. The potential actions to be taken would still take in recommendations from provider input as stated in the bill. All the chatter online seems to speak definitively as if is the latter.
I don't disagree that there is only a probability, but I think you are ignoring the very likely possibility that this is part of a multi-facet pressure campaign on social media companies by this president, his attorney general, and the Congress critters who are sponsoring this legislation.
Using vague wording in legislation is a "bad code smell". Giving the authority to the DoJ (who is both a partisan, nominated official, and is very biased in the prosecution of those whoa re accused of breaking this law. I don't put any credence in "still take in recommendations from provider input"; I assume Congress and the DoJ can ignore any and all feedback if they choose to make a partisan show of a specific case/issue (and they have when it comes to encryption unlocking phones of terrorists, child sexual abusers, and drug dealers).
I have some friends in the employ of the federal government, who was raised Catholic. One time we were having a conversation that started about the Catholic church. He said is that some of the decisions of the Catholics are odd and slow because they are making decisions on the scale of a hundred years. They intend to be around by then.
The USA Feds seem to be on a similar plan, but it's more like a 10 year lag, and it finally seems to them like this HTTPS thing isn't just a fad.
Funnily enough the official website for the Holy See is also not HTTPS encrypted. (http://www.vatican.va)
This is a really good point. Organizations that play a stabilization role tend to benefit from this kind of dampened response to the various proddings of rapidly evolving tech.
From my experience as a US government-adjacent worker, there are a few agencies that are really working hard to drag the govt into the present. The General Service Administration's digital efforts (login.gov, Digital Strategy, etc.) are fighting inertia to try to modernize the gov't. Additionally, the public research being done at NASA/NIH/other labs requires a lot of infrastructure—and that's spilling over into other labs and agencies.
However, it's all fighting the insane bureaucratic system that requires mountains of paperwork to do anything, as well as requirements to contract out jobs to expensive and shitty contractors.
This has nothing to do with any administration, governments move slow. They have a lot of internal special interests (and I'm not talking about contractors) that fight to keep the status quo, largely because change can lead to people's jobs becoming irrelevant, etc. Then there are heaps of channels and regulations you have to go through to get anything done, even if you are an independent agency. Anyone that has worked in a large corporation or government knows how it works. The bigger is it is, the harder it is to implement change...even if it is the most basic change imaginable. This is one reason I currently would rather work for smaller companies where my voice can be heard and changes are more agile. Granted if a big tech giant like Microsoft offered me a job, I wouldn't decline...but some other corporations/government, especially old media companies, are like trying to roll a boulder up a muddy hill in terms of getting anything done; and I'd want nothing to do with it.
Hi, this is Boris Ning from United States Digital Service.
Can you go into specifics of the "wonky certificate settings"? I can probably help you out with that or at least bring it to the attention of the team here at VA.
They're likely referring to different parts of the federal government maintaining separate PKI. For example, the DoD has separate certificate structure (https://public.cyber.mil/pki-pke/) and these certificates aren't commonly pre-installed on platforms used by US citizens.
Most federal agencies have their own internal PKI systems, and DoD is probably more unique than others because the infrastructure is bigger, older, and has different regulations governing them.
Most civilian agencies such as VA aside from DoD - should utilize public PKI / public CA for their certificates.
I don't know if calling out the VA specifically is particularly fair on my part – it's possible my issue has been solely when attempting to access DoD sites secured by DoD certificates. Does any other government org in-house their certificates for internal sites in this way that is completely divorced from other root authorities?
Feedback and comments are always welcome, at least I welcome them :D.
I can't speak for all government agencies, but generally there is an internal CA for hosting internal sites. I remember reading a comment from the Federal PKI guide that these sorts of infrastructure goes back to before 2004.
"Prior to 2004, some agencies had already deployed and invested in their own PKI and CAs. Some of these agencies opted out of migrating to the SSP Program and continued to manage their existing infrastructures. These Federal Agencies Legacy operate one or more CAs that are cross-certified with a Federal PKI Trust Infrastructure CA."
- https://fpki.idmanagement.gov/ca/
Here's a very short list of public CA certificates from Treasury and it lists out public key certificate for many other agencies as well.
- https://pki.treas.gov/crl_certs.htm
even better would be to kick google (and microsoft to a lesser extent) out of any and all government services, including gmail, analytics, captcha, android, forms, js, fonts, apis, storage, etc. it maddens me to have to turn over sensitive data to google to get government services nowadays. they've crept into every dark corner and crevice of government.
the governments we have are already wanting to surveil us plenty, we don't need a shadow entity doing it too.
I think the government trying to build these tools and services themselves, or trying to procure them from a traditional government services company, would be an absolute disaster and colossal waste of money and time. No thanks.
Government and the military should absolutely be encouraged to write their own in house tools. That would dramatically improve quality of service and simultaneously reduced government expenses.
> That would dramatically improve quality of service...
I find this doubtful.
> ... and simultaneously reduced government expenses.
I find this almost tautologically impossible. By and large the government (my corner of it anyway) doesn't pay extra for licenses to commercial software. The idea that we could somehow find and hire enough developers to rewrite the entire Microsoft ecosystem, and somehow do it for less than it costs us to pay for Office 365 every year, requires some serious justification.
> By and large the government (my corner of it anyway) doesn't pay extra for licenses to commercial software
No, but governments often do (under policies that have a built-in, explicit preference for COTS or MOTS solutions because [in terms of public rationale] they are presumed to be especially cost-effective) pay for commercial software that's not well adapted to the specialized circumstances of government agencies, which even when broadly similar to public actors are often significantly governed by unusual or even sui generis considerations applying only to government or to that particular government jurisdiction or even only to that particular agency of that particular government. And then pay extra (either in terms of custom modifications or business process adaptations to the poorly-aligned software) for the mismatch.
In some cases this has knock-on effects, because managing the government cost ends up with the government requiring (explicitly or practically because of the requirements of interfacing with the proprietary system) encouraging trading partners, including intergovernmental ones with the same kinds of business fit issues, to use COTS/MOTS solutions from the same vendor.
Maybe I misunderstood. What tools do you think the government should be building in house? If MS Word isn't something you're advocating that we rewrite in house, then I don't understand what you're advocating that we don't already do.
You are really just complaining about the security model that the browser has chosen. When you visit a website, you choose to trust the content the owner sends back and all of the transitive trust that the owner embodies by importing images, scripts, styles, frames, forms, and hyperlinks.
And to the extent you are talking about how government works (online or offline), they manage to choose some terrible vendors. You don't get to opt out of those offline vendors, so what's the point of complaining about just a few online ones?
> "You are really just complaining about the security model that the browser has chosen."
no, i'm not. everything we do is intimately tied together with huge slatherings of trust, browsing included. it's how societies and civilizations fundamentally work.
however, some entities like google have proven themselves less trustworthy. so let's move up the vendor ladder of trustworthiness, rather than staying stuck at the bottom.
> "...so what's the point of complaining about just a few online ones?"
with that kind of defeatism, what's the point of anything?
every improvement helps, particularly when it involves one of the, if not the, largest privacy-invading, data harvesting machines on the planet.
AFAIK, my corner of the government doesn't use Google or Microsoft services for any public-facing services. That said, we do use Google and MS (and AWS and GitHub and ...) internally, but with a huge caveat. Absolutely all of those services run exclusively on government infrastructure. None of the data hosted there is in any way shared or accessible to the companies providing the software.
awesome! please convince your colleagues to do the same. maybe it will trickle down to california, and LA in particular, where gov is love-nuzzling with google.
Ah, one of the downsides of working for the Federal government is that you forget that "government" isn't just the big players. Although I can't say I've encountered this in my interactions with the California government either.
I would say that maybe there's a role for the Federal government to play in providing data hosting for smaller governments who can't afford to do it themselves, but with the general distrust of the Feds these days, I bet most here (including me, frankly) would rather give their data to MS and Google.
i'm personally no fan of the gov-corp false dichotomy--we can neither completely trust nor distrust any institution. we the people use them as tools for a better society, not as tribal members, and need to stay vigilant of any concentration or expression of power, like surveillance. it's our duty.
but yes, fed-gov providing smaller govs enterprise services seems like a potentially good and more efficient use of tax dollars, though concentrating surveillance is a concern.
To be fair, not all of the United States government wants to be on HTTPS (all that quickly) and not all of the government wants to backdoor encryption. I think it's important to point out that some of the US Government has a more tempered approach to the issue.
The NSA tends to ignore the domestic quibbles about encryption because they tend to have much more flexibility in how they work around encryption hurdles. The DOJ and local police departments scream bloody murder about encryption everytime there is one phone they can't access and a few congress Critters use that call as a political wedge issue.
Proponents of backdoors are beating the drum louder, harder, and more often as time goes on. They are playing the long game. It seriously concerns me that morons want to take us in the wrong direction.
one is that our network is obscenely open and used in weird ways.
public ips handed out to all the things via dhcp. dynamic hostnames (generated from the dhcp request) on a subdomain of our .gov for all the things. similarly static ips and top level dns records on our .gov are passed out like candy.
the border is heavily firewalled, and all networks are heavily sniffed and monitored, but everyone has a public ip with a .gov hostname. the network users consist of thousands of academics and scientists who use the network in fun an interesting ways, frequently without tls.
changing this culture is likely way more difficult than making config changes on bind and dhcpd
I've slowly learned to stop asking, and just try to keep my sobbing down during calls
You make it sound like the US government is one homogenous voice with one opinion. A republican senator introduced one bill that affects HTTPS, while a completely separate non-legislative government group is advocating for using HTTPS on the sites they control.
>Which is it
It's both, there's two separate groups advocating for two separate things here.
The government being able to use HTTPS on their sites is not logically inconsistent with a requirement that the government be able to get through any encryption.
There's a pretty significant difference between "no HTTPS" and "mandatory decryption for consumer electronic devices with a warrant", even if the latter is still a bad policy.
It's free and easy to setup for your own country if you're interested: https://websecuritymap.org/
Example of the site running for the Netherlands: https://basisbeveiliging.nl/