Hacker News new | past | comments | ask | show | jobs | submit login
[dupe] Why doesn't Linux apt use HTTPS? (whydoesaptnotusehttps.com)
44 points by thecodeboy 4 months ago | hide | past | web | favorite | 17 comments




> Furthermore, even over an encrypted connection it is not difficult to figure out which files you are downloading based on the size of the transfer2. HTTPS would therefore only be useful for downloading from a server that also offers other packages of similar or identical size.

This nonsensical argument again.

Eavesdropping on HTTP: inspect the request body and see wich package and version is requested . That's it.

Eavesdropping on HTTPS: 1) build up a database of package sizes for versions. 2) Reassemble HTTPS traffic to figure out what HTTPS requests are. 3) Account for randomized padding lengths and packages of similar sizes (what if a minor security fix results in the same package size? ) 4) perform a lookup of the package version in your sophisticated database.

It's not even the same ballpark of complexity. Sure, dedicated targeted semi sophisticated attackers can still eavesdrop your HTTPS connections, but HTTPS sure as heck protects against casual snoopers. Which do you really think is more relevant in the real world? And furthermore what kind of attacker achieves the level of sophistication for such a lookup mechanism, and doesn't have the sophistication to screw you over in some other way? There is zero understanding of economics or real-world attacker motivations in this argument.

It boggles my mind that there are people so stubborn - or think they're so clever - that they rather set up a dedicated website with a "well, actually" argument only based in pure technology. They do this instead of thinking critically about this and work towards giving people sane defaults.


Let's ignore the issue of integrity and look at confidentiality:

• Browsers will reuse the same TCP connection when downloading multiple resources. Does apt not do this? This seems like it would make inferring package versions and names difficult.

• Is it impractical to standardize on a fixed block size that works for most packages, and just add noise as required to 'top up' the size of the payload to match the same size as all the others?

I found these articles interesting:

https://tools.ietf.org/html/draft-pironti-tls-length-hiding-...

https://hal.inria.fr/hal-00732449/document

Also, is there an actual PoC for any of these size-related side channel attacks? I'd take it all a lot more seriously if there was one.


Yet again, the perfect gets treated as the enemy of the good.


I'm not sure when this site was last updated, but I'm guessing this is being posted here due to the edits added to the top of the post this year, most notably the link to CVE-2019-3462[0] which was reported in January.

The last time I read whydoesaptnotusehttps.com the tone of the article seemed disappointingly in favour of the status quo. The intro to the article now seems much more open to change.

(this site isn't on the Wayback Machine, so I'm going on memory—not sure how significantly the article has actually changed)

[0] https://lists.debian.org/debian-security-announce/2019/msg00...


No one serious uses https as the authentication for raw packages - google, Apple, and Microsoft all sign updates/software/whatever with separate keys.

They also aggressively pin those connections.

However because they’re serving over https a mitm can only DoS the update system: they can’t change the update or dependency lists, they can’t insert malicious content into those responses, they can’t add cookies to the requests and responses.

Privacy can also be fixed if you simply pull multiple resources over the same connection (which is also faster)

Just use https.


In my corporate environment, we are prohibited from using HTTP but we do not require our certificates to be up to date. Since the proxy does not allow Internet access except to a whitelist of hosts, we have to do something like this in order to take an ubuntu iso from a vendor, and convert it into an os template that my company can use:

echo "Acquire::http::Proxy \"http://personal-cntlm-proxy:3128\";" > /etc/apt/apt.conf

apt-get install -y apt-transport-https

echo "deb [trusted=yes] https://someserver/somedir bionic main universe multiverse" > /etc/apt/sources.list

echo "deb [trusted=yes] https://someserver/somedir bionic-updates main universe multiverse" >> /etc/apt/sources.list

echo "deb [trusted=yes] https://someserver/somedir bionic-security main universe multiverse" >> /etc/apt/sources.list

echo "Acquire::https::Verify-Peer \"false\";" > /etc/apt/apt.conf.d/80ssl-exceptions

echo "Acquire::https::Verify-Host \"false\";" >> /etc/apt/apt.conf.d/80ssl-exceptions

apt-get -y install ca-certificates # and now the server is trusted finally

echo "deb https://someserver/somedir bionic main universe multiverse" > /etc/apt/sources.list

echo "deb https://someserver/somedir bionic-updates main universe multiverse" >> /etc/apt/sources.list

echo "deb https://someserver/somedir bionic-security main universe multiverse" >> /etc/apt/sources.list

rm /etc/apt/apt.conf.d/80ssl-exceptions

probably not even anywhere near the prescribed way to do this, but everything in corporate america has a few extra dance steps.


Not to pound on you digitalsushi, I don't imagine you have a say in the policy, but .. "we are prohibited from using HTTP but we do not require our certificates to be up to date"

WHAT'S THE F%@#ING POINT if you do not require proper certificate ?!?! I've been saying for many years that the HTTPS everywhere is not a good idea and would end up in false sens of security. It's like green-washing; everybody pretend doing it except not really ....

Luckly, in the case of Apt it does not matter much but there are other cases of blind-https-all-the-thing-stupidly where it does.


No offense taken, there's a large city's worth of people at this company. I definitely have nothing to do with the TLS governance board's policies.

There are so many competing, parallel efforts to solve problems, that generally everything is a bit mediocre in terms of what is available.

It requires a bit of a forced-optimist persona to remain creative and upbeat. On the other hand, when I go back to a small company, the sense of technical freedom should feel remarkable.


In my previous work I could not install through apt package libelf1 because "f1" was banned on company firewall.


because packages are signed, which gives the same level of trust as a https certificate against tampering (or even better, think how weak a guarantee is a let's encrypt certificate) but allows to delegate hosting to infinite untrusted mirrors.


Haven’t we been through this a few times before?


In fact I’d go further... there is too much emphasis that https == trusted && safe It doesn’t


whydoesthisurlexist.com


doesntanyurlyoucomeupwithexist.com


The only way to be 99% sure your packages aren’t tampered with is to use a source-based package management tool. Even then there’s no guarantee, as you are placing your trust into the package maintainers and contributors of that particular package.


You might consider reading up on reproducible builds. The majority of Debian packages can now be bit-for-bit rebuilt to verify the maintainer actually ships what they say they are.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: