
Why doesn't Linux apt use HTTPS? - thecodeboy
https://whydoesaptnotusehttps.com/index.html
======
dalf
Previous post:
[https://news.ycombinator.com/item?id=18958679](https://news.ycombinator.com/item?id=18958679)

------
DCKing
> Furthermore, even over an encrypted connection it is not difficult to figure
> out which files you are downloading based on the size of the transfer2.
> HTTPS would therefore only be useful for downloading from a server that also
> offers other packages of similar or identical size.

This nonsensical argument again.

Eavesdropping on HTTP: _inspect the request body and see wich package and
version is requested_. That's it.

Eavesdropping on HTTPS: 1) build up a database of package sizes for versions.
2) Reassemble HTTPS traffic to figure out what HTTPS requests are. 3) Account
for randomized padding lengths and packages of similar sizes (what if a minor
security fix results in the same package size? ) 4) perform a lookup of the
package version in your sophisticated database.

It's not even the same ballpark of complexity. Sure, dedicated targeted semi
sophisticated attackers can still eavesdrop your HTTPS connections, but HTTPS
sure as heck protects against casual snoopers. _Which_ do you really think is
more relevant in the real world? And furthermore what kind of attacker
achieves the level of sophistication for such a lookup mechanism, and doesn't
have the sophistication to screw you over in some other way? There is zero
understanding of economics or real-world attacker motivations in this
argument.

It boggles my mind that there are people so stubborn - or think they're so
clever - that they rather set up a dedicated website with a "well, actually"
argument only based in pure technology. They do this instead of thinking
critically about this and work towards giving people sane defaults.

------
lol768
Let's ignore the issue of integrity and look at confidentiality:

• Browsers will reuse the same TCP connection when downloading multiple
resources. Does apt not do this? This seems like it would make inferring
package versions and names difficult.

• Is it impractical to standardize on a fixed block size that works for most
packages, and just add noise as required to 'top up' the size of the payload
to match the same size as all the others?

I found these articles interesting:

• [https://tools.ietf.org/html/draft-pironti-tls-length-
hiding-...](https://tools.ietf.org/html/draft-pironti-tls-length-hiding-01)

•
[https://hal.inria.fr/hal-00732449/document](https://hal.inria.fr/hal-00732449/document)

Also, is there an actual PoC for any of these size-related side channel
attacks? I'd take it all a lot more seriously if there was one.

------
crooked-v
Yet again, the perfect gets treated as the enemy of the good.

------
lucideer
I'm not sure when this site was last updated, but I'm guessing this is being
posted here due to the edits added to the top of the post this year, most
notably the link to CVE-2019-3462[0] which was reported in January.

The last time I read whydoesaptnotusehttps.com the tone of the article seemed
disappointingly in favour of the status quo. The intro to the article now
seems much more open to change.

(this site isn't on the Wayback Machine, so I'm going on memory—not sure how
significantly the article has actually changed)

[0] [https://lists.debian.org/debian-security-
announce/2019/msg00...](https://lists.debian.org/debian-security-
announce/2019/msg00010.html)

------
olliej
No one serious uses https as the authentication for raw packages - google,
Apple, and Microsoft all sign updates/software/whatever with separate keys.

They also aggressively pin those connections.

However because they’re serving over https a mitm can only DoS the update
system: they can’t change the update or dependency lists, they can’t insert
malicious content into those responses, they can’t add cookies to the requests
and responses.

Privacy can also be fixed if you simply pull multiple resources over the same
connection (which is also faster)

Just use https.

------
digitalsushi
In my corporate environment, we are prohibited from using HTTP but we do not
require our certificates to be up to date. Since the proxy does not allow
Internet access except to a whitelist of hosts, we have to do something like
this in order to take an ubuntu iso from a vendor, and convert it into an os
template that my company can use:

echo "Acquire::http::Proxy \"[http://personal-cntlm-
proxy:3128\";"](http://personal-cntlm-proxy:3128\\";") > /etc/apt/apt.conf

apt-get install -y apt-transport-https

echo "deb [trusted=yes]
[https://someserver/somedir](https://someserver/somedir) bionic main universe
multiverse" > /etc/apt/sources.list

echo "deb [trusted=yes]
[https://someserver/somedir](https://someserver/somedir) bionic-updates main
universe multiverse" >> /etc/apt/sources.list

echo "deb [trusted=yes]
[https://someserver/somedir](https://someserver/somedir) bionic-security main
universe multiverse" >> /etc/apt/sources.list

echo "Acquire::https::Verify-Peer \"false\";" > /etc/apt/apt.conf.d/80ssl-
exceptions

echo "Acquire::https::Verify-Host \"false\";" >> /etc/apt/apt.conf.d/80ssl-
exceptions

apt-get -y install ca-certificates # and now the server is trusted finally

echo "deb [https://someserver/somedir](https://someserver/somedir) bionic main
universe multiverse" > /etc/apt/sources.list

echo "deb [https://someserver/somedir](https://someserver/somedir) bionic-
updates main universe multiverse" >> /etc/apt/sources.list

echo "deb [https://someserver/somedir](https://someserver/somedir) bionic-
security main universe multiverse" >> /etc/apt/sources.list

rm /etc/apt/apt.conf.d/80ssl-exceptions

probably not even anywhere near the prescribed way to do this, but everything
in corporate america has a few extra dance steps.

~~~
ldng
Not to pound on you digitalsushi, I don't imagine you have a say in the
policy, but .. "we are prohibited from using HTTP but we do not require our
certificates to be up to date"

WHAT'S THE F%@#ING POINT if you do not require proper certificate ?!?! I've
been saying for many years that the HTTPS everywhere is not a good idea and
would end up in false sens of security. It's like green-washing; everybody
pretend doing it except not really ....

Luckly, in the case of Apt it does not matter much but there are other cases
of blind-https-all-the-thing-stupidly where it does.

~~~
digitalsushi
No offense taken, there's a large city's worth of people at this company. I
definitely have nothing to do with the TLS governance board's policies.

There are so many competing, parallel efforts to solve problems, that
generally everything is a bit mediocre in terms of what is available.

It requires a bit of a forced-optimist persona to remain creative and upbeat.
On the other hand, when I go back to a small company, the sense of technical
freedom should feel remarkable.

------
dvh
In my previous work I could not install through apt package libelf1 because
"f1" was banned on company firewall.

------
LoSboccacc
because packages are signed, which gives the same level of trust as a https
certificate against tampering (or even better, think how weak a guarantee is a
let's encrypt certificate) but allows to delegate hosting to infinite
untrusted mirrors.

------
edf13
Haven’t we been through this a few times before?

~~~
edf13
In fact I’d go further... there is too much emphasis that https == trusted &&
safe It doesn’t

------
overcast
whydoesthisurlexist.com

~~~
strictfp
doesntanyurlyoucomeupwithexist.com

------
0x8BADF00D
The only way to be 99% sure your packages aren’t tampered with is to use a
source-based package management tool. Even then there’s no guarantee, as you
are placing your trust into the package maintainers and contributors of that
particular package.

~~~
somepig
You might consider reading up on reproducible builds. The majority of Debian
packages can now be bit-for-bit rebuilt to verify the maintainer actually
ships what they say they are.

