Hacker News new | past | comments | ask | show | jobs | submit login

> Along with the recent Linux Mint hijack, this really illustrates the need for people to verify programs they download. Though I think most people can't be bothered to verify the checksum on a file every time they download it.

Barring a situation where a CDN hosting the download is compromised but the main site is not hosted on the CDN, it's extremely unlikely that someone would have the ability to inject malware into the download and not have the ability to make the checksum match. Posting checksums is actually pretty useless, and was something that used to be used to deal with the possibility of malicious mirrors, but doesn't provide any security against mitm attacks (unless the main site is secure but the downloads aren't which is idiotic by 2016 standards anyway), the site getting hacked, etc.

Digital signatures are a little bit better if the key is kept safe, since hacking the site and replacing the binary won't allow a random person to produce a valid signature, although ability to modify the source code would still allow someone to introduce backdoors into the next version, but there's still a huge problem where you need some way to determine what key was supposed to be used to sign the binary in the first place, so just posting a signature on a website is also basically useless.

Digital signatures can work if there's some sort of centralized distribution method, or for safely updating software that's already installed.




In Debian and Ubuntu at least, all published files containing binary executable files (ISOs, .deb packages, etc.) are hashed and the hash signed by a well-known system pre-installed PGP key.

Given trust in the protection of the private key used to sign the hash list file the integrity of the executable content can be proved (assuming useful SHA1 collision creation is prohibitively expensive).

Coincidentally I was writing a Bash script this weekend to auto-install (Ubuntu) releases into LVM volumes and it includes the following code to verify the download:

  set -e
  # ...
  ISO="${NEW_DIST}-desktop-${ARCH}.iso"
  for F in SHA1SUMS SHA1SUMS.gpg ${ISO}; do
    if [ ! -r $F ]; then
      wget http://cdimage.ubuntu.com/${FLAVOUR}/daily-live/current/$F
    fi
  done

  if ! gpg --verify --keyring /etc/apt/trusted.gpg SHA1SUMS.gpg SHA1SUMS; then
    echo "Error: failed to verify the hash file list signature; files may have been tampered with"
    exit 2
  fi
  if ! grep ${ISO} SHA1SUMS | sha1sum -c; then
    echo "${ISO} is corrupted; please try again"
    exit 1
  fi


Shouldn't you be using HTTPS for all downloads and grabbing the sha1s and image from different mirrors?


That is a good idea but note the use of GPG - if you aren't able to forge that signature, the verification step will fail.


Right, I missed that.


Great idea.


You'd think there would be some sort of global torrent network that simultaneously distributes binaries and signatures.

Doesn't seem like a horrible idea to me: you could just add the developer's key to your client, have your client broadcast interest, receive a _signed_ list of available software with appropriate magnet info... Download servers could serve as initial trackers until enough information has propagated through the network for downloads to be trackerless. Checksums? Guaranteed. Signatures? Acquired. Checking? Performed automagically.

Granted, this just moves the point of failure to the developer's key. (Key acquisition needn't necessarily take place on the developer's site, a friend in the network could pass you a link containing the dev's key and the application's magnet info.)


> (unless the main site is secure but the downloads aren't > which is idiotic by 2016 standards anyway), the site > getting hacked, etc.

It's not idiotic at all. You let anyone who wants to spread the load by providing downloads, but you use checksums - behind https - to ensure they can be trusted.


I thought only apps signed by "identified developers" are run by default on Macs with Gatekeeper now. Shouldn't code-signing have prevented this? Unless they inserted the malware before the signing process.


>I thought only apps signed by "identified developers" are run by default on Macs with Gatekeeper now. Shouldn't code-signing have prevented this?

"By default". Most developers don't bother to register, and lots of people change the default (and after that, they can right click to open the app and bypass the warning).


Anyone can sign up for the Apple Developer Program to become an "identified developer", so there's nothing that stops an attacker from signing their malware.


And according to the analysis [0], this is exactly what they did. They used a different cert to sign their malware.

I have to admit that Windows' UAC is better in that regard, as it shows the signees name. But of course this is only useful if you know the "right" name.

[0] http://researchcenter.paloaltonetworks.com/2016/03/new-os-x-...


Yeah, I think this is a major issue on OS X. For the average user it is impossible to tell who signed an app, if it is sandboxed, and what permissions it has. Hell, using the codesign command to extract entitlements from all binaries in a package is hard even for advanced users...

(There is third party tool named RB App Checker which does make these tasks a bit easier, though)


Well, I guess that’s at least one advantage for apps that use Installer.app¹ to install; Installer.app makes it really easy to see the certificate².

――――――

¹ — https://en.wikipedia.org/wiki/Installer_(OS_X)

² — http://f.cl.ly/items/1s1E3n19273M1l3i3S2X/developer_id_insta...


Hold down control, right click and choose run. Then once will run unsigned binaries (after a warning).


A small nit: just plain right click, or hold down control and left click, which is the same as right click.


The malware version was signed with the Transmission developer key.


No, it wasn’t:

The two KeRanger infected Transmission installers were signed with a legitimate certificate issued by Apple. The developer ID in this certificate is “POLISAN BOYA SANAYI VE TICARET ANONIM SIRKETI (Z7276PX673)”, which was different from the developer ID used to sign previous versions of the Transmission installer. In the code signing information, we found that these installers were generated and signed on the morning of March 4.

From: http://researchcenter.paloaltonetworks.com/2016/03/new-os-x-...


What. That's interesting -- Polisan is a relatively well-known paint company in Turkey. I don't think they have a part in this -- maybe they did not store their private keys well enough?


> which was different from the developer ID used to sign previous versions of the Transmission installer

and that didn't ring any alarm bells?


For the end user? No, it wouldn’t. As thesimon and jakobegger, respectively, said:

And according to the analysis, this is exactly what they did. They used a different cert to sign their malware. I have to admit that Windows' UAC is better in that regard, as it shows the signees name. But of course this is only useful if you know the "right" name.

Yeah, I think this is a major issue on OS X. For the average user it is impossible to tell who signed an app, if it is sandboxed, and what permissions it has. Hell, using the codesign command to extract entitlements from all binaries in a package is hard even for advanced users... (There is third party tool named RB App Checker which does make these tasks a bit easier, though)

…in this comment thread: https://news.ycombinator.com/item?id=11234966


It did actually, but only for in-app updates [0].

[0]: https://forum.transmissionbt.com/viewtopic.php?f=4&t=17835


It could cause a failure for updates but not fresh installs.

Many people would uninstall and download it over again when running into that kind of error message.



There is also the web of trust for PGP, which sort of solves the problem of needing a central store of the key. It does require being inside the web though (bootstrapping). But once you are, you can construct how much you trust a key from someone you haven't met.


I really sort of expect a signing-keys-on-download-server announcement any time now.


>unless the main site is secure but the downloads aren't which is idiotic by 2016 standards anyway

https adds a performance hit. The security of "checksum over https and actual file over http", if the checksum is checked, is the same as "actual file over https", barring preimage attacks.


This attitude is completely irresponsible.

HTTPS will ensure integrity of your download automatically with no action required from the user.

Checksums require the user to perform the integrity check manually, and 99.9% of users wont bother.

Please don't put your users at risk just to save a negligibly small number of CPU cycles.


Granted, the project I saw this reasoning on (https://www.whonix.org/wiki/Download_Security) is one where users are especially likely to do security checks, and they generally aren't satisfied with the security of SSL anyway.

> just to save a negligibly small number of CPU cycles

The link above says they can't afford the additional cost. If it's so negligible, would you sponder the cost of those extra cycles? I'm sure they would host on SSL if someone covered the cost.


Quote from a google engineer in 2010 (it's only gotten cheaper in the last 6 years w/ advances in CPU tech) regarding SSL overhead:

> On our production frontend machines, SSL/TLS accounts for less than 1% of the CPU load, less than 10KB of memory per connection and less than 2% of network overhead. Many people believe that SSL takes a lot of CPU time and we hope the above numbers (public for the first time) will help to dispel that. [0]

[0]: https://www.imperialviolet.org/2010/06/25/overclocking-ssl.h...


None of affects the point they're making, which is that they can't find SSL mirrors that aren't more expensive. If you find one, let them know and I'm sure they'll be happy to switch over.


It is 2016. SSL is not slow anymore. Only case it could be deemed slow would be on a webpage where the browser has to download a ton of small files likes images. Each image would require a new connection and each connection would require full SSL handshake. Even then the fix is not to not use SSL but to bundle all the images/files into 1.


Maybe not computationally, but if you're 100ms, 200ms, 300ms or more away from the rest of the internet, all the SSL handshakes really add up.


Keep in mind the topic at hand is downloading a single large file, the TLS handshake is a rounding error of the total time, regardless of where you are in the world.


This is a persistent myth https://istlsfastyet.com


I've seen whonix say this

https://www.whonix.org/wiki/Download_Security

>Practically it is difficult to provide SSL protected downloads at all. Many important software projects can only be downloaded in the clear, such as Ubuntu, Debian, Tails, Qubes OS, etc. This is because someone has to pay the bill and SSL (encryption) makes it more expensive. At the moment we don't have any mirror supporting SSL. We're looking for SSL supported mirrors to share the load.

Is it not true that mirrors supporting SSL are more expensive?


No, it's not true anymore. From the link you replied to:

"On our production frontend machines, SSL/TLS accounts for less than 1% of the CPU load, less than 10 KB of memory per connection and less than 2% of network overhead. Many people believe that SSL/TLS takes a lot of CPU time and we hope the preceding numbers will help to dispel that." - Adam Langley, Google

Getting an SSL certificate used to be a cost, but that's taken care of now by https://letsencrypt.org/.


So can you recommend a mirror for them that supports SSL?

There are multiple named projects there that aren't using SSL, and I don't think it's just laziness. If you know of a way for them to use SSL mirrors for no additional cost, I'll work on getting them to switch over.


Debian, Ubuntu, Qubes, and others are on https://mirrors.kernel.org.

I suspect that wiki page you linked might be out of date. It seems like all of the Whonix download links on their website are over https, like the VirtualBox images https://www.whonix.org/download/12.0.0.3.2/Whonix-Workstatio....

Whonix also runs a tor mirror, which has significantly more overhead than TLS.


I know the last time I played with Whonix it was http, so I think you're right that it's a recent change.

For tails: https://tails.thecthulhu.com/. It appears to be the same server behind http://dl.amnesia.boum.org/ based on the TLS cert.

The situation is messy to actually use https for all of these projects, but I think the issue now is organization rather than overhead.


Huh. The language there seems to be from 2013.

I seem to remember downloading whonix from their site over HTTP around a year ago.

Do you see a tails HTTPS mirror?


This is only true for Intel and AMD x86_64 servers that have hardware accelerated AES with the AES-NI instruction set. Software implementations of AES and the other ciphers are much, much slower than AES with hardware acceleration. RC4 was the fastest decent software cipher for a while, but that has been found to be insecure and its use is discouraged. The fastest possible replacement would probably be ChaCha20, but that cipher is not widely supported yet. The other software ciphers are very slow, and certainly wouldn't be considered as "fast yet".




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: