Fully Homomorphic Encryption: Secret Key Homomorphic Encryption Over Integers 142 points by madrafi 9 months ago | hide | past | web | favorite | 45 comments

 `````` (c(mod p)) (mod 2) = (p * q + 2 * r + m (mod p)) (mod 2) = 2r + m (mod 2) = m `````` This breaks if 2*r > p. Even if you choose r to be small during encryption, the r values accumulate with each homomorphic operation and will eventually be too big. The only restriction stated is that r is "from a different interval than the private key one". This should be made more clear.
 > Even if you choose r to be small during encryption, the r values accumulate with each homomorphic operation and will eventually be too bigAs @tuxxy points out, there is a metaphor for describing this - a "noise ceiling". To see this phrase used in context, see, for example: https://eprint.iacr.org/2011/277.pdf
 I agree. The noise ceiling was only hinted at when the post started explaining Gentry's Bootstrapping method.
 Top of article mentions 'In the previous post...' (which is an intro to homomorphic encryption)That post is here: https://radicalrafi.github.io/posts/homomorphic-encryption/
 It was posted on HN 14 Days ago Thanks
 Homomorphic encryption is interesting from a mathematics perspective, but in practical terms it seems like an awful lot of effort being invested to move even more computing off of your own devices and onto the "cloud."
 Homomorphic encryption + smart contracts = an immortal, non-subpoena-able lawyer holding your signing keys in trust, who can continue to act “as you” even after your death. That’s useful, no?Add a cryptocurrency wallet for which the signing keys are the private keys, and now it’s more like an immortal, non-subpoena-able corporation executing your desires using its treasury assets. (Which can include hiring real people to do real-world things, and even—given that you could provision crypto mining capacity—making income to keep said corporation’s actions [relatively] self-sustaining.)
 William Gibson called, I believe he wishes to discuss a new novel with you.But seriously while amazing in potential, having semi-intelligent contracts running around indefinitely would eventually have to cause havoc. Either by the “Paperip Maximizer” effect [1] or just having a huge weight dragging down the economy from accruing smart contracts.
 Daniel Suarez wrote it already. Check out Daemon https://amzn.com/0451228731 .
 Seconded! And also the sequel, Freedom. The basic premise of the story seems entirely plausible in the near future to me.
 Yes, though if I understood things correctly that one had the option to add capabilities using fmri vetted coders.
 or an unregulatable and opaque shadow economy
 Note that the results of computations performed over FHE ciphertexts are encrypted, so you won't get much benefit out of a "FHE lawyer" without having the secret key to decrypt its results.
 Zero Knowledge Proofs are better in this scenario.
 Efficient homomorphic encryption (and my understanding is that homomorphic encryption is still far from being efficient) has benefits for physical security as well, such as removing the need to keep decrypted data in RAM.
 I think this is the highest benefit. This an offloading sensitive data into the cloud.
 If you're arguing that no computation should be proprietary, ever... okay. Not opening that discussion right now.But, assuming that one does, ever, want someone else to do a proprietary computation on one's data, it'd be real nice if that didn't require giving up plaintext.
 I would like to point that although HE isn't efficient Multi Party Computation is very promising .
 We already have an awful lot of data in the cloud. If Google and Facebook adopted homomorphic encryption, it would be a huge improvement.
 But they won't. The reason they're willing to store and serve all the information for us is so they can mine it for revenue. They're not going to adopt homomorphic encryption and lock themselves out from their primary (only) revenue source.
 Yes!
 I used HE in my informatics b.sc. thesis to do privacy preserving surveillance: Store data on multiple servers and need server majority to reconstruct (non-HE), perform face recognition on encrypted data and then use (S?)HE to query a database if that face is in it - of course without the db learning about the content of the face data. So, turns out just throwing some math on the problem works (I just applied some previous work; you know what they say about the shoulders of giants) and gives you the advantages of surveillance with less potential for abuse - but the necessary computational power is absurd :((And yeah, HE "noise" is a pain)
 Another interesting scenario for HE closely related to this one consists of accountability __without__ transparency. Which throws a wrench in the old concept of 'no accountability without transparency'.
 Here it is: https://www.dropbox.com/s/9zv0xjmf4bz602a/thesis_web.pdf?dl=...We did a some basic research on that for a seminar, and I wanted to figure out how far this can be pushed for a more "real life"-setting with the four roles mentioned in the introduction. I ran into slight time problems since the implementation was more difficult than I expected & due to some out-of-university-related stuff.Anyway, maybe things changed in the past 6 years and my conclusion from back than doesn't hold anymore. So this could be more feasible now. Let me know what you think ;-)
 I have to dig it up I'm afraid, but I have it somewhere. Keep in mind it's only a bsc ;)
 > c is odd if m = 1 c is even if m = 0 ( Yes 0 is even ).If c is the ciphertext than can't someone simply mod 2 and "decrypt" it?
 is this the same technology that Numerai uses... or is it multi party computation (https://mortendahl.github.io/2017/04/17/private-deep-learnin...) ?
 It is a virtual certainty that Numerai is lying about using homomorphic encryption. The only known algorithms for FHM encryption require specialized algorithms to perform computations on the encrypted data. You can't just run a standard neural net over some homomorphically encrypted data and expect an interpretable result. Yet, Numerai claims that this is exactly what's possible with their data. This is clearly false. They are probably obfuscating their private signals in some extremely trivial way.
 There's also the unlikely possibility of discovering a truly homomorphic encryption scheme with no constraints on operations
 About as likely as discovering Fermat's purported proof.
 Unclear tbh, the only thing I could tell from numerai is that the data is time series. The evaluation of the predictions isn't public afaik so you can't tell.
 I am unclear on this - it looks like it operates 1 but at a time, so if I have a sequence of encrypted bits I can do freq analysis. Clearly that is not the case, so what bone headed misunderstanding am I making?
 P stays the same, but Q and R are randomly chosen for each bit. That wasn't clear to me at first, but see the commented-out definition of the "encrypt" function in the code block.
 nice catch I wanted to make the values small so the reader can follow with pen and paper since that was the way I learned how it works .
 So to be clear - do you need to store all those Q's and R's somewhere to decrypt, or does it still work if you throw them away because of all the mod's?
 No the point of secret key here is that you only need the p to encrypt and decrypt in the code example q, r are random in each run.
 The best way to encrypt any data is to make adversary think the data does not exist.For everything else rubber hose cryptanalysis will work.
 If your adversary decides the easiest way to get your data is to kidnap & torture you, you're probably doing a really good job.This sort of thing is called "security through obscurity" and the consensus is that it doesn't work. It can be a deterrent to adversaries lacking in skill or motivation, but it isn't a very strong layer. Attackers quickly discover that one company looks much like the next, and they develop an intuition for what sort of data a company needs to collect to accomplish their tasks. Additionally, there's tons of sensitive data that companies in certain industries are required to collect & retain, and those laws are a matter of public record. You aren't going to outfox them into thinking you don't have blueprints of your widgets and evaluations of your employees on file somewhere; focus on keeping them away from them.
 Obscurity is a valid layer, it’s just not valid as a sole means of security.
 Don't think of it as a security layer, it won't serve you. Don't link to your admin interface from your homepage, that's asking for trouble, but don't expend additional effort to create obscurity thinking you're strengthening your outermost defenses. You'll waste your time while creating a false sense of security.The problem with obscurity is that it doesn't really impose asymmetric costs on the attacker. You know how much effort you spent creating a layer of obscurity, but there is no way to know how much effort the attacker has to spend to break it. Do they find your secret URL on accident? Were they an ex-employee who simply knew? Are you just not nearly as tricky as you imagine you are? You can easily work yourself to the bone creating a "layer" which is as effective as the Maginot line.