> The computation cannot be parallelizedIs there a proof of this somewhere? Operations that seem vaguely similar, like doubling repeatedly, can be easy to "parallelize".

 The puzzle is based on a paper from 1996 by Rivest, Shamir and Wagner: https://people.csail.mit.edu/rivest/pubs/RSW96.pdfEach step depends on the result of the previous step. I don't know of any claim that the crypto/maths behind the paper do not hold.You can use a FPGA or build an ASIC: but as far as I know as long as the crypo in this paper hold, it's not parallelizable.
 > Each step depends on the result of the previous step.But this doesn't mean anything. If I said to take a number and add 1 eighty trillion times, each step would depend on the result of the previous step, but obviously it would be unnecessary to actually compute each step.
 The reason this isn't the same is that addition is associative, so you can parallelize it like this: `((1+1)+1)+1 = (1+1)+(1+1)`.A fast way to calculate something like `g^x` for some large `x` is to iteratively square:`````` g^1 g^2 g^4 ... g^(2^i) `````` and the puzzle is designed so that `i` is large (`79_685_186_856_218`)Of course, this doesn't prove that computing `g^x` is sequential, but it does show why it is different than computing `1+1+...+1`.
 Doesn't this suggest that computing 2^i is just as hard?You can compute g^(2^i) by just multiplying g's in the same way that you can compute 2^i by adding 1s, and multiplication is as associative as addition is. Specifying that you want g^(2^i) rather than g^k for some arbitrary k looks like a way of ensuring that a partial product computed in parallel (say, g^3) will be of nearly no value, because every partial product that helps compute the total will already be produced by the single agent doing squares.So why compute g^(2^i) rather than just 2^i? Is 2^i in fact hard to compute? (In binary representation, it certainly isn't, but maybe 3^i is?)
 Using the doubling trick you can compute `2^i` in `log i` time, so you can compute `2^(2^i)` in `log 2^i = i` time. Let `i` be big enough, and you've got a "difficult" problem.'
 Read the paper. It tells you why they think it's better than that. And I think that has stood so far.
 > Read the paper. It tells you why they think it's better than that.OK, I read the paper.This is the entire discussion of the inherent difficulty of repeated squaring:> More importantly, repeated squaring seems to be an "intrinsically sequential" process. We know of no obvious way to parallelize it to any large degree. (A small amount of parallelization may be possible within each squaring.) Having many computers is no better than having one. (But having one fast computer is better than one slow one.) The degree of variation in how long it might take to solve the puzzle depends on the variation in the speed of single computers, and not on one's total budget.I see two problems here:(1) This is a purely conclusory argument, suggesting that a better response would have been a simple "no, there's no proof".(2) This does not even attempt to address the question I posed, which was whether it is necessary to compute the intermediate results in order to compute the final result.In fact, we know that the intermediate results are not necessary to compute the final result, because that is how the message gets enciphered in the first place. It can be done by factoring the modulus, and obviously there is no proof that that is difficult to do.But beyond that, it also isn't immediately obvious that it must necessarily be impossible to compute the final result directly by means that don't require factoring the modulus. It is possible that somewhere in the literature on Blum Blum Shub this question has been analyzed, but that goes a bit beyond your suggestion to "read the paper".So, I'm stumped. Tell me what you saw in the paper that you thought would answer my question?
 > In fact, we know that the intermediate results are not necessary to compute the final result, because that is how the message gets enciphered in the first place. It can be done by factoring the modulus, and obviously there is no proof that that is difficult to do.Well, there's no mathematical proof. But physicists and judges would accept that there's lots of money and fame to be made breaking eg RSA, so countless people have tried and keep trying, but haven't come up with much good so far.
 The answer to your question is, "No, there is no proof."This is similar to the answer to, "Is there any proof that RSA is uncrackable?"The security is based upon the assumed hardness of certain mathematical problems.
 simond_d 77 days ago [flagged] I hate it when people think they know better than the experts. Do you really think that if they could have made it run in parallel they wouldn't have? Get off your high horse and solve it yourself if you're so smart.
 Seems to me like a good-faith effort to understand. And respect for the experts isn't the same as unquestioningly accepting all of their assumptions.

Registration is open for Startup School 2019. Classes start July 22nd.

Search: