Hacker News new | past | comments | ask | show | jobs | submit login

The reason this isn't the same is that addition is associative, so you can parallelize it like this: `((1+1)+1)+1 = (1+1)+(1+1)`.

A fast way to calculate something like `g^x` for some large `x` is to iteratively square:

  g^1
  g^2
  g^4
  ...
  g^(2^i)
and the puzzle is designed so that `i` is large (`79_685_186_856_218`)

Of course, this doesn't prove that computing `g^x` is sequential, but it does show why it is different than computing `1+1+...+1`.




Doesn't this suggest that computing 2^i is just as hard?

You can compute g^(2^i) by just multiplying g's in the same way that you can compute 2^i by adding 1s, and multiplication is as associative as addition is. Specifying that you want g^(2^i) rather than g^k for some arbitrary k looks like a way of ensuring that a partial product computed in parallel (say, g^3) will be of nearly no value, because every partial product that helps compute the total will already be produced by the single agent doing squares.

So why compute g^(2^i) rather than just 2^i? Is 2^i in fact hard to compute? (In binary representation, it certainly isn't, but maybe 3^i is?)


Using the doubling trick you can compute `2^i` in `log i` time, so you can compute `2^(2^i)` in `log 2^i = i` time. Let `i` be big enough, and you've got a "difficult" problem.'




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: