Hacker News new | past | comments | ask | show | jobs | submit login

For growing hashes, we can limit the growth of the search complexity by choosing to accept a more constrained range of hashes in each iteration.

Suppose H is an L-bit hash function and let H_n be the commit hash at the nth iteration. If we seek to generate H_n subject only to the constraint H_n+1 > H_n, the search complexity grows exponentially as you describe above.

But suppose we search subject to the additional constraint H_n < H_n+1 < min(H_n + k, 2^L) for some constant k. So long as 2^L - H_n > k, the search complexity for H_n+1 is only dependent on the constant k and the number of bits of the hash function, L.

Heh, you made me consider if kudeki-chain also restricted the range (though not to a constant):

One could read the title as exactly n leading zeros, but of course, it just means it has n leading zeros (that is, it may have more, too) - and the py code is implemented that way. So, for example, if you started with 16 leading zeros, it could be used for the first 16 commits.

  work = len(sha1) - len(sha1.lstrip('0'))
  assert commit_num <= work, "Not enough work done."

BTW cute that assert is exactly appropriate for the ci tests, for determining whether it is accepted as a commit.

Sure, it works, but it doens't have the fun of the problem starting easy and getting harder over time.

Setting k as sqrt(2^L-H_n) would be fun middle ground probably, though I'm lazy to analyze its properties.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact