Hacker News new | past | comments | ask | show | jobs | submit | Retr0id's comments login

I avoided the stack limit by using itertools.chain: https://github.com/DavidBuchanan314/millipds/blob/15727d474c...

(this is for iterating over JSON-like objects, which are just weird trees)


> "Why not just use recursive functions"

One great reason not to use recursive functions for traversing trees is that you can allocate your own stack data structure rather than relying on the call stack itself. In most languages/runtimes, the call stack has a maximum depth which limits the depth of trees you can process, usually on the order of thousands of stack frames.

Managing your own stack usually produces weirder looking code (personally I find "naive" recursive approaches more readable) - but having it as a first-class language feature could solve that!


> it is a shame that they gave up

They didn't really give up, though - the domain verification still stands and is just as powerful as ever.


It would be nice to be able to see the final results of questions I voted on

If you repeat the test a few times, it'll average out.

That's why there's 20 rounds I guess. If you could just press "all the same" it wouldn't have to be as many

Interesting. I only got 15/20, and previously considered myself "above average" at colour distinction tests but based on other replies that's not an especially good score. I'll try again, going more carefully.

Ah, yes, 19/20 the second time (only the last one wrong).

The first time I kept my eyes fixed in the same place roughly in the middle which clearly wasn't a good idea. On the second attempt I glanced between each circle in turn, trying to discern the difference over two points in time rather than two points in space.


I have poor color discrimination, but excellent flicker detection (?). This last skill was discovered by the senior devs when I was doing GPU driver debug, and “we” were looking for an extremely transient high-refresh rate tile clear issue. The issue only occurred at 120Hz (or higher) refresh rate with solid clear color on a large screen, with nearly identical colors. About one 4x8 pixel tile every minute or so. That was a boring few days, let me tell you.

If we assume cryptographically-relevant quantum computers will one day exist, you don't just need to worry about certs being cracked before they expire, but also the ECDH-established session keys being cracked. These keys are ephemeral, but if you store the ciphertexts long-term, you can crack them at any point in the future (aka https://en.wikipedia.org/wiki/Harvest_now,_decrypt_later).

Perfect forward secrecy means harvest now, decrypt later does not apply to signature algorithms when ephemeral keys are used and TLSv1.3 mandates ephemeral keys. If the ephemeral keys are cracked, that would be the fault of the key agreement algorithm, not the signature algorithm.

> If we assume cryptographically-relevant quantum computers will one day exist

One day could be 10,000 years in the future, so what meaning is there to such an assumption? You need to assume much more than that such machines will be constructed one day to suggest that there is a need for action. The industry is switching to hybrid key agreement algorithms out of an abundance of caution that it is not just one day that such a machine will be made, but one day in our lifetimes. It is not certain that will actually happen, but if it does, having adopted hybrid key exchange algorithms years in advance is enough. There is no need to switch signature algorithms from ECC until the creation of such a machine is imminent. Thus it is fine to proceed with EdDSA adoption in PKI.


The Eccfrog512ck2 curve can be used for both signatures and key agreement.

The industry is mostly pivoting to hybrid schemes, and it's sensible to want a higher-security curve to pair with a higher-security PQ algorithm.

The pivot is occurring on both key agreement and signatures. Hybrid schemes currently only exist for key agreement. Perfect forward secrecy means that as long as the key agreement schemes are secure against Shor’s algorithm, we can afford to do a much more leisurely roll out of PKI with PQ signing algorithms. Whether people will opt for “hybrid” signatures is yet to be seen.

They say coefficient b is determined via BLAKE3, but unless I'm missing it, they don't actually say how?

They also claim that the prime modulus was chosen "carefully", and enumerate its favourable properties, but do not elaborate on how it was chosen. Presumably they had some code that looped until they found a prime that gave them all the right properties, but it would be good if they shared that process.



And even with the constant `b=BLAKE("ECCFrog512CK2 forever")` there is an open question, while not as problematic as it is with the NIST & SEC curves, it's covered in "How to manipulate curve standards: a white paper for the black hat"[1]

I'm surprised they didn't include the constant in the paper and at least a short justification for this approach, despite stating "This ensures reproducibility and verifiable integrity" in section 3.2, whereas several other curves take the approach of 'smallest valid value that meets all constraints'.

Really they should answer the question of "Why can't `b` be zero... or 1" if they're going for efficiency, given they're already using GLV endomorphisms.

Likewise with the generator, I see no code or mention in the paper about how they selected it.

[1]: https://eprint.iacr.org/2014/571.pdf


Agreed. I have a draft article (far from finished) with my own attempt to explain ECC, and the opening diagram is the classic "pretty pictures" with a big red cross through them. They have surprisingly little relevance in the overall picture of ECC.

Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: