Hacker News new | past | comments | ask | show | jobs | submit login

No that would be silly - TCP handles “infinite” length streams. The problem is that there’s so much extra data for the cryptographic handshake that latency of establishing the connection is meaningfully higher by a lot and performance degrades by a meaningful amount. That’s all.

I don’t know why OP brought in MTU and packet sizes since that doesn’t really apply here. The most you could say is that the size exceeds the TCP window requiring an explicit ack but that’s unlikely (windows are quite big) and everything I’ve read only talked about the latency of the handshake being caused by the much larger data exchange needed (ie if TLS handshake requires 32 bytes each way, Kyber and friends need to send 1kib each way [1])

[1] https://blog.cloudflare.com/post-quantum-for-all/




> I don’t know why OP brought in MTU and packet sizes since that doesn’t really apply here.

It does apply. TCP exposes streams as the API, but the underlying data is still sliced into packets of size up to the MTU.


regardless of all that, I must now think about the correlation between cryptographic strengths and amount of information necessarily transmitted

is there such a correlation? why? how does it work? i don't even....


I’m not enough of an expert to cut through your confusion. if I recall correctly the cryptographic signature size has always been tied with the size of the key which determines the strength. The larger the key, the larger the signature and “more cryptographic strength”. What’s new here aside from the signatures being an order of magnitude larger than before and having different growth factors with increasing key size?


the involvement of quantum computing devices?

which is something else I must admit I cannot really fit together with what I imagine I understand about "classical" computing

but in information theoretic terms does it matter whether you use quantum or typical computers??? I would think that it does not matter but I may be wrong and I couldn't really explain why


The involvement of the quantum computer is only that it's an adversary that can break asymmetric encryption with different complexity constraints than a classical computer. For example, take two random prime numbers as a secret and publish the result of multiplying them. It turns out that if you want to find the two random prime numbers by knowing only the result of multiplication is a hard problem known as integer factorization. If you double the size of the prime numbers, discovering them takes exponentially longer. The theoretical quantum computer model though says that it can accomplish it in sub-exponential time. Now doubling the size of the number only increases your compute requirements by ~2x to recover the prime factors (specifically it's actually a logarithm so it's < 2x more compute is needed). The algorithm for doing this is known as Shor's algorithm [0] and because of how complexity works in computer science, it turns out that this algorithm can be applied to many many problems mechanically and these problems are the underpinnings of RSA, DSA, ECDH key exchanges.

These quantum-resistant algorithms are based on mathematical problems believed to be exponentially difficult even for a theoretical quantum computer - when you double the size of the problem, it again takes exponentially more time even for a theoretical quantum computer. These are of course unproven beliefs but that's true of classical algorithms too. So no, it doesn't matter where you run the cryptographic algorithm; it remains computationally difficult to solve the problem without knowing the secret. The quantum computer is critically important though for your ability to crack classical problems though - without it, all of this post-quantum cryptography is unnecessary.

[0] https://en.wikipedia.org/wiki/Shor%27s_algorithm

[1] https://arxiv.org/pdf/2212.12372


Interesting tidbit from the paper, they suggest to "ignore" O(log(n)^3/2) and take it as equivalent to O(log(n)), even though strictly speaking this is not correct, in one of their time complexity proofs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: