Hacker News new | past | comments | ask | show | jobs | submit login

No, we could not wait to start doing number theory until after we discover that it's useful for cryptography. It took thousands of years to get to the point where we understood it well enough to use it. Its use would never occur to us if we had not discovered it beforehand. That's completely the wrong causal direction.

What completely undiscovered branch of mathematics do you think we should explore based on an immediate need that we have right now? Not so easy, is it?




What specific, useful things in cryptography would never have happened if we had not been studying number theory for thousands of years? Even if there are some examples, would we be significantly behind in useful capability if we didn't have those specific results?

It's more efficient to work backwards from the problems you have and build out the math. That's what they did with a lot of linear algebra and functional analysis when quantum mechanics came about. I am not saying discovery-based exploration would never work; I am saying it's inefficient if the goal is technological progress.


It just feels like asking which bits of a human wouldn't have been possible without having evolved for billions of years. It's an interconnected body of work that made cryptography possible at all. So ... all of it? I know it sounds like I'm copping out of the question, and maybe I am, because it's a really complicated question you're asking. I just don't know how you're imagining humanity came up with the ideas for:

- Diffie-Hellman key exchange without a deep understanding of quotient groups (and their properties, and proofs of their properties), large (co)prime numbers, and computability

- Quotient groups and its applicability to this problem without a deep understanding of group theory, equivalence classes, isomorphisms, etc.

- Large (co)prime numbers without work by Euler, calculations of GCD, proofs of infinite primes, understanding their densities and occurrence on the number line, etc.

- Computability without tons of work by Turing, von Neumann, Church, Babbage, Goedel, etc. relying on ideas on recursion, set theory, etc.

- Ideas on recursion and set theory without work on the fundamental axioms of mathematics, Peano arithmetic, etc.

- Group theory without modular arithmetic, polynomials and their roots, combinatorics, etc.

- Polynomials and their roots without a huge body of work going back to 2000 BC

- Calculations of GCD without work by Euclid

Most of these generalized abstractions came about by thinking about the more specific problems: e.g. Group Theory only exists at all because people were thinking about equivalence classes, roots of polynomials, the Chinese remainder theorem, modular arithmetic, etc. Nobody would have thought of the "big idea" without first struggling with the smaller ideas that it ended up encompassing.

You can't just take out half of these pillars of thought and assume the rest would have just happened anyway.


I agree that it's hard to imagine an alternate history when things happened through a mixture of pure and application-motivated work. In each example, I can see how people arrive at these notions through an application-driven mind-set (transformation groups, GCD through simplifying fractions during calculations, solving polynomial equations that come up in physics calculations). Computability and complexity, in the flavor of Turing's and subsequent work, I already see as application-driven work, as they were building computing machines at the time and wanted to understand what the machines could do.

Related to this topic. I highly recommend this speech / article by Von Neumann: https://www.zhangzk.net/docs/quotation/TheMathematician.pdf




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: