
We can make primality machines that do not know anything about the factors of n - ColinWright
https://mathlesstraveled.com/2018/01/03/a-tale-of-three-machines/
======
l0b0
I'm pretty sure I've read[1] that primality machines in normal computers rely
on a _heuristic_ to determine whether the factors produced are very, very
likely to be prime, and that security focussed organisations subsequently use
large amounts of computation (on the order of several days on a supercomputer)
to verify that the factors are, in fact, prime. Or is there a class of super
fast algorithms which _guarantee_ primality as the author implies?

[1] Citation needed. This was years ago :/

~~~
johncolanduoni
There is AKS[1], which is deterministic and polynomial time. The most common
need for a primality test is the generation of RSA keys, where a mistake will
result in a key that almost certainly doesn’t work (I.e. you can’t decrypt
encrypted data even if you have the correct half of the key). For these cases
a faster probabilistic test is still the most popular.

[1]:
[https://en.m.wikipedia.org/wiki/AKS_primality_test](https://en.m.wikipedia.org/wiki/AKS_primality_test)

~~~
wging
From what I've read in the past (and this is borne out by the article you've
linked), you wouldn't necessarily use AKS if you need determinism, because
there are other deterministic algorithms with worse asymptotics that perform
better on smaller inputs. Has this changed in the last several years?

From Wikipedia

> While the algorithm is of immense theoretical importance, it is not used in
> practice. For 64-bit inputs, the Baillie–PSW primality test is deterministic
> and runs many orders of magnitude faster. For larger inputs, the performance
> of the (also unconditionally correct) ECPP and APR tests is far superior to
> AKS. Additionally, ECPP can output a primality certificate that allows
> independent and rapid verification of the results, which is not possible
> with the AKS algorithm.

~~~
YomiK
You are correct.

AKS (v6, Voloch, or Bernstein) is O(log^6(n)) with large constants. A nice
polynomial but both large constants and a larger exponent than we'd like.

APR-CL is O(log^K(n)) where K = C*log(log(log(n))), which means for practical
purposes it's in the range 3-5, hence has a lower exponent than AKS. As n goes
to infinity it does finally exceed AKS, but at that point n is so large as to
not be practically computable.

ECPP is conjectured O(log^5(n)) or O(log^4(n)) depending on algorithm used.
Both Primo and ecpp-dj implementations show O(log^4(n)) growth though the
latter doesn't scale well past 1000 digits due to a limited polynomial
dictionary. Primo scales very well and has generated results for primes over
30k digits -- much larger than the others. I expect Pari/GP's ECPP
implementation to eventually compete nicely when it's ready.

Miller-Rabin and BPSW are both O(log^(2+c)(n)), where c is between 0 and 1
depending on the multiplication method, so exponent 2 to 3. But of course
these are probabilistic.

To my knowledge this has not changed recently. There haven't been substantial
improvements to AKS since Bernstein's 2003 paper, which still results in an
exponent of 6, though lowers the constant factors by many orders of magnitude.
In 2006, Bernstein published a randomized version of AKS which runs in
O(log^4(n)). So the same exponent as ECPP but also having the same "downside"
of using randomization, and Bernstein notes that it was still slower than
ECPP.

