The title is a bit misleading, though they do explain it in the text.
It seems they cannot show right now that this is the perfect way to multiply. It's faster than any previously existing way and they hope to be able to proove in the future that there is no faster way.
I found this interesting because coming from cryptography I know that it's very hard to have any lower bounds for algorithm speed. Which is why we can't have provably secure cryptography (yet?).
In general I find mental arithmetic like this easiest if you start from the most significant digits. So you just need to keep track of a running sum and which multiplications you still need to add.
Here's the calculation as I would do it in my head, more or less:
According to the article, Schönhage and Strassen introduced the use of fast Fourier transform to integer multiplication in 1971, giving n × log n × log(log n). "The technique has been the basis for every fast multiplication algorithm since."
"Over the past decade, mathematicians have found successively faster multiplication algorithms, each of which has inched closer to n × log n, without quite reaching it. Then last month, Harvey and van der Hoeven got there."