
Rethinking floating point for deep learning - rbanffy
https://arxiv.org/abs/1811.01721
======
Nomentatus
I'm astonished that floating point ever gets mentioned in the context of AI -
I went to static point (integer multiply and bit-shift) in the 1980s for
neural nets. I wasn't able to sell the idea to Hinton, way back then, but that
was because he didn't give me time; he was busy tossing me out of his office
for mentioning a new parallel-processing computer.

