>But if we have 2 * 0 = 0, the inverse of that would be 0 / 0 = 2, which is just not true
By the article's own definition, there are two inversions for each operation so why is only one of them given as "proof"? Taking the other inversion, we have 0 / 2 = 0 which is perfectly correct, and contradicts the claim that inversions "prove" why we can't divide by zero.
Why can't 0 / 0 be defined to be any number you wish? The inversions would work fine.
Reminds me of Young Sheldon's fleeting break of faith in zero (https://www.youtube.com/watch?v=xI5ukcG-9pQ&ab_channel=It%27...).
Isn't zero just nothingness. If you multiply x with nothing, it becomes nothing. If you divide x by nothing, shouldn't it just be x.