Glad you follow! But I'm not sure we agree yet, as there's still something puzzling when you say:
> But don't use field theory to justify convention 2, because it's mathematically incoherent
> defining that division and calling the consequent system a field when it's not a field
> a direct consequence of defining any division by 0 is that you cease to have an algebraic field
If you go back to my comment (the one you're replying to), both the functions f and g assume a field F, and they are well-defined functions on F×F\{0} and on F×F respectively. (Do you agree?) For example, F may be the field of real numbers. Forget about the word “division” for a moment: do you think there is something about the function g, that makes F not a field?
To put it differently: I agree with you that it is a direct consequence of defining a multiplicative inverse of 0 that you cease to have an algebraic field. But the only way this statement carries over when we use the word “division”, is if we already adopt Convention 1 (that “division” means the same as “multiplicative inverse”).
Again, I think you are implicitly adopting Convention 1: you're saying something like “if we adopt Convention 2, then x/y means the function g and includes the case when y=0, but [something about multiplicative inverses, implicitly invoking Convention 1], therefore there's a problem". But there's no problem!
It is not a direct consequence of defining the function g(x,y) that something ceases to be a field: it is a consequence only if you also insist on Convention 1, namely if you try to assign a multiplicative inverse to 0 (which everyone here agrees is impossible).
Let me emphasize: whether we adopt Convention 1 or Convention 2, there is no problem; we still have the same field F.
> But don't use field theory to justify convention 2, because it's mathematically incoherent
> defining that division and calling the consequent system a field when it's not a field
> a direct consequence of defining any division by 0 is that you cease to have an algebraic field
If you go back to my comment (the one you're replying to), both the functions f and g assume a field F, and they are well-defined functions on F×F\{0} and on F×F respectively. (Do you agree?) For example, F may be the field of real numbers. Forget about the word “division” for a moment: do you think there is something about the function g, that makes F not a field?
To put it differently: I agree with you that it is a direct consequence of defining a multiplicative inverse of 0 that you cease to have an algebraic field. But the only way this statement carries over when we use the word “division”, is if we already adopt Convention 1 (that “division” means the same as “multiplicative inverse”).
Again, I think you are implicitly adopting Convention 1: you're saying something like “if we adopt Convention 2, then x/y means the function g and includes the case when y=0, but [something about multiplicative inverses, implicitly invoking Convention 1], therefore there's a problem". But there's no problem!
It is not a direct consequence of defining the function g(x,y) that something ceases to be a field: it is a consequence only if you also insist on Convention 1, namely if you try to assign a multiplicative inverse to 0 (which everyone here agrees is impossible).
Let me emphasize: whether we adopt Convention 1 or Convention 2, there is no problem; we still have the same field F.