A major part of its training set is Wikipedia. Given the impenetrability of maths Wikipedia, I am not at a surprised that it's bad at maths. Usually when I read maths articles on Wikipedia, I get worse at maths.
On a serious note, I wonder how well the training works when a lot of maths is represented symbolically. For example, when they feed it books, do they convert it to LaTeX? MathML? Just leave it out?
On a serious note, I wonder how well the training works when a lot of maths is represented symbolically. For example, when they feed it books, do they convert it to LaTeX? MathML? Just leave it out?