Hacker News new | past | comments | ask | show | jobs | submit login

Well, I mean it’s pretty fair accusation. ChatGPT was demonstrably bad at math. I think it was only recently mentioned that GPT4 was trained on math. Furthermore, consider what it means to apply the transformer architecture to math problems. I think the tool is a mismatch for the problem. You’re relying on self attention and emergent phenomena to fake computational reduction as symbol transformations. It can probably do some basic maths (all the way up to calculus even) because, in the scope of human life, the maths we deal with are pretty boring. But that’s what they made the wolfram plug-in for as well.

I really think people attribute powers beyond what GPT really is: a colossal lookup table with great key aliasing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: