I don't think it's a problem per se, but it will cease to be a good example of a break in GPT because once it's "fixed", people will point to it and say "nuh-uh".
When really, the "fix" is "put the answer in the model". GPT didn't learn anything. It didn't generate the solution on its own. It's not indicative of GPT being able to solve that class of problem, just that one problem.
Which seems to be the entire thrust of GPT in general. It can't solve types of problems, it can solve existing problems if they have existing solutions.
When really, the "fix" is "put the answer in the model". GPT didn't learn anything. It didn't generate the solution on its own. It's not indicative of GPT being able to solve that class of problem, just that one problem.
Which seems to be the entire thrust of GPT in general. It can't solve types of problems, it can solve existing problems if they have existing solutions.