One of the surprising things is that LLMs regularly "fix" things that no other system can fix. Like if we both add the same sentence to a doc. It's interesting stuff.
With that said I am not sure that this specific LLM is providing the "right" answer. It seems like AN answer! But I think the real solution might be to ask the user what to do.
One of the surprising things is that LLMs regularly "fix" things that no other system can fix. Like if we both add the same sentence to a doc. It's interesting stuff.
With that said I am not sure that this specific LLM is providing the "right" answer. It seems like AN answer! But I think the real solution might be to ask the user what to do.