The problem is he wasn't comfortable with my solution even though it was clearly faster and it could be tested instantly. It's a mental block for him and a lot of people in this industry.
I don't advocate blindly trusting LLMs. I don't either and of course test whatever it spits out.
Testing usually isn’t enough if you don’t understand the solution in the first place. Testing is a sanity check for a solution that you do understand. Testing can’t prove correctness, it can only rind (some) errors.
LLMs are fine for inspiration in developing a solution.
I don't advocate blindly trusting LLMs. I don't either and of course test whatever it spits out.