If they convinced me of their helpfulness, and their output is actually helpful in solving my problems.. well, if it walks like a duck and quacks like a duck, and all that.
This is true, but part of that convincing is actually providing at least some amount of response that is helpful and moving you forward.
I have to use coding as an example, because that's 95% of my use cases. I type in a general statement of the problem I'm having and within seconds, I get back a response that speaks my language and provides me with some information to ingest.
Now, I don't know for sure if everything sentence I read in the response is correct, but let's say that 75% of what I read aligns with what I currently know to be true.
If I were to ask a real expert, I'd possibly understand or already know 75% of what they're telling me, as well, with the other 25% still to be understood and thus trusting the expert.
But either with AI or a real expert, for coding at least, that 25% will be easily testable. I go and implement and see if it passes my test. If it does, great. If not, at least I have tried something and gotten farther down the road in my problem solving.
Since AI generally does that for me, I am convinced of their helpfulness because it moves me along.