To be honest, I catch myself doing this too. I’ll ask an LLM first instead of searching, even though I know it’s not a reliable source of truth. It just feels faster and more convenient.
I’m aware of the hallucination risks and that it’s a bad habit, but the workflow is so smooth that it’s hard to break. When it matters, I do double-check with real sources, but for casual stuff, I’ll probably keep using AI as my first stop.
reply