This is the crux of it, and where I'm wondering if I'm missing something. Can it, today? My understanding is it cannot discern reality from fiction, thus "hallucinations" (a misnomer because it implies awareness, which these probability models lack).
The poorly named hallucinations are creation of ideas from provided prompts, which ideas are not grounded in reality. It isn't the mistaken adjudication of the reality of a provided prompt.