There is no dream-reality separation in an LLM, or really any conception of dreams or reality, so I don't think the term makes sense. Hallucination works fine to describe the phenomenon. LLMs work by coalescing textual information. LLM hallucinations occur due to faulty or inappropriate coalescence of information, which is similar to what occurs with actual hallucinations.