This might be one of the stupidest things I've read on here. LLM/generative "AI" is not capable of what you're suggesting. They do not possess personalities, intelligence, creativity, or any sort of sentience even if you think "hallucinations" are indicative of this. Hallucinations are merely a symptom of the problem that is foundational to all LLMs: they are just really, really complicated programs that determine mathematically likely output based on an input, and since they don't "know" anything, they can, and often, get it wrong. That's what hallucinations are.