Hallucinations exist because people are using LLMs as knowledge oracles which they are not. I doubt it'll ever be solved unless a new type of model is invented for this.
With current models it cannot be solved. Maybe it cannot be solved at all; humans lie/make up things (as in; they might not know they are telling a non truth) all the time and that's the best example of intelligence we have.