It's a bit offensive, tbh. Hallucination is because LLMs can't think and don't operate on facts. To even suggest engineering is that haphazard shows a clear lack of understanding of engineering. I've had LLMs hallucinate entire libraries existing with examples of how to use them. Others have had worse. If a person did that, they're no engineer.
That's not exactly true either. Hallucinations are because LLMs aren't trained to be truthful, they're trained to be plausible, which usually coincides with the truth.
Humans have better training, but I've worked with someone who had LLM levels of bullshitting. Straight up made shit up and it worked. I'm still not quite sure why but everyone else was either too polite or oblivious to call him out (apparently that's not an unusual response to proper bullshitters).
Fortunately those people are pretty rare, but it's really a question of degree. The latest LLMs bullshit far less than they used to. Less than that guy I worked with.