I strongly recommend the book Normal Accidents. It was written in the '80s and the central argument is that some systems are so complex that even the people using them don't really understand what's happening, and serious accidents are inevitable. I wish the author were still around to opine on LLMs.