Good points. But language models are usually then fine-tuned for specific tasks. Contextual reasoning in NLG is a specific area of research that attacks some of the points raised and certainly BERT wasn't pretrained to solve all tasks in NLP/ NLG area of research.
I had a positive interaction with a chat bot yesterday that serves as a good example of what you describe. My home internet had an outage and I went to my cable company's website to report the incident and perhaps get an ETA of when it will be restored. I interacted fully with a customer service chat bot which handled the whole situation flawlessly.