You’re right. But for some types of learning it’s not necessary to have a teacher that’s always right. If GPT-4 feeds me hallucinations occasionally about JavaScript or Python, I will find out soon enough when my programs fail to run or when web searches fail to confirm what it says. Being able to ask it questions and get right answers almost all the time more than makes up for the occasional error.
Another example where GPT-4 can be used productively for learning despite its fallibility is with human languages. It doesn’t always explain grammar correctly, and it can spout out hallucinated facts. But its ability to interact almost as if it were a real person, as well as its ability to explain the meanings of words and sentences quite accurately (in the case of major languages, at least), is of enormous value for people learning other languages. Few human language teachers are as knowledgeable and accurate as it is.
Another example where GPT-4 can be used productively for learning despite its fallibility is with human languages. It doesn’t always explain grammar correctly, and it can spout out hallucinated facts. But its ability to interact almost as if it were a real person, as well as its ability to explain the meanings of words and sentences quite accurately (in the case of major languages, at least), is of enormous value for people learning other languages. Few human language teachers are as knowledgeable and accurate as it is.