Your judgment that present technology is inadequate is based on the assumption that computers need to learn to read the human thoughts.
What about the inverse, that the humans learn how to think in a way that a computer understands? That will be much easier, as humans learn much better than computers, and also much safer - I will have complete control over which of my thoughts the computer can detect and interpret.
The human learning to adapt to the machine has been the way EEG-based brain computer interfaces have been made for a couple of decades. Using machine learning to adapt the machine to the human is a much more recent development.
It is possible today to make EEG controlled devices. They typically differentiate between a small number of real or imagined movements in the user. This is awesome, because it can allow severely paralyzed people to communicate, control a wheelchair. etc. Nevertheless, the algorithms used to do this are perfectly useless when it comes to distinguishing whatever words the user is internally vocalizing.
The keyboard is not very good at determining which words I'm internally vocalizing either, still seems to work.
The point I'm trying to convey is that maybe we can learn to transmit words using some form of brain reader, but that measures something else than vocalizing.
What about the inverse, that the humans learn how to think in a way that a computer understands? That will be much easier, as humans learn much better than computers, and also much safer - I will have complete control over which of my thoughts the computer can detect and interpret.