> how often is significant numbers of bit errors a problem, or when does that come up?
It depends on the transport. If you are going over something like TCP you will have a perfect bitstream or you will have nothing, so your codec doesnt have to tolerate bit errors or loss. If you are pushing raw bits over the air with GMSK modulation with no error correction, you'll have to tolerate a lot of errors.
In real world applications you almost always have to consider the tradeoffs on what things you want to leave to the codec and what things you want to leave to the transport layer.
At very low bitrates, the overhead required to create reliability and tolerance for errors or omissions become significant enough that the entire system performance matters a great deal. That is to say that the codec and transport have to be designed to be complimentary to one another to achieve the best final result.
From the presentation they show us mlow at 6kpbs and then again at 14kbps with 30% packet loss (effective datarate 9.8kbps). They do not say if the loss is random bit errors or entire packets, but let's not worry about that. Let's just assume the result of both of these is that you get final audio of about the same quality. This means that mlow has some mechanism to deal with errors on its own, but is using an obscenely high overhead rate (133%) to accomplish it. They also dont let us hear how it actually degrades when exposed to other types of transport errors. These numbers and apparent performance just aren't very good compared to other codecs/systems in this space.
they also address something similar in the body: "Here are two audio samples at 14 kbps with heavy 30 percent receiver-side packet loss."