I'm open to the argument that Python 3's handling of string encoding is not ideal. But after using Python 3, I'm not willing to go back to a language that allows the programmer to freely mix encoded and decoded strings and uses the same type to represent both. That way lies insanity, and I'm glad to be rid of it.
In Python 2, I could never seem to figure out how to do string encoding properly. After trying to reason about the code and put the calls to str.encode() and str.decode() in what I thought was the right place and inevitably failing, my only recourse was to sprinkle random calls to str.encode() and str.decode() throughout my code like magic pixie dust until errors stopped happening. In Python 3, if I do it wrong, I get an easy-to-debug TypeError. For bonus points, if I use Mypy, I can get that error without even having to run the code.
In Python 2, I could never seem to figure out how to do string encoding properly. After trying to reason about the code and put the calls to str.encode() and str.decode() in what I thought was the right place and inevitably failing, my only recourse was to sprinkle random calls to str.encode() and str.decode() throughout my code like magic pixie dust until errors stopped happening. In Python 3, if I do it wrong, I get an easy-to-debug TypeError. For bonus points, if I use Mypy, I can get that error without even having to run the code.