So, take the Python 3 string changes as an example.
Python 2 scripting on Unix was great! Python just adopted the Unix tradition of pretending everything is ASCII up until it isn't, and then breaking horribly. And then the Linux world said "just use UTF-8 everywhere!" and really meant "just keep assuming things are ASCII, or at least one byte per code point, and break horribly when it isn't!"
This was great for people writing command-line scripts and utilities. This was a nightmare for people working in domains like web development.
Python 3 flipped the script: now, the string type is Unicode, and a lot of APIs broke immediately under Python 3 due to the underlying Unix environment being, well, kind of a clusterfuck when it came to locales and character encoding and hidden assumptions about ASCII or one-byte-per-character. Suddenly, all those people who had been using Python 2 -- which mostly worked identically to the way popular Linux distros did -- were using Python 3 and discovering the hell of character encoding that everybody else had been living in, and they complained loudly about it.
But for growing from a Unix-y scripting language into a general-purpose language, this change was absolutely necessary. Programmers should have to think about character encoding at their input/output boundaries, and in a high-level language should not be thinking of text as a sequence of bytes. But this requires some significant changes to how you write things like command-line utilities.
This is an example of a "yes" to one group being a "no" to another group. Or, at least, of it feeling that way.
Also, saying that Python was a great Unix-y language is not equivalent to "Python only ran on Unix and never supported Windows at all", and you know that, so it was kind of dishonest of you to try to start an argument from the assumption that I said the latter when really I said the former. Don't do that again, please.
You wrote: Python was a great Unix-y language is not equivalent to "Python only ran on Unix and never supported Windows at all",
Let me elaborate further. I recall that Mark Hammond at one of the Python conferences around 2000 said that Python was the language with the best support for Windows outside of the languages developed at Redmond. Hammond did much of the heavy work in making that happen.
I didn't mean to sneak in a dishonest argument.
I am under the genuine impression that Python worked well for Windows, with the narrow Unicode build that matched the UCS2 encoding that Windows used, and was comparable to the experience of developing under Python for Unix.
Similarly, I thought the native Mac support under, say, OS 9, was also well supported, and matched the Mac environment.
I'm not saying that there weren't problems, and I agree that that web development is one of the places where those problems came up.
Rather, I'm saying that I think the native Unix, native Windows, and native Mac support were roughly comparable, such that I don't think it's right to say that there was a really strong bias towards Unix.
What I said: Python was a great language for writing traditional Unix-y things like shell scripts, daemons, sysadmin tools, and here's an example of it adopting something that made that much easier.
What you are trying to twist that into saying: Python somehow didn't run on or wasn't used on or was terrible on operating systems not explicitly named "Unix".
I don't see any way to assume good faith on your part given you've repeated that attempt at putting words in my mouth while demonstrating knowledge that indicates you understand perfectly well what it was I really said. I'm going to ignore you now.
> Python 2 continued an earlier tradition of saying "yes" to almost everything from one particular group of programmers: people working on Unix who wanted a high-level language they could use to write Unix utilities, administrative tools, daemons, etc. In doing that, Python said "no" to people in a lot of other domains.
> Python 3 switched to saying "yes" to those other domains much more often.
I would like to know why you singled out Unix when it seems like Python also said "yes" to MS Windows.
Of course Python developers said "no" to other domains. Every language says "no" to some domains. I thought you were trying to make something more meaningful about a specific bias towards Unix.
Eg, as I recall, the Perl implementation was biased towards Unix and was difficult to compile under Windows. The glob syntax, for example, called out to the shell.
Honestly, I was expecting you to point out a difficulty that Python had with non-Unix OSes, specifically with MS Windows, which has since been remedied with Python 3.
I didn't expect this response at all, nor have my attempts to explain myself seemed to have made a difference.
I still don't know why you singled out Unix in your earlier comment. And it seems I will never know.
"Unix-y" is a paradigm or design philosophy, not an operating system. You can write unixy things for any OS. That's what the parent is talking about, not an operating system. https://en.wikipedia.org/wiki/Unix_philosophy
Your wording wouldn't have cause me to raise an eyebrow.
But ubernostrum seemed to be making a stronger statement that Python favored "one particular group of programmers: people working on Unix ... to write Unix utilities, administrative tools, daemons, etc."
While I know that I used Python 2.x with the win32 extensions to write daemons for MS Windows, and to write an ActiveX extension for Excel.
That's why I wanted clarification on the basis for ubernostrum's statement, with pointers to why I thought Python was well-supported on other OSes.
Python 2 scripting on Unix was great! Python just adopted the Unix tradition of pretending everything is ASCII up until it isn't, and then breaking horribly. And then the Linux world said "just use UTF-8 everywhere!" and really meant "just keep assuming things are ASCII, or at least one byte per code point, and break horribly when it isn't!"
This was great for people writing command-line scripts and utilities. This was a nightmare for people working in domains like web development.
Python 3 flipped the script: now, the string type is Unicode, and a lot of APIs broke immediately under Python 3 due to the underlying Unix environment being, well, kind of a clusterfuck when it came to locales and character encoding and hidden assumptions about ASCII or one-byte-per-character. Suddenly, all those people who had been using Python 2 -- which mostly worked identically to the way popular Linux distros did -- were using Python 3 and discovering the hell of character encoding that everybody else had been living in, and they complained loudly about it.
But for growing from a Unix-y scripting language into a general-purpose language, this change was absolutely necessary. Programmers should have to think about character encoding at their input/output boundaries, and in a high-level language should not be thinking of text as a sequence of bytes. But this requires some significant changes to how you write things like command-line utilities.
This is an example of a "yes" to one group being a "no" to another group. Or, at least, of it feeling that way.
Also, saying that Python was a great Unix-y language is not equivalent to "Python only ran on Unix and never supported Windows at all", and you know that, so it was kind of dishonest of you to try to start an argument from the assumption that I said the latter when really I said the former. Don't do that again, please.