frequent changes to the language cause pain for implementors of alternate implementations (Jython, IronPython, PyPy, and others probably already in the wings) at little or no benefit to the average user (who won't see the changes for years to come
If this isn't a good reason to leave a language's semantics mutable via compile-time macros, I don't see what is...
There is a lot of value in mutable semantics. However when you go to share code, there is a lot of room for bad interactions between packages as well.
The principle that only one person gets to be clever at a time holds in spades when you start changing semantics and/or syntax.
This problem is lessened quite a bit with a language like Lisp which has a uniform syntax. It is at its worst with a language like Perl with a very non-uniform syntax. (Yes, there actually are several ways in Perl to change the semantics of the language on the fly.) However even with Lisp you can encounter issues, particularly if people are using reader macros. (This is one of the reason that the Lisp community uses reader macros so sparingly.)
when you go to share code, there is a lot of room for bad interactions between packages as well
There's absolutely no reason for syntax changes to be global. On the contrary, Python already has a way to change syntax locally in a module, through imports from __future__. For example, "x = print" is invalid syntax in Python2.6, unless you include "from __future__ import print_function" at the beginning of the file.
It is at its worst with a language like Perl with a very non-uniform syntax.
Perl 5 did not have compile-time macros (to my knowledge), so whatever bad rep it got for ugliness isn't coming from those. The compile-time macros in Perl 6 are very clean, IMO: http://en.wikipedia.org/wiki/Perl_6#Macros
Whatever down-side there is to confusing people by creating a local anaphoric aif-s and awhile-s completely dwarfs in comparison to the benefits of not freezing the syntax.
That said, my point about Perl seems to have been unclear. I'm not saying that Perl has a reputation for ugliness because people change the syntax on the fly. I'm saying that layering macros in Perl is problematic because they interact badly with the syntax.
This is why wise Perl programmers avoid source filters and modules that use source filters. Experience has shown that they are very fragile, and don't play well together when you start stacking them. (I'm guessing that fragility is one of the reasons why http://search.cpan.org/~elizabeth/ifdef-0.07/lib/ifdef.pm limited itself to stuff embedded in POD.) Unfortunately some things can't be done in Perl without core support or source filters. For example it wasn't until Switch.pm was pulled into the core that it could be done without source filters.
"For example it wasn't until Switch.pm was pulled into the core that it could be done without source filters."
Not true, actually - it could comfortably be done with Devel::Declare - we just haven't had time to yet.
(I wrote Devel::Declare and in fact have been trying for about a year to persuade somebody to write me the smartmatch logic for perl5 v8 with a promise that I'll do the given/when syntax if somebody does ...)
Source filters fail because they're line-by-line; Devel::Declare is still effectively a filter in some respects but it's one that co-operates with the tokenizer so it tends not to interact badly with anything. If you've seen places it does, failing tests would be very welcome.
That said, my point about Perl seems to have been unclear. I'm not saying that Perl has a reputation for ugliness because people change the syntax on the fly. I'm saying that layering macros in Perl is problematic because they interact badly with the syntax.
Ah, very good point, and well explained :) sorry for underestimating your first iteration.
For example, "x = print" is invalid syntax in Python2.6, unless you include "from __future__ import print_function" at the beginning of the file.
Well, you're not changing any syntax there. Print moves from a statement in 2.6 to a function in 3.x, and if it's a function you can just assign it like any other function. In other words, after doing "x = print" you still could not call "x 'hello world'", you'd have to use "x('hello world')".
Statements and expressions are different syntactic elements. So "x = print" will throw a syntax error in 2.6:
>>> x = print
File "<stdin>", line 1
x = print
^
SyntaxError: invalid syntax
But not if you "from __future__ import print_function" first. This is also why __future__ imports need to be at the very top of each file they are included in (unlike other imports).
I suspect it works how it always does for a BDFL (as I am for DBIx::Class in the perl world) - whoever likes proposes, if I like it I try and argue for it, only if there's a majority in favour does it get accepted anyway.
Except that if there's no consensus decision after a long discussion I get to make the choice. And no matter how any decision was come to, it's my fault if it's wrong.
Guido, Linus, and the various other BDFLs with important projects and actual talent seem to run things much the same way - I can't speak for their approaches but they largely inspired mine so I suspect I'm not completely wrong :)
Then you're not paying attention. No one (except a broken minority) said that they would not want to get rid of the GIL so long as single threaded performance was note degraded. Guido himself has stated this time and time again. The one thing missing to all the bright ideas and suggested ways of doing it?
Calling the GIL an implementation detail is like calling tail call elimination an implementation detail: for most software, it may be so, but there are idioms one may or may not be able to use depending on that detail. It's a visible implementation detail.
tail call elimination is a language feature. The GIL is an implementation detail of the Cpython interpreter. Jython, IronPython, and one day PyPy and Unladen-Swallow will not have a GIL.
Well, in other implementations like jython and iron python, the GIL doesn't exist, so I think most people would consider it just an implementation detail
Tail-call is an implementation detail. The fact that some people think it's as fundamental to programming as the Ten Commandments is to the Judeo-Christian tradition is... well, perhaps more a problem with programmers than with language implementations.
(and yes, it's visible, and that's one reason why I have issues with it -- it's a classic leaky abstraction, except in this case the "leak" is more like a torrential flood)
If your `while'-loops would crash after a certain number of iterations (and use linear memory before that), would you call that an implementation detail or a leaky abstraction? I'd call it a bug.
And special case constructs for linear recursion like `for' and `while' are only a necessary band-aid in languages that do not treat functions properly.
From what I have read, GvR and other core Python people are in favor of removing the GIL, but the solution to removing the GIL has to not hurt performance CPython and must not cause any regressions. Some have worked on replacing GIL, but have not finished the work.
GvR talked about retirement for a while, I see this as the official announcement. It's unlikely that in N years when the freeze ends GvR will continue to moderate the discussion on Python evolution.
This could also signal the beginning of python as "the" mainstream language as this makes it a lower risk proposition for vendors to embrace the language.
google's unladen swallow targets LLVM, and apple is now sponsoring work on LLVM, which they might be using to target iPhone ... so it could become a reality in the near future :)
If Adobe can use LLVM to compile Actionscript for the iphone, then it seems very likely work could be done to get Python to compile to the iPhone as well.
That's not a compiler for Python. It's a compiler for a limited subset of Python that works via source to source translation targeting C++, you could write a version of it targeting Objective-C if you wanted.
Great way to kill the momentum and innovation. The only reason Ruby is still relevant today is directly due to its dynamism and continuous evolution. If Python declares a moratorium, they'll be shooting themselves in the foot.
Basically, this means not adding any new language features for about 2-3 years, instead focusing on making improvements to the standard library, and to performance and correctness of the core. The moratorium sounds like a great idea to me: it will help reassure people who are porting their code from 2.x to 3.1 that they won’t have to make too many more changes to keep their stuff working on 3.2 & 3.3, and other implementations (Jython, IronPython, PyPy(?)) will be able to catch up to 3.x features.
I disagree. The last thing Python needs right now is to make Python 3.x even more different from the 2.x line. As a strategic move, I think it makes a lot of sense to stabilize 3.x and wait for it to gain momentum before going on to more syntax changes.
Though I have to say, I'm quite disappointed that PEP 380 is getting left behind, since I find the alternative to be pretty ugly and heavy-weight.
That's because Ruby folks don't work on that as a community. We're all working on our pet projects, instead of helping get ruby1.9 into the popular OSes -- there's no one-click installer for Windows or OSX, and packages for various linux distros are lagging.
(Yes, I'm pointing the finger at myself too... if only I could stop wasting my time on random stuff, I could spend a bit of time each week helping out.)
There's more to momentum and innovation than piling on more syntax.
In fact putting restrictions on things (such as you can't add syntax or break backwords compatibility) can engender more innovation than letting people do whatever.
Haha, I was just thinking this rocks for Ruby and other upcoming languages. If people don't have new Python features to play with, they can spend the time learning new languages.
I hope the proposal gets accepted, because I don't like python (personal taste, not trying to start a flame war) and I hope it stops showing up in all the systems I work with.
Oops, got below 0 for the first time. Sorry, I was joking a bit, didn't mean to be offensive.
I didn't mean to offend python (this time). I think learning different languages helps the mind, and I try to experiment with at least 1 new language every year. So I was thinking if more people have time to learn new languages, everyone will be better off. I hope this makes more sense.
Doesn't it depend on your goals? For example, I expect Haskell to be a hotbed for experimentation for a long time to come. In fact I think people choose Haskell because it is continuously evolving.
I know that I picked Clojure recently not just because of it's features - I picked it because it _wasn't_ stable, it _wasn't_ all figured out. In fact, as part of the community, you can actually contribute a hand to it's future.
It's just Python has matured to the point where people would rather experiment with it's implementation rather than it's syntax/expressivity. It also has to do with the fact that this looks like part of a growing effort to make Python _immensely_ popular - going head to head with the likes of C++/Java.
As far as I'm concerned that's a good thing for Python.
But it does mean if you're looking for new ideas in PLs you'll have to look elsewhere. But it's not like there's a lack of excellent and popular candidates these days.
I find Scheme quite a bit more ugly than Common Lisp. The simplistic design makes it much more unpleasant to use as CL. For learning programming its fine, smaller programming task are fine too, but anything interesting is just a pain. Scheme then provides much on top with SRFIs (and similar extensions) - unfortunately the base language does not scale.
(And I would not want to do loop in Scheme. That's just too imperative. Interestingly Haskell's special syntax for Monads and its lazyness makes the need for most macros / special forms go away (though not for all, and Haskell has plenty of syntax on the surface).)
Consider that Paul Graham, the author of the fine book "On Lisp", chose PLT Scheme to implement Arc on top.
This is potentially pretty big news. While this means that we'll probably see blazingly fast/cool implementations, Python as a laboratory for language innovation is coming to an end.
If this isn't a good reason to leave a language's semantics mutable via compile-time macros, I don't see what is...