Hacker News new | past | comments | ask | show | jobs | submit login

This is happening mostly because Guido left, right? The take that CPython should be a reference implementation and thus slow always aggravated me (because, see, no other implementation can compete because every package depends on CPython kirks, in such a way that we're now removing the GIL of CPython rather than migrating to Pypy for example)



Partly, yes, but do note he is still very much involved with the faster-cpython project via Microsoft. Google faster cpython and van rossum to find some interviews and talks. You can also check out the faster-cpython project on github to read more.


It's fascinating to me that this process seems to rhyme with that of the path PHP took, with HHVM being built as a second implementation, proving that PHP could be much faster -- and the main project eventually adopting similar approaches. I wonder if that's always likely to happen when talking about languages as big as these are? Can a new implementation of it ever really compete?


Probably. Without a second implementation proving it out the bureaucracy can write it off as not possible and the demand may be less just because users don’t know what they’re missing


similar with vim vs neovim as well


There is a significant portion of NeoVim users though.


Can you elaborate? Genuinely interested.


Not the parent, but here's a blog post about it: https://geoff.greer.fm/2015/01/15/why-neovim-is-better-than-...


Guido is still involved, but he's no longer the BDFL.



> no longer the BDFL

the irony :/


They should have ceremonially defenestrated him from a ground floor window.


“Dictator” means you get to arbitrarily change the rules. In his case, to end his own term early.


And the "for life" part?


He decided it was for life, just not his.


He dictated it wasn’t the case anymore


Pypy actually works really well. It could probably get even farther if people knew about it more.

Test your packages on Pypy, people.


The only trouble is C packages, which are really really common, right? So you have a performance hit or something (or so I heard)

I guess that now the GIL is going away, pypy will become better at handling packages with native code like numpy


In my experience, pypy works with basically everything these days. I remember having some struggles with a weird fortran based extension module a few years ago, but it might work now too.

Most c extension modules should work in pypy, there's just a performance hit depending on how they're built (cffi is the most compatible).

https://doc.pypy.org/en/latest/faq.html#do-c-extension-modul...


The point of using C extensions is to have better performance. Python is already slow in general; code that use such extensions typically depend on this performance to not be unbearable (such as data science)

People wanting to use Pypy usually do so because they want better performance. Having a performance hit while using pypy is disconcerting.

I was speculating that in the future, C extensions in pypy would be faster, but I now see that the GIL is actually unrelated to this performance hit. Anyway it's really a pity.


I get what you are saying, but normally this wouldn't matter too much right? You will have a small number of calls into the C-extension that together do a lot of work. So as a percentage the ffi-overhead is small.


C extensions are fast if you use cffi. It's when it has to emulate CPython to run extensions written against CPython, that it is slow.

Please do not phrase that as a failure of Pypy. That is so weird.


It's fine with C packages these days, it's increasingly rare to find libraries that it won't work with.

That said, it has happened often enough I'm cautious about where I use it. It would suck to be dependent on pypy's excellent performance, and then find I can't do something due to library incompatibility.


This is way better than before when no C packages worked.

Now, a lot of C packages work - and where they don't it's worth raising bugs: with PyPy, but also in the downstream program - occasionally they can use something else if it looks like the fix will take a while.


Seems a bit silly to think that- Guido is still involved with Python... and in fact is the one heading the Faster CPython project at Microsoft which is responsible for many of these improvements.


As a compiler, python is an optimized (C/kernel) implementation of parse generation. JIT (PyPy) is a method that parses a 'trace' variation of grammar instead of syntax, where the optimizer compiles source objects that are not limited to python code.

It's goal is to create as many 'return' instructions as it can decide to.

GIL is merely a CPython problem but synchronization can also be a compilation problem.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: