Hacker News new | past | comments | ask | show | jobs | submit login

I don't quite see why it's being downvoted. Like it or not, but that's what is happening: Python's 2.4-2.7 success was largely due to it's simplicity. It more or less did correspond to values defined in `import this`. It wasn't particularly performant nor "safe/foolproof", it wasn't even that powerful as a language. It lacked (and still lacks) good error reporting system. But it was lean, easy to get started with, agile enough, expressive enough. It was always possible to make something unintelligible using reflection and magic methods, but it is easily distinguishable form "how stuff should be written" and otherwise you could be quite sure there won't be any surprises.

This is why python is still used for it's purpose. CLI utils, bots/crawlers, tf/pandas/scikit-learn, REST-APIs on top of flask.

Then hard times came. First, many year long story (still not concluded) of transition between 2 and 3. Then all this stuff. Sure, there is such thing as progress. Stuff is invented for some purpose. But now, in 2016 and at v3.6 — Python isn't what it has been loved for. Not easy-to-start-with, nor simple. This return/yield fuck up, showed in the article is absolutely huge deal, for example, and it is not about asyncio per se. Async stuff is always complicated, it wouldn't be that bad if it was all it's about.

If some 5 years ago one would use Python just because "why not, I just need to get stuff done", now it's quite likely that after struggling with all this micro-nuisances he would go with golang/js/php/whatever instead.




The thing about Python is that unlike most other languages, you don't have to deal with that complexity. You can still write Python 2.7 style code, no problem.

This approach to async, though, is just a language feature that's becoming mainstream right now. C# has it, ES7 has it, C++ has a working paper on it etc. Python actually had the benefit of watching how things work out elsewhere before implementing it all.


The idea that complexity doesn't matter if you don't use it sets Python on the path to becoming the next C++. Pick the subset of the language that you like. Then carefully watch your libraries, because if they use a different subset, you still have to deal with what's in there (and Python is a library heavy language!). It goes against everything Python stands (stood) for. Definitely not what I'm looking for.


This is not at all a new problem for Python. In fact, it's possibly more of a problem for Python than it is for C++, because Python's dynamic nature and exposure of many internal mechanisms makes it possible to do some pretty crazy stuff in the libraries.

For example, speaking of async - even 2.7 already has Twisted, and an ecosystem of libraries around it.

The only two ways I can see it being solved is either by making it more of a toy language (which is great if you're just writing short scripts, but it's not really what it's supposed to be about); or by having a very centralized "best practices" enforcement that basically forces libraries to conform through peer pressure, like Java - which has its own disadvantages aplenty.


no you can't. Unicode by default is a complete pain in the butt when working with libraries that want bytes. It's just a complete mess.


That's because it's not unneeded complexity. Making you stop and think about what you're doing when you say "just treat these bytes as a string" (or vice versa) is to prevent buggy code that only works properly with Latin-1, and similar kind of thing.

If some library messes it up, and requires you to provide bytes that are semantically a string, it's a design flaw in the library.


okay try and use any networking library such as zmq, and see what a pain in the butt it is, most definitely refuting the idea that "you can write python 2.7 code if you want". You're going to be polluting your code with .encode .decode all over the shop. Same with xrange, dict access. All sorts of cruft has been introduced in the pursuit of "seriousness". And now we have this post (and most of the comments) saying that 3's flagship win, asyncio, is actually a dog. I reluctantly moved to 3 last year and honestly, I cannot find any wins whatsoever. Seriously considering Lua, although it also has its 5.3 v 5.2 problems not to mention Luajit "fork".


> You're going to be polluting your code with .encode .decode all over the shop.

Well, do those encode/decode calls serve a purpose? Does encoding matter?

If so, they're not cruft - they do something that needs to be done. You may dislike the fact that it's extra complexity, but any speaker of a language for which ASCII (or Latin-1) is not sufficient will rejoice knowing that you can't write code that breaks when we try to use it anymore. I remember how much of a hassle this kind of stuff was back in DOS/Win9x days, and even in early 00s; and I also remember how a hard push for Unicode as the string encoding in mainstream languages like Java and C# did a lot to rectify that.

If not, then why would you need to encode something just to decode it later, or vice versa? Why not just pass bytes/bytearray around? Again, if the library requires doing so (because they demand the data to be passed as a str specifically, even though it's never actually treated as a string), then that's a bug in said library, and you should complain to/about its authors instead.


As If JS isn't becoming more complicated by the year, and PHP didn't add a ton of features in 5. Anyway, Python ranks in the top 4 most popular language rankings on multiple sites, and plenty of people are still learning it.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: