Hacker News new | past | comments | ask | show | jobs | submit login
Python 3.11 (python.org)
202 points by wetpaws on Oct 4, 2021 | hide | past | favorite | 146 comments



> When printing tracebacks, the interpreter will now point to the exact expression that caused the error instead of just the line. For example:

  Traceback (most recent call last):
    File "distance.py", line 11, in <module>
      print(manhattan_distance(p1, p2))
            ^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "distance.py", line 6, in manhattan_distance
      return abs(point_1.x - point_2.x) + abs(point_1.y - point_2.y)
                             ^^^^^^^^^
  AttributeError: 'NoneType' object has no attribute 'x'
This is such a great quality of life improvement.


This is an INCREDIBLE improvement. I'm so excited for it for myself, and I think it will also do wonders for getting new programmers comfortable with the language. Seeing exactly where in the line your error is, is amazing! IMO this 100% cements Python as the single most newbie-friendly language, if there was any question before now (or at least, before the release of 3.11).


You can already get this information for Python 3.6+ with https://friendly-traceback.github.io/docs/index.html


I already move code into separate lines since this is currently missing. This is great indeed. (Although that might still make sense for debugging)


It would sure be great if they made it easier to break to the debugger in case of an unhandled exception, like (setq debug-on-error t) in Emacs Lisp.


This can be done in some IDEs. Although it can get confusimg if you’re working in an app framework that does some sort of catch-all exception handling.


Yeah it seems to me that regular Python should be able to do it though.


Well if you like this, you are going to love 3.10 due for tomortow, because it's packed with a lot of error message improvements.

Everybody talks about the pattern matching, but I'm excited about this. I'm going to upgrade ASAP just so for it.


Oh wow, I completely missed this in the release notes, probably due to not carefully reading them since earlier versions. This is amazing!


Really wish they'd just go and add a way to replace better-exceptions[0]. I don't use it anymore and thus don't really like to maintain it these days, and would love to see it in mainline.

[0] https://GitHub.com/qix-/better-exceptions



This all all exciting stuff, but 3.11 comes out in 2022. 3.10 is supposed to come out tomorrow though, you can checkout it's "What's New" here:

https://docs.python.org/3.10/whatsnew/3.10.html

3.10 release schedule:

https://www.python.org/dev/peps/pep-0619/


Python is deceptive. They lure you in with its simplicity, and then one day you want to deploy your software and you realize that it's more complicated than any other programming language ever conceived of.


Wouldn't say it better myself. Everything you gain with fast coding you lose on tackling performance issues, GIL, dependency management... Things harder to solve once you're committed to production.


It depends very much on what you are doing. Python for big complicated applications is a No for me also but python for extraction jobs in a ETL pipeline or data science tasks is a huge YES.


I don't know; I've just written something in .net/c# and it's pretty bloddy difficult to deploy.


.NET can export self-contained single-file executable that doesn't require any runtime on the target computer/server.

Python has solutions like PyInstaller, but it's not built-in and requires building on the same system. In .NET, you can use the built-in tools to export binaries for Linux from Windows.


Except when you have some specific dependencies/targets that make this difficult. I'm sure I'll get there eventually, but I chose .net for this thinking it would be dead simple, and yet here I am wading through documentation and stackoverflow posts to get to my self-contained single-file executable.


Most windows users have at least 4/5 versions of linux installed on their machine so it's a non issue. They have wsl, a base installed partition, vagrant, a VM and probably docker containers. so running one extra command to gen a windows package isn't a big deal.


.net/C# was one of the easiest to deploy in my experience. Just build a self-contained executable and you're done.


Can you be more specific about your claims? What is wrong with Python?

> more complicated than any other programming language ever conceived of

is definitely untrue. Even if you take only mainstream languages then there are many more annoying to deploy, ones that are more complicated and so on.


Dependency management in python is much worse than most mainstream languages.


Here I agree, but it is quite far away from

> more complicated than any other programming language ever conceived of


It was far easier for me to grok monads in Haskell (and it took less time), than to learn how to properly deploy a Python application. Given that monads are often a complaint and testament to the "complexity" of Haskell, I think it's unfair to make the claim that Python is uncomplicated. If you use a language for some period of time, it is very likely that you will have to deal with the tooling at some point.


I think the "it's" in "it's more complicated" refers to deploying software, not the language itself nor its support packages.


When you say "deploy" do you mean "put it on a webserver"? I've deployed python Web services on to very limited environments that have nothing more than an interpreter. Nowadays I use pip-tools and docker which makes updating trivial.

Do you perhaps mean distribution as an app? I've had luck with pyinstaller. There can be fiddly bits but overall it's pretty smooth. It's easier if you are targeting a system with an interpreter, like a Linux distro.

Overall I just think you need to know what you're doing. I don't really think this is a bad thing. Maybe you could elaborate on what your difficulties are? From here I just don't see how it's more difficult than anything else.



Thanks, I know all of that. What made you think I don't?


Yes - this is the reason I've switched to Go for writing command-line apps. I don't like Go as much as Python (Go is simpler, but not as easy), but being able to compile a self-contained executable is huge. It's also great to have access to GIL-free parallelism.

I don't expect Python to support either of those things any time soon.


This is a solved problem. The community provides a handy system overview[^1] that should be good enough for most cases.

[^1]: https://xkcd.com/1987/


Didn’t even need to open the link to know what it was, thanks for the laugh!


It's probably the most voiced complaint these days, unless core devs are deaf, I bet ten solid dollars they're gonna address that soon.


It’s been like this for over a decade. I wouldn’t hold my breath. By comparison, I’m pretty sure the Ruby community adopted bundler over a decade ago.


Maybe they were busy on other things. Py3k, async. Now these are settled I don't see what other defect needs to be worked on


What kind you issues you get during deployment? I think if running on right version then there should not be any issues.


https://stackoverflow.com/questions/52187362/how-to-deploy-p...

Some of these approaches work for Windows, others for Mac, some for Linux, depending on versions of Python that are installed, etc. You can get around some of the problems with pyenv and a bunch of other tools.


I was originally using python (learned it in my first job back in 2001!) & that was the reason I eventually moved to other languages - python as a language is great but the tooling/ecosystem isn't for exactly these issues (or at least weren't at the time, that's also at least a decade+ ago by now).

My impression at the time was that using it for server side stuff was fine (i have control of the server and simply install all the right things/versions myself) but packaging it up for end users to install on their desktops/mobile was a huge pain, there was no real easy/simple way to just export a self running installer that you double click (on windows/mac) or an IPA/APK (on ios/android) and end up with a running program the way users were used to from e.g. a bundled .net (at the time using mono, now ms provides cross platforms .net implementations themselves) or native c/c++ executable.


I recently wrote a python script to handle serial communication and uploading of a bare metal binary to my RPi3, this saves me so much time debugging now that I don't need to swap SD cards continually. Excellent tool.


Because of dependency management, or?


Because of having to know what WSGI and ASGI are and picking the right one. Then pick which implementation of the above you want to use. Then configuring it correctly for your app. And finally make it play nicely with nginx/apache/whatever.

If on the other hand you're trying to deploy a desktop app, that is an entirely different kettle of 'fun'.


That's not really a criticism of Python, is it? What do you want? One Way To Do Things with No Configuration with magical compatibility with nginx/apache/whatever ?


Well, according to the Zen of Python "there should be one- and preferably only one -obvious way to do it" :-P


> Because of having to know what WSGI and ASGI are and picking the right one. Then pick which implementation of the above you want to use. Then configuring it correctly for your app. And finally make it play nicely with nginx/apache/whatever.

None if the Python in I deploy has to deal with any of that, all of which applies in a very narrow particular application domain. Even if it was a source of valid criticism, it would only be of using Python for that specific use, not generally.


<dream> There is a very fast Gunicorn replacement in Rust that accepts an wsgi/asgi module and a static dir and starts serving both. It breaks performance barriers for pure Python servers. It can automagically create systemd service and nginx site or just exist as a Docker entry point.


Have you ever tried to deploy an Erlang/OTP application? I find it much harder to do


No, but Go and C++ are much easier to deploy.


This comment is deceptive. This is a solved problem with virtualenvs. It is why Piku (https://github.com/piku) can deploy isolated apps with ease (and actually devotes less than 80 lines of code to the actual Python deployment).


Thanks for taking the opportunity to advertise your own product, but it doesn't solve the problem.


It’s not a product. It’s an example of why it isn’t a problem, and it was actually _designed_ around virtualenvs, which is why it’s just a single script (again, not a product).


This is just a change log its better to link to the 3.11 what’s new: https://docs.python.org/3.11/whatsnew/3.11.html


Between this link and the OP, we're seeing the first improvements to drop from Mark Shannon's Microsoft-funded full-time work on CPython optimization, alongside Eric Snow and Guido van Rossum:

https://www.theregister.com/2021/05/19/faster_python_mark_sh...

https://github.com/markshannon/faster-cpython/blob/master/pl... - indicates that this is step 2 of 4, where the forthcoming steps actually plan to introduce JIT compilation!

From the OP Changelog:

- bpo-44590: All necessary data for executing a Python function (local variables, stack, etc) is now kept in a per-thread stack. Frame objects are lazily allocated on demand. This increases performance by about 7% on the standard benchmark suite. Introspection and debugging are unaffected as frame objects are always available when needed. Patch by Mark Shannon.

And from the What's New:

- “Zero-cost” exceptions are implemented. The cost of try statements is almost eliminated when no exception is raised. (Contributed by Mark Shannon in bpo-40222.)

- Method calls with keywords are now faster due to bytecode changes which avoid creating bound method instances. Previously, this optimization was applied only to method calls with purely positional arguments. (Contributed by Ken Jin and Mark Shannon in bpo-26110, based on ideas implemented in PyPy.)

Really, really exciting beginnings. I can't wait to see what's next!


7% is nothing, and the benchmark suite is notoriously unreliable. As usual, whenever a company that has done nothing at all for Python in 30 years attaches its name to the product of others, people fall over themselves to praise it.

Real improvements made by individuals for the last decade are taken for granted and aren't mentioned. The corporations are credit thieves.


> Between this link and the OP, we're seeing the first improvements to drop from Mark Shannon's Microsoft-funded full-time work on CPython optimization, alongside Eric Snow and Guido van Rossum

Oh good, after the proposals to speed up Python I didn't hear of any organisation stepping up to sponsor them. It's great to find that Microsoft did.


With Python style favoring exceptions, I could see just the zero-cost try alone being a meaningful improvement.


On fact, for loops are try/except under the hood.


According to PEP-619[1], Python 3.10 release expected tomorrow, so no wonder work started on 3.11...

[1]: https://www.python.org/dev/peps/pep-0619/


Development on a new branch begins at the time of beta 1 of the previous branch, which is the feature freeze point. ("Development on a new branch" is metaphorical, there's no new git branch created at that point, just the master branch.)


I'm waiting for Python for Workgroups 3.11


Close...

  Python 3.8.10 (default, Jun  2 2021, 10:49:15)
  [GCC 9.4.0] on linux
  Type "help", "copyright", "credits" or "license" for more 
  information.
  >>> print(3.11 - 3.1)
  0.009999999999999787
  >>>


I’m not holding my breath for Python NT however.


Considering what happened between 2 and 3 ... I think everyone agrees that no one ever wants to reach 4 (NT).


Some nice changes:

> “Zero-cost” exceptions are implemented. The cost of try statements is almost eliminated when no exception is raised. (Contributed by Mark Shannon in bpo-40222.)

> Method calls with keywords are now faster due to bytecode changes which avoid creating bound method instances.


Just 3 more updates until Pithon 3.14


one more after that for 3.14.1


Then they just need to get the patch release up to 3.14.15 for maximum enjoyment.

It's doable! 2.7 got all the way up to 2.7.18 (though that was a special case).


> though that was a special case

I think this qualifies as an even more special case.


pi-thon


I really enjoyed reading the discussion about the updates to TimSort, at https://bugs.python.org/issue34561 .


In https://docs.python.org/3.11/whatsnew/3.11.html :

> A new function operator.call has been added, such that

    operator.call(obj, *args, **kwargs) == obj(*args, **kwargs).
Isn't this the same as the old apply function?

https://docs.python.org/2/library/functions.html#apply

Ah, it was discussed:

https://bugs.python.org/issue44019#msg400821

> Python2's apply has different semantics: it takes non-unpacked arguments, i.e.

    def apply(f, args, kwargs={}): return f(*args, **kwargs)
> rather than

    def call(f, *args, **kwargs): return f(*args, **kwargs)


The deprecation of lib2to3 has occasioned some difficulties for formatters. Neither `black` nor `yapf` can correctly format a script with the `match` syntax introduced in python 3.10, while python 3.10 is formally released tomorrow. It seems that transitioning to the new PEG parser isn't a trivial work.

Certainly glad to see python 3.11 having so much performance improvements.


just spent about 4 hrs getting Python 2 and pip setup on my Mac. Any time I have the misfortune of needing to use python I get cold sweats at the thought of the environment stuff.

Why is this still such a massive problem?


4 hours? Yeesh. Here's how to do it in 5-10 minutes.

Use pyenv to install any python version and pyenv-virtualenv to manage virutal environments.

https://github.com/pyenv/pyenv

https://github.com/pyenv/pyenv-virtualenv


For me, pyenv on Mac is about 15 minute compile time with a bunch of environment variables required to make sure tkinter works properly.


The compile time depends largely on dependencies. If you have OpenSSL, zlib, and whatnot pre-installed (either via Homebrew or self-compiled and specified with environment variables), the compile time is close to 5 minutes. If you don’t, by default pyenv (actually python-build) compiles all those from scratch, and can easily spend north of 20 minutes (mostly for OpenSSL).


Use miniconda. It's much better.


The licensing is not compatible with my companies legal team.


    brew install pyenv


Anaconda is also great for beginners - GUI, lots of normal packages precompiled (also numpy, scipy, pandas, scikit-learn). If you do the full install, it comes with lots of packages pre-loaded.


Using Python 2 is a big part of the problem. Modern Python 3 is a lot nicer. Also you're much better off if each Python project has its own virtual environment using the venv module.


While I also quirked an eyebrow at the exact version, is 3 better at package management & environments?


Yes, virtual environments are baked into the language. python3 -m venv venv will make an environment named venv. source venv/bin/activate activates it. pip install package in an activated environment installs the package only into the active environment.


It's still a mess, honestly. They tried to standardize on venv but since it doesn't deal well with Python libraries that depend on non-Python components, conda is still better for many purposes.


Can you give a specific example?

I've had nothing but pain with conda in mixed linux/mac/win environment at work, and actively worked to get it deprecated. We're on plain venvs now, with the occasional setup.py

Curious if I missed some useful case for it.

- very slow dependency resolvers

- using conda in docker is annoying

- the worst thing: inconsistencies in downloading binaries and other resources between win/mac/linux when installing packages

- conda insisting on messing with bashrc to run their 'activate' crap

- beware the fool who runs 'sudo conda whatever' on a multiuser system


Curious if I missed some useful case for it.

Conda will package and version things like compilers and C libraries etc in a sane way. I can 'conda install' a package and start using it, but if i 'pip install' the same package I'm still left having to hunt down and install a bunch of additional dependencies. For example 'conda install cython' gives me a fully working and consistent cython on all platforms in a way that 'pip install cython' won't.

There are still several computational science and data science packages where 'pip install X' on windows simply doesn't work (and 'conda install X' does) due to missing dependencies and other weirdness.

Conda can version and manage R as well as Python if you work on projects that use both then you only need one tool.

Conda can also install dev tools like Spyder, VSCode or Jupyter(labs) making it easier to offer a single point of entry for an entire dev environment.

But basically the big one is simply that Conda will install a lot more numeric packages with a lot fewer headaches on win, linux and mac than pip. Although in pip's defense it has gotten a lot better over the past 2-3 years. And I do agree that the dependency resolver can take a ridiculously long time in some situations.


I think this is an interesting case where people just want different things, and so the thing that they want is "better".

Some people really just want Python+extras and don't want to worry about getting a fortran compiler. Conda is great for this.

Other people really don't want a big meta-install system that bundles a bunch of dependencies. Maybe they're on Linux, they target a single platform, or they want to use a package not in conda that works with a more recent version of the dependencies that conda bundles.


I think this is an interesting case where people just want different things

For some people Python is really more of an end user application. They don't want to develop and distribute and deploy software, they want to write scripts and analyze and transform data. They expect the script a colleague emailed them to just work and expect their colleague to be able to open and run their Jupyter notebook as easily as they can open and run an Excel document. For those people Conda comes much closer to solving their problems.


Here's an example, let's say you're developing GPU code that needs to work with different CUDA versions on different machines. Look at the `cupy` install instructions for example:

https://docs.cupy.dev/en/stable/install.html

With conda this installation tends to just work, conda will also install the non-python cudatoolkit for you. With pip you have to either make all your python developers set up their c++ environment the same way as well, to install from source, or set a fixed cuda version that all users have to be on.

Now `cupy` is just one python library that has non-python dependencies. If your project has several dependencies like this, where conda is a one line install and pip means you have to mess around with your c++ environment, conda is probably the right choice for you.

All of the pain you mention with conda is totally correct, though, it's just a question of which sort of pain happens to be worse for the packages you are going to use.


Except when Conda plays hard to get, like in any corporate environment with connection inspection, custom root CAs, or not quite admin privileges. Then Conda is an absolute dog and hard to work with, yet regular old python is fine.


I use the Anaconda distribution. I can create environments containing any version of Python, like this:

    conda create -n myEnv python=2.7
    conda activate myEnv
I think the problem is that unless someone already knew Python, they probably wouldn't understand environments, so their first thought would be to somehow install another version of Python and replace the system Python, or to use a switcher to switch between versions. That would be my first intuition too. (and I believe that's how Ruby works, via RVM).

But Python works just a little bit differently in that it advocates for the use of its own environments to isolate dependencies. I'll have to admit it is not super intuitive (I already have a system package manager (apt, brew, etc.) and I have Docker containers -- why do I need another yet another environment?) but over time I've just accepted it for what it is. It's a Python convention.


I mean, conda is installing another version of Python, and it provides a switcher to switch between environments. Conda and rvm are very similar to use.


Folks have given a number of good options here, but I thought I'd mention some other ones in case they're helpful for your use case.

1) docker! If you only plan on tinkering with something and don't want to install a bunch of random things on your machine, this command will start a python 2 image with access to the current directory: `docker run --rm -it -v "$PWD:/app" python:2.7 bash` (IIRC, on my phone atm!) (Windows users will need to write `"/$PWD://app"` if using mingw/etc). Once inside, you can pip install, etc, and it'll all go away once you exit the terminal! If you want it to stick around, don't use --rm and add a --name to give it a name you can remember for later. It does require learning some docker, but it's a good investment :) I use the same technique to try out/play with other languages, versions, packages, etc. without making complicated mods to my actual computer. And there might even be docker images that come with certain packages/etc pre-installed for certain use cases!

2) Pycharm! A good python IDE, with virtualenv support baked in if you don't feel like playing around with a bunch of commands at all. Although disclaimer, I can't fully remember how hard it was to get things setup (or how easy it is to work with multiple python versions), so YMMV.


I don't think Pycharm will install different versions for you, but if you have multiple versions installed, it is indeed super easy to switch between them. Though if you have a ton of packages you'll have to reinstall them all for the right version, and if you're dealing with lots of venvs + system installs, you have to make sure to choose the right thing from the dropdown and you might get somewhat lost and confused if you don't know what you're doing.

The right menu is File -> Settings -> Project: <your project name> -> Project Interpreter, and then you pick whatever you want by Python Interpreter.

In Windows I use Chocolatey to manage Python versions, in Mac idk though.


It does install them too - however I have the pro version, and I'm not sure if that's a pro feature or not.


It installs versions of Python or versions of packages? I can't find anything in the docs about installing versions of Python, just packages (which I've done on occasion, when I'm really confused about why my environment isn't working, I also have pro version now but I'm pretty sure I'd done it when I had community version too)


Ah, just checked. I thought it installed python versions as well, but it can create environments (virtualenv, conda, system pythons, pipenv, docker), and can install packages in those environments.

I do not see a way to install a whole new python version (only the ability to add to the list, which requires you to select the interpreter from an open file dialog)


Interesting, I don’t use Python much anymore, but my project setup was always something like:

    virtualenv -p interpreter venv 
    source venv/bin/activate
    pip install . . .

These days, I just use nix


Replace the first line with

    python3 -m venv venv
And you have the current state of things.


Install miniconda, it’s amazing. Takes roughly 30 seconds to install. After using conda I’ve never had environment issues


Please enough of this. Trying spending 2 minutes googling it. Hint: pyenv


If you spend 2 minutes googling it (or even just reading HN comments here), you'll get various people recommending venv, pyenv, poetry, conda, pyflow and probably a whole bunch of deprecated stuff, where it's totally unclear to outsiders which one to use.


Fwiw if you are an outsider, having gone through this exact issue myself recently start with venv: https://docs.python.org/3/library/venv.html

It is a standard library module and serves as the base functionality, most the other things you list are third party tools that build on the venv concept in various ways.

Once you grok venv, you will be in a position to understand if those tools can provide value to your particular situation.


I wanted to install python on a new Mac last night and just couldn’t get up and running. Closest I got was with using anaconda but although I could get the right env in the terminal I couldn’t get either paycharm or vscode to pick up the correct interpreter/packages etc.

I’d almost guarantee there was something simple I was missing but for someone not familiar with the ecosystem getting python set up to do more than just hello world is a shitshow.


virtualenv, pyenv, venv.. I'm sure they all work well, but for a newbie, the sheer choice is a killer, especially when they all work almost same, but not quite.

Now compare that with cargo or npm and the canonical way makes newbie upbringing SL much easier.


Took the words out of my mouth. Let’s not over complicate things.


1. Python has a low barrier to entry and is a popular first language, so most users don't realise how bad it is.

2. Python was originally popular with old-school sysadmins, Debian types, and a lot of its package management is based around that philosophy of carefully hand-tended servers shared by multiple users.


> a lot of its package management is based around that philosophy of carefully hand-tended servers shared by multiple users.

Is it? I'd love for apt to support virtual environments or `apt install --user` like pip does, let alone populating venvs with requirements.txt.


The pile of debootstrap chroot stuff that you get told to use works pretty similarly to virtualenv. It's pretty clunky and bolted-on, but so is the Python version.


(s)`chroot` and `debootstrap` existed long before docker did for this purpose.


Care to elaborate on what is bad? How bad are they compared to other languages? (Assuming you are always accepting some tradeoffs when moving from one eco system to another)


It's painfully archaic compared to languages that have all this sorted out.

Pip is a recipe for disaster, indicated by the huge amount of churn in the Python packaging sphere. It's constantly the worst part of my day whenever I pick a Python project up. Conflicting dependencies, non-deterministic installs, etc.

I used to cope with this crap fest until I tried Elixir and experienced the beauty of modern package management and build tools. One tool, Mix, that handles task running and dependency management with a proper resolver.

I honestly think even Node has a better package management and tooling story.

Also: virtual environments are a hack and a pain. Python is moving forward in this aspect with the recent PEP for the pypackages folder, but we're still a long way from adoption.

All of this stuff is painful for beginners. Virtual environments might seem easy to us, but I've had ridiculous amounts of trouble explaining why they're necessary to beginner developers. Then you have to explain `poetry` or `pip freeze`.

Even I have trouble coming back to older projects of mine and I'm an experienced Pythonista: I usually waste an hour or two trying to sort the pip dependencies out with whatever conflicts the resolver comes up with this time.

Python package management is not okay, we're all just used to coping with it. Other languages put it to shame.


Package management is pretty fubar for in every language I've tried but python is one of the worst


> non-deterministic installs

Things continue to improve. For example, Pip's resolver has been deterministic since around November 2020. Bringing something minimal into Python that provides deterministic builds (e.g. pip-tools) would help packaging a lot.


Off the top of my head:

- What you get when you import a library depends on state that's scattered all over the system: system-managed packages, pip-managed system-global packages, pip-managed per-user packages, which virtualenv is currently active, which directory you're currently in, which directory the program you're running is in, whatever it is that conda does....

- There's no concept of reproducible builds or dependency pinning. There's "pip freeze" but that's a one-time operation that you can't then reverse, so it's only usable for leaf applications. If you're developing a library, you'd better get used to having your transitive dependencies changed on you all the time. And since the whole ecosystem is built that way, even if you use some tool that lets you make stable releases of your library, that doesn't help you develop at all.

- Virtualenvs are stateful and attached to whatever terminal you were in at the time. This interacts hilariously with the previous point: if you accidentally run "cd myproject && pip install -r requirements.txt" in the wrong terminal, you permanently, irreversibly fuck up that virtualenv. All you can do is wipe it out and try to recreate it - but, per the previous point, it probably won't come out the same as before.

- You're supposed to use pip to manage which python version each project is using. But you're supposed to use the installer for it that's distributed with the python runtime. But only certain versions of the python runtime...

- There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.

It's really a lot worse than other languages. If you build a real system (like, a couple of libraries and applications) in another language (not, like, C/C++ - but even Perl or TCL will prove the point) and then come back to Python, you'll find yourself hating it all the time.


> There's "pip freeze" but that's a one-time operation that you can't then reverse, so it's only usable for leaf applications.

What do you mean by reverse? You can certainly upgrade all packages locally and then run pip freeze again if you want to set up newer versions, or like manually change versions in your requirements.txt and `pip install --upgrade` to update them. For stronger reproducibility guarantees there's also `--require-hashes`, although admittedly freeze doesn't appear to support easily writing hashes to a requirements.txt.

> per the previous point, it probably won't come out the same as before.

I don't see how this follows. If you have frozen dependencies, it will.

> You're supposed to use pip to manage which python version each project is using. But you're supposed to use the installer for it that's distributed with the python runtime. But only certain versions of the python runtime...

No you use venv for that. Venv which has been available since python 3.3, in 2012, and on pip, ensurepip has been since 3.4, in 2014. If you're using a version of python that's 10 years old, I don't really know what to say.

> There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.

I mean you can install from git directly: https://pip.pypa.io/en/stable/topics/vcs-support/#vcs-suppor..., and there is documentation on spinning up a local repository (which for packages, is just files in a known directory structure): https://packaging.python.org/guides/hosting-your-own-index/


> What do you mean by reverse? You can certainly upgrade all packages locally and then run pip freeze again if you want to set up newer versions, or like manually change versions in your requirements.txt and `pip install --upgrade` to update them. For stronger reproducibility guarantees there's also `--require-hashes`, although admittedly freeze doesn't appear to support easily writing hashes to a requirements.txt.

You need to know, and maintain, both what you intended to depend on and what you physically ended up depending on. So in more sensible ecosystems you will have, e.g., Gemfile and Gemfile.lock. pip freeze is, effectively, the way you create Gemfile.lock, but it forces you to destroy Gemfile to do it. And so you can't really use it.


> So in more sensible ecosystems you will have, e.g., Gemfile and Gemfile.lock. pip freeze is, effectively, the way you create Gemfile.lock, but it forces you to destroy Gemfile to do it.

No, it doesn't.

  pip freeze -r requirements.txt > requirements.lock.txt
There's plenty of legitimate criticism of python in general and pip in particular, but much of yours send to be criticism of things that are factually untrue.


> There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.

https://pypi.org/project/pypiserver/

Also, pip will install from a git repository.


> - There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.

You can setup proxy such as Nexus, which allows hosting your own librarires: https://help.sonatype.com/repomanager3/formats/pypi-reposito...


You should check out poetry. It has greatly simplified almost all of what you have mentioned


What are you trying to do? It generally is not a problem unless you are using old or obscure packages. Sure some projects need those things, probably a lot of them, but I don't think that is really pythons fault. Not that this couldn't be done better in python. What issues did you encounter?


If you use pyenv, it shouldn't be that difficult.


I dont know how its actually possible to spend 4 hours getting lyton2 and pip setup at all. The issues deal with managing upgrades, not initial setup


This too is a solved problem. First of all, the bundled Python 3 in Big Sur is perfectly OK and works fine (it’s become my default after years of pyenv).

Second, brew and pyenv “just work” if you need any other version. I’ve been developing in Python for decades, and it is one of the cleanest and easier approaches to running multiple runtimes I’ve used in any language.


i sympathize.

the problem is apple. they are (understandably) refusing to ship an up-to-date global python interpreter with macOS. not only that, they aren't removing the default 2.7 that they ship.

macOS Big Sur ships Python 2.7. ha. ha. ha.

none of the proposed "virtual environments" solutions are going to rescue you when you operate globally at the OS-level and not in a sandbox.

been there.


1. macOS ships a /usr/bin/python3 that is up-to-date at the time of the feature freeze of a major macOS release, e.g. macOS 11 ships Python 3.8.2 from February 2020.

2. /usr/bin/python being a symlink for python2.7 is entirely consistent with PEP 394 recommendations, especially the version before June 2019: https://raw.githubusercontent.com/python/peps/214736457f6d61...

  * Unix-like software distributions (including systems like Mac OS X and
    Cygwin) should install the ``python2`` command into the default path
    whenever a version of the Python 2 interpreter is installed, and the same
    for ``python3`` and the Python 3 interpreter.
  * When  invoked, ``python2`` should run some version of the Python 2
    interpreter, and ``python3`` should run some version of the Python 3
    interpreter.
  * If the ``python`` command is installed, it should invoke the same version of
    Python as the ``python2`` command (however, note that some distributions
    have already chosen to have ``python`` implement the ``python3``
    command; see the `Rationale`_ and `Migration Notes`_ below).
The 2019 update https://github.com/python/peps/commit/ae932bd6fd2c493d7d64ce... allowed more flexibility, but the old handling is still entirely okay.


> not only that, they aren't removing the default 2.7 that they ship.

Yes they are:

    $ python

    WARNING: Python 2.7 is not recommended. 
    This version is included in macOS for compatibility with legacy software. 
    Future versions of macOS will not include Python 2.7. 
    Instead, it is recommended that you transition to using 'python3' from within Terminal.
    
    Python 2.7.16 (default, Aug 30 2021, 14:43:11) 
    [GCC Apple LLVM 12.0.5 (clang-1205.0.19.59.6) [+internal-os, ptrauth-isa=deploy on darwin
    Type "help", "copyright", "credits" or "license" for more information.
    >>>


> they are (understandably) refusing to ship an up-to-date global python interpreter with macOS

What is the understandable reasoning?


It's pretty common to have kept `/usr/bin/python` as Python2. This is the most conservative option and won't break anything that's already running. Anyone who is using Python currently will have already installed Python 3 (or knows how to do this), and is probably using `#!/usr/bin/env python3`. As an OS vendor with many (many...) systems in the wild, you don't want to switch Python versions on people unexpectedly. Especially when it's still pretty painless to keep python2 around even if there is an installed python3.

For anyone that's still using python2 (as a nearby comment mentioned), there is now at least a warning to avoid using the default Python 2.7 (/usr/bin/python). For an OS vendor, this transition takes a long time... but at least it's in process.


Oh, sure, it's definitely not a good idea to change what the unqualified python binary refers to. But I interpreted "refusing to ship an up-to-date global python interpreter with macOS" as meaning that Python3 isn't shipped at all, with the binary name "python3").

Is that the case? If so, I still don't understand the reasoning.


I don’t think anything in macOS (the base OS) requires Python, does it? If not, then, I don’t see why Apple would want to ship any Python with the OS. Let people install the Python version they want and then the users will have to maintain it.


The comment I asked rhe question of said:

> the problem is apple. they are (understandably) refusing to ship an up-to-date global python interpreter with macOS

Ie the claim is that it's causing a problem for users. I don't have an opinion about this point, but it is the premise of my question.


Because python 2 has been unsupported for more than a year.


No such problems on Linux, which is free and open-source. I think 'open-source' is the key.


I think you mean you don't have such problems on Linux, there are plenty of people who trip over the fact that the default python on RHEL<= 7 was comically old and unintuitive to update...

https://www.ibm.com/support/pages/work-around-frustrating-py...

https://unix.stackexchange.com/questions/468620/how-to-chang...


You are comparing apples and oranges, obviously. Perhaps some people are just unconsciously too enamored with apples.

OP's problem was not configuring default Python version as in your second link. Your first link is about some IBM stuff, and IBM is proprietary, so no comments there.

Neither was OP's problem about setting up Python on an old version of the Mac OS.

In any case, on old RHEL, you can just do something like this: https://www.2daygeek.com/install-python-3-on-centos-6/

Installing Python on Linux is much easier than you're trying to say.

For the the love of apples.


Configuring the default python, also referred to as installing a new Python, is exactly what OP was doing.

The process in the supplied link, which is the same as the process you weirdly called IBM proprietary even though it has nothing to do with IBM, is essentially the same as the process on MacOS. Realize that the OS python is not what you want, enable a non-default package repository with a few shell commands, install Python.

Prejudice about anything to do with Apple aside, glad to see we agree that RHEL and MacOS have similar install processes.


Python dependency management is rubbish, and many python projects have incomplete or missing requirements.txt files. I dislike Python as well (why on earth is whitespace relevant??) but unfortunately if you want to tinker with any of the newest deep-learning projects, you'll need python.

I use Anaconda for package management on Ubuntu, which helps a lot, but it's still not my preferred programming language.


> why on earth is whitespace relevant??

To eliminate the need for braces and semicolons, of course. It's part of what makes python so easy to read and write. Is it really such a problem?

I honestly can't imagine disliking python. It's my goto for anything where performance doesn't matter due to its unmatched ergonomics and vast libs. Golang comes close but for some reason having to implement really common and basic things yourself is part of the language's design philosophy. Plus it's compiled so you can't just drop into the interpreter for a slick one-liner.

What's your preferred quick and dirty scripting/programming langage?


I prefer Ruby for quick scripting - in my opinion it beats Python in readability (sugarcoating handlers/hiding guts like not needing self littered everywhere), and seems to have more logical consistency in calls (eg: array.append(element) but len(array) in Python vs. array.append(element) and array.length in Ruby, amongst many other things).

It's definitely a preference, and Python has a lot of library support that the Ruby community doesn't (yet, hopefully), but grandparent's comment about headaches setting up Python with a simple project are pretty common.

Rarely do I encounter a Python project that "just works" - If your setup isn't exactly the same as the repo owner, and often even they don't even know why their setup is the way it is, then there's often a lot of fiddling and package adjustments needed. Not a deal breaker normally, but enough to make it more painful to quickly drop in and start experimenting with.


> bpo-44590: All necessary data for executing a Python function (local variables, stack, etc) is now kept in a per-thread stack. Frame objects are lazily allocated on demand. This increases performance by about 7% on the standard benchmark suite. Introspection and debugging are unaffected as frame objects are always available when needed. Patch by Mark Shannon

That’s pretty massive! My impression of CPython was there were few performance improvements globally since 3. I’m having trouble finding benchmarks for all the minor versions.


7% is massive for C, nothing for Python, and the benchmark suite is unreliable.

I'd wait for independent measurements in real world applications. Python core developers are very good at writing bloated benchmark suites that are so convoluted that no one knows what is going on.

Since more than two decades, big projects like "The need for speed", pythonspeed.com pop up periodically without anything happening. But apparently that's enough to get folks excited.


The new error messages look amazing.


Love seeing all the optimizations. Is this largely the new Microsoft work?


I noticed on https://www.python.org/downloads/ I only see up to python 3.9. Where is 3.10 and 3.11 downloads? whats the best way to install python?


3.10 will be released tomorrow. 3.11 is still very much work in progress and is expected to drop in 2022.


Thanks this is what I was looking for. Don't know why i was down-voted for asking




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: