Hacker News new | past | comments | ask | show | jobs | submit login

Python 3 progress so far:

  [x] Async.
  [x] Optional static typing.
  [x] Threading.
  [ ] JIT.
  [ ] Efficient dependency management.



Not sure what this list means, there are successful languages without these feature. Also Python 3.13 [1] has an optional JIT [2], disabled by default.

[1] https://docs.python.org/3.13/whatsnew/3.13.html

[2] https://peps.python.org/pep-0744/


The successful languages without efficient dependency management are painful to manage dependencies in, though. I think Python should be shooting for a better package management user experience than C++.


If Python's dependency management is better than anything, it's better than C++'s. Python has pip and venv. C++ has nothing (you could say less than nothing since you also have ample opportunity for inconsistent build due to mismatching #defines as well as using the wrong binaries for your .h files and nothing remotely like type-safe linkage to mitigate human error. It also has an infinite number of build systems where each system of makefiles or cmakefiles is its own build system with its own conventions and features). In fact python is the best dependency management system for C++ code when you can get binaries build from C++ via pip install...


> If Python's dependency management is better than anything, it's better than C++'s.

That’s like the lowest possible bar to clear.


Agreed, but that was the bar set by the comment I was replying to, which claimed Python doesn't clear it.


On bro


C++ has apt-get etc. because the libraries do not change all the time. Also, of course there are vcpkg and conan.

Whenever you try to build something via pip, the build will invariably fail. The times that NumPy built from source from PyPI are long over. In fact, at least 50% of attempted package builds fail.

The alternative of binary wheels is flaky.


> C++ has apt-get

That's not a development dependency manager. System package management is a different kind of issue, even if there's a bit of overlap.

> because the libraries do not change all the time

That's not true in practice. Spend enough time with larger projects or do some software packaging and you'll learn that the pain is everywhere.


That was the entire point, that C++ is the absolute worst.


I pip3 installed something today. It didn’t work, at all.

I then yum installed a lib and headers, it worked well.

C++ on an msft platform is the worst. I can’t speak for Mac. C++ on a linux is quite pleasant. Feels like most of the comments like yours are biased for un-stated reasons.


This has nothing to do with languages. You can yum install python packages and expect them to work fine. You can install C++ files using an actual dependency manager like vcpkg or conan and have issues.

You're pointing out differences between software package management styles, not languages.


C++ on linux is indeed pleasant if you use only distro-provided library versions. Some specialized library with specific version, also no big deal. Need some upgraded version of a widely-used library -- get containerized or prepare for real pain.


If I had a penny for every time I gave up on compiling C++ software because there's no way to know what dependencies it needs, I'd be a millionaire. Python at least lists them.


Is that because the compiler failed with "foo.h not found" or the build system said "libfoo not found"? CMake is most common and it will tell you. Worst case it's difficult to derive the package name from the name in the diagnostic.

It's not great, but usually not a big deal neither IME. Typically a couple of minutes to e.g. find that required libSDL2 addon module or whatever, if there is that kind of problem at all.


Yes it is, and it's usually such a big deal for me that I just don't use that software. I don't have time to go through a loop of "what's the file name? What package is it in? Install, repeat". This is by far the worst experience I've had with any language. Python has been a breeze in comparison.


I’m not going to refute your points. If you’re going to wear rose-tinted glasses about all of the bad parts about python, that’s fine, I also like python.


What's rose-tinted about "one of them downloads dependencies automatically, the other one doesn't"?


I mean this is a documentation problem. It's pretty common for python to import something it doesn't say it depends on too, btw...


If I had a penny every time I heard something like that on sites like this, I’d be a billionaire :)


Have you tried vcpkg on msft (work on linux and mac too btw)? I found it to be much better than pip3 and venv nonsense.


Mac has the Brew project, which is sort of like apt-get or yum.


Python's dependency management sucks because they're audacious enough to attempt packaging non-python dependencies. People always bring Maven up as a system that got it right, but Maven only does JVM things.

I think the real solution here is to just only use python dependency management for python things and to use something like nix for everything else.


Julia's package manager (for one) works great and can manage non Julia packages. the problem with python's system is that rejecting semver makes writing a package manager basically impossible since there is no way to automatically resolve packages.


Could you clarify what you mean? pip and every other Python package installer is absolutely doing automatic package resolution, and the standard (PEP 440) dependency operators include a compatible version operator (~=) that's predicated on SemVer-style version behavior.


There is no way to manage non-hosted dependencies, though, in a cross-platform way. Something attempting it is often worse than nothing, on a distro that has different assumptions — e.g. every package manager that downloads a dynamic executable will fail on NixOS, and gives no easy way to hook into how those get executed.


I agree that attempting it is worse than nothing, because you now have expectations that may fail at awkward times. But they've gone an done it so here we are.

NixOS is a stark contrast to Python here. It makes things that can't be done deterministically difficult to do at all. Maybe this sounds extreme from the outside, but I'd rather be warned off from that dependency as soon as I attempt to use it, rather than years later when I get a user or contributor than can't make it work for some environmental reason I didn't forsee and now everything rests on finding some hacky way to make it work.

If Nix can be used to solve Python's packaging problems, participating packages will have to practice the same kind of avoidance (or put in the work to fix such hazards up front). I'm not sure if the wider python community is willing to do that, but as someone who writes a lot of python myself and wants it to not be painful down the line, I am.


This is what we used to have and it was much worse. Source: lived that life 10-15 y ago.


15y ago I was using apt-get to manage my c++ dependencies with no way of keeping track of which dependency went with which project. It was indeed pretty awful.

Now when I cd into a project, direnv + nix notices the dependencies that that project needs and makes them available, whatever their language. When I cd into a different project, I get an entirely different set of dependencies. There's pretty much nothing installed with system scope. Just a shell and an editor.

Both of these are language agnostic, but the level of encapsulation is quite different and one is much better than that other. (There are still plenty of problems, but they can be fixed with a commit instead of a change of habit.)

The idea that every language needs a different package manager and that each of those needs to package everything that might my useful when called from that language whether or not it is written in that language... It just doesn't scale.


Valid points. I was also thinking about apt install-ing some C library, try to pip install a package, have it fail to build, look for headers/debug packages in apt, set LD_LIBRARY… you know. It hurts, a lot.

And yet, python is a fantastic language because it’s the remote control to do the heavy, complex, low-level, high-performance stuff with relative ease.


I agree. I want to help save python from its packaging problems because it has so much traction out there (for good reason). People in this forum might be ok jumping ship to rust or whatever but most python users are students, scientists, business types... They're not going to learn rust and I don't think we benefit from leaving them behind.

I wish there was a standard interface that tools like pip could use to express their non-python needs such that some other tool can meet those needs and then hand the baton back to pip once they are met. Poetry2nix is an example of such a collaboration. (I'm not trying to be a nix maximalist here, it's just that it's familiar).

The python community is large enough to attempt to go it alone, but many other language communities are not. I think we'd see faster language evolution if we asked less of them from a packaging perspective:

> focus on making the language great. Provide a way to package deps in that language, and use _____ to ask for everything else.

Apt and brew and snap and whatever else (docker? K8s?) could be taught to handle such requests in a non alter-your-whole-system kind of way.


Not sure this is still a valid critic of Python in 2024.

Between pip, poetry and pyproject.toml, things are now quite good IMHO.


I guess that depends from your perspective. I'm not a Python developer, but like many people I do want to run Python programs from time to time.

I don't really know Rust, or Cargo, but I never have trouble building any Rust program: "cargo build [--release]" is all I need to know. Easy. Even many C programs are actually quite easy: "./configure", "make", and optionally "make install". "./configure" has a nice "--help". There is a lot to be said about the ugly generated autotools soup, but the UX for people just wanting to build/run it without in-depth knowledge of the system is actually quite decent. cmake is a regression here.

With Python, "pip install" gives me an entire screen full of errors about venv and "externally managed" and whatnot. I don't care. I just want to run it. I don't want a bunch of venvs, I just want to install or run the damn program. I've taken to just use "pip install --break-system-packages", which installs to ~/.local. It works shrug.

Last time I wanted to just run a project with a few small modifications I had a hard time. I ended up just editing ~/.local/lib/python/[...] Again, it worked so whatever.

All of this is really where Python and some other languages/build systems fail. Many people running this are not $language_x programmers or experts, and I don't want to read up on every system I come across. That's not a reasonable demand.

Any system that doesn't allow non-users of that language to use it in simple easy steps needs work. Python's system is one such system.


"I don't want a bunch of venvs"

That's your problem right there.

Virtual environments are the Python ecosystem's solution to the problem of wanting to install different things on the same machine that have different conflicting requirements.

If you refuse to use virtual environments and you install more than one separate Python project you're going to run into conflicting requirements and it's going to suck.

Have you tried pipx? If you're just installing Python tools (and not hacking on them yourself) it's fantastic - it manages separate virtual environments for each of your installations without you having to think about them (or even know what a virtual environment is).


Managing a farm of virtualenvs and mucking about with my PATH doesn't address the user-installable problem at all. And it seems there's a new tool to try every few months that really will fix all problems this time.

And maybe if you're a Python developer working on the code every day that's all brilliant. But most people aren't Python developers, and I just want to try that "Show HN" project or whatnot.

Give me a single command I can run. Always. For any project. And that always works. If you don't have that then your build system needs work.


"Give me a single command I can run. Always. For any project. And that always works."

    pipx install X


Right so; I'll try that next time. Thanks. I just go by the very prominent "pip install X" on every pypi page (as well as "pip install .." in many READMEs).


Yeah, totally understand that - pipx is still pretty poorly known by people who are active in Python development!

A few of my READMEs start like this: https://github.com/simonw/paginate-json?tab=readme-ov-file#i...

    ## Installation

    pip install paginate-json

    Or use pipx (link to pipx site)

    pipx install paginate-json
But I checked and actually most them still don't even mention it. I'll be fixing that in the future.


Out of the 3 things I want to install 2 don't work. Both of these seem bugs in pipx so I reported one, but the feedback was borderline hostile and it ended up being closed with "unclear what you want". I'm not even going to bother reporting the other bug.

So whatever the goals are, it doesn't really work. And in general pipx does not strike me as a serious project.


Pipx is great! Although, I always seem to have to set up PATH, at least on windows?


I could say the exact same stuff about NodeJs, c++, go, rust, php, etc. All of these are easy to use and debug and "install easily" when you know them and use them regularly, and the opposite if you're new. Doubly-so if you personally don't like that language or have some personal pet peeve about it's choices.

Guys let's not pretend like this is somehow unique to python. It's only until about a few years ago that it was incredibly difficult to install and use npm on windows. Arguably the language ecosystem with the most cumulative hipster-dev hours thrown at it, and it still was a horrible "dev experience".


That does not match my experience. I've been working with Python for a year or so and the packaging problems come up every now and then still.

I've installed/built a few packages written in Go and Rust specifically and had no problems.


That single command is pipx.


Pythons venvs are a problem to the solution of solving the dependency problem. Consider the following: it is not possible to relocate venvs. In what universe does this make sense? Consider a C++ or Rust binary that would only run when it is placed in /home/simonw/.


Normal users who just want to run some code shouldn't need to learn why they need a venv or any of its alternatives. Normal users just want to download a package and run some code without having to think about interfering with other packages. Many programming languages package managers give them that UX and you can't blame them for expecting that from Python. The added step of having to think about venvs with Python is not good. It is a non-trivial system that every single Python user is forced to learn, understand, and the continually remember every time they switch from one project to another.


This is correct. The whole application installation process, including the creation of a venv, installing stuff into it, and registering it with some on-PATH launcher should be one command.

BTW pyenv comes relatively close.


I agree with that. Until we solve that larger problem, people need to learn to use virtual environments, or at least learn to install Python tools using pipx.


sudo apt install pipx

pipx install package_name

Takes care of the venv and the script/app path is added to system path.


I reject the virtual environments and have no issues. On an untrusted machine (see e.g. the recent token leak):

  /a/bin/python3 -m pip install foo
  /b/bin/python3 -m pip install bar
The whole venv thing is overblown but a fertile source for blogs and discussions. If C-extensions link to installed libraries in site-packages, of course they should use RPATH.


This is mostly a curse of Python’s popularity. The reason you can’t pip install with system Python is that it can break things, and when your system is relying on Python to run various tools, that can’t be allowed. No one (sane) is building OS-level scripts with Node.

The simplest answer, IMO, is to download the Python source code, build it, and then run make altinstall. It’ll install in parallel with system Python, and you can then alias the new executable path so you no longer have to think about it. Assuming you already have gcc’s tool chain installed, it takes roughly 10-15 minutes to build. Not a big deal.


Its more probable that you are trying to install the deps in the system python. And using pip instal xxxxx -u will install them in your user directory rather than the system. I'm pretty sure modern Ubuntu warns you against doing that now anyway.

If you're installing for a small script then doing python -m venv little_project in you home dir is straightforward, just active it after [1]

I'm using rye[2] now and its very similar to Rust's Cargo, it wraps a bunch of the standard toolchain and manages standalone python versions in the background, so doesn't fall into the trap of linux system python issues.

[1]https://docs.python.org/3/library/venv.html [2]https://rye.astral.sh/


Maybe I am biased, because I learned these things so long ago and I don't realize that it's a pain to learn. But what exactly is so confusing about virtualenvs ?

They really not that different from any other packaging system like JS or Rust. The only difference is instead of relying on your current directory to find the the libraries / binaries (and thus requiring you to wrap binaries call with some wrapper to search in a specific path), they rely on you sourcing an `activate` script. That's really just it.

Create a Virtualenv:

    $ virtualenv myenv
Activate it, now it is added to your $PATH:

    $ . myenv/bin/activate
There really is nothing more in the normal case.

If you don't want to have to remember it, create a global Virtualenv somewhere, source it's activate in your .bashrc, and forget it ever existed.


Only python demands you to source an activation script before doing anything.


Yes, though just to illustrate that it's a matter of taste, I do prefer the solution of virtualenvs requiring to source a script that append to PATH, rather than a solution requiring the use of a wrapper that executes in its PATH.

I never remember how to run Javascript binaries. Is it npm run ? npm run script ? npx ? I always end up running the links in node_modules/bin


Do you have a problem with Node.js too because it creates a node_modules folder, or is the problem that it is not handled automatically?


I don't care about the internals. I care about "just" being able to run it.

I find that most JS projects work fairly well: "npm install" maybe followed by "npm run build" or the like. This isn't enforced by npm and I don't think npm is perfect here, but practical speaking as a non-JS dev just wanting to run some JS projects: it works fairly well for almost all JS projects I've wanted to run in the last five years or so.

A "run_me.py" that would *Just Work™" is fine. I don't overly care what it does internally as long as it's not hugely slow or depends on anything other than "python". Ideally this should be consistent throughout the ecosystem.

To be honest I can't imagine shipping any project intended to be run by users and not have a simple, fool-proof, and low-effort way of running it by anyone of any skill level, which doesn't depend on any real knowledge of the language.


> To be honest I can't imagine shipping any project intended to be run by users and not have a simple, fool-proof, and low-effort way of running it by anyone of any skill level, which doesn't depend on any real knowledge of the language.

This is how we got GH Issues full of inane comments, and blogs from mediocre devs recommending things they know nothing about.

I see nothing wrong with not catering to the lowest common denominator.


Like people with actual lives to live and useful stuff to do that's not learning about and hand-holding a dozen different half-baked build systems.

But sure, keep up the cynical illusion that everyone is an idiot if that's what you need to go through life.


I didn’t say that everyone is an idiot. I implied that gate keeping is useful as a first pass against people who are unlikely to have the drive to keep going when they experience difficulty.

When I was a kid, docs were literally a book. If you asked for help and didn’t cite what you had already tried / read, you’d be told to RTFM.

Python has several problems. Its relative import system is deranged, packaging is a mess, and yes, on its face needing to run a parallel copy of the interpreter to pip install something is absurd. I still love it. It’s baked into every *nix distro, a REPL is a command away, and its syntax is intuitive.

I maintain that the relative ease of JS – and more powerfully, Node – has created a monstrous ecosystem of poorly written software, with its adherents jumping to the latest shiny every few months because this time, it’s different. And I _like_ JS (as a frontend language).


This is the truth right here. The issues are with people using (not officially) deprecated tools and workflows, plus various half baked scripts that solved some narrow use cases.


All is well, then, one day, you have to update one library.

Some days later, in some woods or cave, people will hear your screams of rage and despair.


Been using python for 15 years now, and these screams were never heard.

Dev/test with relaxed pip installs, freeze deployment dependencies with pip freeze/pip-tools/poetry/whateveryoulike, and what's the problem?


same here. Been using python/pip for 10+ years and this was never a problem. In the java world, there is jar hell, but it was never a crippling issue, but a minor annoyance once a year or so.

In general, is dependency management such a massive problem it is made to be on HN? Maybe people here are doing far more complex/different things than I've done in the past 20 years


Guessing that folks who write such things are lacking sysad skills like manipulating paths, etc.

It does take Python expertise to fix other issues on occasion but they are fixable. Why I think flags like ‘pip —break-system-packages’ are silly. It’s an optimization for non-users over experienced ones.


Deps in CPython are more about .so/.dll problem, not much can be done since stuff happens outside python itself.


The shitshow that is python tooling is one of the reasons I prefer java jobs to python jobs when I can help it. Java got this pretty right years and years and years earlier. Why are python and javascript continuing to horse around playing games?


Optional static typing, not really. Those type hints are not used at runtime for performance. Type hint a var as a string then set it to an init, that code still gonna try to execute.


> Those type hints are not used at runtime for performance.

This is not a requirement for a language to be statically typed. Static typing is about catching type errors before the code is run.

> Type hint a var as a string then set it to an int, that code still gonna try to execute.

But it will fail type checking, no?


The critique is that "static typing" is not really the right term to use, even if preceded by "optional". "Type hinting" or "gradual typing" maybe.

In static typing the types of variables don't change during execution.


If there’s any checking of types before program runs then it’s static typing. Gradual typing is a form of static typing that allows you to apply static types to only part of the code.

I’m not sure what you mean by variables not changing types during execution in statically typed languages. In many statically typed languages variables don’t exist at runtime, they get mapped to registers or stack operations. Variables only exist at runtime in languages that have interpreters.

Aside from that, many statically typed languages have a way to declare dynamically typed variables, e.g. the dynamic keyword in C#. Or they have a way to declare a variable of a top type e.g. Object and then downcast.


'dynamic' in C# is considered a design mistake and pretty much no codebase uses it.

On the other hand F# is much closer to the kind of gradual typing you are discussing.


Python is dynamically typed because it type-checks at runtime, regardless of annotations or what mypy said.


Is java dynamically typed as well, then? It does reify generics, so only some part is type checked, what is the level at which it is statically/dynamically typed?


Yeah Java is partially dynamically typed. I think we can safely conclude that all practical languages are hybrids where some of the type checking is done by the compiler and some of the type checking is deferred to the runtime.

Then when we call a language statically typed we mean most of the type checking is usually done statically. Dynamic type checking is the exception, not the rule.

Even in dynamically typed languages like Python, some of the type checking may be done by an optimizer in the compilation stage. The runtime type check guarding some operations may be removed, because the optimizer decides it knows the types of the values involved once and for all.


That’s not really “the definition”. Different type systems can express different properties about objects, and there are absolutely cases that something changes about a type.

E.g. in structural typing, adding a new field will change the type to a subtype. Will it make any structural typed language non-static?


the efficient dependency management is coming, the good people of astral will take care of that with the uv-backed version of rye (initially created by Armin Ronacher with inspirations from Cargo), I'm really confident it'll be good like ruff and uv were good


rye's habit of insisting on creating a .venv per project is a deal-breaker. I don't want .venvs spread all over my projects eating into space (made worse by the ml/LLM related mega packages). It should atleast respect activated venvs.


A venv per project is a very sane way. Put them into the ignore file. Hopefully they also could live elsewhere in the tree.


well that's good for you, but you're in the minority and rye will end up being a standard anyway, just like uv and ruff, because they're just so much better than the alternatives


I think uv's use of a global cache means that having several .venv with the same packages is less of a problem.


I don't get how this optional static typing works. I had a quick look at [1], and it begins with a note saying that Python's runtime doesn't enforce types, leaving the impression that you need to use third-party tools to do actual type checking. But then it continues just like Python does the check. Consider that I'm not a Python programmer, but the main reason I stay away from it is the lack of a proper type system. If this is going to change, I might reconsider it.

[1] https://docs.python.org/3/library/typing.html


The parser supports the type hint syntax, and the standard library provides various type hint related objects.

So you can do things like “from typing import Optional” to bring Optional into scope, and then annotate a function with -> Optional[int] to indicate it returns None or an int.

Unlike a system using special comments for type hints, the interpreter will complain if you make a typo in the word Optional or don’t bring it into scope.

But the interpreter doesn’t do anything else; if you actually return a string from that annotated function it won’t complain.

You need an external third party tool like MyPy or Pyre to consume the hint information and produce warnings.

In practice it’s quite usable, so long as you have CI enforcing the type system. You can gradually add types to an existing code base, and IDEs can use the hint information to support code navigation and error highlighting.


> In practice it’s quite usable

It would be super helpful if the interpreter had a type-enforcing mode though. All the various external runtime enforcement packages leave something to be desired.


I agree. There are usable third-party runtime type checkers though. I like Beartype, which lets you add a decorator @beartype above any function or method, and it’ll complain at runtime if arguments or return values violate the type hints.

I think runtime type checking is in some ways a better fit for a highly dynamic language like Python than static type checking, although both are useful.


At MPOW most Python code is well-type-hinted, and mypy and pyright are very helpful at finding issues, and also for stuff like code completion and navigation, e.g. "go to the definition of the type of this variable".

Works pretty efficiently.

BTW, Typescript also does not enforce types at runtime. Heck, C++ does not enforce types at runtime either. It does not mean that their static typing systems don't help during at development time.


> BTW, Typescript also does not enforce types at runtime. Heck, C++ does not enforce types at runtime either. It does not mean that their static typing systems don't help during at development time.

Speaking of C here as I don't have web development experience. The static type system does help, but in this case, it's the compiler doing the check at compile time to spare you many surprises at runtime. And it's part of the language's standard. Python itself doesn't do that. Good that you can use external tools, but I would prefer if this was part of Python's spec.

Edit: these days I'm thinking of having a look at Mojo, it seems to do what I would like from Python.


https://github.com/python/mypy

> Python itself doesn't do that

The type syntax is python. MyPy is part of Python. It's maintained by the python foundation. Mypy is not part of CPython because modularity is good, the same way that ANSI C doesn't compile anything, that's what gcc, clang, etc are for.

Mojo is literally exactly the same way, the types are optional, and the tooling handles type checking and compilation.


> Mojo is literally exactly the same way.

No, because in Mojo, type checking is part of the language specification: you need no external tool for that. Python defines a syntax that can be used for type checking, but you need an external tool to do that. GCC does type checking because it's defined in the language specification. You would have a situation analogous to Python only if you needed GCC + some other tool for type checking. This isn't the case.


You're really splitting hairs here all to prove some sort of "type checking isn't included with python" property. Even if you're technically right, half the world really doesn't care and most code being churned out in Python is type-hinted, type-checked, and de-facto enforced at design time.

It's honestly winning the long-term war because traditional languages have really screwed things up with infinite and contrived language constructs and attempts just to satisfy some "language spec" and "compiler", whilst still trying to be expressive enough for what developers need and want to do safely. Python side-stepped all of that, has the perfect mix of type-checking and amazing "expressibility", and is currently proving that it's the way forward with no stopping it.


I'm not saying that no one should use Python, I'm just saying why I don't like it. But if you instead like it I will not try to stop you using it.

This said, if most people use type hints and the proper tooling to enforce type checking, I would say this would be a good reason to properly integrate (optional) static typing in the language: it shows that most programmers like static typing. The problem I focused on in my example isn't the only advantage of a type system.


Third party tools (mypy, pyright, etc) are expected to check types. cpython itself does not. This will run just fine:

python -c "x: int = 'not_an_int'"

My opinion is that with PEP 695 landing in Python 3.12, the type system itself is starting to feel robust.

These days, the python ecosystem's key packages all tend to have extensive type hints.

The type checkers are of varying quality; my experience is that pyright is fast and correct, while mypy (not having the backing of a Microsoft) is slower and lags on features a little bit -- for instance, mypy still hasn't finalized support for PEP 695 syntax.


Optional static typing is just like a comment (real term is annotation) of the input variable(s) and return variable(s). No optimization is performed. Using a tool such as mypy that kicks off on a CI/CD process technically enforces types but they are ignored by the interpreter unless you make a syntax error.


A language server in your IDE kicks in much earlier, and is even more helpful.


I haven't used an IDE that has that but it is still just giving you a hint that there is an error and the interpreter is not throwing an error which was my point.


> I haven't used an IDE that has that

You don’t need an IDE for this, an LSP plugin + Pyright is sufficient to get live type checking. For instance, Emacs (Eglot), Vim (ALE), Sublime (SublimeLSP) all support Pyright with nearly no setup required.


That's true of most compiled languages. Unless we are talking about asserts, reflection, I think type erasure, and maybe a few other concepts, language runtimes don't check types. C does not check types at runtime. You compile it and then rely on control of invariants and data flow to keep everything on rails. In python, this is tricky because everything is behind at least one layer of indirection, and thus virtually everything is mutable, so it's hard to enforce total control of all data structures. But you can get really close with modern tooling.


>> and the interpreter is not throwing an error which was my point. > That's true of most compiled languages

True of most statically typed languages (usually no need to check at runtime), but not true in Python or other dynamically typed languages. Python would have been unusable for decades (prior to typehints) if that was true.


That's just reflection. That's a feature of code, not language runtime. I think there are some languages which in fact have type checking in the runtime as a bona-fide feature. Most won't, unless you do something like isinstance()


> "I haven't used an IDE that has that but it is still just giving you a hint that there is an error and the interpreter is not throwing an error which was my point."

At this point, I'm not sure how one is to take your opinion on this matter. Just like me coding some C# or Java in notepad and then opining to a Java developer audience about the state of their language and ecosystem.


Nope. Type annotations can be executed and accessed by the runtime. That's how things like Pydantic, msgspec, etc, do runtime type enforcement and coercion.

There are also multiple compilers (mypyc, nuitka, others I forget) which take advantage of types to compile python to machine code.


The interpreter does not and probably never will check types. The annotations are treated as effectively meaningless at runtime. External tools like mypy can be run over your code and check them.


It checks types .. it doesn't check type annotations.

Just try:

  $ Python
  >>> 1 + '3'


Python's typing must accommodate Python's other goal as quick scripting language. The solution is to document the optional typing system as part of the language's spec and let other tools do the checking, if a user wants to use them.

The other tools are trivially easy to set up and run (or let your IDE run for you.) As in, one command to install, one command to run. It's an elegant compromise that brings something that's sorely needed to Python, and users will spend more time loading the typing spec in their browser than they will installing the type checker.


I think static typing is a waste of time, but given that you want it, I can see why you wouldn't want to use Python. Its type-checking is more half-baked and cumbersome than other languages, even TS.


I used to think like that until I tried.

There are areas where typing is more important: public interfaces. You don't have to make every piece of your program well-typed. But signatures of your public functions / methods matter a lot, and from them types of many internal things can be inferred.

If your code has a well-typed interface, it's pleasant to work with. If interfaces of the libraries you use are well-typed, you have easier time writing your code (that interacts with them). Eventually you type more and more code you write and alter, and keep reaping the benefits.


This was the thing that started to bring me around to optional typing as well. It makes the most sense to me as a form of documentation - it's really useful to know what types are expected (and returned) by a Python function!

If that's baked into the code itself, your text editor can show inline information - which saves you from having to go and look at the documentation yourself.

I've started trying to add types to my libraries that expose a public API now. I think it's worth the extra effort just for the documentation benefit it provides.


This is what made me give it a shot in TS, but the problem is your types at interface boundaries tend to be annoyingly complex. The other problem is any project with optional types soon becomes a project with required types everywhere.

There might be more merit in widely-used public libraries, though. I don't make those.


I shouldn't have said it's a waste of time period, cause every project I work on does have static typing in two very important places: the RPC or web API (OpenAPI, gRPC, whatever it is), and the relational database. But not in the main JS or Py code. That's all I've ever needed.

I did try migrating a NodeJS backend to TS along with a teammate driving that effort. The type-checking never ended up catching any bugs, and the extra time we spent on that stuff could've gone into better testing instead. So it actually made things more dangerous.


Typescript is pretty much the gold standard, it’s amazing how much JavaScript madness you can work around just on the typechecking level.

IMHO Python should shamelessly steal as much typescript’s typing as possible. It’s tough since the Microsoft typescript team is apparently amazing at what they do so for now it’s a very fast moving target but some day…


Well the TS tooling is more painful in ways. It's not compatible with some stuff like the NodeJS profiler. Also there's some mess around modules vs "require" syntax that I don't understand fully but TS somehow plays a role.


A type checker is only going to add limited value if you don't put the effort in yourself. If everything string-like is just a string, and if data is not parsed into types that maintain invariants, then little is being constrained and there is little to "check". It becomes increasingly difficult the more sophisticated the type system is, but in some statically typed languages like Coq, clever programmers can literally prove the correctness of their program using the type system. Whereas a unit test can only prove the presence of bugs, not their absence.


I instead think that the lack of static typing is a waste of time, since without it you can have programs that waste hours of computation due to an exception that would have been prevented by a proper type system ;)


python will never be "properly typed"

what it has is "type hints" which is way to have richer integration with type checkers and your IDE, but will never offer more than that as is


> what it has is "type hints" which is way to have richer integration with type checkers and your IDE, but will never offer more than that as is

Python is strongly typed and it's interpreter is type aware of it's variables, so you're probably overreaching with that statement. Because Python's internals are type aware, it's how folks are able to create type checkers like mypy and pydantic both written in Python. Maybe you're thinking about TS/JSDoc, which is just window dressing for IDEs to display hints as you described?


I don't think you can say that a language is strongly typed if only the language's internals are. The Python interpreter prevents you from summing an integer to a string, but only at runtime when in many cases it's already too late. A strongly typed language would warn you much sooner.


Your example is bang on when describing a "strongly typed" language. That said, strongly typed is different from "static typing", which is what you described later in your post. Python is both strongly typed and dynamically typed. It is all rather confusing and just a big bowl of awful. I have to look up if I haven't referenced it in a while, because the names are far too similar and there aren't even good definitions around some of the concepts.

https://stackoverflow.com/questions/2690544/what-is-the-diff...

https://wiki.python.org/moin/Why%20is%20Python%20a%20dynamic...


s/will never be/already is/g

https://github.com/mypyc/mypyc

You can compile python to c. Right now. Compatibility with extensions still needs a bit of work. But you can write extremely strict python.

That's without getting into things like cython.


It is properly typed: it has dynamic types :)


Then we have very different ideas of what proper typing is :D Look at this function, can you tell me what it does?

  def plus(x, y):
    return x+y
If your answer is among the lines of "It returns the sum x and y" then I would ask you who said that x and y are numbers. If these are strings, it concatenates them. If instead you pass a string and a number, you will get a runtime exception. So not only you can't tell what a function does just by looking at it, you can't even know if the function is correct (in the sense that will not raise an exception).


It calls x.__add__(y).

Python types are strictly specified, but also dynamic. You don't need static types in order to have strict types, and indeed just because you've got static types (in TS, for example) doesn't mean you have strict types.

A Python string is always a string, nothing is going to magically turn it into a number just because it's a string representation of a number. The same (sadly) can't be said of Javascript.


> It calls x.__add__(y)

Your answer doesn't solve the problem, it just moves it: can you tell me what x. __add__(y) does?


Whatever it's defined to do, and nothing else.

Dynamic typing, but strong typing.

There's no magic going on here, just an attribute lookup. It's still possible to write terrible Python code -- as it is in any language -- and the recommendation is still "don't write terrible code", just as it is in any language. You don't have to like it, but not liking it won't make it any different.

The older I get, the more I like writing statically-typed code. I wrote a lot more Python (for my own use) in my youth, and tend towards Rust nowadays. Speaking of which: if you dislike the dynamic typing of Python then you must hate the static typing of Rust -- what does

    fn add<T:Add<U>, U>(a: T, b: U) -> T::Output { a + b }
do?


Yeah and even with static typing, a string can be many things. Some people even wrap their strings into singleton structs to avoid something like sending a customerId string into a func that wants an orderId string, which I think is overkill. Same with int.


In theory it's nice that the compiler would catch those kinds of problems, but in practice it doesn't matters.


It's very hard to write long-running daemons in python partially for this reason, when you make a typo on a variable or method name in an uncommon code path.


It can matter also in practice. Once I was trying some Python ML model to generate images. My script ran for 20 minutes to then terminate with an exception when it arrived at the point of saving the result to a file. The reason is that I wanted to concatenate a counter to the file name, but forgot to wrap the integer into a call to str(). 20 minutes wasted for an error that other languages would have spotted before running the script.


Both Haskell and OCaml can raise exceptions for you, yet most people would say that they are properly typed.

The plus function you wrote is not more confusing than any generic function in a language that supports that.


> you can't tell what a function does just by looking at it

You just did tell us what it does by looking at it, for the 90% case at least. It might be useful to throw two lists in there as well. Throw a custom object in there? It will work if you planned ahead with dunder add and radd. If not fix, implement, or roll back.


> You just did tell us what it does by looking at it, for the 90% case at least

The problem is that you can't know if the function is going to do what you want it to do without also looking at the context in which it is used. And what you pass as input could be dependent on external factors that you don't control. So I prefer the languages that let me know what happens in 100% of the cases.


Protection from untrusted input is something that has to be considered in any language.

Not yet been a real world concern in my career, outside webforms, which are handled by framework.


> Protection from untrusted input is something that has to be considered in any language

Sure, but some languages make it easier than others. And that was just one example, another example could be having a branch where the input to your function depends on some condition. You could have one of the two branches passing the wrong types, but you will only notice when that branch gets executed.


When is the last time you had a bug IRL caused by passing the wrong kind of thing into plus(x, y), which your tests didn't catch?


It never happened to me, because I don't use Python ;)

On a more serious note, your comment actually hints at an issue: unit testing is less effective without static type checking. Let's assume I would like to sum x and y. I can extensively test the function and see that it indeed correctly sums two numbers. But then I need to call the function somewhere in my code, and whether it will work as intended or not depends on the context in which the function is used. Sometimes the input you pass to a function depends from some external source outside your control, an if that's the case you have to resort to manual type checking. Or use a properly typed language.


This isn't an actual problem people encounter in unit testing, partially because you test your outer interfaces first. Also, irl static types often get so big and polymorphic that the context matters just as much.


And if you specify that they are numbers then you lose the ability of the function to generalize to vectors.

Indeed assuming it adds two things is correct, and knowing that concatenation is how Python defines adding strings is important for using the language in the intended way.


What about the case of passing a number and a string?


Python 3.12 introduces a little bit of JIT. Also, there is always pypy.

For efficient dependency management, there is now rye and UV. So maybe you can check all those boxes?


Rye is pretty alpha, uv is young, too, and they are not part of "core" Python, not under the Python Foundation umbrella (like e.g. mypy is).

So there's plenty of well-founded hope, but the boxes are still not checked.


You forgot:

    [X] print requires parentheses


Fair. But it was importable from __future__ back in 2.7.


print was way better when it was a statement.


Idk why but python 2 print still pops up in my nightmares lol on bro


The conda-forge ecosystem is making big strides in dependency management. No more are we stuck with the abysmal pip+venv story.


I definitely like some aspects of conda, but at least pip doesn't give me these annoying infinite "Solving environment" loops [0].

[0] https://stackoverflow.com/questions/56262012/conda-install-t...


That issue is fixed by using the libmamba resolver:

https://www.anaconda.com/blog/a-faster-conda-for-a-growing-c...


I'm eager to see what a simple JIT can bring to computing energy savings on python apps.


I'd wager the energy savings could put multiple power plants out of service.

I regularly encounter python code which takes minutes to execute but runs in less than a second when replacing key parts with compiled code.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: