Hacker News new | past | comments | ask | show | jobs | submit login

I think golang is used because you can easily create a single static binary, which is incredibly easy to distribute. I often find non-trivial CLI tools written in Python cumbersome because of the dependency wrangling necessary.



I think one of the advantages of a script is that you can quickly check what it is doing by simply opening it - an executable won't afford that.


Plus it can be run on any machine, while golang needs to be compiled for the specific architecture you'll be running it on. No messing about trying to get the right build.


Go is child's play to build (and cross-compile).

Simpler than having to worry about Python versions, let alone dependencies.


I actually think its a less of a problem than many imagine. If you have different architectures it actually is better and more predictable because it's compiled, also it's incredibly easy to compile even for noobs

I spent a weekend going through all my old python scripts with Gemini and ChatGPT, rewriting them to Go just because of this.

Most of them were so old that I would have had to skip like 3 generations of package managers to get to the one that's used this year (dunno about next year) if I wanted to upgrade or add dependencies.

With Go I can just develop on my own computer, (cross)compile and scp to the destination and it'll keep working.


> I often find non-trivial CLI tools written in Python cumbersome because of the dependency wrangling necessary.

I'm thinking of trying out Mojo in large part because they say they're aiming for Python compatibility, and they produce single-file executables.

Previous to that I was using PyInstaller but it was always a little fragile (I had to run the build script a couple of times before it would successfully complete).

Currently I'm using pipx and Poetry, which seems pretty good (100% success rate on builds, and when my 5-line build script fails it's because of an actual error on my part).

Which is a round-about way of asking everyone:

Does anyone have any other good way(s) to build single-file executables with Python?


Fun fact, you can use D language as compiled scripting using rdmd with powerful and modern programming features although it has much faster compilation than comparable C++ and Rust [1]. The default GC make it intuitive and Pythonic for quick scripting more than Go. Its recent native support for OS lingua franca C is the icing on the cake [2].

From the website, "D's blazingly fast compilation allows it to be used as a high level, productive scripting language, but with the advantages of static type checking" [3].

[1]Why I use the D programming language for scripting (2021):

https://news.ycombinator.com/item?id=36928485

[2]Adding ANSI C11 C compiler to D so it can import and compile C files directly:

https://news.ycombinator.com/item?id=27102584

[3] https://dlang.org/areas-of-d-usage.html#academia


Scriptisto is an underrated tool: https://github.com/igor-petruk/scriptisto

It can do the Python venv stuff behind the scenes for you and it just looks like a single Python file.


You could try Nuitka [1], but I don't have enough experience with it to say if it's any less brittle than PyInstaller.

[1]: https://nuitka.net/


I've been waiting for a single executable interpreter for Ruby for a while now

like deno or bun, but for Ruby

artichoke ruby is the closest we've got


Bootstrapping and different behavior for different versions and not being able to use the dependency ecosystem really make it a lot more difficult than people realize if you’re trying to “script” at scale.

I’ve used rust for this task but people get mad that I’m calling it a “script”. “That’s not a script that’s a program” which…sure. But so maybe we need another term for it? “Production-scripts” or something.

My experience is rewriting Ruby and bash buildpacks for the open spec CNCF Cloud Native Buildpack project (CNB) https://github.com/heroku/buildpacks

I agree that Ruby is easier to start and grow complexity, that would be a good place to start.


This complaint comes up enough that I'm surprised nobody's created the Ruby equivalent of GraalVM, to compile a Ruby script, all its deps, and a WPOed subset of the Ruby runtime, into a native executable.


Given Ruby's lackluster type system, the amount of assumptions a compiler can take is significantly reduced. Moreover, analyzing that at a simple IR level would prevent it from understanding of what is actually referenced vs what isn't without making compile times take eternity and performing complex flow analysis with simulation.

Even with a good type system, a trimmer/linker has to be enlightened of many special idioms and patterns and perform flow analysis, and in the case of dynamically typed languages or languages with reflection - to analyze reachability of reflectable members and trim otherwise spaceous metadata. It took significant amount of work in .NET's ILLink in order for it to be able to produce as compact binaries/libraries as it does today with .NET's AOT build target, and it still required a degree of metdata compression and dehydration of pointer-rich data structures (something which, funnily enough, Go doesn't do, resulting in worse binary size).


It's not quite what you're describing, but TruffleRuby is Ruby on GraalVM:

https://github.com/oracle/truffleruby

Unlike GraalVM Java, as far as I can tell TruffleRuby doesn't provide a bundler that can create a single executable out of everything, but in principle I don't see why it couldn't.


Worth noting that the GraalPython implementation does support creating a single binary.

https://www.graalvm.org/latest/reference-manual/python/stand...

I'm not sure I'd try replacing shell scripts with natively compiled Python binaries. That said, I use a Kotlin Scripting based bash replacement in my own work that has many useful features for shell scripting and is generally much more pleasant. You have to "install" it in the sense of having it extracted somewhere, but it runs on Win/Mac/Linux and can be used without root etc.


I wasn't so much imagining each shell script being replaced with a binary, as I was imagining deploying a single static binary "sysadmin-DSL interpreter binary" — where that "interpreter" is just "Ruby with a custom prelude, all packed together" — such that I could then name that interpreter in the shebang line for my admin scripts written in that DSL.

Graal can create an executable from a ruby program as well with TruffleRuby and native image (both part of the general graal project).

WPO?


While Program Optimization, in this case mostly meaning dead-code elimination for any runtime code not called by the Ruby code.


I think you mean Whole Program Optimization.


The ability to type check and unit test your code is also valuable. This is possible with many languages but with Go it requires basically zero configuration.


Bundle it into a pex and distribute that. Its still way large but its easy to distribute.


Pex was also the solution I landed on after evaluating several non-container options for distributing a Python project to arbitrary Linux hosts.

It works well but with one huge caveat: although you bring the stuff required to reconstitute the venv with you, you’re actually still using the system’s python executable and stdlib!! So for example if you want to make a project targeting all supported Ubuntu LTS versions, you have to include the wheels for every possible python version you might hit.

Ultimately this boils down to there not really being a story for statically compiled python, so in most normal cases you end up wanting a chroot and at that point you’re in a container anyway.


Nuitka has worked for me for everything Ive tried (in house dev tools). I didnt end up using it for work because I can rely on a pristine system Python with the right version so pex makes more sense.

There are other options I didnt look too much into, e.g. Beeware


I wish an easy cross-platform PEX or shiv [1] were a thing. Binary dependencies are the biggest reason I prefer the new inline script metadata spec (https://packaging.python.org/en/latest/specifications/inline...) and `pipx run`. Luckily, they're pretty great. They have changed how I write Python scripts.

The way inline script metadata works is that your script declares arbitrary dependencies in a structured top comment, and a compliant script runner must provide them. Here is an example from a real script:

  #! /usr/bin/env -S pipx run
  # /// script
  # dependencies = [
  #   "click==8.*",
  #   "Jinja2==3.*",
  #   "tomli==2.*",
  # ]
  # requires-python = ">=3.8"
  # ///
pipx implements the spec with cached per-script virtual environments. It will download the dependencies, create a venv for your script, and install the dependencies in the venv the first time you invoke the script. The idea isn't new: you could do more or less the same with https://github.com/PyAr/fades (2014) and https://github.com/jaraco/pip-run (2015). However, I only adopted it after I saw https://peps.python.org/pep-0722/, which PEP 723 replaced and became the current standard. It is nice to have it standardized and part of pipx.

For really arbitrary hosts with no guarantee of recent pipx, there is https://pip.wtf and my venv version https://github.com/dbohdan/pip-wtenv. Personally, I'd go with `pipx run` instead whenever possible.

[1] I recommend shiv over PEX for pure-Python dependencies because shiv builds faster. Have a look at https://shiv.readthedocs.io/en/stable/history.html.


All else being equal I’d probably prefer poetry for the broader project structure, but that would definitely be compelling single script use cases.

You can also combine the two. Something I have done is script dependencies in inline script metadata and dev dependencies (Pyright and Ruff) managed by Poetry.

Most shell-ish scripts probably use no dependencies and will not be picky about exact version


Mmm. I’d argue that all shell scripts use a ton of dependencies that are different across Unix/Linux.

I. E. ‘sed -i’ is only in GNU sed. Same with ‘grep -P’.


I took my parent post as meaning the app language dependencies. Pip packages or ruby gems or "must be exactly python 3.12"

Correct but if you're in a situation where this is a issue you probably know about it and can use POSIX versions that are more portable.

Otherwise nobody thinks of it because most likely it is not being distributed.


This has bitten me many times


It’s why if my bash turns into more than a page or two I start re-evaluating it and turn it into python


Tons of scripts rely on coreutils (sed, awk, grep, head) to manipulate data.

All of those have wildly different behavior depending on their "flavors" (GNU vs Busybox vs BSD) and almost all of them depend on libc being installed.


That's not my experience at all. Shell is often glue between different utilities and unless it's being run in a controlled environment like a docker container, you have no idea what's on the base machine.

I only touch Go when using tools CNCF projects decided to write in Go.

Other than that, OS scripting is done in traditional UNIX tools, or Powershell.


Golang's not so secret weapon


Hard to miss those >100 meg 'single binaries'


[flagged]


Every compiled language can do it until you run into issues with glibc vs musl or openssl version or network stack defaults and remember you weren't as static as you thought.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: