Hacker News new | past | comments | ask | show | jobs | submit login

The benefit for me is knowing what installs where so that i can later uninstall everything without having to rely on an uninstaller that leaves bits behind. Of course I'm talking about a different scenario where I'm already assuming the script and binaries it installs are safe.



I’m old enough to remember when you had a makefile or even better an rpm or deb.

It was so much better than the windows way of running a random program with little visibility, and so sad when so much software changed away from that method.


To be fair, the package must be provided by your distribution, otherwise it's just another random download, only in a different format.

Even if you are adding a third-party package repository, you just change the downloader. Although you usually get signature checking with that step for free.

And therein lies the issue: It's not the job of the software developers to provide packages for your distribution, but the job of your distribution maintainers.

So, if your distribution will probably not backport $newver of some tool to your release, you are only left with installing it via different means.

If you find yourself always installing software like this because your package repositories don't contain what you want, it may be a good idea to just check out different distros.


There's a big difference there in that even if you do install the odd program from source, the distro likely has you covered in terms of all the libraries needed to run it in the first place.

That's a very different scenario from having to download the dependencies as well.


Having to download dependencies is a very Linux desktop problem. On every other platform (Windows, macOS, iOS, Android) it is customary for installers to come with all dependencies. Most programs also statically link non-system libraries.


Nothing stops you from bundling dependencies on linux.

The reason it's a "very linux problem" is exactly because this is a sane thing to do in a pool of insane systems.

There's absolutely no reason to ship your 1k executable with 1Gb of libraries, used by 10 other programs.


If your 1k executable uses 1Gb of libraries, that's on you.


An unsigned deb is better than an unsigned script as it keeps track of what files are deployed and you have a version number in a central list from dpkg -l

Sure something bad can still do things, but it tends not to splat files everywhere with no trace.


In theory yes, it's more secure.

In practice personally auditing the installation script for every program you're going to use and the installation script for every update is grossly impractical, for the same reason nobody reads EULAs. In the end it still boils down to trust.


I used to use checkinstall for this, makes a little uninstallable apt package for you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: