Hacker News new | past | comments | ask | show | jobs | submit login

Why is there still software asking user to run sudo on something downloaded from the internet? Why even bother doing that script if there is a easier, safer, with smaller footprint and built-in in all distro option available?

Just as comparison, micro alpine's package has 4MB (installed) while its staticly compiled binary version it downloads from github has 11MB. The repo's package would make it more micro.




> Why is there still software asking user to run sudo on something downloaded from the internet?

Does it do this?

From https://github.com/zyedidia/micro#quick-install-script

    Quick-install script

    curl https://getmic.ro | bash

    The script will place the micro binary in the current directory.
    From there, you can move it to a directory on your path of your choosing
    (e.g. `sudo mv micro /usr/bin`).
It points out you need `sudo` to move the binary to `/usr/bin` - if that's even what you want to do - but... yeah? So?


This is just a demonstration of how unfriendly is Linux to supporting third-party software. You are supposed either to use software from distribution's repository or compile it yourself. And if you are a software developer, then you are on your own to decide how to support hundreds of different distributions (the most popular way is to statically compile everything into a single binary).

Such scripts remind me of DOS-era "installers", and they are awful:

- it is unknown whether your distribution is supported or not

- this script is probably not tested on many distributions and can break something

- it is unsafe, because you are running a code that some random person posted on the Internet without any sandboxing


No, it is not a demonstration of anything. I'm not saying there is no fragmentation on linux, there is.

There are many solution that try to fix that: appimage, flatpak, snap and what not. All of them of some degree of success, and pitfalls. I, for example, am happy user of flapak, running steam games on a alpine distro without much problems.

But for this specific thing I just don't get why there is a curl/bash command. It is a wide available piece of software that you can easily install using distro’s repo. Don't want to user terminal? Just download any "mainstream" distro and will have a store like software available, type micro, click install and done.

Is there fragmentation in linux? Yes. Is is hard to guarantee it will work on every distro? Yes, a lot worse on software you can't compile. But micro IS NOT the demonstration of it.

There is no reason whatsoever micro should have "curl | bash" on the front page.


It says when you run the installer:

    Note that you must install micro to a directory accessible to all users when doing
    this, typically /usr/bin. cd to that directory before running this script.
    
    E.g.:
    
      $ cd /usr/bin
      $ curl https://getmic.ro/r | sudo sh
    
    or
    
      $ su - root -c "cd /usr/bin; wget -O- https://getmic.ro | GETMICRO_REGISTER=y sh"


You conveniently missed off the lines of text above that which explain what "doing this" is:

    getmicro can use update-alternatives to register micro as a system text editor.
    For example, this will allow `crontab -e` open the cron file with micro.

    To enable this feature, define the GETMICRO_REGISTER variable or use the URL
    `https://getmic.ro/r`.

    Note that you must install micro to a directory accessible to all users when doing this...
Yes, if you want to install `micro` as a system text editor, something you can do (but don't have to), you need privs.

Again, so?


I know why it is asking for sudo and that was not my point.

Again, my point is: you should never ask the user to run sudo from curl (or actually, don't curl to bash at all). There are plenty ways to sploit that. Again, I know it needs root to move the bin to a root owned folder or update the default editor, but, again, you should never ask (or suggest) the user to pipe shit from the internet, it is security risk.

Also there is not even reason for it, micro is available in every distro.


Hear, hear!

It is completely irresponsible to encourage piping random Internet sources to sudo or to (ba)sh, and I'm quite tired of hearing the ridiculous justification that "everyone does it".


So these are all a bad way of installing?

https://bun.sh/

https://www.rust-lang.org/tools/install

https://deno.land/manual@v1.36.1/getting_started/installatio...

I'm not disagreeing with you, as a beginner, I'm just trying to learn.


"It depends". The answer for this has multiple parts. Though my preference is using my OS package manager (so I'm biased towards that), I'll try to explain my reasoning as best as possible.

And I'm using "OS" in a liberal sense, to also mean "different linux distros" and "BSDs".

---

One, you have to also think about what happens after installing. You have to consider upgrades, uninstalls, and also errors happening during those processes. With these scripts, if something goes wrong you're mostly on your own (though yes, you can go to the specific support channels for the specific software to fix the specific problem for your specific situation).

Your Deno link at least is versioned so it would be less bad to debug if something goes wrong, but that's only relatively speaking.

A package manager deals mostly with static archives, and keeps track of any file controlled by it. It is trivial to know whether a particular file belongs to a specific package or not. Upgrades can be centrally managed. Since it knows about what programs it has installed, it can print you a list of names and versions of everything so you can compare it against a list of CVEs or something.

Custom installation scripts are custom, and each has their own way to do these things.

---

Second, those instructions tell you to run a random script without even verifying its integrity first. No versioning for that script, no auditability trail, no signature to guarantee that the script is indeed the same as the script's author uploaded some moment in the past and has not been tampered with since.

A package manager deals (usually) with static archives whose signatures are verified before trying to do anything with them. An install operation is a simple, boring, "extract archive". Maybe with a small post-install script (that was part of the signed archive).

And the public keys used by the package managers are (usually) already in your local machine, so for most operations you are using an already-known public key to validate signatures.

(Note: Presence of a signature does not imply the software is trusted. It only means that the person that signed the thing had access to the private key.)

---

Third, though this sometimes causes problems (but I have never experienced them, or at least nothing comes to mind), the maintainer of your OS package repository usually knows your OS better than the upstream developer, and can apply patches to the packaged software that can be bug fixes, or even disabling/removing telemetry that the upstream software has enabled by default.

And with a package manager you know there's at least one other person (the maintainer of the package) that has used this program in this specific OS. There may be the case that some versions are just packaged without actual testing (just "git pull & run packaging scripts" or the equivalent), but in those cases there's the escape hatch of installing a previous, working version of the package.

---

With that said. If you were to ask me if there would be a situation where I would use those scripts, then _maybe_ (big maybe) it would be (1) inside a container where I don't care about any garbage left behind because I can nuke the whole thing at once; or (2) in an isolated environment where I don't even care if the script needs root privileges (this second case implies I would not do this in Docker).


do you read the script sections of all the packages you install?


No, it is unfeasible (at least for me) to validate every single package. So at some point I have to trust someone/ some party and I chose to trust more the maintainers of my distro than a person asking me to run sudo from curl.


And some people trust the software authors more than some random repo maintainers who don't have enough time to even make sure the packages they update are actually still compatible with each other.


You do not need to trust anyone if the program is run inside a sandbox. This sandbox is supported in hardware since 80386, but Linux doesn't make proper use of it.


In Linux, to install virus or malware, you need to download, compile, and install virus manually, OR user can install it using `curl ... | bash`, which is much simpler method. Even newbies can install new malware using `curl ... | bash`.


Yes, or I get my packages through a trusted package system, such as pkgsrc.


sudo, yes. piping to a shell is fine imo provided the install script is easy enough to read if you are so inclined. i have seen lots of generated install scripts that are mind boggling.

the reality is that you run code you haven’t personally read and vetted everyday. so the idea that you need some stranger to bless a package for you to be trustworthy doesn’t make a whole lot of sense.


It would be interesting if there was a dsl scripting language with the only purpose of installing software


Personally, I don't want a middleman to be inserted between the application developer and me.

I want to get my applications directly from the application developer, because that way I can make sure that I'm using the officially supported version of the app and that no one has meddled with the code.

If I don't trust the application developer, I simply don't install the application.


> because that way I can make sure that I'm using the officially supported version

No, they (the developers) can’t guarantee that, nor is their job.

You would trust them to develop a reliable software and you would trust the maintainer to guarantee the it will be properly compiled for specific OS version, has the proper permission set and has all dependencies it requires to run.

To ask the software developer to pack and make sure it works for all distribution would be huge amount of work that is beyond their knowledge (which it is not a issue). Also it seems one is not aware how to properly do it when they comes up with a not so bright ideas like the “curl | bash”.

It is not you don’t trust the developer, it is that there are people who know how to better distribute it.

obs.: this is in the context of micro and general FOSS that is easily available and distributable.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: