This could be a direct response to the concerns you voiced in the Pow thread a couple days back. And your input is on word choice? Jeesh, someone's hard to please :)
This doesn't answer your question but here I think is a good place for me to spell out what actually happened between the Pow thread and here:
* Sam Stephenson posted Pow with instructions to install it by piping curl http:// into sh.
* I posted a comment calling that installer directive "borderline irresponsible", while at the same time attempting to convey being impressed with Pow itself. My execution of the message I am trying to convey is not graceful.
* Because I have better name recognition on HN and also an unasked-for but not unappreciated status as HN resident security dork, my comment shot up to the top of the thread.
* Because HN is a community of nerds, a bunch of people jumped on to say (in effect) "hey wait, people are also at risk when they install software from Rubygems, but you don't call Rubygems out".
* I responded to some of these comments. I was careful at first because wow is this a boring argument, and more careful later because wow did that comment thread ever spin out of control.
* At this point, 2/3rds of the comments on the thread are nerds arguing about (or, more accurately, piling on to one side of the argument or the other) curl|sh installers.
* Here I decide to edit my comment to note that it sucks that this argument is taking over the thread, because Pow is pretty cool. I see it as an appeal to please downvote my comment because it isn't germane to Pow.
* Sam Stephenson posts a comment thanking someone else for praising Pow, because he's happy not to see FUD about Pow. I now feel very bad, because I can see why he feels FUD'd. I'm having a bad week, so I don't communicate this very gracefully.
* Sam, obviously still stung (or amused) by the HN debacle, posts "gosh", a program that makes fun of the notion that you shouldn't pipe web pages into sh.
Here we are a little stuck. I don't know Sam, but I know the team he works for and admire it. On the other hand, I do know that curl|sh is a bad idea and am not going to say it's a good idea just because HN commenting dynamics, whether my fault or not, spun out of control. It doesn't matter how much I like 37signals. I still have to rely on my judgement. I may be wrong, but from what I can perceive now, curl|sh is evil.
I have respect for both Tom and Sam, but Tom is in his domain here (even if he's only willing to commit to a vague feeling of unease...I trust those feelings from certain people).
Curl sends page output to STDOUT regardless of http response code. So, if the URI in the command line is typo'ed or if the install script is moved, or the webserver config is borked, etc, the error page gets sent to the shell for execution. Also if the ISP proxy or some other evil bit is broken or unauthenticated-to.
Most of the time it'll be harmless, but it's a factor that is completely out of the control of the developer of the software you intended to run.
And sometimes it might not be harmless. A custom 404 page with inline CSS, or even helpful text could easily contain a command on a parsed newline, or after a semicolon. Hijinx could ensue.
PS: yes, Pow is super cool and I already love it. But the recommended installation method is too clever for its own good. Damn you Ximian.
Thanks for the citation. I'll include that comment at the bottom of this one for reference.
There are two other points I neglected to make in detail previously.
* curl does not have root certs by default so curl https:// no good. In fact, it won't even work (and rightly so):
$ curl https://google.com
curl: (60) SSL certificate problem, verify that the CA cert is OK.
Details:
...
curl performs SSL certificate verification by default, using a "bundle"
of Certificate Authority (CA) public keys (CA certs). If the default
bundle file isn't adequate, you can specify an alternate file
using the --cacert option.
Going through the process to generate a cert bundle is much harder than installing RubyGems or any other package installer that supports signed packages.
* XSS or similar failures that don't involve complete control of the server lead to code execution on the client.
You don't want to make common security flaws on the server result in instant compromise of the client.
While I think the author of "gosh" may consider this a fun little joke, he's picked the wrong side of this issue. Where's the tool making fun of JS crypto? Now that's security theater.
Previous comment with other points:
* No transport security. As many people mention, at least adding HTTPS would help with this. However, most non-browser SSL clients (wget, curl) don't include any root certs by default so even switching to SSL would not help this method. Firesheep, sslstrip, etc. automatically generate a self-signed cert which would look no different to wget than a real cert.
* No persistence. If you download any installer package once and then reuse it on multiple machines, you get the benefit of knowing that the same code was installed on each machine (good or bad). With this method, users may catch the site in the middle of an update and get multiple versions of the package.
* No authentication. Even with SSL, you only get strong transport security. You would know strongly that ".pow.cx" sent you some code, but not how that code got put on the server. With package-signing, typically done on the developer's end system, you know that it was protected even before it was uploaded to some site.
* Easier to trojan than binaries. Inserting a few extra shell commands in a single HTTP(S) session (say, targeting a single client IP) is much easier than building a custom binary package. Consider how hard it is to even compile Firefox with all the dependencies. Now do that work and insert a trojan and upload a separate 10 MB binary that needs to be stored somewhere on the server while waiting for that one client to visit the site. Compare this to keeping a two-line patch to a shell script (easily done in RAM, maybe even by hotpatch).
* Trains users that all the above is ok since the popularity of this "| sh" install method is relatively new. (Yes, I know about shar scripts in the past but those ended by 1996 or so with the advent of real package managers). It is absolutely impossible to retrofit "| sh" to be secure, whereas it is definitely feasible to add package signature verification support to gem or yum or apt or whatever (in fact, all those already support it).
The fact that many installers aren't signed today is not an ok to drive this process back to the 80's. We should be moving toward the future when package signing is a required part of being a software developer. Too hard? Well build tools to make this easier!
It may be possible that I am just insane right now. It's been... a week. I think I'm going to take this message board argument and drown it in a bottle of rye.
Next thing you know they'll be running binaries without sending them through IDA Pro.
(Which isn't to say that I disagree with the security concerns raised of a curl|sh. Just that of course many people don't vet their various source code/shell scripts/executables. None the less, you should give them the opportunity to - a tarball and a detached signature seems to be a pretty friendly approach)
My problem isn't with the claim that this is an insecure method of installing software (it most certainly is), just that people are acting like it's an order of magnitude worse than what most people do regularly: download and execute software from unauthenticated/unencrypted websites. I would wager that many of the people complaining are guilty of that as well.
“Concerned about theoretical man-in-the-middle attacks when piping scripts from curl to your shell?”
This case is where you know what's at the URL you specified, but since it was/will be downloaded over HTTP, in principle a MITM could change the content when it's downloaded by curl, so you're running a different script than the one you reviewed/uploaded.
Yeah, but if you pipe the curl output to your shell anyway, this is the same thing.
Unless of course it's some sort of "timing attack" (advance apology to purists if I'm using this term incorrectly) and the server knows you've downloaded this once (with gosh perhaps) and then sends you the malignant stuff the second or subsequent times.
Edit: sorry, the above makes no sense because you will have reviewed it with gosh and probably wouldn't download it again.
There are lots of ways you can get bad data from a given server over curl, even if using HTTPS and the attacker does not have full control over the server:
* Typo in URL and typosquatter sends you whatever commands they want
* XSS in server-side scripts leads to injection of commands via unescaped tags
There may be others I haven't thought of. Perhaps a DNS glue record can be used to inject shell metacharacters where the server has an error handler that reports the client hostname? The fact that any of these are even possible shows how fragile this mechanism is.
Unless your tinfoil hat is nailed firmly to your head, and you're worried about state-sponsored CA attacks that break HTTPS; or if you're running code from a site you don't trust, for... some reason.
No tinfoil hat needed here. Not only states can compromise CAs, scriptkiddies and hackers can too. As every CA can produce a certificate for every site, only one has to be weak/exploitable and you're screwed.
HTTPS is no replacement for proper code signing and checking. The hacker of the latest CA fiasco also produced a certificate for "plugins.mozilla.org", for example.
However, less funny is that curl doesn't work with https without a lengthy process of creating a CA bundle from Mozilla. This process happens to be much harder than installing a proper package manager that supports signed packages.
I don't know ruby but if I did I'd change the sha256 stuff to GPG. Could support searching for local GPG keys with URL and/or the username out of the github URL. Asks which to use if multiple matches are found, should remember the choice. If there isn't a local key already, support querying specific known keyservers. mit's and ubuntu's come to mind.
Might poke around with this in bash in a week or two.
My two-hundredths-of-a-dollar on this whole 'debate' is that if you can't modify the curl pipe command so that it writes a file instead of running it immediately (as someone other than root. Right? RIGHT?) then you really deserve that Trojan you just piped into your shell.
I am cursed with the affliction of seeing both sides in many situations. Sam is without a doubt right, in that an attack on the users of pow, presumably being small in number, is unlikely. Thomas is also right, in that a situation like this is almost the epitome of low hanging fruit to an attacker with the means and motivation to attack someone installing pow.
I think both parties need to be cut some slack. Sam is in a position where he's just trying to get some things done and make it easy on the user to run some great software. Laudable, without a doubt. Thomas is in a situation where he sees the evil that men do, and just wants to point out a tweak that could potentially head off problems for people wanting to opt in to said great software. Also laudable, without a doubt.
Where I will come down on one side is the release of gosh, which is difficult to interpret as anything but an attempt to mock one of their positions. The adjective theoretical is perhaps one of the sticking points. The problem here is that the transformation from theoretical to actual in terms of a threat is unfortunately just a couple of hours of coding on my part, and I say this with full knowledge that most participants at HN far exceed my skill level. I would use bog standard tools, all of which are already installed on my laptop - even though I am not in the habit of doing such things. For a myriad of reasons, the least of which being industry health, it shouldn't be necessary for me to pull an Eric Butler in the next few hours for this topic to go from theoretical to actual threat.
At the heart of things, there is a disconnect between those in the security industry and those who aren't. If you attempt to be totally secure you'll find yourself in a recursion loop that never exits. If you attempt to just get things done, you can find yourself employing practices that are quite simply horrifying to those who are stuck in said recursion loop. If you attempt to take a moderating view, 9 out of 10 times you'll find yourself agreed with yet your suggestions will mostly go unfollowed. Until some common exploit comes about, at which point those same 9 out of 10 folks will mention that this vulnerability has been known about since the beginning of time.
In my view, we all need to meet on some common ground. Sure, if you don't have http-->sh executions going on there are still 10^10 other attack vectors out there. But for right now, that's more or less the only solution the security industry has to offer. Keep plugging away at low hanging fruit. It raises the bar.
Bottom line here is that Thomas doesn't seem like too bad a guy to me, and I doubt he's looking to tarnish the reputation of a great piece of software. But he's bringing up a good point that is refreshingly actionable. It's an opportunity to make things just a wee bit better with a minimal amount of disruption. I'd suggest that you mock it at everyone's peril.