Hacker News new | past | comments | ask | show | jobs | submit login

> Assumedly programs like apache filter out environment variables properly. But unfortunately, in the validation of input data, they fails to validate correctly input data because they don't expect that data starting with "() {" will be interpreted by their bash child processes. If there's a bug, it's not in bash, but in apache and the other internet facing programs that call bash without properly validating and controlling the data they pass to bash.

Bullshit, this may well be perfectly valid data and web servers are not in the business of shielding shells against their own misfeatures.

And if they were where would they stop? It would require that webservers do arbitrary context-sensitive data analysis of everything doing through in case this is a malformed JSON string triggering a bug in GSON while that is data injected unescaped into an SQL variable and the other one's too big for an underlying C buffer.

You can only end up with a webserver refusing to do anything, because some idiotic application somewhere may misuse or misunderstand anything it lets through.




Your comparison doesn't make any sense. It's an obvious requirement for a JSON parser that it be able to parse input from arbitrary sources, including malicious ones. It's not so obvious that a shell should have to deal with malicious environment variables, due to the reasons outlined in the original post.


'It's obvious' isn't an argument. It certainly didn't seem obvious about YAML for some people [ http://www.kalzumeus.com/2013/01/31/what-the-rails-security-... ] to exactly the same effect.

edit:

"A brief description: Ruby on Rails makes extensive use of a serialization format called YAML, most commonly (you might think) for reading e.g. configuration files on the server. The core insight behind the recent spat of Rails issues is that YAML deserialization is extraordinarily dangerous. YAML has a documented and 'obvious' feature to deserialize into arbitrary objects. Security researchers became aware in late December that just initializing well-crafted objects from well-chosen classes can cause arbitrary code to be executed, without requiring any particular cooperation from the victim application."

So what was 'obvious' then is the opposite of what is 'obvious' now.


Except the JSON parser may be using specially-formatted input[0] as parsing directives[1], the XML library has various custom directives and is probably sensitive to billion laugh attacks anyway, the SQL library doesn't correctly handle part of the DB's dialect, etc… in the same way you've got a shell somebody decided should smuggle code from ENVvars and didn't realise it was implicitly executed OOTB as cherry on the cake.

> It's not so obvious that a shell should have to deal with malicious environment variables, due to the reasons outlined in the original post.

Which I don't care for, my point is that the webserver can not wipe the ass of every bug or misfeature implemented by the shit put behind it. It's just not possible.

[0] e.g. "magic" object keys or keysets, most libraries expose ways to hook into object deserialisation to do exactly that but they could do it by default as well, and I'm sure there are some which do

[1] which is exactly what bash does here


It's the requirement for the JSON parser, not for apache. It's like saying apache should quote SQL strings automatically so that SQL injections can't happen. This is not apache's job!


It's the job of the CGI program, which is the same code that would have responsibility for sanitizing environment variables before calling bash.


There's nothing for the CGI caller to sanitise, how could and why should it know how arbitrary programs are going to arbitrarily misinterpret what it forwards? The CGI script could be Python eval()'ing it or Ruby interpreting it as a local file to display or delete, and it's no business of the CGI caller that they do.

All the CGI caller can and should do is forward correct data as defined by RFC 3875, the rest is not its job.

> sanitizing environment variables before calling bash.

The CGI caller may not even be calling bash, then what? Should it remove anything which looks like valid PHP code because it's calling a PHP CGI? Oh but now the PHP CGI uses system() which creates a subshell which is still holed, and we end back with: if it becomes the CGI caller's job to cleanup data which could be misinterpreted by application code, the only thing it can do is stop working entirely.

Now if you want a mod_bash_is_retarded prefilter feel free to implement one, but it most definitely is not mod_cgi's job to fix that crap, mod_cgi's job is to correctly implement RFC 3875, and the number of times bash is mentioned in RFC 3875 is 0.


I think you're in violent agreement with the comment you responded to in this case (especially judging with what he's written elsewhere on this thread).

He's saying if Apache passes a request to mod_cgi, which spawns "someapp", it is not Apache, but "someapp" that should sanitize the environment before it calls bash.

(and of course if the developer/admin has chosen to write their script to be run by bash, that's their mistake)


A shell certainly is not supposed to evaluate random environment variables.


Exactly. It's impossible for bash to know the difference between wanted variables and possibly-malicious ones. According to Postel's Law, it's Apache that should be more careful with its output, not bash that should be suspicious of its input.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: