Hacker News new | past | comments | ask | show | jobs | submit login

General sanity aside, the whole exploit hinges on the fact that they used string parsing to check for the prefix "http". This wouldn't have been exploitable if they used a proper URL library.



Honestly, they could have just used a whitelist instead of a blacklist.

One could easily fuck usage of a library. Common sense is required.

Attempting to ban "http" as a method of ensuring "https", is obviously less ideal than ensuring "https"... by checking for "https".


URL parsers also have bugs (or at least don't all agree on one parsing if you rely on more than one parser). Just take a look at https://i.blackhat.com/us-18/Wed-August-8/us-18-Orange-Tsai-... for some fun examples.


I watched that talk a while ago. It convinced of one thing you should only have one URL parser in a project, and don't pass a url to any thing that may parse it differently.

It also made it clear that trying to use a URL to restrict stuff is a bad idea. Like the dell updater could only load signed requests which means an attacker would have to get dell's private key for signing.


The sane thing would have been to not use a HTTP server at all. This part is pure laziness. It is trivial to communicate with a Windows service locally through named pipes.


I think you're referring to the SupportAssist Client being an HTTP server - while it is weird that they exposed all those other routes, the driver install route allows for drivers to be installed from a website (which a named pipe would not).

I wouldn't characterize it as "pure laziness" - more a questionable feature


The whole process starts with the installation of aoftware to identify the computer. The vulnerable service is part of that. Thw list of drivers could just as well be shown by a local GUI ghat is started by thenbrowser through an URL handler registered in the system. There would be no need for any of this frankly stupid Rube Goldberg website/webservers interaction. It would be one less TCP server socket in the system.


Do you think that a proper Url library would have protected against a MITM’d DNS attack?


It that library allowed them to enforce connection via HTTPS, then yes.


It doesn't even need a library. A simple regex would have prevented this.


A naive regex would just as easily have exactly the same issue, e.g. "^http:"


That wouldn't have the same issue since the space at the beginning would fail that regex.


I believe that's parent's point. "https:" would be OK, "http: would be rejected, but " http:" would also be _accepted_ it. They looked for "http://" at the start of the string, instead of requiring "https://".

Replacing:

    bool flag2 = file.Location.ToLower().StartsWith("http://");
with:

    bool flag2 = Regex.IsMatch(file.Location.ToLower(), "^http:");
doesn't help. You have to make sure to actually replace http, not just check the start of the line.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: