Hacker News new | past | comments | ask | show | jobs | submit login
Snowden Used Low-Cost Tool to Best N.S.A (nytimes.com)
95 points by ColinWright on Feb 8, 2014 | hide | past | favorite | 48 comments



To quote Julian Assange: "The more secretive or unjust an organization is, the more leaks induce fear and paranoia in its leadership and planning coterie. This must result in minimization of efficient internal communications mechanisms (an increase in cognitive "secrecy tax") and consequent system-wide cognitive decline resulting in decreased ability to hold onto power as the environment demands adaption. Hence in a world where leaking is easy, secretive or unjust systems are nonlinearly hit relative to open, just systems. Since unjust systems, by their nature induce opponents, and in many places barely have the upper hand, mass leaking leaves them exquisitely vulnerable to those who seek to replace them with more open forms of governance."


The above quote is from Assange's "Goverments as Conspiracies" essays: http://cryptome.org/0002/ja-conspiracies.pdf

They're interesting reads and pretty short.


Didnt wikileaks require staffers to sign an NDA?

http://www.geek.com/news/wikileaks-hypocritical-confidential...


Needing privacy/confidentiality doesn't mean you are necessarily evil...yet. It's almost certainly a sign that one is trying to concentrate power or keep it to oneself.


Gosh I love this statement from Snowden:

"Through his lawyer at the American Civil Liberties Union, Mr. Snowden did not specifically address the government’s theory of how he obtained the files, saying in a statement: “It’s ironic that officials are giving classified information to journalists in an effort to discredit me for giving classified information to journalists. The difference is that I did so to inform the public about the government’s actions, and they’re doing so to misinform the public about mine.”"

It's pretty on point. I'm going to believe he wrote it himself, as it's just the right amount of factual, snarky prose that I'd expect from a sysadmin!


Contrary to the current HN title, the article points out:

Evidence presented during Private Manning’s court-martial for his role as the source for large archives of military and diplomatic files given to WikiLeaks revealed that he had used a program called “wget” to download the batches of files. That program automates the retrieval of large numbers of files, but it is considered less powerful than the tool Mr. Snowden used.

So the tool wasn't wget. curl, perhaps?


Having done this type of work before for a legitimate purpose, it is almost certainly a python or perl script with a nice library in front of it that makes it easy to follow links.

wget is too brittle, not extensible enough, and not as maintainable as a nice python script.


I believe Manning actually used Windows batch scripting to automate wget, or so the government alleged from forensics at the trial. (I observed a couple days of the trial).

Manning did not have the tech skills of Snowden though, she wasn't neccesarily doing things in the most effective or elegant ways, but it worked.


Probably; but could be something like lftp. It's name belies its capabilities.

Or maybe Kermit? Half-smiley; only if he's a masochist. http://www.kermitproject.org/ckscripts.html


Wget is also single threading which is a slow strategy to download pages.


that's what xargs -n x is for


Can you elaborate?


the other day I had the task to batch download product pictures from a website... every picture had a sessionid on the uri so I could't make a simple image wget. I wrote a simple python script that wrote a shell script with a lot of "wget -E -H -k -p \n sleep 30" and ran it trough a cloud server for a couple days... after that, some simple scripts for renaming the pictures, some regular expressions here and the, and voila, 250k perfectly named pictures for my product catalog... (it's for an intranet, so I guess I wont have copyright problems"


FYI, you have exactly the same copyright issues on an intranet. You're just less likely to get caught, I guess.


curl is just a library with a slim command-line interface. It can't scrape pages by itself. Perhaps you're thinking of curlmirror? Even then, I doubt it can be considered more powerful than a good wget configuration.


Nutch/Solr could provide a way to do a crawl, refine parameters, and then feed into a tool to download the actual resources.


[I can imagine some bureaucrat somewhere is saying]

Woah!

You mean he could just automatically download without even using a browser?? And he didn't need an expensive tool to do it? On a normal computer? People like that are dangerous, maybe we should be licensing that kind of knowledge...


> maybe we should be licensing that kind of knowledge...

Or regulate it until it's illegal.


I'm sure Stephen Heymann and his ilk are working on a theory of how wget, curl, and some spiders are "burglarious tools."


In the Bradley Manning case, the prosecution laid a computer fraud charge on top of the espionage and other charges, based on Manning's use of wget.† They argued that, although Manning was authorized to access the files he disclosed, the fact that he used wget to download them constituted computer fraud, since he was not authorized to have wget on his computer.

http://www.washingtonpost.com/blogs/worldviews/wp/2013/07/30...


"These people of the deepweb are crazy!"


The most newsworthy aspect of this post is that the NSA didn't compartmentalize their information. This spidering should have been impossible at the network level, even as an admin.


I used to work on software purchased by the DoD, and the amount of access control, logging, auditing, and other related security features required to fulfill the requirements was astounding.

I totally agree with you that the apparent openness of the intelligence networks information systems is the most newsworthy aspect of this. I don't think that journalists are really knowledgeable at all about intelligence community is supposed to handle compartmentalization. They think the NSA handles things just like they do at their paper.

(The sad part is that apparently, the journos are right.)

The problem of these loopholes allowing classified information to leak was exposed with Bradley Manning. Yet, even in that case, I didn't hear any calls from the press or public to see head further up the chain roll.

But, that should have been enough to highlight the holes in our networks. Apparently it wasn't, and exactly the same thing has happened again.

I want to see heads roll right up the chain for this lack of duty with respect to protecting our national secrets.

By the way, none of the above is intended to imply that I think what the NSA has been doing is legitimate. I am glad that Snowden came public with this information. However, how do we know that other, more critical secrets haven't been sold directly the the Chinese or North Koreans? We only found out about the Snowden leaks because he made them public.


It's worth noting that there was a lot of noise about too much siloing in the intelligence community resulting in allowing 9/11. Arguably, this kind of openness is a consequence of that.


This was my first thought as well, besides the fact that the NSA wants to appear that Snowden was a "hacker" or a "mastermind" of some kind.

Crawlers tend to die when challenged with authentication. Right... But then they go on to make it worse by saying if he hadn't used this supposed "crawler" that their ultra-fancy security would have detected the intrusion.

Really? Really.

I call bullshit. He had access to all these servers, all he had to do was netuse some windows shares and fill up his thumbdrive. Probably took a total of 45 minutes.

I feel bad for the. (wha?) No I dont!


SELinux, MAC, RBAC, SCIF, Trusted Solaris, Cisco AON, and... xcopy??


Agency officials insist that if Mr. Snowden had been working from N.S.A. headquarters at Fort Meade, Md., which was equipped with monitors designed to detect when a huge volume of data was being accessed and downloaded, he almost certainly would have been caught.

This, to me, is the most laughable and amazing part of the article. How did they think they had any security at all if this was the case?

"He would have been caught, if he had done it right in front of us!"


They just want to point out that the majority of NSA employees and contractors can't leak documents the same way Snowden did.


Actually, he is claiming they cannot do this. How, exactly, would we verify this claim? I'm sure he would have made the claim that Snowden could not have done something like this before Snowden did it.


The title is wrong. FTA, Bradley Manning used "wget" and Snowden a site scraper (maybe httrack?).

What's striking is how easy it was for him to get access to all these documents. It probably means that most of things we hear in the news is already old news for other big government / criminal organizations.


The media appears to be fascinated with wget for some reason. Perhaps it's the allure of command-line utilities in a society where such interfaces are seen by the vast majority of people as arcane, or stigmatized as ones used by computer criminals.

The Post did an article on Manning and wget back in July 2013: http://www.washingtonpost.com/blogs/worldviews/wp/2013/07/30...


What an annoying website. I can't pinchzoom. When I double tab, it (after a pause) changes font size. JavaScript is becoming the new Flash, an enabler of annoying user experience.


An amazing new level of awful. Double-tap cycles between three font sizes. The useless hovering nav-bar won't go away. There's some JavaScript trying to make page down do the right thing despite said nav-bar.

Look, NYT: I want to read some text, you want to put some ads next to it. There has to be some solution that is less bad for both of us.


Everytime media got wrong understanding of technical details, I always wonder aren't they supposed to check the fact and ask someone? it's like hearing my mom talk about tech.


Most technical articles that I have some background understanding of are so wrong. It makes me wonder why I should trust all the articles on subjects that I don't know anything about.


The programmer says, man, they totally screw up tech stories, but I love their politics section.

The Congressional staffer says, wow, their coverage of politics is laughable, but I really enjoy their insightful business analysis.

The businessman says, sheesh, they can't get anything right about the world of business. I sure like their great tech coverage, though.

I'm quite sure you should deeply mistrust all media reporting on everything. I see no reason to think that tech reporting is somehow different in how wrong they get it.


That's called the Murray Gell-Mann Amnesia effect.

http://www.patheos.com/blogs/geneveith/2011/08/the-murray-ge...


The common theme I've found over time is that both news and Hollywood depictions of anything you're familiar with are grossly wrong in significant details.


Are you kidding? You talk like they are journalists instead of a bunch of lazy hacks.


Seems like the NYT is trying to create a dangerous, terrorist aura around tools that access the internet through any means but an 'standard' browser.

And by referring to whatever it is as a "low-cost tool", they're probably just making up facts. It's far more likely that Snowden used one of the many no-cost tools that are used every day by people who work with and on computers.

This article is pretty much a no-op other than creating a dangerous mystique around people who access the internet through obscure tools that don't have an icon on the Mac or Windows stock desktop.


It's not the NYT trying to do that, it's the government. (The nyt may be enabling though, yeah, welcome to US journalism).

In Pvt. Manning's trial, the governments' case was that Manning's wget use was a CFAA violation. Manning was convicted of a CFAA violation.

And yeah, at the Manning trial, the government lawyers repeatedly explicitly brought up the fact that wget was an unusual tool that doesn't have an icon on the desktop, trying to make it seem sinister.

And yeah, it is frightening.

(I sat in on several days of the trial as an observer).


Manning used wget, Snowdon used a basic web crawler, headline is wrong.


I have an obvious question if NS cannot secure systems from an outside contractor what indicates that it can full fill its duty to secure US computer systems outside its own offices?

Seems to me we be better at hiring Google for such an office rather than NSA, we might even get better ethics as a side benefit


At Manning's trial, the government alleged that used of wget for scraping was a CFAA violation. Manning was convicted of the CFAA violation.

https://www.eff.org/deeplinks/2013/07/manning-verdict-and-ha...


"Once you are inside the assumption is that you are supposed to be there"

Almost the exact same words used by Al Pacino in the movie The Recruit, where Colin Farrel plays an agent who steals information from the CIA.


Keep a copy of this article so next time your website gets defaced you can serve it as an excuse : "but even the NSA can't protect its stuff" :o


Pretty good article detailing the leak.

Goes to show that many corporate security is so focused on keeping outsiders out and leave a wide gaping hole for insiders.


maybe nutch?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: