Hacker News new | past | comments | ask | show | jobs | submit login
FBI arrests author of NanoCore after it was pirated and abused by hackers (thedailybeast.com)
418 points by djug on March 31, 2017 | hide | past | favorite | 291 comments



I can only hope they actually find a smoking gun implicating Taylor as a true conspirator in this case. Because the picture painted by the article puts a whole new meaning on 'chilling effect'.

Did he pick the wrong place to advertise his code? HackForum could just have easily been Hacker News. According to the article Taylor actively worked to defend against malicious use of his software; deactivating accounts he found were using the software to launch attacks, and eventually removing functionality like password scraping and keylogging which ultimately proved too alluring to black hats.

The tool has obvious non-infringing use. That it can also be used maliciously cannot be a factor. If NanoCore is a criminal conspiracy, I'd hate to think what the FBI thinks of Metasploit or Tor.

The article puts forward a good theory for how the FBI might have found themselves in this position. They are used to barging into these guys houses, crawling through all their equipment, and finding actual evidence of collusion with black hats. They are used to pressing these guys to turn state's witness and in the past it's worked out great when a trusted malware provider landed them 100 convictions. I am shocked, shocked let me tell you, the FBI would, upon not finding any real evidence of a conspiracy, press on with charges against a sole developer with $60,000 to their name.

Taylor Huddleston might not exactly be Aaron Swartz, but if the truth is anything like how The Daily Beast is telling it, Taylor is going to need a lot of help and a lot of support to get through this, and I hope he gets it.


You say that the tool has obvious non-infringing use and the article claims that security experts who have examined NanoCore say there’s nothing in the code to disprove Huddleston’s claim that he intended it for lawful use.

I looked at a youtube video of NanoCore [1] and it's immediately obvious that all of the above is bullshit. This is just a modern version of Sub7. [2]

Some features that NanoCore offers:

* Disable webcam indicator light

* Lock computer with a password of your choosing and show a message on the computer. The youtuber says it's for ransom.

* Swap mouse button functions

* Open CD tray

* Keylogger

* Extract passwords of various applications

* Send SYN floods from all your controlled computers

What exactly is the legitimate use of disabling the webcam indicator remotely? Combine this with the fact that NanoCore was originally launched on HackForums and I'd say this is a slam dunk case of a tool being purpose built for illegal activity.

Now whether someone should be held accountable for building such tools without using it themselves is an interesting question. However please don't try to act like this tool was built for anything other than malicious activity.

--

[1] https://www.youtube.com/watch?v=J1uzu6hzSQQ

[2] https://en.wikipedia.org/wiki/Sub7


>>Disable webcam indicator light

Schools and corporation do this all the time for theft Reaction (take a photo of the thief with out them knowing), it is a feature they want, Some Schools have gotten in trouble for turning it on and catching children in the rooms

>Lock computer with a password of your choosing and show a message on the computer. The youtuber says it's for ransom.

Again, legitimate Theft Reaction

> Swap mouse button functions ..Open CD tray

Is that malicious, really. Enough for jail time

>* Keylogger

Plenty of Corporations have keyloggers on their systems, some corporations even go as far as 24/7 keylogging and screen recording while the system is on.

>>* Send SYN floods from all your controlled computers

That one you may have a case for... the rest all have legit purposes used today by Enterprises worldwide


Even the SYN flood could be used for testing DDoS against an internal server for benchmarks. There have been more capable technologies which seem to be absolutely on the end user for malicious intent.

The reference to blaming a gun manufacturer for a crime is spot on. Especially when the people abusing it are pirating the software, you can't even look to the author as their arms dealer, they stole the product and are using it with malicious intent.


It is more about the total package. nmap and ssh could be used to build a distributed SYN flood tool quite easily. But when you combine all of these features into one tool it shapes a picture of intent that gets harder and harder to argue about.

I have great concern as a person who has built and released software whose only real purpose is to perform MiTM on network traffic. On the other hand, my software isn't popular with criminals and I break software professionally. It would take a lot of effort to package most infosec and computer tools into easy to use hacking tools.

We tread a difficult line here, but at some point there is no charitable interpretation for a software package. I think at the end of the day I still lean in this guy's favor, but he makes it really hard. It was for profit software and I bet if we have the whole back story of evidence it will become even more difficult to defend the author. Intent matters, even with software.


What that sounds like to me is you would consider dozens of individual and potentially malicious packages to be benign, but when brought under one umbrella it is considered to be malicious?

Every feature I've read that is included in that software suite has a good use case with zero malicious intent, and often times can be very useful to white hat hackers and system administrators and security analysts alike. I still don't believe it is fault of the author that black hat hackers are pirating and abusing a useful software suite, especially when it isn't being advertised exclusively towards them and the author has in many ways attempted to mitigate or limit harmful uses and users.

Like a gun manufacturer who offers weapons as believer in home defense and the right to bear arms, only to have criminals steal merchandise and use it to rob a bank. Or the guy who invented dynamite which has great uses such as tunneling through mountains only to have it used for derailing and looting trains. You can kill a man with a pencil; that doesn't mean a 20-pack of pencils was produced with malicious intent. Dangerous use cases don't necessarily mean that is their purpose.

I agree it is a difficult line to tread, and, in my opinion it really boils down to his involvement in the criminal activity itself.


I liken this to weaponizing dynamite. It is a step beyond a simple tool. But still just a tool. The criminal activity matters. Also, this software was marketed toward the black hat community based on other threads here and my own understanding of how this software got its popularity.


You don't need a remote access tool to test DDoS. That would be a silly use case for something better accomplished with ssh or a remote desktop tool.


Can you name a major corporation that does "24/7 keylogging"?


I don't know about major corporations but smaller businesses or government institutions (such as my old highschool) use stuff like this.


I hope high schools like that keep getting sued.

Also really glad I'm not in high school anymore.

http://www.pcmag.com/article2/0,2817,2386599,00.asp


Can you point out where its described in NanoCore as "24/7 keylogging" as opposed to a feature that can be enabled when needed (in the case of theft or suspected misdeeds for instance). If not, you're building a strawman.


It's not "building a strawman," it's a direct response to a claim by the parent. It's open to some amount of interpretation how much of a bearing it has on the broader discussion, but if the claim is false it absolutely deserves push-back - especially insofar as these understandings help determine norms.


I know of a call center that use to do it, it was stored with Screen Capture and Audio recording data,

Basically everything the employee did while interacting with the customer was recorded..


Some of the tax prep and strict financial institutions do.


I've done work for gigantic investment banks, hedge funds, two of the world's largest retail banks, several insurance companies, and three major trading exchanges. None of them keylog. Can you give me a more specific example of "strict financial institution" that does keylog all its employees?

At every F-500 company I've ever been posted to, the logs produced by a keylogger would be considered a far greater threat than anything the keylogger itself might detect. I can't imagine the regime you'd have to come up with to protect those logs.


It would be a nightmare having to classify this flood of data, store it, manage its lifecycle, identify (or de-identify) it, understand its risk properties from legal, privacy, and insurance perspectives, manage its domicile(s)... Hard to imagine what benefit would outweigh all the cost and risk.


Bloomberg.


Comcast.


> Send SYN floods from all your controlled computers

Seems potentially useful for stress testing. Especially if you're trying to make your service more resistant to DDoS attacks.


This is about as credible as saying that the Michelangelo virus was really intended as a remote wipe tool for IT departments.


There's a long tradition there, wasn't BO2K non-ironically marketed as a system admin tool?


No, it was ironically marketed as a system administration tool, for exactly the subtextual reason as this story suggests one might.


Thanks for posting the video, it's a nice demo of the tool. This is actually a really awesome program I could see myself using to control and monitor my own machines.

Yes, this clearly has the kitchen sink of functionality thrown in. In many ways this looks like a platform for experimentation, with the developer clearly building this as a labor of love and learning as much, if not more so, than a commercial enterprise.

> What exactly is the legitimate use of disabling the webcam indicator remotely?

In a context menu labeled 'Swiss Army Knife' there's an option to Enable/Disable webcam light. There's also options like making the computer speaker beep. By the way, most video recording software provide options to toggle the webcam light in advanced menus.

Under an option called 'NanoStress' you can send several traffic patterns. Yes, it even has a bit of iPerf built in.

Frankly this is a really neat tool with tons of useful applications. It's highly distressing that this could get you raided and charged in Federal court with conspiracy.


You can see yourself installing a RAT with a keylogger, webcam light disabler, and SYN flood feature to manage your own computers?


Certainly, this program has a number of features which I needed on a daily basis in my WiFi testing lab. We had hundreds of headless machines running in isolation chambers which needed automated tools for remote controlling all aspects of the system.

Back in the day, we programmed our own agent to do things like provide remote program execution, file system access, NDIS/OLE/DCOM control, traffic generation, packet capture, system profiling, key-press and mouse-click macros for UI automation, etc. We had many of the same options for automating how the agent was installed, such as customizing the build for automatic deployment across various environments. We had automated PXE combined with a Ghost program where we could snapshot and deploy images to the machines straight from a TCL API. We had ways to throw up screens on the UI to indicate tests were in progress and lock the machine for interactive use.

About 15 years ago I actually spent several man-years building much of the functionality which is now contained within NanoCore. And while we didn't provide specifically SYN flood, we wrote wrappers for iperf to be able to load the executable onto the machine, and a TCL API around running iperf in server or client mode and capturing and parsing the output. We also wrote our own L2 traffic generator which trivially could have been used to generate SYN floods, although we were more interested in particular with generating "pure" traffic patterns to find the synthetic maximum possible throughput, as well as ideally sized packets for WiFi range and ACI testing.

The Azimuth WSC -- as it was called -- met every definition of a modern day "RAT" except that of course it's official purpose wasn't malware.


FWIW, Apple's Remote Desktop tool does almost everything this guy's RAT does.

Except: You can't disable webcam lights with it. And you can't SYN flood (directly) with it. It is trivial to run a few shell commands and install tools that let you SYN flood. Why would a RAT include that by default. I think it will come down to a couple of the small hard to justify features coupled with the overall packaging and history of the software that really spell out a story of intent from the author that lands him in jail.

Intent, and the story, that gets told in court really matters. Plus we don't have the totality of the evidence.


You should think about writing up a statement and sending it to his legal defense.


Why not? Could be used to see if you have been hacked and there is something/someone else using it no? A different way to monitor the logs of a system


FWIW, I specifically wanted to disable the webcam light on my own computer and was both frustrated and relieved to discover it was - at least - difficult.


Are the features selectively installed, opt-in installed, or configured downloaded, as with most modern software?


He can see himself claiming that to make a point, anyway.


One very simple use case that would apply to almost any sort of hack, no matter how bad, is to lawfully use it to show someone that it can be done. Say a teacher using this at school on a sample laptop as a PSA to let students know the danger of leaving actively connected webcams in their room and not trusting the light indicator on them to show when it is working. Yeah, you could just tell the students, but sometimes showing them the hack in action works far better to convince them.


Educational use is definitely interesting. That's largely how I've become an expert at security myself, because there is so much security related information & resources available under the disclaimer of "educational purposes only".

However I do think that there are some interesting corner cases. Looking at extreme cases, what about selling nuclear bombs for educational purposes? There are certainly scientific tests that could be done with the bomb and humanity would be better off for having done it. However I think that selling nuclear bombs without restrictions, or launching the sale campaign in the middle of Raqqa would lead to undesirable results.

This leads me to belive that we as a global society certainly aren't ready for every tool to be available unrestricted for educational purposes. What's more, I don't think we're even ready for every tutorial to be available unrestricted, because the cross section of people who can follow the tutorial and also want to end civilization as we know it is still too numerous.

I also don't like censorship or the idea of hindering scientific progress. I would definitely like to progress towards a world where nuclear bombs can be sold at WalMart and nobody would cause problems with them.

We're not there yet and I'm not sure how exactly we can even get there. Until we do, as much as I hate to say it, even educational purposes will have to be sacrificed for the greater good. Where exactly we draw the line is a tough nut to crack and I personally don't hold a strong opinion of a specific line yet.


I think there's a lot of room to draw the line between nuclear bombs and a RAT.


Looking at Stuxnet [1] the distance between those two is less than most think, and the distance is only decreasing. More important is the takeaway that it's getting easier and easier for a misguided teenager to cause industrial scale harm. So the classic problem of a punk kid breaking a window gets amplified.

Of course there are other options besides banning software to improve the situation. Among them is increasing awareness of the possible threats, and that software like NanoCore makes mischief easier to execute.

--

[1] https://en.wikipedia.org/wiki/Stuxnet


Stuxnet didn't kill hundreds of thousands of people. I think that while the distance has narrowed, perhaps it isn't as narrow as you think.


What a peculiar thing to argue, but in the same spirit most nukes haven't killed a single person. The ones that did had really good delivery mechanisms, which don't come prepackaged with the nuke.


Are you really arguing that nuclear weapons and stuxnet are similar because most nukes haven't been used on people? Why?

I get that you're saying Stuxnet is an example of programming having real world, physical effects, but this is a very strange argument because a lot of things that we have no moral or legal issue with anyone owning have the potential for outsized physical effect. Nuclear weapons have been used to kill hundreds of thousands of people, so the line we spoke about earlier, they belong on the 'not for everyone and ideally not for anyone' side of it, along with chemical and biological weapons.

A RAT doesn't quite seem to go that far.


> Are you really arguing that nuclear weapons and stuxnet are similar because most nukes haven't been used on people?

Definitely not arguing that they're similar. More so about the difference decreasing at a greater rate than people seem to realize.

Taking a step back and talking in more general terms. Nukes are dangerous because they allow one person to do harm to masses. The same statement is increasingly more true in the software world. I feel like this isn't understood well enough (or is ignored?) by most people.

As an example, we're putting more and more software into cars, internet connected software even. If this software follows the security practices of almost any other software, then it won't take long until malicious users will move from opening CD trays to car doors.


Usually educational use tools leave a lot as "exercise" to the reader. They don't come with features that make malicious use easy. And you usually frame the intent of the software around that. Not as a for pay tool. So that argument does have validity... in an open source project or white paper describing some technique.

There is a reason researchers often leave some details of an exploitable vulnerability write up left as an "exercise" for the reader.


And you think that teacher would go buy it on a forum devoted to hacking?


I've only read the article linked here and had never heard of this tool before today so grain of salt and all that.

But the article did mention he was hoping "anxious parents" would use it to monitor their kids activity.

As a parent of two teenagers and a twenty-something I could see myself wanting to turn on a web cam without them knowing it. I hate to say it but my kids get up to some crazy stuff on the web and have defeated a lot of my efforts to monitor/block that activity. In the end I've had better luck just sitting down and talking openly with them about it. But there was a time where I was frustrated enough that I might have sunk to plain old spying on them...

Anyway, that's just one use I could see for disabling the cam indicator.

And the article does mention youtube videos and how frustrated he was to see what people were doing with the tool. He even added a "feature" where the user's license number was displayed in the program so if he could see it in such videos he could disable their copy. Did you see the actual author in any youtube videos?


I'm also a parent, and I am of the opinion that being a teenager doesn't mean you should have your computer hacked to be monitored like this. There's all kinds of legitimate admin tools you can use to monitor their activity. If they deliberately circumvent them, physically take the computer away. Turning on a webcam surreptitiously to monitor them is not cool.


Or you could, you know, just teach them the consequences. Eg, talking to catfish, child porn charges, etc.


I think I said as much in my comment.


>As a parent of two teenagers and a twenty-something I could see myself wanting to turn on a web cam without them knowing it. I hate to say it but my kids get up to some crazy stuff on the web and have defeated a lot of my efforts to monitor/block that activity.

What the fuck. It's well past time to give real responsibility and freedom. This sort of behavior can and will cause long-term damage, pain, and resentment to your own children. And for what, to selfishly assuage your own anxiety?

IMO, riding roughshod over a basic fundamental need for privacy is child abuse, and it ought to be more widely considered as such.


How many children do you have? And how many times have they caused threatening letters from ISP's to be sent to your house for downloading copyrighted material? How many times have you intercepted drugs in your mailbox that were bought off the internet? How many times have you had to worry that one of these days their social activities were going to catch up with them finally and get them into real trouble?

For the record to all those that replied, I never actually spied on my children through a web cam hack. I said in my comment that I had been frustrated enough in my life that had I read this article then I may have considered it.

And, no, children living in my home using my internet connection are not entitled to the same level of privacy that you and I are. Just as they are not entitled to drink, drive, vote or join the military. They need boundaries and guidance. To the extent that they are doing well in school and socially and leave me little room to worry I'd guess they enjoy more freedom than most of their friends. When they abuse that trust the reins get drawn in and you can darn well bet I'm going to do everything I can to make sure they don't get into further trouble.

And less you draw any conclusions about the twenty-something I mentioned, I was speaking about when she was a teenager and living at home, I didn't try to block or monitor her habits once she became a well rounded adult who looks back on her own behaviour now that she's also a parent a bit sheepishly.


I'm sorry, I definitely read more into your post than is there. I've found that abusers often use more legitimate overt goals as narrative cover for covert abusive techniques. "Protecting your children from things" is often a pretext for fulfilling the parent's need for control and dominance. Or for a more graphic example, people who handle their anger or frustration through physical abuse will claim that they spank their children for disciplinary reasons.


> What exactly is the legitimate use of disabling the webcam indicator remotely?

I want to do that on a Pi camera at a remote location so animals aren't disturbed, or the people aren't aware they're being filmed.

> ...a tool being purpose built for illegal activity.

Intent isn't determined by how others use something, or what a YouTube video says the intent is.


> I want to do that on a Pi camera at a remote location so animals aren't disturbed

Put a tape over it


turning off the light seems easier and more thorough to me.


Just removing the light that has no practical purpose in that case seems more thorough yet.


Is SYN flooding also to prevent disturbing animals?


No, those are to win http://ipv6tree.bitnet.be/


>> "... a tool being purpose built for illegal activity."

Attempting to discern the author's design intent ex-post-facto strikes me as very difficult.

Also, this does not admit the possibility of a tool being designed for a nefarious purpose, but later turning out to be useful for beneficial purposes.

E.g. A rifle designed to kill people later turns out to be great for deer hunting (trivial example but you get the idea).


> * Disable webcam indicator light ... What exactly is the legitimate use of disabling the webcam indicator remotely?

Cheaper than opening the laptop frame and cutting/shorting the LED, allows you to take pictures of people trying to log into your computer without them noticing. Company computer does this on every invalid login, and makes me review them when I log in successfully.

They also make sure when I enter my password I'm not reading it from anything (like a phone or a text file), and when I use the 2FA card that there aren't any wires sticking out from it. I also imagine it takes random pictures, and I'm glad the light is disabled because it would probably annoy me to have it blink all the time.

> * Lock computer with a password of your choosing and show a message on the computer.

My company laptop does this if someone steals it, telling them that this computer is stolen and has embedded GPS and 3G in it so the owners know where it is.

> * Swap mouse button functions

Left-handed people. Sinister, I know.

> * Open CD tray

Before DVD and Blu-Ray made it common to fit everything on a single disc, many applications would copy files from multiple CD discs and would open the tray as a prompt to swap the disc at various stages of the installation process.

> * Keylogger

One of the environments I work in has strict audit policy: Everything is logged to make sure nobody is passing notes in some other way (chat, a text file on a shared server, a drafts folder, etc). Indeed, someone was using a keystroke-stuffing USB device to transfer files into the network and the keylogger was the tool that detected it.

> * Extract passwords of various applications

I use the Apple Keychain. It has an option to let me get my password. This is useful when I log in via the website, which saves a generated password, but then I log in via the app and the idiot programmer has made their own password prompt (instead of using the iCloud-enabled password API).

Sometimes I want to go the other way, and I'm lucky that most programmers are idiots and just wang cleartext passwords into a text file or a sqlite file, or trivially "encrypting it"[1] because otherwise I might not be able to recover my access.

[1]: https://www.unix-ag.uni-kl.de/~massar/bin/cisco-decode

> * Send SYN floods from all your controlled computers

I usually use hping to do this because I want to know that I can protect my applications and services from anyone who will spend less than they can make by knocking my services offline. I wish more programmers did this, but I'm frequently disappointed by bad engineering.

Honestly, that someone is so willing to judge someone they don't even know as malicious simply because of their own ignorance and lack of imagination, is what I find most disappointing of all.

> Combine [all of] this with the fact that NanoCore was originally launched on HackForums and I'd say this is a slam dunk case of a tool being purpose built for illegal activity.

It probably is, but if that happens I hope it will be overturned like some other slam dunk cases[2].

[2]: http://legal-dictionary.thefreedictionary.com/Jim+Crow+Laws

> Now whether someone should be held accountable for building such tools without using it themselves is an interesting question.

I've never pondered it before, and I feel like spending my time arguing about it could be better spent elsewhere, so in my view, it is the exact opposite of an interesting question.

> However please don't try to act like this tool was built for anything other than malicious activity.

Only if you don't try to act like this tool was built for malicious activity, because to be completely frank: You don't know this person, or this tool, or this space, and that the US government agrees with you isn't evidence that you're smart or right, nor will I respect your prejudices for it.


> so willing to judge someone they don't even know as malicious simply because of their own ignorance and lack of imagination

> You don't know this person, or this tool, or this space

I've been involved in the security space for 17 years now. I'm very well aware how these tools and features are used in practice. That's the primary reason why these superficial excuses don't work on me. This isn't abstract theory for me, I'm talking from direct experience.


I've been programming for nearly forty years at this point, and consulting for more than the last twenty.

I've seen each and every one of these "superficial excuses" in companies with 1000 employees or more.

If you haven't seen them, then "being involved in the security space" isn't making you as experienced as you think you are.


Can you name any company with 1000 employees or more that has used NeonCore? Or are you talking about other tools? That's really the crux of the situation, the packaging & intent - not the individual features in vacuum. I've seen plenty of people open CD trays and swap mouse buttons for left handed use, but none of them use popular trojans like NeonCore or Sub7 to achieve these tasks remotely.


> are you talking about other tools? That's really the crux of the situation,

No, it's not the crux of the situation. You said:

> > > Now whether someone should be held accountable for building such tools without using it themselves is an interesting question.

So you're clearly talking about any such tool. If now you want to just talk about NanoCore, then you're moving the goalpost, but it's still not going to work:

> > > > I looked at a youtube video of NanoCore [1] and it's immediately obvious that all of the above is bullshit.

...because your source is a youtube clip, and not the admission of any personal or experiential knowledge of the tool, to wit you list a number of features outlined in the youtube clip as specific evidence that the software was designed for illegal purposes only, to which I argue convincingly that those features are not evidence, because I have used those features in large companies.

Don't be a troll. You can be better than this.


I've never moved the goalpost, your interpretation may have changed though. Regarding accountability I'm talking about any tool which contains all those features [1] packaged together in high concentration. So NanoCore, Sub7, Zeus etc. Beyond that I've talked specifically about NanoCore. You said the accountability question isn't interesting to you and commented plenty on the specific features. It increasingly felt to me that you're building a case for every feature separately, which is why I brought it explicitly back to the package. I've never argued for the features being inherently malicious in a vacuum, so arguing that with me seems like talking past eachother more than anything else.

Regarding describing personal experience around software designed for illegal purposes, I don't feel like the benfits are worth it for me right now. So you'll have to live with parallel construction. [2]

Unrelated to NanoCore, just as a friendly suggestion, you should cut down on the ad hominem. In every reply you've made to me you've managed to slap on a personal attack. First you called me ignorant and having a lack of imagination. Next you call into question my experience. Now to top it off, you've moved from ad hominem to straight up name calling, accusing me of being a troll. [3] Tactics like these don't help me understand your arguments any better, and I would bet they don't help others reading either.

--

[1] I want to be even more clear in that when I say "all those features" I also mean the truckload of botnet controlling & deployment features that I didn't list in my comment but exist in NanoCore and other competing software.

[2] https://en.wikipedia.org/wiki/Parallel_construction

[3] Name calling being the lowest point in Paul Graham's excellent essay about disagreement. http://www.paulgraham.com/disagree.html


Does it matter that the CIA and FBI are building these tools all the time? It takes a few characters to blow away a drive on the Linux command line, should those characters be illegal because they're potentially dangerous?

Banning software you don't like is a slippery slope, researchers publish proof of concepts far worse all the time.


Is this any different than the tools CIA employees and contractors built which were then used to conduct espionage against lawful American companies?


"Do as we say, not as we do"


"Well, when the president does it, that means it is not illegal." --Nixon


Do you also think the Metasploit authors should have their homes raided, and criminal charges raised?


Also? I haven't been a proponent of any raids. I'm not even sure if in-the-mail charges are in order for creating these trojans without using them offensively. I'm talking about the intent and purpose of the application, not what should be done about it.


> What exactly is the legitimate use of disabling the webcam indicator remotely?

We want our webcam light to blink when there are connection issues, or transcoding issues during a conversation and we had to degrade QoS.

We use blinking to indicate degraded service, and multiple light colours to indicate level of service.


On first pass I was pretty alarmed and thought this was a scary precedent being set. Further down in the comments I believe it was likened to arresting a gun manufacturer for someone pulling an armed robbery, and that the major vendors are getting a pass for the same functionality.

However, I disagree. The functions listed above are not mainstream use cases for legitimate software. I also think of it as the same thing as arresting someone who hosts a child porn or silk road type of website. You may not be doing the crime, but you certainly are facilitating it in a big way.

What I also think, however, is that this is probably a misguided kid with aspirations that exceeded his business savvy. He probably could be mentored into using some of his skills for good, so I hope the FBI doesn't proceed to ruin his life.


> The functions listed above are not mainstream use cases for legitimate software.

Who decides what is "legitimate" software? Do you want to live in a world where you have to get the government's approval before writing code?

How could a feature be "mainstream" if it isn't included in software? Should we have arrested Steve Jobs for the Macintosh because GUI wasn't "mainstream" when it came out?


For that matter... if it's not legitimate, then why does the hardware have the ability to do it? Why does camera hardware allow the light to be disabled during recording? I mean SYN is useful, and SYN flood might be useful for systems testing... that said, there are other tools for that, and RAT probably isn't the right place.

In any case, this is definitely a slippery slope as there are "Security" companies that provide software that does all of this that act as US, local and other govt vendors.


>Do you want to live in a world where you have to get the government's approval before writing code

I'm not saying it would be a good idea but there is a very clear comparison here to building permits.

For example, you may want to remodel some part of your house and remove/replace some walls. To do this, you must get the governments approval.


It is simple, the law decides.. Whether the law is sane or sensible is a different matter ..


Which law provides an unambiguous definition of "legitimate software"?


"legitimate software" sounds a bit like "legitimate ideas"


Slippery slope.


It seems that functionality is provided by plugins though? You can see in the menu that he has at least "MultiCore" and "Nanonana" installed. I think the default version of the program does not come with these, e.g. here is a forum thread about someone asking how to disable the camera light and getting pointed to a plugin:

https://webcache.googleusercontent.com/search?q=cache:RGNVD_...


Good point, the only dubious feature is the "recover passwords" one, but others clearly black hat in nature like "recover steam file", "open meatspin", "SYN flood", "lock computer with password" are from plugins.


Why are indicator lights software controllable?


It varies by manufacturer/model. Some have a hardware/firmware only solution that can't really be turned off from the OS. Other times it's controlled by the driver. Often there isn't an API for controlling it, but you can memory patch the driver. As for why, I'd say it's just another case of security design by people who don't fully understand security. Similarly to why toy bears are accessible from the internet. [1]

--

[1] https://arstechnica.com/security/2017/02/creepy-iot-teddy-be...


While i'm not going to sit here and pretend that this RAT was built for law activity, metasploit payloads don't provide any indication to the 'victims' that their system has been infected either.


Ergo we should immediately arrest all gunmakers because it's hard to think of another use for guns besides killing things or practicing the art of killing things, no?


Just because you don't have a legitimate use case doesn't mean someone else doesn't. Your mentality can be very dangerous with regards to eroding our cherished liberties.


Is there any legitimate use case for triggering SYN flood ?


Stress test your servers (although, you could argue this could be done at a better level from a non remote tool).


I hope you understand they where 3rd party external plugins, not created by Huddleston.


Some of these feature remind me of a "tool" back in the mid/late 90's.


I am inclined to agree with you, but man, it still scares me.


Please visualize and define various malicious activity.


What exactly is the legitimate use of disabling the webcam indicator remotely?

Employer monitoring. Parental monitoring.


If you're shocked it just means you haven't been keeping up with how the US treats it's so called citizens.



Aha good one.


It couldnt be "hacker news" .. becuase hacker news has nothing to do with unethical hacking, which is a primary topic of where he advertised and supported his product.

If the product was talked about there, used there, etc, that would be different. The fact that he actively engaged in that community as the author of this software demonstrates pretty clearly which of the "dual use" sides he was intending on cashing in on...


>Did he pick the wrong place to advertise his code? HackForum could just have easily been Hacker News.

Spoken like someone who hasn't spent a lot of time on HackForum. 99% of the content is super low quality and/or obviously criminal. Lots of money changes hands.


I guess this is why we have people like Satoshi, and now the MimbleWimble(https://www.youtube.com/watch?v=XiUGu48JTd0) team all working under Potter-esque pseudonyms


I imagine their argument would be that his claim that the tool was not for illegal use was just a subterfuge, given the forum. Isn't this kind of the same reason a head shop doesn't want you to even mention anything but tobacco?


It is a bit of a red flag that "handgun" was the analogy of choice. It says to me that there is unlikely to be a morally legitimate use, only a possibly legally legitimate use.


Most branches of moral philosophy and ethics consider self defense morally legitimate.


Sure. The main question I'm asking is, "why that particular analogy"? I mean, he could've even said hunting rifles (where legitimate use is more obvious). Why handguns?

IMO, it's a matter of explicit vs implicit functionality. The explicit function of a handgun is to do violence to other humans. Self-defense is only implicit/secondary.

So, to analogize this software to a handgun is to acknowledge that the most obvious use for the software is to hurt others, while "self-defense" is only secondary.


Because it's come up recently and congress voted it down and the courts dismissed it. It's established case law and congressional law that supports his case strongly.

This software is used for self-defense. If you install it on your laptop and someone steals it you can lock them out and take pictures of them, etc. (it is your property after all).


I generally think it's crazy to hold someone responsible for the software they wrote, even if it has no theoretical "good use". The person using the software should be responsible. That being said it's even crazier if there is a legitimate use (network monitor etc.) which seems to be the case here. Where exactly do you draw the line? If an attacker uses Windows or Linux...is that evil software. If they phish with some mail-tool is that evil software etc. etc.


Totally agree. It's a freedom of speech issue. Just like you can't hold the Beatles responsible for the Manson murders, you can't hold someone expressing themselves in code responsible for malicious use. See: NRA lobby white papers.


It isn't a freedom of speech issue. Free speech doesn't cover using words to commit a crime. Saying "handover the cash or I'll slit your throat" is still robbery.

The law handles this fairly well in theory. You are only responsible for aiding a crime if you acted to assist the crime with the required mental state. The required mental state for most crimes is recklessness.

So you can sell someone a gun but you can't sell someone a gun if they say "I'm going to murder my wife."

Doing something totally legal to assist a crime is a crime if you were acting intentionally to assist the crime. You can be charged for giving the Gettysburg Address if the point was to get people in one place to bomb it.

Here the FBI is wrong. As of now, there is no evidence he intended to assist in these crimes.

But if you created some software designed to hack a company you hated and then released it widely hoping that someone would use it? That is a crime.


Robbery is a physical action.

The words are not robbery.

What if they were uttered in a play? What if you're joking around with your friends? There are valid, legal uses for them.

In the absence of criminal actions such as robbery or assault following those words, a court would need to prove criminal intent.


Words can definitely be the actus reus of a robbery. You also need the mens rea.

There are very few actions that are always criminal. Sex with a minor is the only major one. Things like speeding if you count small fines.


The standard for when words are criminal is much more narrow than that in the US. It isn't just words implicated in crimes. Off the top of my head, there are quite narrow interpretations of libel, slander, and incitement to imminent lawless action. Even in cases such as insider trading, it's the acting that makes the crime, not the fact someone said something.


Absolutely agree. I hope he has attempted to contact the EFF over this for counsel.

As I understand it, what under similar policies, they could arrest a mapping product programmer for some nefarious third parties who plan terrorist attacks.


> they could arrest a mapping product programmer for some nefarious third parties who plan terrorist attacks.

What if the programmer distributed and supported the product on a terrorism planning site, and worked to ban anyone who mentioned using the mapping product for terrorism? And who knew of a plugin for his software designed to calculate bomb shrapnel distribution?

The guy sells his software through a hacking site, provides a plugin API, but states that it shouldn't be used for hacking. The situation seems similar to ROM sites that say "only download if you own the original game", smoking stores that sell bongs "for use with tobacco products only", etc. He'd have a better defense if he sold the software through a forum for Windows enthusiasts, or something.


Let's hope they don't throw 50 years of prison or plea deal at him, as they tend to do, and that he doesn't take the plea deal. Since software has already been defined as free speech, I think there's a very good chance he should win this.


At the very least, if we're holding people responsible for their software, we should be holding car manufacturers responsible for crashes and firearms makers responsible for murders. "Oh, you were drunk and crashed your Prius? Put the engineering team in jail!"


Well, some actually try. Don't give them ideas.


I know, I know. I'm in the uncomfortable position of being both pro gun, and pro gun-control, so I get the worst of both worlds. On one side people think you should be free to transfer a gun without a scrap of record keeping, and the other thinks that if you beat someone to death with a cell phone, you should sue Samsung.

The total systemic paralysis is real.


It seems like the Feds' case depends pretty heavily on the fact that he initially advertised the software on a forum that was devoted to malicious computer hacking.


And the educational factor of "it can be done" is essentially all the justification needed to supply PoC with bug reports.

I mean, take:

http://seclists.org/oss-sec/2017/q1/675

"I am able to crash a RHEL7'ish system with the above PoC quickly."

So, someone takes down some critical system running RHEL7 with this (even if it is just a crash) - and the author is on the hook because the only use for the code was educational and "crashing a system"?


The issue is that this person wrote the software and then profited and advertised it specifically for malicious uses. That implies intent, and intent matters.


Except there is opposing intent of him disabling the software of people implying they use it for illegal purposes.

And there is an explanation that moves his advertisement there from intent to negligence.


If smith & wesson promoted their guns in a forum known for plotting murders, do you think they wouldn't be held accountable?

If ford promoted the f150 as being able to drive though 50 people without slowing down, would they not be accountable?

The software isnt the problem, the software has legitimate use... however, it has nefarious use as well, and he promotes that side of it..


I mean, there is such a thing as ethics. A programmer writing software with malicious intent or with the explicit purpose to defraud, undermine, or otherwise harm another person and/or their property should absolutely be held responsible for the code they write.


> A programmer writing software with malicious intent or with the explicit purpose to defraud, undermine, or otherwise harm another person and/or their property should absolutely be held responsible for the code they write.

Then I think Cisco, Microsoft, and all the other NSL, backdoor inserting, government roll-over companies count in this category too. Do you not see how slippery the slope is you are arguing for?


The code or the actions?

I'm all for holding people accountable for bad actions but I don't think that writing code is enough of to show intent to harm.


I don't know if it's that simple -- if I print out fake dollar bills it'll probably be treated differently if I sell them as movie props than if I sell them as counterfeit money you could pass off as real.


If the printed money is identical? No, you'd be treated the same for both.

If you're talking about different bills then your analogy obviously doesn't apply.


Are you sure about that? http://www.omaha.com/townnews/crime/prop-money-used-in-movie...

> Owning prop money in itself is not a crime. But it's a crime if people try to pass the prop bills off as real money, said Capt. Jim Duering of the Grand Island Police Department.

Seems like the same principle would apply if you were selling it for the purpose of enabling fraud.


You're not allowed to print fake money that looks like too much like real money. See https://www.marketplace.org/2015/03/10/business/tricky-busin....

From the article: "Essentially what this law says is that bills must be either 75% smaller than or 150% larger than the size of a real bill and one color, one side."

So printing fake money could be a crime even if you don't attempt to pass it as real.


The action makes it wrong not the code itself. Writing code cannot be evil, it's the user's decision to exploit the code for own selfish gain.

ex. See people using cars or knives or guns as weapons to commit crimes, we don't throw in Toyota or H&K executives in jail. Just because an item was used by someone, the inanimate object by itself do not display intent, intent is something that can only be held in the minds of a person.


Intent is what matters. Writing code can most definitely be evil when written with evil intentions, or with the expectation of having the code used in malicious ways.

It's a very different thing to write code with good intentions, only to have it repurposed by others for nefarious intent.

It's obviously not a black and white matter, or we wouldn't be having this discussion. The important thing to remember, though, is intent. What was that code or feature originally intended to do? If it was intended to be malicious, then why write the code?

I actually do not believe this is really a legal issue, and is instead a moral and ethical issue on a personal level. It is only becoming a legal issue because we have no other way to deal with it, since not every person operates on the same moral code. If more software developers went out of their way to hold themselves to a higher ethical standard, maybe fraud wouldn't be as prevalent as it is today, and we wouldn't need to have this discussion in the first place.


code itself cannot show intent. the code is executed when the person chooses to run it to fulfill their intent which can be judged to be bad in hindsight if it caused harm to others.

a gun by itself doesn't show intent to commit assassinations. only when it's taken by a revolutionary and aimed at the heart of capitalist pigs will the intent be realized.

the gun maker cannot be held responsible for creating a device that intended to kill someone important.


Yes, but if the gun maker started advertising on the Underground Assassins Network it might be a different story.


I guess you are trying to tie this back to HackForums as a way to suggest intent, this is a pretty weak argument. Unless there's private messages like "oh hell yeah my nanocore is going to be loved by criminals, I'm going to code the best keylogger ever.", there's no intent whatsoever to pin here.


Evidently the prosecutors don't think it's so weak an argument.


You think security research should be illegal?


> I mean, there is such a thing as ethics. A programmer writing software with malicious intent or with the explicit purpose to defraud, undermine, or otherwise harm another person and/or their property should absolutely be held responsible for the code they write.

Congratulations. You've just publicly stated you should remove your game from Steam and anyone else who distributes DRM rootkits. Rootkits are re-purposable into malware too.

Will you? No?

Color me surprised. /s


I'm confident there is an extremely large gap between a game developer with an open-source video game on Steam (which does not use Steam's DRM feature or any 3rd party DRM) and a developer who writes a malware rootkit.

Like... a HUGE gap.

wtf.


You are aware Steam distributes rootkits, yes?

If you use it, you are just as "guilty" as the logic used in the OP.


Agreed, but selling software you wrote to a person while knowing they intend to commit a crime with it on the other hand...

Not sure that happened here but I don't have a problem with that being illegal.


borderline industries such as drug and gun production are heavily licensed and controlled by the state. I believe this case should fall into a similar category.

>Where exactly do you draw the line? You can kill with a hammer too, but noone will hold hammer maker complicit.


No, they are controlled by lobbyists working on behalf of the industry to limit entry into the market. It's about profit, not safety.


"No" what? Could you explain what do you mean by that. Just to be clear, I didn't make any statements about control.


You say the e.g. gun industry is "controlled by the state". This is wrong; the industry regulates itself by means of lobbying.


How could that possibly work? Gun manufacture requires machinery and resources, and drug production requires specific knowledge that's not widely disseminated. Anyone can learn how to write software.


Realistically more people could learn to run a CNC milling AR-15 parts, than could learn to write code of similar efficacy.


Well, I've heard of quite a few malware pieces. I've never heard of hand-crafted black market AR-15s.


https://ghostgunner.net/

"As shipped, Ghost Gunner manufactures mil-spec AR-15 and AR-308 lower receivers to completion. With simple tools and point and click software, the machine automatically finds and aligns to your 80% lower to get to work. No prior CNC knowledge or experience is required to manufacture from design files. Legally manufacture unserialized AR rifles in the comfort and privacy of your home."


That's just another level of indirection.

"I'm not selling guns, I'm selling machines that automatically make guns."

"I'm not selling malware. I'm selling random bitstrings. Here's the email of someone who sells a different kind of random bitstrings, and here's a URL where you can download an AES implementation."

You still need someone with the technical know-how to make that machinery, and the commercialization of such machinery is very easy to regulate because pretty much only industries use CNC machines.

Computers, on the other hand, are everywhere.


Sales would be the point of control.

Gun manufacturing equipment is not prohibitive. Zip guns and 3-d printers facilitate it.

Drug production for cocaine, heroin, cannabis and others is as simple as a garden.

Not that I agree the kid should be arrested. I simply perceive the feasibility of software prohibition via government license to be as effective as it is for guns and drugs.


>I simply perceive the feasibility of software prohibition via government license to be as effective as it is for guns and drugs.

In other words, not effective at all against the people you're attempting to target (criminals) and only negatively affecting the people who don't deserve or need to be targeted (law abiding citizens)?

Yeah, that'll work great. /s


> Sales would be the point of control.

Do you mean sales of the software itself? There's no way that could work. Secretly transferring a bitstring is one of the easiest things to do. At best you could catch the money laundering.

> Drug production for cocaine, heroin, cannabis and others is as simple as a garden.

I was thinking of drugs in general. Including for example antibiotics and cancer drugs. AFAIK, practically no one knows how to make stuff like that.


> Do you mean sales of the software itself? There's no way that could work.

Loaning a gun is even easier, but it seems regulated too. I've seen a friend verbally say in front of others that he was transferring full ownership a gun when a friend asked to borrow one for the range because that's what the laws of his state required (WA). He then informed his friend that he would certainly appreciate the gun being transferred back in a week but he had no recourse if not.


>Loaning a gun is even easier,

Really? You're going to make that claim?

Prove it: Just encrypt a gun with my public key and send it to me over email.


I mean, are we talking about just regulating or about actually enforcing regulations?


The tool he developed is called NanoCore. Licences were sold for $25. Below [0] is the latest wayback mirror. Sadly the "Terms of Services" didn't got archived.

Quote from the website, section "Remote Surveillance":

[...] remote surveillance via Remote Desktop, Remote Webcam, and Audio feeds. [...] file and process surveillance.

I wonder how many legit use-cases for such a tool are out there. Not everyone wants to use Teamviewer, and for example lthe feature with remote task manager seem to be useful for debugging/support. Though, a google search for "https://nanocore.io/Download.rar" reveals another picture [1]. Also why is the rar-archive "protected" with a passphrase (hovering "Download" reveals "Password: NanoCore")?

[0] https://web.archive.org/web/20170315201655/https://nanocore....

[1] https://encrypted.google.com/search?hl=en&q=https%3A%2F%2Fna...

*Edit: Formatting


> Also why is the rar-archive "protected" with a passphrase

Many AV suites recognize common and innocuous things in code as viruses (e.g. compression, heavily used in demoscene production). Putting a password on prevents the AV from scanning the file and blocking it outright.


I've lost count of how many 4k and 64k intros have disappeared from my collection.


Once I made the mistake of using demo-related tools to craft my own gamedev tools and engines... (for example using kkrunchy to pack my stuff).

One day I decided to isntall anti-virus and... whoooops, everything was nuked (the AV deleted without asking permission, didn't even bothered with quarantine).

And back then I had no source control...


All the creative code in demos drives dectectors bonkers sometimes.


Speaking neutrally, the legit use-case for that kind of tool is stated in the article -- for local network monitoring by budget-conscious network admins. Now, I personally think that the use of such tools is unethical, but the general opinion on HN seems to be that on a corporate network, anything the management wants to do on your machine is okay.


why does a budget conscious network admin need audio and video surveillance through people's computers? wouldn't the business just install some cameras?

this tool was made for sextortion, marketed to predators, and the author is shocked when the feds show up...


why does a budget conscious network admin need audio and video surveillance through people's computers?

I don't know why organizations make decisions I would find unconscionable, but that doesn't mean they don't do it.

IT departments often become the enablers of surveillance.

For example: https://en.wikipedia.org/wiki/Robbins_v._Lower_Merion_School...


> why does a budget conscious network admin need audio and video surveillance through people's computers? wouldn't the business just install some cameras?

https://www.youtube.com/watch?v=U4oB28ksiIo


Congratulations to the FBI for borrowing a page from the handbook of the mullocracy of Islamic Republic of Iran.

https://en.wikipedia.org/wiki/Saeed_Malekpour

(Have you noted dear reader that thugs act and look the same no matter where they are from?)


Seeing how many people here are defending the FBI really makes me question if HN is still the place it used to be and if it is a place that I want to continue in. I have noticed a large increase here in apeasers of totalitarian approaches to software and hardware over the last few years.

What ever happend to the hacker spirit of freedom of knowledge, information, data, and the ability to write and read code as you see fit? I still see an agreement on GPLv3 now and then, but HN is increasingly seeming more infected by silicon valley business types who want to pretend to be hackers and don't understand or subscribe to the core concepts that enabled the computing revolution in the first place.


There is definitely a spirit of "have the state come and rescue us from ourselves" here in HN and it is getting worse.


“It’s like saying that if someone buys a handgun and uses it to rob a liquor store, that the handgun manufacturer is complicit.”

I think that's a good analogy.


Gun manufacturers are explicitly immune from liability for the actions of their consumers, under the federal Protection of Lawful Commerce in Arms Act.

en.m.wikipedia.org/wiki/Protection_of_Lawful_Commerce_in_Arms_Act


The only reason this became law is because so many people tried going after the gun manufacturers.

With all the terror attacks in Europe using trucks as weapons, people might start going after the truck manufacturers. Then the EU will pass some law saying truck manufacturers are not responsible for people using them as weapons.

Cause and effect. It's sad, but some people try to blame the existence of the weapons. As if anything can't be a weapon! You can pick up a rock and attack with deadly force. Who do you sue then? God? This law exists because some people will always try to shift the blame.


The EU will not pass the truck-related law, because it is not how law works in the EU.

In the US (common law) high-profile cases and precedents are the source of the law. This brings more power to the people, but also makes the legal system infinitely more complicated.

In the EU, we have civil law, which relies on first principles and lawmakers authority to interpret existing laws, with much less attention given to individual cases. This simplifies the law and filters away some insane legal tests like this one, at the cost of giving more legal power to the government and less to the people.

Both systems have their pros and cons, of course.


Right. That's why the Comission is already, after ONE case, proclaiming that it is unacceptable that WhatsApp encrypted with no police access, and they are going to prepare a directive to address it. Your view of EU is idealistic and at odds with how it really works.


> Gun manufacturers are explicitly immune from liability for the actions of their consumers, under the federal Protection of Lawful Commerce in Arms Act.

Which was only lobbied for by the ggun lobby because generally-applicable legal liability principles exposed them to liability for certain of those actions.


Handgun manufacturers are strictly controlled by the state: you'll need a licence at the very least. If you produced handguns in the garage and one accidentally got out, you probably wouldn't be so surprised to see the FBI at your doorstep.


You actually can manufacture guns in your own home for your own use completely legally in the US. Just for your own use though you can't ever sell or gift them. There's a whole market that sells 80% complete billets with most of the more difficult parts complete such that all someone needs is a drill press to complete the firearm.


What you're referring to are called "80% lowers" and they're the part of an AR-15 rifle that contain the serial number. AR-15s come in a lot of parts, so the "lower" is considered the actual gun that has a serial number and requires a license. So manufacturers are selling machined lowers that are 80% complete, considering this as just a piece of metal, not a gun. The ATF has already raided at least one maker of these parts and taken their customer list.

Personally I think it's a grey area. At what point do you go from "it's a bunch of metal, screws, and springs" to "firearm". It's the sand hill paradox. And if you want to stay out of jail, stay away from these. All it takes is a zealous DA who decides it's a violation of federal fireams law and suddenly your life is hell.


There are other 80% complete frames available and have been for a long time. The most popular are the AR-15 and 1911. It seems like the line is fairly well established. There are other issues like felons and other people who can't have a gun using this to get around the background checks that can lead to raids and questions about the makers of these billets but in general if you can own a gun you're perfectly safe to buy an 80% billet and machine it out yourself.

Are you talking about the CNC shop that was trying to act like just starting or touching the machine was enough to count as their customers manufacturing the lower instead of the company? That's a whole other issue with just what does it mean for a person to have manufactured the weapon and I'm pretty comfortable saying that company was well on the wrong side of that line.

For most 80%s they're pretty clearly not a gun because there's absolutely no way to use this [1] without modification and assemble it into a gun. This basic design has been around for a looong time.

[1] https://cdn.shopify.com/s/files/1/0218/5770/products/DSC0545...

Edit: found a pretty in depth Q&A with the ATF about 80% lowers that defines and clears up a lot of things like exactly what would make an incomplete lower count as a firearm: https://www.ammoland.com/2014/11/atf-answers-questions-on-80...


It's not that much of a grey area. The ATF has published regulations that require the fire control control pocket be created before a lower receiver is legally considered a firearm. The people who have gotten in trouble here have created lowers where the fire control pocket has been started in some way. In one example, a company created a "biscuit" of one color plastic in the shape that needed to be milled out and then shot the rest of the lower around that biscuit. The ATF argued that the fire control pocket was created in this process and the end user was just removing a plug that was inserted into that pocket. In other cases, the manufacturers made index markings showing where to mill things out.

If you stick to an 80% where the fire control pocket has not been started and you select a lower that requires a separate jig to mill, you'll be fine.


While I buy the plug argument, sort of, I can't really fathom how index markings are supposed to change whether or not something is a gun.


You actually can transfer them, but not for profit, or with any regularity. It cannot be a business.


What happens to the weapon when the maker/owner passes away?


Under federal law, they can be bought and sold like any other gun.

The issue is that you need a federal firearms manufacturing license to be in the business of manufacturing firearms. The occasional sale of homemade firearms you no longer want is legal as long as it's infrequent and you're not manufacturing the weapons to make money.


That I'm not entirely sure on and a cursory search doesn't bring up anything helpful. It probably gets passed down like other effects? But that's just a guess.


Right, so it's more like if someone buys an axe and uses it to rob a liquor store.


I call metaphor fight.

We don't need a metaphor to understand this news story. A guy made a convenient remote access program. He released it into a lightweight "hacker" forum. He tried to prevent people from using it in ways he didn't like which includes criminal ways. The FBI wants to nail him more because they can reach him than because they truly believe he did anything wrong. There are grounds for debate depending on how you weight the facts we've been given, to say nothing of facts we don't have. Etc. These metaphors are not accomplishing anything.


I have to disagree.

An axe is a multipurpose tool, and I think we agree that its main purpose is chopping wood (and/or wild wood, also known as trees), not robbing liquor stores.

The question here would be: what was the tools' main purpose? Is it closer to a gun, which are mainly designed to make living things less alive[1], or closer to an axe, which people use all the time in law-abiding ways? Because the first one has tons of restrictions (and penalties) that the second one doesn't.

Of course he'll say "I manufactured and sold tools", the prosecution will say "he manufactured and sold guns", and the courts will decide. While I'm not saying he's definitely guilty, I think the case is nuanced enough to warrant a trial.

[1] Yes, I know most gun owners go to shooting ranges instead of robbing liquor stores. That doesn't change what guns were designed for.


RATs are also commonly used for IT support and law enforcement.


RATs are also commonly used for IT support and law enforcement.

That's a damn good point. I bet if his defense team looked hard enough, they could find an RFP/RFQ out there somewhere, where the FBI themselves were seeking to purchase something like NanoCore. Hell, I'd go one further and suggest that in discovery they ask the FBI for a list of any RAT tools they use, including NanoCore!

Proving that the FBI themselves are a customer for tools like (and possibly including) NanoCore should be enough for any half-intelligent judge to throw the case out without further discussion.


The point I was responding to was relying on the status of firearms as items whose manufacture is already severely restricted by law; on the other hand, as with axes, you don't need a license to make software.


Buys an axe from the liquor store robbing section of the liquor store robbery forum, and then proceeds to rob a liquor store with it?


Unless, of course, the gun was more or less designed and marketed to would-be liquor store robbers.


And that type of marketing makes a gun illegal anyway. Look up the "street sweeper" 12 gauge. Not a very capable or dangerous gun, quite terrible actually.


Guess he should have marketed it to law enforcement and state spy agencies. They buy this sort of stuff all the time.


This is crazy. A vast majority of people working in my industry (infosec, most of us write tools that can be used for good/bad) should be behind bars if you follow that logic.


So, when is the FBI going to arrest the distributors of Kali Linux or Linus Torvalds, the head of a massive association of associates who create and support the defacto operating system of hacking that Kali is built upon?

Sometimes I think it might be better to not be in the technology industry where outsiders can only see what you do as magic and declare you "A Witch!" and come at you with rakes, pitchforks and BearCats.


If Linus were advertising and supporting his software on a forum with a heavy focus on non-ethical hacking, that might be an apt analogy.

He made the app for unethical purposes... his advertising in said forums is a clear indicator of that. Lets not rally being people who are actively trying to reduce security on the web...


Linux is advertised all the time on a site called Hacker News. This isn’t only "hacking" in the innocent "innovative coding" sense of the word. The participants in this particular forum regularly discuss computer intrusion, some academically, others practically.

(sarcastic variation of a similar sentence from the article)


Linux was originally advertised on evil Usenet... I here there is all kinds of bad things there.

Get him


He's using vertical-align and flex-direction. Arrest him!


Yay one other person saw that. :D


If that's all that is to the story: Poor guy.

Could have been me, wouldn't I have gotten a slap on the wrist when I was 16. That made me leave the "hacker forums" and go into another direction.


Not sure how long its been, but I think they punish hacking far heavier now. This change reminds me of the harsh punishment Samy Kamkar received for the Myspace worm[0] which, while it did deserve a punishment, far exceeded what I would expect for the "damage" done.

Samy was hit under an extension of the Patriot Act. For sending friend requests to strangers via XSS vuln.

Its the War on Drugs Effect, hysteria and concerns over the implications of a hacking push lobbiers and citizens into demanding heavier punishments (which, similarly with the WOD, probably leaves the convicted even worse off).

[0] https://en.wikipedia.org/wiki/Samy_Kamkar#Samy_worm


What happened to you?


Ahem... Spent too much time in these circles, doing what people do there. Some of it not really legal. Authorities noticed, took my hardware and told me hat I maybe should think hard about going back there. Got new hardware and changed course.


Mr. Anderson...


We should mass-report RATs like TeamViewer to the FBI. It's not fair that some get a taste of democracy and others don't.


I used to do malware analysis. We should also report MS Office macro functionality, AutoIT, Metasploit, and VNC. Maybe go after RSA too for crypto used in ransom ware.

If this goes through it will set a horrible precedent. While I wouldn't be too happy having certain software classified as cyber arms, if that's what needed to happen to get the same protections as weapon manufacturers, so be it.


Reprising a Lobste.rs comment:

I pulled the indictment from PACER. The story is oversimplifying the case.

The indictment is far more concerned with Huddleston’s affiliation with Zachary Shames, who was convicted (apparently dead-to-rights) for selling a keylogger called “Limitless”. The indictment mentions Limitless more than it mentions NanoCore. Shames wasn’t very smart: the DOJ has records of him providing tech support to users who were clearly using his keylogger to harm people.

Huddleston has two big problems. The first is that he sold licensing software to Shames for the Limitless keylogger. The second is that the DOJ apparently has Huddleston and Shames in a Skype group together talking about this stuff.

The Beast article snarks about the indictment mentioning HackForums repeatedly. But the Beast article doesn’t think it’s important for you to know about the HackForums Skype group Huddleston and Shames shared; in fact, Shames himself gets only a tiny sliver of the article, despite being the fulcrum of the indictment.

RAT software theoretically has legitimate uses. But, obviously, we all know that most RAT software isn’t legitimate. NanoCore sure wasn’t. It has a DDoS botnet tab, for Christ’s sake. Huddleston’s attempts to position it as legitimate software are about as compelling as the “no copyright claimed” comments on a Youtube video.

But having said that: it’s unlikely Huddleston would be in the amount of trouble he is in had he simply written a malicious RAT. His problems are his connections to a criminal conspiracy that got busted.


Yes, I wish tech news organizations would actually have provided the whole story. Thank you for this additional information (both in regards to Zachary and the DDoS Botnet tab in NanoCore) as having those facts paints this situation in a much different light.


This is a case where the German legislation did surpass US laws in stupidity. In Germany there is the so called Hackerparagraph § 202c which provides for the procurement and distribution of access codes to access protected data, as well as the production and use of tools which are useful for this purpose as a criminal offense.

I am astonished by the amount of comments here who do believe that this kind of legal proceeding against software tools would help to improve security. Do you really want to live in a world where you have to hide a nmap CD somewhere in your backyard. And no, that is not another situation, as it impossible to draw a line.


If this goes though, can we sure car, knife, and anything else manufacturers? They sometimes use shady lots to sell cars, don't tell me everything about the car, and I can run someone over. Many people die each year from cars! This case might be precident...


> If this goes though, can we sure car, knife, and anything else manufacturers?

Suing (and sometimes winning, more to the point) manufacturers and retailers who profit from unlawful acts of their customers knowingly, and often with reason that they should have known, is already possible under the law.

More than 20 years ago when I worked at Radio Shack the employee orientation included a piece on this.


>If this goes though, can we sure car, knife, and anything else manufacturers?

If they make purpose specific tools, then yes.

Someone selling "Rape-o-matic sleeping pills" for example, will go to jail, even if they raped no one.


Reminds me sometime ago there was a post on HN from the author of shadowsock[1] about being prosecuted/arrested by chinese authority.

Maybe one day one could be arrested by making keyboards or phone screen. Who knows

[1] shadowsocks.org


> Maybe one day one could be arrested by making keyboards or phone screen. Who knows

Only if the keyboards and screens don't send their data to the authorities.


trying to sell dual-use RAT software on hack forums and calling it legitimate business seems like riding the line. As the article says, you're not going to find corporate IT managers on that site. Seems like the guy was knowingly selling to hackers and then crying foul when he "discovered" they were using it for hacking and disabling the license.


Not hard to imagine that a kid on a hacker forum grows up & gets a tech job. Today's hack forum user is tomorrow's corporate IT professional.


yeah, but once you're a corporate IT professional you don't go back to the blackhat hacking and scamming forum to pick up some software for administering your network. Suggesting that the majority of customers for such software on such a forum are legitimate is a leap.


This isn't directed at you, but at corporate culture and cultures outside of it: Why is "corporate" a legitimizer?

If corporations use it, does that make it okay? why?

A corporation is a legal entity, (a social fiction if you want to be dramatic) for isolation of activities from the individual. That puts it on shaky ground already. History contains myriad examples of bad corporate behavior to corroborate my accusation of shadiness.

To use this history of illegal behavior under the veil of isolation as an argument for legitimizing another activity under questioning is circular.

The preferential treatment for commercial activity over freedom is evidence of citizens-as-products via rigid plutocracy.


This article is seriously clickbait. The authors of legitimate IT security tools aren't advertising on skid websites, just the fact that you advertise on these websites shows that you have the intent of letting people misuse it.


I don't know anything about the particular website he was advertising on, but I could easily see someone in the HN community developing some security software, advertising it via a "Show HN" post, and then being judged for posting it on "Hacker News".


the site described is very much a blackhat hacker forum.


I read the article but it isn't so clear about that. It says the forum contains many non-hacking related subjects. To a total outsider with little understanding of the world of computer security, Hacker News might look and sound like a blackhat site too. There are frequent posts about vulnerabilities, hacking news, and security tools.


Yeah but anyone who actually opens a discussion on hackforums will realize it's a forum made for skids.


as a previous user of the forum in question, I can confirm it's not for the "curious programmer" type of hacker. I realise that the term is ambiguous, but I think that's more due to the "hacker news" type hacker clinging onto an old term after the public understanding of the term has moved onto "hacker = computer criminal". The public at large, and journalists, know what a hacker is - and it's not a VC-driven startup entrepreneur.


Would you argue that phrack magazine is also not for the "curious programmer" - or would you say that the dividing lines are different now than they used to be?

I wonder because I read a lot of stuff in phrack as a "curious programmer"...


Hackforums and phrack are not even close to being in the same category.

Phrack contains a lot of technically interesting articles, which can be of use to anyone interested in IT security.

Hackforums is full of kids giving tips about hacking FB accounts and selling stolen steam accounts...


Yeah. The analogy I'd draw is selling guns is legal. Radical Islamism is legal. A gun store that only advertises on radical Islamist internet fora is at the very least acting recklessly, and thus ought to be held legally responsible for the actions of its customers.


Pick some other niche that most people don't find so reprehensible and your reasoning falls on it's face.

Should a retailer that sells bongs "for use with tobacco" and advertises on a weed forum be held accountable for people illegally smoking weed with it?

A century ago you could have used the same analogy to say that some business that has an advertising deal with a gay bar is responsible for enabling them.

Both those examples are businesses acting "recklessly." Should they be held accountable? That is a slippery slope to biased enforcement.


The reasoning doesn't fall on its face just because you pick an activity that shouldn't be outlawed in the first place.

Interpreting 'should' in the legal sense, then yes holding the bong seller accountable is a valid argument that can be made. And the second scenario is possibly valid too, depending on some details like what they actually sell and what specifically the illegal activity is.

It's not a particularly slippery slope.

I don't really like that kind of enforcement, but it can be done in a reasonable enough manner.


The difference is that marijuana use and gay sex, if illegal, are "victimless crimes." Hacking into people's computers to take over their webcam and steal their credit cards and passwords is not.


That's no different legally, only ethically. The FBI does not enforce ethics.


Well, gay sex isn't illegal (and has never been illegal at the federal level).

But in any case I'm not offering my legal opinion, I'm offering my ethical one of how I think the law ought to be.

IANAL, so I have no idea whether the government's case in this instance is likely to succeed or whether this application of the relevant statute(s) would withstand a court challenge.

But, if I were America's benevolent-dictator-for-life, I would hold a business and any responsible employees legally responsible for its customers' illegal uses of their product, in the case it is intentionally and recklessly marketed to persons whose interest in the product, any reasonable observer would assume, is likely to be for said illegal use.

And, neither here nor there, but I'd legalize less-harmful drugs (pretty much all except opiates and maybe cocaine/crack), treat addiction as a medical rather than criminal issue, and keep all consensual relations between adults legal.


> Hacker who hacked no-one

Okay, what did they do, then?

> Made software abused by hackers

Oh, that's unfortunate. It was just a legitimate exploit tool, right?

> RAT

Okay, that's a potentially legitimate type of software—

> advertised on HackForums

…the case pretty much writes itself. Nobody who has heard of that site can claim with a straight face that software marketed there has innocent intent.

The article is sympathetic and argues the author had innocent intentions. Perhaps that's the case, but the problem is that it would be very difficult to persuade people of that, given that someone who in truth did have malintent might act exactly the same (e.g. the disclaimers attached to other such hacking tools).


Perhaps that's the case, but the problem is that it would be very difficult to persuade people of that

Nobody has to be persuaded that he had innocent intentions. In the US, we base our legal system on presumption of innocence. The onus is on the State to prove that he committed a crime. And if "Intent" is what makes his actions a crime (or not) then then State have to persuade a jury that he had malicious intent. The presumption is that he did not.


>>>The article is sympathetic and argues the author had innocent intentions. Perhaps that's the case, but the problem is that it would be very difficult to persuade people of that

So you support the idea of Guilty until proven innocent, combined with Guilt by Association. What a terrible combo

Most people believe it is up to government (or anyone making the claim it was not for legit use) to prove their claim, it is not up to the defendant to prove they had innocent intentions.


> So you support the idea of Guilty until proven innocent, combined with Guilt by Association.

Do I? I'm not saying they're automatically guilty.


No you say he has to prove he made the tool for legit purposes.. that is guilty until proven innocent

You have assumed he has malicious intent, and have shifted the burden of proof on to the defendant to disprove your claim


It seems like he just stumbled on HackForums and grew comfortable there. It was his place, and having a lot of bad actors around was just normal, even though he wasn't one of them


IBM participated in apartheid[0] in South Africa, providing hardware and software to the government to run the passbook system which enabled widespread racial profiling. They have yet to be held accountable[1], despite having participated directly.

Say what you will about the author of NanoCore, he participated in the supposed crimes less than IBM did in apartheid.

[0]: http://www-cs-students.stanford.edu/~cale/cs201/apartheid.co... [1]: http://hrp.law.harvard.edu/areas-of-focus/previous-areas-of-...


This trial might be a good thing ultimately to set good laws. Similar to the times they tried to sue tobacco companies and gun manufacturers for the deaths of people. Those cases rightfully died in court and set precedents to block future wasteful cases.

Even using the gun analogy this wasn't even selling a full gun just a part of one - as this tool is useless in isolation in terms of hacking - and probably one of the easiest parts of hacking. Getting access is typically the expensive risky part.

There's a far better case against zero days being sold but even that has plenty of legitimate use cases for red teaming. But it's still closer to selling loaded guns and needs to be carefully sold, not so much with this case, just from a social good perspective not even regulations (which I think are a bad idea, such as the one regulating zero days in europe).

It's just too bad this guy has to go through hell for this cause. Hopefully he sets up a legal donation page to get the best team he can. Gun/tobacco companies typically have legal teams. This is just a small time ISV, so this still makes me very upset it's being done to such a vulnerable person, which could very likely create bad laws.


The real question is: if someone makes a product that deliberately makes it easier to break the law, is he guilty in breaking the law, or complicit? Deliberate is the key word in this case. Complicity depends on knowledge of what they're doing. I think perhaps criminal negligence is all he could be charged with, but I don't know. Sounds like he wasn't an angel, but not totally dubious either, based on how he advertised the product and what it actually did.

Makes me wonder though, the software developed by pen-testers could be stolen and used for nefarious purposes. Do the pen-testers get held responsible for everything they do?

Or are we judging based on this individual's spirit? He wanted to make money, so he enabled bad people to do bad things, advertising on bad sites. Even though this article is framing it like he was an angel because SJWs need fake news to feel like heroes?


As a pen tester, I largely use open source and "nefarious derived" software...


Not to fault him for lying through his teeth... you gotta defend yourself...

But he is lying. He supported and advertised the software on an unethical hacking site. Thats extremely clear intent, on his part... hes hosed, and he deserves it. Why people here are standing up for unethical hacking is... beyond me...


"unethical" <> "illegal". And even if he did violate some bogus statutory bullshit posing as "law", hackers (of all stripes) should be supporting him because of the precedent setting effect, and the chilling effect, of holding software makers liable for the actions of downstream users of that software. This is not a path we want to start down.


I want people who make software designed to do harm to be held accountable. Since he promoted and supported it in a place focused on harmful hacking, his intent was very clear. Him being held accountable is a good precedent to set.


I want people who make software designed to do harm to be held accountable

I don't. I want people who cause harm to be held accountable. I might barely buy your position IF the software had ONLY "non legitimate" uses, but that's clearly not the case here.


The uT was sentenced to prison for writing software he didn't use either (TJ Maxx theft), they just used chat logs to prove he was part of a conspiracy and I'm sure FBI probably has similar PMs from this guy replying to potential customers or informants "proving" conspiracy.


And at the same time, on this very website, we see social scientists saying that "we are losing the information war". No wonder, when institutions behave like that why should we have any trust in them?

Of course when you're used to people trusting and obeying you, it becomes so natural that you stop thinking about it. That's a problem for you, and the rest of us, the day it no longer happens.

That day is getting closer and closer, thanks to idiots like the FBI officers in the article. For a lot of people it is already too late. In a world where you can choose your news, people make up their minds at most once on a subject and then they are set for life — and they will have an effect on the opinions of their family and friends, which will do further harm to the trust in your institution.


Just wondering when they will start arresting knife manufacturers?


If knife manufacturers are promoting and supporting their product on a forum devoted to the illegal use of knives... they probably could (and should)


Would it be illegal to write say an excellent grepping or sorting tool that I really want someone to use as part of a criminal activity, even if it had heaps of other uses?


Apparently, it's all in the Marketing. You can make a keylogger and get the cops to sell it for you, so long as it's "for the children".

https://www.eff.org/deeplinks/2014/09/computercop-dangerous-...


FBI is claiming if you sell your grep tool via a site called HackForums, it is.


Would HackerNews be an acceptably named place to drum up some publicity? Politicians know that difference between Hacker and Cracker, right? ;)


Hacker News is operated by a VC firm, and most articles talk about new technology and how to monetize it.

HackForums is a place to talk about cracking software, distributing unwanted malware, and other generic "skiddie" stuff. Like the article says, it's shades lighter than the "dark web", and there are probably dozens of legitimate reasons to visit that place, but the vast majority of people there are looking for means to circumvent permissions/licensing/trust on other computers or someone else's software.

Hell, Ansible is a great RAT. It logs onto my servers, installs software, reconfigures settings, restarts crashed services. It can handle a network of thousands of boxes at a time. It's just not marketed as something that can turn on the webcam of a user, hide from antivirus software, and silently let a disgruntled creep keep tabs on their ex's bedroom.


So why isn't the FBI knocking at Bill Gates' door, for facilitating hacking, spyware and botnets with the Windows Operating System?


Of course I'm assuming you asked this question rhetorically, but I really feel the answer is worth stating explicitly :)

Bill Gates is an extremely rich and politically connected individual. Technicalities rarely apply to them.

To be arrested as a massively wealthy individual, you need to be committing crimes on the scale of Madoff, Enron, or murder with a weapon other than a drone :)


Unfortunately this is true.

Massive wealth disparity has created different classes of people, including an Elite class, the wealthiest, who are treated like royalty.


Couldn't agree more, my friend.

What will be even scarier and chilling is when they find a way to either fine, imprison, sue us for libel, or impugn us for pointing out their crimes using our freedom of speech! :)


Yeah, seriously, go after the person or corporation that has the most power in solving the problem. Who allows all those computers to be hacked so easily? The companies developing and servicing the systems. Fix the broken systems, don't go after the hackers. If there's nothing to hack they will go away anyway.


Intent.


This guy's intent was to create a software and sell it. His clients intents is another story.


His intent was to create software to sell at hackforums which also sells pay-per-installs. I doubt there is even a single user on hackforums who bought his software who would use it for legitimate purposes.


Remembers me some virus that existed back in 1998, more or less, that would allow you to take control of some computer, see the screen, open CD-ROM, and so on.

It was used as a base for some softwares that would replace VNC or some kind of remote management for Windows (when Microsoft solution was too expensive or too poor)


Are you thinking of NetBus? https://en.wikipedia.org/wiki/NetBus


Perfect, I couldn't remember the name. Thanks!

Some people say that a management suite developed in Brazil was made by taking the code and refactoring it. Indeed, some functions at that time were quite similar.

And it was a great product, helped a lot at the company I worked at. Don't know if it's still that good, but surely I would recomend it back those days.


> “It’s like saying that if someone buys a handgun and uses it to rob a liquor store, that the handgun manufacturer is complicit.”

Except it's really more like someone stealing a handgun and then using it to rob a liquor store, and then blaming the handgun manufacturer.


If someone uses a hammer to bludgeon, then is the hammer maker responsible? No. I thought people suing gun companies solved this, saying the manufacturer is NOT responsible!


This is so much bullshit. Here's hoping the EFF and others can come to his aid. Anybody know if there's anywhere to contribute to his legal defense fund?


Pirated implies proprietary software. Lesson: anything that is even remotely related to security should be written as open source.


This week in the US: stupidity keeps prevailing.


I don't get it. Building a road doesn't make you an accomplice to crime, if a criminal happens to drive on it


Intent matters. Roads aren't built with the intent of facilitating crime. A firearm is a better comparsion, because it's built with the intent to kill. Indeed this is the defense route that many are choosing here for this app. However there's an additional problem, in that this app was originally launched on HackForums, where the author also frequents. HackForums is a cesspool of criminals, so while there are some legitimate use cases that can be conjured up for this app, none of those legitimate users would visit HackForums.


The gun analogy is good but shouldn't there be more than just intent? Otherwise, a gun-owner merely frequenting a section of town known for crime activity would suffice as "intent".


I like how the code included in the graphic is CSS, as if he is at large for hideous styles.


Do we hold manufacturers of guns responsible for the people who use them?


Potentially, if they deliberately sold to criminals with knowledge (or a situation where they should have known) that the weapons would be used for criminal activity (as a primary purpose).

However, this case seems a bit complex (though the article is pretty biased in making its case, but I'll take it at face value). The guy was dumb (my opinion) to sell it on the forum he did, but he did (apparently) make efforts to prevent its use for criminal purposes. But selling on a forum where one of the primary discussion channels is about criminal hacking, it should've been obvious that was a bad place to advertise this tool.


Not really disagreeing, but if I start a gun shop in a bad neighborhood (and then naturally a higher % of my customers will be more criminal in nature) does that make me liable?


Potentially, if you have knowledge of their intended use for the weapons. You likely won't know, or can put yourself in a position where you can plead ignorance (meet at least the legal standards for sales with background checks and such).

You have no obligation to sell in the first place. If a man comes in talking about their intent to shoot up a place, and then comes to you at the counter and asks to buy a gun, you can always turn him away. That's not likely to happen, but that's essentially what happened on the forums. People discussed how to (with malicious intent) hack other computers, and then purchased this guy's software (or pirated it). Now, again per the article, he tried to prevent that use (which is why I say in this case he seems to be just dumb), but it's still a highly questionable group of people to sell and advertise to.


Just a guy wearing a hoodie on a computer in a Starbucks. ha joking


What's the legitimate use for remote keylogging?


Parents keeping an eye on their kids, schools monitoring students, small offices keeping an eye on staff, etc.

That's my guess at least.


Those uses of keylogging are about as legitimate as hidden cameras in hotel bedrooms, imo.


If the bedroom was in the middle of an office, maybe.


Not sure about the first one, but the later two are illegal in a lot of countries.


Does anyone know if this kid has a defense fund?


scary... but also lol at the css in the background image


Whatever you think about this case, the article is written in a deceptive way. For the first half of the article, you are told that he is being prosecuted for making a tool that was used by hackers against his intentions. But then you find out that that is just his side of the story. According to the feds, his intention was to help people commit crimes. And they have a pretty good reason to suspect that this was his intention in that he announced the software on a site for hackers.


Oh no, a website for hackers. is hacker news a website for hackers, how about slashdot? If you are responsible for what people use your software for, even when you actively tell people not to do that, then it should also be true for guns, alchol (kids drink your bud beer, so arrest budweiser execs?).

This is one of those fundamental questions, similar to do you own physical devices or just run the software on them, like the tractors. It should be you aren't responsible for what people do with your software, if there is any legimiate use. The prosecutorial power of the us govt is very large, and they can coerce people to plead guilty to crimes where they weren't really responsible.


The article makes it pretty clear that this was "hack" in the criminal sense. The nature of the forum is a question of fact, and it sounds like the prosecution will have no difficulty in showing that lawbreaking was a major topic of conversation. With "Hacker News" on the other hand the hypothetical defense could easily demonstrate that it was for "hack" in the positive sense. Dictionaries routinely list both senses of the word.


Perhaps the forums he sold it in were focused on illegal activity, it was mixed in the article. But lets get back to this forum - I am sure that there is discussion of how to subvert drm on hacker news (search far enough and you'll see it), and that's illegal. So if you were on trial and they used your posting on hacker news against you, they'd definitely call it "evil scary bad hacker news".


And then later in that same article, we see clear artifacts (messages, remote disabling of the software, etc) of the author's attempts to stop malicious use of his software.


Oh. Look at the pretentious press trying to make a criminal look innocent. This guy knew exactly what he was doing. Trust me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: