Hacker News new | past | comments | ask | show | jobs | submit login
Life of Brian (Krebs) (hackerfactor.com)
184 points by Garbage on Feb 19, 2014 | hide | past | favorite | 50 comments



But wouldn't it be interesting if, in the dark of night, Brian Krebs actually was a serious cyber-crimnal. And he used his reputation and knowledge to frame himself from time to time, just to throw off the fuzz. He's certainly smart enough to pull it off. (no, I don't think that's true that he's a cyber-criminal :-) )


the old school version of this conspiracy is the AV software makers were the ones funding the virus creators.


I heard a rumor back in the day that the Chernobyl virus [1] (Also Win.CIH, I think) was released by Microsoft within pirated versions of Windows. At the time I was in high school and just believed it, because Microsoft was the big, evil corporation at the time. Now of course I'm more skeptical, though I still wouldn't be surprised.

[1]: https://en.wikipedia.org/wiki/CIH_(computer_virus)


Thanks for reminding me of that one.

Some bogus design decisions by Packard Bell (no jumper/switch to lock out BIOS flash access) meant two of my machines were written off.

No, they weren't running pirated Windows, though I was on day 204 of my 31 day Paint Shop Pro evaluation.


A friend of mine had a pbell that was killed off by Chernobyl. I was on the phone with him when he booted it up. "I'm going to boot." "Don't do it man." "Nah I'm gonna boot. Here it goes. See it's fine." "Sweet" "Uh oh" "Uh oh?" "Uh oh"

And that was the end of that machine lol


I managed to get replacement motherboards for both under warranty, including engineer call out.

You'd almost think the virus shipped with the computer, the way they responded.


My personal conspiracy theory is that AV software are actually backdoors for the international three letter agencies. For my taste the market is unusually fractured with many of the players coming from the typical spy powernations. Plus it would be the sweetest access door as they are almost always whitelisted and get access to the internet regularly.


Kind of like the religious nutcases being in cahoots with the heathens. Many conglomerates own both health food and junk food brands.


ha ! like the Dexter of the cyber world. I sense a tv series idea coming on..


I don't think there's a screenwriter in the world that knows how to do hacking and security stuff realistically.


The recent House of Cards (S02E03) is actually refreshingly good. There are a holes that pretty much have to exist for plot but overall they actually talk about real things. Not random gibberish.



I don't think that any sane screenwriter would want to do hacking and security stuff seriously if they could - the pacing/dynamics of how hacking is shown currently is much better fit to what works in movies as narrative media; showing it realistically simply would generally hurt the movie and be unappreciated by the vast majority of viewers.

It's just as with any other time-consuming thinking activities such as invention/tech research; crime/plot investigation; medical diagnosis - they're much more successful when the screenwriters/directors misrepresent reality of 'how it happens', so they do so intentionally.

That's how that media works - if you want realistic portrayal of hacking and security, then it (just as many, many other topics) is quite suitable for a written narrative, but not really for a two-hour movie/30min tvseries format.


> Let's say that these intimidation techniques make Brian give up journalism. Maybe he becomes too scared to write.

Advocates against anonymity do not get this. I guess they think the bad guys get arrested and the good guys have nothing to hide (asymptotically). They do not consider that the world is not black-and-white and that you may need to protect against technically lawful people.

Other than that, I did not know Krebs (but I had heard about this successful battle against spam) and it makes me want to know more. The website hackerfactor.com has a nice design and have potentially interesting content.


That site looks like something from the 90's. Has design regressed that much that the 90's look is cool again?


"If the only thing protecting your security is a lack of others knowing the secret, then you have no practical security."

My understanding is that "security by obscurity" is a condemnation of making algorithms and code part of the secret. A system reliant on others not knowing your (say) private SSH key may lack defense in depth but isn't "security by obscurity".


Its possible that "security by obscurity" has a different meaning in the programming world. When talking about computer or network security it is used to describe any mechanism that attempts to defend a system by hiding something. One of the classic examples of security by obscurity is when a person disables the broadcasting of the SSID on a wireless router.


For what it's worth, Wikipedia disagrees:

"Security through obscurity is generally a pejorative term referring to a principle in security engineering, which attempts to use secrecy of design or implementation to provide security."

http://en.wikipedia.org/wiki/Security_through_obscurity

"When talking about computer or network security it is used to describe any mechanism that attempts to defend a system by hiding something."

That is again a definition which would include ssh keys, which is to say a useless definition.

"One of the classic examples of security by obscurity is when a person disables the broadcasting of the SSID on a wireless router."

I'm not sure that's an example at all, since hidden SSIDs can be readily sniffed from a publicly documented protocol.


>That is again a definition which would include ssh keys, which is to say a useless definition.

For those of us who work in security, its taken for granted that when we are talking about security through obscurity, we aren't talking about passwords and cryptographic keys. I should have been more clear.

>"Security through obscurity is generally a pejorative term referring to a principle in security engineering, which attempts to use secrecy of design or implementation to provide security."

That's the exact same thing that I said worded in a different way.

>I'm not sure that's an example at all, since hidden SSIDs can be readily sniffed from a publicly documented protocol.

And that's the entire point. Security through obscurity alone is not a good thing. Starting from the second sentence, and continuing through to the second and third paragraphs, the article elaborates on this.

>A system relying on security through obscurity may have theoretical or actual security vulnerabilities, but its owners or designers believe that if the flaws are not known, then attackers will be unlikely to find them.

>Security through obscurity has never achieved engineering acceptance as an approach to securing a system, as it contradicts the principle of simplicity. The United States National Institute of Standards and Technology (NIST) specifically recommends against security through obscurity in more than one document. Quoting from one, "System security should not depend on the secrecy of the implementation or its components."[1]

>It is analogous to a homeowner leaving the rear door open, because it cannot be seen by a would-be burglar.


It worries me a tiny bit that you work in security, and yet your thinking seems this muddled here. I'm hoping you haven't had your coffee or it's a failure of communication or something...

"its taken for granted that when we are talking about security through obscurity, we aren't talking about passwords and cryptographic keys."

I agree that passwords and cryptographic keys are not what we're talking about - the question is why. Whether, confronted with a proposed setup that relies on the secrecy of X for its security, whether X should be seen as analogous to keys (and thus not a case of "security through obscurity"), or more analogous to protocol or algorithm (and thus a case of "security through obscurity"). There are, of course, other concerns as well.

"That's the exact same thing that I said worded in a different way."

SSID is neither design nor implementation of a wireless network.

"Security through obscurity alone is not a good thing."

I wholeheartedly agree that security through obscurity is not a good thing, but not everything that is bad is "security through obscurity". "Don't broadcast secrets from radios in cleartext" is a rather simpler (and stronger!) rule violated here if you are trying to use SSID as a secret.


>SSID is neither design nor implementation of a wireless network.

As it relates to the physical implementation of a network, part of the implementation includes configuring the devices. Security through obscurity can be something as simple as manually assigning a non-standard port to a service (such as HTTP). The example of disabling the broadcast of an SSID is also a valid example of security through obscurity.

You seem to have an issue with me using the SSID as an example because its so easy to defeat that it would be a waste of time to even try. Like I said before, that's the entire point. Its obvious to us now, in 2014, but for years people were told by 'experts' to disable it as a security measure. Some individuals actually used to attempt to secure their networks by hiding the SSID instead of using a password. The prevalence of this kind of negligence is one of the key reasons that disabling the broadcast became the prototypical example used to explain why security through obscurity doesn't work.

In short, its not an example I made up on my own formed from a lack of understanding security, its an example that's used in textbooks and security-specific certification study guides. Its an example that's taught in introductory IT and IT security classes. I'll try to find some more specific citations to put your mind at ease, its been quite a while since I've taken an introductory course on the subject.

https://www.owasp.org/index.php/Avoid_security_by_obscurity

>not everything that is bad is "security through obscurity".

And I agree with this, but my example most certainly is a valid case of security through obscurity.

Update: Better Sources

From Hacking Exposed: Wireless, 2nd Edition Pg. 80

From the first paragraph on security through obscurity:

>Many wireless networks today operate in hidden or nonbroadcasting mode. These networks don’t include their SSID (network name) in beacon packets, and they don’t respond to broadcast probe requests. People who configure their networks like this think of their SSID as a sort of secret. People who do this might also be prone to enabling MAC address filtering on the AP.

From Enterprise Security: A Data-Centric Approach, Chapter 7

>In order to provide security through obscurity, methods such as hidden SSIDs and MAC address filtering have been employed to keep the wireless network invisible to eavesdroppers and more difficult to connect to for an unknown host.

From Certified Wireless Design Professional Official Study Guide, Section titled SSID hiding.

>In order to provide security through obscurity, methods such as hidden SSIDs and MAC address filtering have been employed to keep the wireless network invisible to eavesdroppers and more difficult to connect to for an unknown host.

These are just a few examples out of dozens, if not hundreds.

I'd sincerely appreciate it if you would educate yourself before implying that other people are incompetent in the future.


"As it relates to the physical implementation of a network, part of the implementation includes configuring the devices."

Arguable, but then what makes keys and passwords "not a part of the implementation"?

"You seem to have an issue with me using the SSID as an example because its so easy to defeat that it would be a waste of time to even try."

That is not my objection. I object to SSID as example of "security through obscurity" because I think it doesn't get at the core of what's wrong with security through obscurity and it has other far more glaring issues - so someone can fully grok that keeping SSID secret is not worthwhile and see "why" but without that "why" generalizing to (say) the company saying "We use our own proprietary cryptosystem. We can't show anyone the algorithm; it's more secure that way!"

"In short, its not an example I made up on my own formed from a lack of understanding security, its an example that's used in textbooks and security-specific certification study guides."

I wasn't questioning your understanding because you picked the example, but because you seem to persist in conflating things that seem clearly (and meaningfully) distinct. If textbooks and security-specific certification study guides use SSID as a canonical example of security through obscurity, I'm more worried not less.


>Arguable, but then what makes keys and passwords "not a part of the implementation"?

They are part of the implementation of a network, but the issues involving passwords and cryptographic keys are so complex that they generally aren't lumped into a conversation about security through obscurity. I made the statement in response to your claim that someone obtaining your SSH key would render SSH an example of security through obscurity, if my definition and the author's definition were used. With that line of thinking, the entire cryptography industry is merely security through obscurity and we should just give up. In practice, cryptographic technologies aren't considered to be security through obscurity unless the technologies are provably broken to a point where only an idiot would rely on them (such as WEP or LANMAN). As a caveat to this, when a cryptographic algorithm is hidden from the public in order to remain unbroken, it does become an example of security through obscurity. I just object to the idea that this is the only meaning to the phrase.

> I object to SSID as example of "security through obscurity" because I think it doesn't get at the core of what's wrong with security through obscurity and it has other far more glaring issues - so someone can fully grok that keeping SSID secret is not worthwhile and see "why" but without that "why" generalizing to (say) the company saying "We use our own proprietary cryptosystem. We can't show anyone the algorithm; it's more secure that way!"

The issue we began with was that you stated that it was not an example of security through obscurity at all. That was an incorrect statement and it still is. If you now want to argue that its not the best example, I would agree with you.

As an introductory example it does a good job of explaining the concept to a layperson or even a technical person that's still a bit inexperienced. Could you find a better example to give to a room full of newbies? Probably, but that wasn't the issue we disagreed on.

Its also important to note, that in a room full of security experts, you would be able to delve into the more complex aspects of the issue without losing the audience, so of course the example would be different.


Facepalm.

"The issue we began with was that you stated that it was not an example of security through obscurity at all."

If you read up, that is not where this began. You initially responded to my objection to the phrase (from the article): "If the only thing protecting your security is a lack of others knowing the secret, then you have no practical security."

I still fully and full-throatedly stand by that objection - particularly since in a security context "secret" is often used to mean "key or password or...".

I did eventually state that I wasn't sure hidden SSIDs were actually an example of security through obscurity - I'm still not sure they are. Maybe I'm wrong about that. It's still appearing to me that you aren't really understanding my objections, though.

"As an introductory example it does a good job of explaining the concept to a layperson or even a technical person that's still a bit inexperienced."

I still say SSID is - at best - a piss poor choice of something that is barely an example of the phenomenon. It does not do a good job of explaining the concept to anyone - newbie or otherwise - because it is poor security for more obvious reasons that have nothing to do with security through obscurity, and is therefore likely to lead to confusion.


>If you read up, that is not where this began.

You're right, the first thing we disagreed on was that the article's pretty much industry standard definition of security through obscurity was incorrect because in your opinion, its too ambiguous. I thought we had already resolved that considering the article you linked said pretty much the same thing that I did. It was your second post where you stated that the SSID isn't an example of security through obscurity.

>I still say SSID is - at best - a piss poor choice of something that is barely an example of the phenomenon.

Considering what you listed as your definition of security through obscurity, it isn't surprising that you think the example is piss-poor. Your personal definition is itself a very specific type of security through obscurity. Therefore, anything that deviates from that specific example is going to seem wrong to you.


... I can't resist. Since when does "secret" in a security context exclude things like keys and passwords? Usually it means things like keys and passwords. You can't say "it's implicit to 'security through obscurity'" because the article was defining security through obscurity. If the audience is expected to be sufficiently unfamiliar to need that definition, they aren't going to know that there's this mystical blessing of keys as "not a part of that". Seriously, where do you work anyway?


First, a failure to communicate doesn't have to mean that either of us is dumb. It just happens sometimes. Second, the thought that you might be trolling has crossed my mind as well, so it now seems unlikely that either of us are. Third, there really isn't any need to repeatedly resort to insults. I have no idea what you do for a living but I wouldn't be so ready to dismiss you as incompetent as you seem to be with others.

Let me try to clarify some of the things, because I often forget to add details that others might find important.

Security by obscurity is generally used to describe very bad attempts to secure something by hiding either the item intended to be secured, or something else that is used to secure it.

To clarify, a password CAN potentially be secured so inadequately that it represents an example of security by obscurity. For example, if a person tapes a copy of his password to the bottom of his keyboard.

Unfortunately, many of the best security mechanisms we currently have still use passwords, or RSA tokens, or something else that we have in our possession. In some cases, we don't yet have (or haven't popularized) anything better, so rather than lump some really useful, yet imperfect technologies in the same category as the jackass that tapes his password to his keyboard as "security through obscurity." We eliminate passwords/tokens that have been properly secured from the conversation.

To clarify even more, if a SWAT-team storms a secure facility, murders the guards, and spends several hours cutting open a hardened safe that contains cryptographic keying material, most security experts aren't going to look at the situation and say "Well, what else do you expect when you rely on security through obscurity? I knew someone would find that password." The situations aren't comparable.

So when I say that passwords aren't included in security through obscurity, what I mean is that the entire concept of having a password should not be considered an example of security through obscurity, because the phrase is generally reserved for practices that are considered very poor. In the distant future, if we find a way to eliminate passwords completely, to the point where only an idiot would depend on one, then things would change and the entire concept would be considered security through obscurity. Similar things have happened in the past. For example, WEP used to be considered secure, now only a fool would use it.

When I say that cryptographic algorithms aren't included, once again what I mean is that the concept of obfuscating your data through the use of a cryptographic algorithm itself is not supposed to be considered security through obscurity, because although most algorithms will eventually be broken, right now some of them are considered to be extremely safe. We aren't ready to abandon our most secure cryptographic protocols, and so the concept of cryptography itself doesn't qualify as a very poor security practice.

There are certain situations where a cryptographic system can serve as an example, such as the example you gave where the algorithm is hidden in order to 'keep it secure.' Another example would be when advances in computing power allows for a previously secure protocol to be broken through brute force. When it becomes trivially easy to do so, use of the specific technology would be accurately described as a form of security through obscurity.

So in practice, what happens quite a bit when two knowledgeable people are having an academic conversation about the theory of security through obscurity, we are taking for granted that passwords should be properly secured. We are also taking for granted the fact that we shouldn't be using broken or unproven cryptographic protocols. If we had to stop and explain the semantics behind security through obscurity to other security professionals it would be like a couple of senior software engineers having to remind each other what a global variable is every single time they want to talk about a complex engineering issue.

I hope I've done a better job of explaining the statements I made, I can see how they may have been overly vague, but I was focusing on your claim that the SSID broadcasts aren't a valid example of security through obscurity.

>Seriously, where do you work anyway?

This is extremely unnecessary and unproductive.


"First, a failure to communicate doesn't have to mean that either of us is dumb. It just happens sometimes."

Agreed. I'd meant my other comment to better indicate that those were potentially transient attributes. Also, smart people are perfectly capable of lacking understanding about something in particular.

"Second, the thought that you might be trolling has crossed my mind as well, so it now seems unlikely that either of us are."

That doesn't really follow, but okay.

"Third, there really isn't any need to repeatedly resort to insults. I have no idea what you do for a living but I wouldn't be so ready to dismiss you as incompetent as you seem to be with others."

I apologize where I was overly rude or confrontational.

"Security by obscurity is generally used [...]. I hope I've done a better job of explaining the statements I made, I can see how they may have been overly vague, but I was focusing on your claim that the SSID broadcasts aren't a valid example of security through obscurity."

It still seems to me that you're more or less using it to mean "weak security", as opposed to a principle which can be used to guide what to fix. It's quite possible that the meaning has weakened/broadened since I first learned about it (plausibly here[1], though I feel like it was earlier), but if so I think that's unfortunate - I think the tie to Kerckhoffs' Principle is important.

">Seriously, where do you work anyway?

This is extremely unnecessary and unproductive."

You brought your profession into the argument. I'm not sure it's extremely unnecessary to establish it, or for the argument to reflect on your profession. Neither am I entirely sure that it was actually appropriate, though.

[1]: https://www.schneier.com/crypto-gram-0205.html#1


Yeah, I'm giving up on this until someone else weighs in. At least one of us is lacking in at least one of understanding of the issues, English language reading comprehension, and ability to write clearly, and I am running out of the patience to figure out which. Presuming, I hope correctly, that you are not simply trolling.


My only "beef" with him, as such, is that he stoops to their level and releases the home address and contact information of people -- frequently young people if I remember correctly. I don't think these people should be free from consequences, but it doesn't warrant vigilante justice. I'm sure given that he has been personally targeted his judgment might be clouded though, and it's hard to know what I would do in that situation.


There was a recent post in which he refused to release PII due to the person being a minor: http://krebsonsecurity.com/2014/02/the-new-normal-200-400-gb...

``` “I don’t see what a wall of text can really tell you about what someone does in real life though,” said Rasbora, whose real-life identity is being withheld because he’s a minor. ```


These Black Hats make their living off of companies and people with poorly implemented information security. Brian Krebs makes his living exposing Black Hats who themselves have poor information security. There is some delicious irony in the Black Hats' retaliation against Krebs.


> hit by a truly huge attack that averaged 200-400 Gbps

I take it he's talking about that[1] DDoS attack? So whoever it was launched the biggest attack in history to to get at one guy?

[1]: http://blog.cloudflare.com/technical-details-behind-a-400gbp...


Note that the resources required to launch that attack were relatively small. From the cloudflare article:

it is possible that the attacker used only a single server running on a network that allowed source IP address spoofing

So perhaps one technically adept attacker, which isn't quite so surprising.


>Technically adept attacker

It was a 15 year old kid http://krebsonsecurity.com/2014/02/the-new-normal-200-400-gb...


That the attacker was 15 doesn't preclude them from being technically adept.


No, it's a different attack. The one on Krebs' site was about 200 gbps and lasted 10 minutes, it was done by some kid who wanted to impress a gang or something. The 400 gbps one was against an unnamed site and lasted a couple of days.


Moderators changed the title from "Life of Brian" to "Life of Brian Krebs." I honestly think its a poorer title, and they could try and acquire a sense of humor.


It was also misleading. I didn't click on the link at first because I assumed it had something to do with the movie.


They could've at least made it "Life of Brian (Krebs)" or something.


Looks like they have now


Not the main point of the article, but I'm glad that he's going after malware authors. I don't think I've ever felt so simultaneously enraged and helpless as when I got infected.


The description of security by obscurity in this article reads a lot like Kerckhoff's principle, which when employed correctly is actually a virtue. Not to defend cybercrime, but completely covering your tracks (digitally or otherwise) is a very tricky problem - one that people have long tried to solve with both malicious and benevolent intent - and failings in that vein aren't necessarily of the level of amateurishness that the term implies.


To make the article more interesting to read you should've put the ending at the beginning (Barking up the wrong tree). Now while reading what you wrote I am trying to guess where it is heading. If you'd have written it reversed it would be more compelling and convincing.

Newspapers do the same thing; they write what is most important at the top and add details and examples further into the article.

Good thought and interesting examples though. :)


>Newspapers do the same thing; they write what is most important at the top and add details and examples further into the article.

Inverted pyramid (this style of newspaper writing) has historical roots. The idea was that a typeset story--often wire service copy--could be (literally) cut off at just about any arbitrary paragraph break to fit in the available space.


The modern world is not so terribly different. Most modern readers probably won't make it past the first paragraph.


Interesting. The writing style was dictated by the constraints of newspaper layout and the technology's limited ability to meet those constraints. Does that mean that this writing style is obsolete now, or are there other reasons to prefer it?


One big reason is that it gives people the most important info they need up front. If they care about details, they can keep reading.


There's an undercurrent on the article that is somewhat dismissive of Krebs--i.e. that all his scoops come from tips, and he gets good tips because he's so well known. It even says that if it weren't Brian it would be someone else.

The facts, are, though, that Brian started in the Washington Post mail room and has since worked his way into a column at the Post, and now success as an independent blogger. Neither of those were easy or assured for him.

So I would argue that there is something about Brian and/or his work that is special or noteworthy.


I wonder how many times the heroin trick has worked?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: