So an American government agency destroyed an American business merely as collateral damage of trying to persecute an unrelated guy who'd revealed their wrongdoing. And they won't face any consequences for doing this. Something is very wrong with this.
> So an American government agency destroyed an American business merely as collateral damage of trying to persecute an unrelated guy who'd revealed their wrongdoing.
I'd argue they destroyed it because it doesn't align with their view of how the country should work. The government hasn't moved much from treating encryption like munitions—they probably viewed lavabit as an existential threat.
I'm curious, from anyone who knows about this stuff, is the argument in https://www.xkcd.com/504/ valid? I.E. could crypto be protected under the Second Amendment? To my IANAL eyes, it seems to me to be a completely legitimate argument.
In the alt text of that comic, Munroe says "Jefferson would have been all about crypto."
A quick Google search confirms: not only would he have used crypto, he did use crypto:
> Jefferson had used ciphers before with official as well as unofficial correspondence; letters to James Madison, John Adams, James Monroe, Robert Livingston, among others include communication in cipher. It was a way to keep "matters merely personal to ourselves" as well as a way to "have at hand a mask for whatever may need it."
The second amendment only applies to stuff that happens within the US. You have to file paperwork to transmit things like gun schematics over national borders, which is why plans for the Liberator pistol got pulled from the main site in a couple days and are now shared mostly through torrent sites.
However, crypto not being a munition means that a good argument could be made that it is speech, and therefore falling under the much more firmly challenged and discussed First Amendment, so restricting crypto could be considered a prior restraint.
Even without this cross-border thing, there are tons of regulations about what kind of weaponry you can have, and I don't think anyone argues that the 2nd amendment would allow anyone unlimited access to bazookas or nuclear weapons. Presumably if cryptography were treated as munitions, they'd argue that you can have access to some cryptography, but not "military grade cryptography".
Right but that logic could be used to prevent mass adoption of better crypto. Brand new crypto is not in "common use". We could, for instance, see the state ban the use of quantum-resistant algorithms by private citizens if citizens are only allowed to use the crypto that is in common use.
The law has not treated new weapons in private hands (ie. explosives, automatic weapons, etc) favorably.
RSA-2048 will be the small arms equivalent of a musket in 2040. Since the US government controls so much of the global brainpower for crypto, that's not a standard that is acceptable.
I will politely disagree. RSA-2048 implemented properly, on its own, has a significant security margin today and it's unlikely that HW advances over the next 25 years will erode that, even at a superbly-funded first-world intelligence agency.
I am not saying that they won't be able to pwn you; just that they won't do so by directly cracking 2k+ RSA.
The point is that crypto is only evaluated as "secure" at a given point in time.
Even if RSA-2048 is secure in 25 years, it will almost certainly not be in the same place in it's lifecycle. Even today, NSA guidance recommends RSA-3072, or RSA-2048 with an accelerated migration to Quantum-resistant crypto in the future. (https://www.nsa.gov/ia/programs/suiteb_cryptography)
My fear is that there won't be an "AES 2", and the commercial/individual world will return to a place where we don't have access to quality, trustworthy crypto for communications and commerce. Without security, there is no trust, and without trust many of the advances in productivity and and collaboration that we've gained in the last 20 years will be substantially weakened.
The second amendment applies to arms, not to munitions directly. At the time of its authoring, "arms" was understood to mean devices used by one person for the purpose of targeting one person or thing.
It have never, for example, applied to ordnance. (hence, the asinine argument "where does it end - should we allow everyone to have nukes?!" does make any sense, at least politically)
So, I've not been able to actually dig up an answer to this:
Were cannons protected by the 2nd amendment? Or merely not prohibited in a legal environment in which the government claimed the authority to prohibit them?
I can't find any reference to discussion of cannons as "bearing arms."
I've always believed (and have understood to be the consensus from legal references I've seen in the past) that the definition of taking up arms was related to their use to defend oneself against an oppressive opponent. Is that not how it is seen in the courts?
Makes sense. So, hypothetically, if crypto was used within US borders (e.g. I encrypt my hard drive) and no algorithms were shared across national borders, then one could attempt a Second Amendment defense (assuming it was still labeled as munitions)?
I can see how recognizing it as speech and protecting it under the First Amendment would be superior.
>the Ninth Circuit Court of Appeals ruled that software source code was speech protected by the First Amendment and that the government's regulations preventing its publication were unconstitutional
So at least the implementation of crypto in software seems to be protected under the First Amendment.
Didn't the MPAA/DeCSS case (that happened after this) just route around this whole precedent using the DMCA and still end up making that source code illegal to distribute or use?
I remember some of the expert testimony presenting versions of the source code as all kinds of speech (someone even wrote a song I think) to make the point that code should be protected, but in the end they still lost the overall case.
It's one thing to classify code as speech, it's another thing to give it complete protection from censorship or criminalization in any scenario.
According to John Gilmore, when the EFF published the book about the "Deep Crack" DES cracker hardware titled "Cracking DES: Secrets of Encryption Research, Wiretap Politics, and Chip Design" [1], the US Government would not allow them to publish the software source code on a floppy or CDROM along with the book, because the export control laws considered DES software in machine readable form to be a munition.
However, they figured out that First Amendment protected their free speech to the extent that they were allowed to publish human readable listings of their source code in the book, even though a floppy disk was right out.
So just for that book, they invented a machine readable easy-to-scan "paper floppy disk" system that printed hex checksums of each line in the left column (which coincidentally looked enough like line numbers that it flew under the radar unnoticed), so you could scan and OCR the source code and checksums from the book, then validate it against the checksums to correct all the scanning errors.
> "Cracking DES" has been published only in print because US export controls on encryption make it a crime to publish such information on the Internet, but the book is designed to be easy to scan into computers. (EFF is also sponsoring a lawsuit by Professor Daniel Bernstein to overturn the law and regulations that make Internet publication of such research results illegal. The case now rests with the Ninth Circuit Court of Appeals.) [2]
At this point I'm rooting for one of these existential threats to actually come through. Let's start over fresh with a government that isn't thoroughly corrupted and useless.
"Lavabit has developed a system so secure that it prevents everyone, even us, from reading the email of the people that use it."
That's the entire unique selling proposition of this American business (and a direct quote from their website), and it was a bald-faced lie. The DOJ didn't "destroy" this business; it revealed it for the sham it was. It could have continued operating long after DOJ remanded its keys; it was no less secure after than it was before.
The model Lavabit used, is the same security model 99% of cloud services use even today.
The same model that iCloud, Google, and other use, all of which have control over the encrytion stored on their servers, all of which will turn over data (not shut down like lavabit did) when asked by the government to turn over said data.
Apple has never, and will likely never refuse to turn over data stored on their servers.
Where did you get that information from that the statement was not true? The site was legit and shut itself down to protest against that kind of power play.
The whole thing is worth a read but here's the gist
"Unlike the design of most secure servers, which are ciphertext in and ciphertext out, this is the inverse: plaintext in and plaintext out. The server stores your password for authentication, uses that same password for an encryption key, and promises not to look at either the incoming plaintext, the password itself, or the outgoing plaintext.
The ciphertext, key, and password are all stored on the server using a mechanism that is solely within the server’s control and which the client has no ability to verify. There is no way to ever prove or disprove whether any encryption was ever happening at all, and whether it was or not makes little difference... The operator can at any time stop averting their eyes, an attacker who compromises the server can log the password a user transmits, and an attacker who can intercept communication to the server can obtain the password as well as the plaintext email."
If I understand what you are saying, then the same could be said about the iPhone 5C - that it was built with a backdoor/security flaw - the ability to install a new OS without passcode that will allow brute forcing simple 4-digit passcode.
You could use a secure password in lieu of the 4-digit one, though, couldn't you? My understanding was that the software running on the device to enforce the retry limit functioned as a backup measure against passcodes vulnerable to brute force in the first place. If the password was strong, the brute force would be futile.
Re: Use of complex password would have made iPhone more secure - that's a fair point, but prior to touch ID, more than 4 digit password were pretty rare in reality (I suspect < 1% of users). Apple could have avoided this entirely through putting a delay in hardware that couldn't be changed without the passcode.
The nuance I'm trying to get to here - is that the fundamental security flaw / backdoor is already present on the iPhone5C, the FBI is just trying to utilize it. Something as simple as a mandatory hardware-enforced increasing timeout would have made brute-forcing the passcode prohibitively expensive. (50 ms, 100 ms, 250 ms, 500 ms, 1sec, 2sec, 5 sec, 10 sec, 30 sec, 1min, etc...)
So much so that one might be led to suspect a coordinated effort, or a "conspiracy"? It's not only western countries, Japan for example is eagerly part of the program.
Possibly a ad-hoc conspiracy, at that level they all talk to each other and watch what they are doing, the permanent political class is a problem for all democracies I think.
I suppose that's kind of what I had in mind when I wrote "conspiracy" in quotes. I don't mean secret societies (nor to rule that out either) - but it seems pretty clear that some groups are seeking to extend their power and it's a coordinated effort.
He destroyed his own business. I do respect his morals, but I'm not clear what exactly he hoped to accomplish?
Could you legally reveal the target of an active phone wiretap? Intercepting email metadata for a specific person under investigation seems like a reasonable thing for law enforcement to attempt. Is his problem with it the action, or the idea of a secretive court allowing this?
The other issue is having to turn over encryption keys. This is much more of a grey area, and I can see why he would shut down rather than set that precedent. This is why if you want your data safe you should have the keys.
His moral stand wasn't for Snowden. He surmised the court would give the key to the Feds and they would use it to unencrypt things beyond the scope of the order. He stood on principle against the courts insistence that he trust the plaintiff to use the tool within the courts orders. The courts naievete is cringeworthy.
No, that's not what happened. The government initially asked LavaBit to intercept and provide data for one user only. LavaBit had done this in the past for FBI on other cases. However, the owner dragged his feet for no legally defensible reason responding to this particular request, and asked for compensation et cetera, and so the government demanded the SSL keys to do it themselves.
I read a comment somewhere (I think it was here, or perhaps another site) that Snowden was to HN what the SCO case was to Slashdot back in the day. That is - times were happier until it seemed like politics took over.
I'm not trying to knock either site. What I'm trying to get at is that there's a certain amount of group-think (group-angst in this case?) and if you immerse yourself exclusively in that, you start taking on that view. This 'existential angst' (I really can't think of a better term, though I know this isn't wholly adequate for what I'm trying to describe) is contagious.
HN became my daily news source when I finally got tired of Slashdot's political bent. After Snowden, HN has taken on a somewhat similar flavor. Don't get me wrong - I agree with a lot of it. I also know that sometimes I just have to step away from it, for my own happiness.
It's not about willful ignorance - it's about picking your battles and guarding your own interpretation and mood against the thoughts and mood of the group.
It makes me sad as well that the average person has no idea or understanding of the importance of the battle going on behind the scenes about their own privacy. These power-mad governments are more concerned with spying on their own people and removing any chance of privacy than actually doing good for the people who voted them in.
It just makes me sad... There's so much good that could come from computers, but our own governments just want to use them to spy on people. :-(
Ignorance, in the sense of ignoring the world around, you is bliss. I sometimes envy those who can live in their own world and nothing seems to affect their happiness.
But more seriously, if you correlate happiness with ignorance and pass that off as something you don't have the power to change, that is a mistake. Happiness and your attitude towards things is something you can work on.
Intelligence, "smartness", and ignorance are all orthogonal.
Even very intelligent and very smart people can approach an issue with extreme bias and blatantly ignore (get it ;) news, facts, etc. that contradicts that bias.
I would argue that William Dembski[1] is both smart and intelligent, but I would also call him ignorant.
[Edit: my point being that people who choose to ignore the things that contradict their bias are happier as a result of their ignorance]
Apologies, I guess that was more of a reference appropriate for Reddit than HN.
> I too am very smart.
is a reference to the subreddit at https://www.reddit.com/r/iamverysmart which points out people describing themselves as very smart, often in ironic ways. I realize now that it may have looked like I was actually making a claim about my own intelligence; I was not.
"It is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied. And if the fool, or the pig, are of a different opinion, it is because they only know their own side of the question. The other party to the comparison knows both sides."
Ignorance is bliss. Personally I'd rather be informed, that way when I do have the power to exert a little influence (such as in the voting booth) I'm more likely to make a decision that will steer away from more mess. Even so that almost always feels like decisions between various grades of bad rather than something good.
After September 11, 2001, I stopped listening to the news for about 5 years. I was tired of hearing one horrible thing after another. While I wasn't living under a rock, the things I did hear about -- Blackwater, Abu Ghraib, "Mission Accomplished" --- did nothing to make me want to pay attention again. In retrospect, I think it's too complacent a stance, but I still think a daily dose of bad news is excessive. Which is why I read up on the presidential primaries at a weekly interval.
A casual Google search says that "happy" == "feeling or showing pleasure or contentment".
The more informed you are, the more problems you will see and the less content you will be with the state of things. Problems aren't particularly conducive to happiness.
That said, ignorance is not necessarily bliss. It is better to be informed and be able to protect oneself from possible problems than it is to continue to pretend everything is fine and fall victim to eventual problems.
Even though I very much agree with lavabit I think it's more honest to construe it as the owner decided to shutter the business rather than comply with the spirit of the order.
If you force a company, which sells encryption, to break this encryption for all existing customer, you pretty much destroy this company. What the owner finally does, doesn't matter anymore.
Uh didn't Lavabit destroy themselves? They offered an impossible product and refused to comply with lawful (and not unreasonable) court orders. They essentially lied about what they were capable of. Hint: If you're receiving SMTP from the world, you always have access to the content of that email. Even if you think you're clever by encrypting it "right away".
They promised secure communications and when the government made it impossible for them to legally continue to offer that service, they shut down rather than betray their users.
As far as I can see, they delivered on their promise.
No, he promised secure communications, developed a dumb design that tied all users to one secret, and folded rather than divulge it. Fortunately, others learned from his mistake in designing better services.
No, in foresight. It was so dumb that secure messaging from 80's and 90's did better in various ways. High assurance already settled on proxies, guards, and per message crypto due to many attacks possible. Hushmail put the needed functionality in an applet to avoid relying on just web tech.
Then, there was Lavabit founder acting snobby talkinv about how he used same security as banks. What might that be? HSM's for keys, link encryption, and per user policies? No, one SSL key protecting everyone like a generic website. Ohhh....
Definitely foresight on my part when I avoided that and similar services. Not to mention I thought they'd 0-day the servers. Still not sure why they didnt. You dont need US government in your threat profile to know top notch hackers like Chinese, Russians, and Israelis might target you to see who you're hiding with "bank style" security.
Note: I protect my Hacker News posts with bank-grade security. I require a login, name, and HTTPS on all posts. I feel safer already.
Wikipedia says Lavabit complied with previous search orders. They just chose to take a stand.
They did not create a secure communications platform. Any email sent to a user would be delivered plaintext (SMTP), and nothing stopped Lavabit from intercepting that info other than them saying so.
This is why (afaik) Silent Circle stopped offering their "secure" email system, because Lavabit made it clear that such a service is not really secure.
As a side note, Snowden also used Cryptocat so the fact he used Lavabit isn't necessarily a positive endorsement.
This is the same kind of government we're expected to trust with private keys and source code from companies like Apple and Google. Of course there's been a long history of government not knowing how to secure IT infrastructure, including a data leak that exposed the identities of it's own undercover spies: http://money.cnn.com/2015/09/30/technology/china-opm-hack-us....
In the latest Testimony, Comey tried to brush-off Apple's statements saying that it is not Apple's business or place to decide or judge whether or not FBI is doing good job keeping citizen's data safe.
I cannot imagine statement more de-attached from reality than that. If the companies like Apple who sells millions of products to customers and rely on customer's trust are not a good indication or source of information whether the Gov is doing good job, then I don't know who is...
FBI wasn't responsible for the data being stolen by China, that was OPM[1], a completely separate and NON-security focused branch of government.
From my point of view, whenever "we the people" demand more "answers/solutions" from the government, what ends up happening is more bureaucracy that ends up failing us all.
I'd rather we be precise in these type of conversations instead of using the general 'they'. Corporate personhood was a decision by the supreme court, a separate branch of government than the FBI. The FBI doesn't seem to like the Bill of Rights whether it applies to people or corporations.
If the mountain of papers is high enough someone will mess up. The funny thing is that this is symmetrical, the same kind of math that underlies 3 felonies a day applies to court filings like these. The more opportunities to mess up the bigger the chance that someone will mess up.
That said I don't think that there is anybody out there who is shocked by this confirmation, it was as far as I'm concerned a certainty, the timing would have been too much of a coincidence.
EFF's Privacy Badger does the same for me. After looking at the list of 10 different blocked tracking sites, I just gave up. It's too bad, I'd be otherwise be happy to visit their fine website.
Someone shared this a long time ago on an HN thread. Put this code into the url of a bookmark. Then click it to get rid of annoying popups and grey outs. It works on Chrome at least:
Ha ha! Yes.... I read the article before noticing it was on Wired. Then I wondered why I was actually getting to read the article. Then I realized I was accidentally using the wrong browser -- one without an ad-block enabled. sigh
I was thinking the same thing. Particularly because it was human error which released the information. If there was genuine human error on Ladar's part then I guess there would not be a lot of leniency. I don't know how the government expects human beings to keep secrets from everybody they ever know, until the end of time.
The amount of times little things slip out in conversation when you don't mean to makes me think that it is only inevitable that somebody, someday, will make a similar mistake to the government here. I don't think they will get treated lightly when that happens, either.
It's worth noting that the same bunch of people responsible for this mistake are those who want the public to believe they're sufficiently responsible and trustworthy to:
a) Safely store and analyse the results of mass public surveillance.
b) Hold 'master' keys to encrypted systems.
Of course no-one in the know seriously believes either claim, but this is a great counter-example to put to the general public.
And that right there is the central issue with the whole concept of a "trusted third party" in encryption. Three people can only keep a secret if two of them are dead.
The biggest problem is not this mistake but the total control and surveillance they are seeking and the persecution of those that try to defy them like Snowden.
It was an open secret that Lavabit was targeted because of Snowden account there, but that's a really good thing to be sure for real, just for the sake of transparency.
Even if in this case, it was non-intentional transparency.
While I also use an ad blocker, it seems odd to be upset that they block you from viewing their site. You've (we've) decided that you (we) don't want their ads. They've decided they don't want readers who don't take their ads.
I've reread the GP post several times and I am only seeing a stated fact, not evidence of any specific emotion. The GP might be upset, but his post is sparse enough that his point isn't really clear. While the post doesn't add much to the discussion, neither does a non sequitur about fairness or using pronouns in weird, parenthetical ways.
But I find some value in the GP post. It tells us that Wired is taking an active stance against ad blockers. This information can be used when considering sources for submission or deciding whether to click a link. Many of the responses to the GP provide value as well by shedding light on which ad blockers are more effective and/or are triggering the issue.
Problem is: There a thousands of comments like this on HN. They do not have any entropy anymore. It is just an expression of being upset for this "discrimination".
Indeed. The appropriate response when someone renders their own site unusable is to close the tab and go on about your day. Hopefully their analytics tools will help them refine their design.
Personally I use an adblocker and got the door slammed in my face as well. I found it frustrating but closed the tab and went about my business. Trouble is, this is allowing this very non-webby behavior to normalize itself.
There should be a backlash against this idiotic adblocking-blocking, I'm just not sure what form it should take.
We know already that some people use a adblocker. We also know that some pages try to motivate people to disable those blockers. What, at least me, don't know is why some people still think they have to tell us that they use an adblocker.
Or in other words what does your blurb has to do with the content of this article.
The contents of the article is invisible to anyone who blocks outgoing connections to domains ending in "d.com" as well. That doesn't mean that people discussing the article need to hear about how you can't read it.
The banner asks politely to disactivate the ad blocker. You just need to deactivate it on the page and you can read the article. You have a message of thank you. I think wired is very polite in its anti-adblocker policy.
You should never read a Terms of service. You would be shocked about the shortage of the word "please". And most upsetting is that there is no "Just let me ignore this!" button next to the "Accept" button.
You can defeat their ad-blocker blocker by disabling javascript.
In Chrome, go to DevTools (CMD+Option+I). Press F1 for settings then check "Disable JavaScript" and reload the page. There are similar ways to do it in other browsers too.
More and more, I find myself using "Inspect Element" in Firefox to bypass ad blockers in order to read a page (or less!) of text. Today was no different.
I do this too, manually edit CSS on some sites to improve my reading experience. I do the same for removing fixed header menus that reduce the height of readable content. And sometimes fonts. Today I found an extension (on Chrome) that lets me keep my CSS changes per site - probably there's something similar for Firefox.
NoScript and Self Destructing Cookies (firefox extension) makes the internet much better overall. The first handles all but the most aggressive paywalls (the really nasty ones use <meta refresh> inside noscript blocks). The second handles the sites that you've turned scripts on for, clearing their cookies as soon as you navigate away. Just whitelist the sites you want to stay logged in to.
It's eye-opening how much of the internet relies on Javascript that you don't notice- and how much works totally fine.
uMatrix is even better than NoScript in many ways. It blocks all cross-site requests aside from CSS and images by default, and can integrate with uBlock Origin to provide a visual indication of which requests are to known ad sites.
Wow this spawned lots of debate. For my part, I am far happier to temporarily disable an ad block, than to get the "5 articles per month" nonsense from the Economist or NYT.
Interesting. I get the same banner, but I'm running an anti-malware installer script. Maybe I should upgrade to one that downloads their malware then just /dev/nulls it.
This is important information, but man, I really applaud the writer for stretching the story "they forgot to redact his email" and repackaging it into 10-15 paragraphs.
As I recall, FBI contacted Lavabit, and asked for information on a single user. That would entail altering their system at the behest of the FBI, and they said no. Then FBI asked for the TLS private key, so the FBI could intercept users themselves. Lavabit said no, and ended up shutting down.
Is anyone else surprised that the government wanted information about Ed_Snowden@lavabit.com rather than cincinnatus@lavabit.com? Maybe they wanted both and just redacted cincinnatus@lavabit.com properly. I thought that was the account he was using for distributing information about the NSA documents he took.
>IT’S BEEN ONE of the worst-kept secrets for years: the identity of the person the government was investigating in 2013 when it served the secure email firm Lavabit with a court order demanding help spying on a particular customer.
We all knew it, but it hadn't been confirmed by the government until now. See [0] for example:
>The name of the target is redacted from the unsealed records, but the offenses under investigation are listed as violations of the Espionage Act and theft of government property — the exact charges that have been filed against NSA whistleblower Snowden in the same Virginia court.