That page is not available to me - I can see 113 and 115, but not 114. Can someone tell me what's about?
Edit: Found a link for those that cannot see it either: http://litru.ru/book/?p=214644&page=24 . No idea why Google decided I cannot read that particular page.
Even though I just re-read that excerpt (lovely book indeed), for a moment there I thought you were talking about Teller from Penn & Teller, and how much I'd love to see those three talking or playing tricks on one another :) (well, without Teller talking, obviously)
No, he just kept on pulling and then another paper would be within reach until the drawer just about emptied. Similar to how folded napkins are all separate but you can pull one out of a container and then the next one will be pulled into place by the previous one. That's on purpose and this is accidental but the mechanism would be much the same.
> Pity they didn't just pack up and go home after that.
Indeed. The public radio station in my town (in the U.S.) airs a special on the Christmas Truce every year around Christmastime. It's supposed to be uplifting, but I find it almost unbearably sad. The Europe out of which that spirit came was demolished during the war.
I suspect those messages were never official, but more the local operators making the best of being posted during holidays. Thus they were likely never recorded anywhere.
[[[ To any NSA and FBI agents reading my email: please consider ]]]
[[[ whether defending the US Constitution against all enemies, ]]]
[[[ foreign or domestic, requires you to follow Snowden's example. ]]]
Also, 2016 is almost here. Don't forget to post your annual "You are hereby notified that you are strictly prohibited from disclosing, copying, distributing, disseminating, or taking any other action against me with regard to this profile and the contents herein." status update to Facebook.
commit ac468f3fab9f7092a430eedfd69ee1fb2e23c944
Author: Martin Thomson <martin.thomson@gmail.com>
Date: Fri Jun 14 13:14:02 2013 -0700
Exercising editorial discretion regarding magic.
diff --git a/draft-ietf-httpbis-http2.xml b/draft-ietf-httpbis-http2.xml
index 58bbc27..f1e570d 100644
--- a/draft-ietf-httpbis-http2.xml
+++ b/draft-ietf-httpbis-http2.xml
@@ -385,9 +385,9 @@ Upgrade: HTTP/2.0
The client connection header is a sequence of 24 octets (in hex notation)
</t>
<figure><artwork type="inline">
-535441202a20485454502f322e300d0a0d0a52540d0a0d0a</artwork></figure>
+505249202a20485454502f322e300d0a0d0a534d0d0a0d0a</artwork></figure>
<t>
- (the string <spanx style="verb">STA * HTTP/2.0\r\n\r\nRT\r\n\r\n</spanx>) followed by a
+ (the string <spanx style="verb">PRI * HTTP/2.0\r\n\r\nSM\r\n\r\n</spanx>) followed by a
<xref target="SETTINGS">SETTINGS frame</xref>. The client sends the client connection header
immediately upon receipt of a 101 Switching Protocols response (indicating a successful
upgrade), or after receiving a TLS Finished message from the server. If starting an
"... The disclosures were published by The Guardian and The Washington Post on June 6, 2013. Subsequent documents have demonstrated a financial arrangement between ... "
I had the pleasure of briefly working with Martin at LMAX. He definitely knows how to get performance out of a system, and although it's not been updated recently I'd recommend his old blog for those that haven't seen it yet and are interested in where the nanoseconds go :
A funny joke, but a telling statement. Words so commonly used reflect attitudes and history. For instance most of the English-speaking world says Goobye ("God be with you") while the French use the more neutral AuRevoir (until we see again). Web browsers now greet websites with "PRISM", the equivalent of "the walls have ears".
This is a pretty good way to remind ourselves ("Never Forget") why some things are the way they are. It stands out and is perhaps unexpected, giving a chance to have a discussion around the things we want to avoid or combat. That it shows up in the protocol gives relevance to a FAQ entry on the topic, vs just being a by-the-way FAQ entry that is easier to ignore.
It could just as well have been ECHELON [0], however PRISM has more recent, and documented, meaning and more mindshare specifically around/related to domestic spying.
While both "the dress" and prism appeared in the news, lets not forget some news is actually newsworthy and some is forgettable mind candy. Prism needs to be remembered and including it in the protocol message ensures it will be discussed for much longer than the dress -- as it should be.
"And the experimental data we have (what there is of it) suggests that we need to make this look like an unknown HTTP/1.1 method (or two)."
Anyone know what experimental data this refers to, and why this helps? This gets encapsulated inside TLS; nothing should know about it except the endpoints, both of which need to understand HTTP/2.
HTTP/2 is not TLS-only, even though major browsers will only use it over TLS. And even when over TLS, the TLS connection might terminate in a separate machine, it going cleartext the rest of the way.
> HTTP/2 is not TLS-only, even though major browsers will only use it over TLS.
Which in practice will (hopefully) make it TLS-only. As I recall, one of the original motivations to require TLS, and one of the reasons browsers plan to mandate TLS (apart from the obvious), was specifically to avoid broken "transparent" proxies.
> And even when over TLS, the TLS connection might terminate in a separate machine, it going cleartext the rest of the way.
If the server uses a TLS frontend device and passes cleartext to a backend, then apart from that being a really bad idea from a security perspective, they should know better than to allow that to pass through a broken transparent proxy.
1) We can try to add more encryption to fight back.
2) We can recognize that there needs to be hooks for duly authorized access.
3) We can change or at least influence the political objectives
Personally, I'd have assumed option one is the obvious answer, in addition to increasing adoption of encryption in other areas (though that may be outside the scope of their project). Unfortunately the author seems to conclude that there need to be (presumable CALEA-style) hooks for "duly authorized access". This is almost unbelievable that they are openly suggesting implementing backdoors in communications protocols. I expect this from LE and politicians, but I don't expect this from FreeBSD commiters.
This change was committed on Jun 14th 2013, about a week after the PRISM programme had been leaked by Edward Snowden. It is a clear reference to the programme but by this point it was already public knowledge.
The author isn't implying that the NSA intentionally injected the word "PRISM" into the header of every HTTP/2 connection. Rather, that the members of the IETF that collaborated on the standard chose to change the existing dummy text to match those initials once the existence of the program was leaked to the public, presumably as a sort of inside joke/protest.
^ I figure I'd clarify that in case anyone else gets as confused as I did about what possible political or technical objectives the NSA might achieve by including a reference to a top-secret spy program in every HTTP/2 message (although it does seem like the sort of thing a comic-book evil intelligence organization would do :P).
Hold on. AES is not intrinsically vulnerable to known-plaintext, like every modern cipher probably going back all the way to DES, but that doesn't mean AES constructions can't be attacked through known plaintext runs.
This particular text isn't a vulnerability (after all, HTTP is full of known plaintext), but I think you've overplayed your argument. :)
yeah, if knowing the leading 192 bits of the plaintext (24 octets, per article) lets an attacker break a session encrypted with 256-bit AES, you've got bigger problems than a known-plaintext attack, methinks. :)
Well, it eliminates most of the possible keys. Out of 2^256 possible keys, only 2^64 are now possible.
This would be serious if someone found a way to easily identify which keys were still possible. From there, you could find the real key simply by a process of elimination. It would be somewhat lengthy, but not unreasonably so. But so far as I know, nobody yet knows how to easily identify which keys are still possible.
Note: I'm ignoring IVs here, partly because I don't know if HTTP2 uses them. They may make the process of elimination impossible (at least in realistic amounts of time).
Isn't supposed to happen. It would be a break of AES if it were to be shown to be possible. However it would mean that such a break would be instantly exploitable, rather than having defense in depth by requiring multiple independent failures. Our crypto systems should not be so fragile...
This may be true for a single session, but it would seem that having every session begin with the same long text could allow correlations to be drawn on large data sets, potentially leading to full recovery of the sessions.
Do you know of the proof that rules out such a possibility?
No, that is not a realistic vulnerability. I am now regretting nerding out about AES upthread, since I seem to have created the impression that there is some risk to having every HTTP request start with a fixed string. There is not.
> Why not? What allows AES to escape such an attack?
What attack? A known plaintext attack is just a type of attack. There aren’t any such known attacks against AES currently. http://crypto.stackexchange.com/a/10837/7264 is related.
Fair enough. Maybe there is some disagreement between the participants of this thread in what "known-known" means. I would say that without a proof that a known-plaintext attack cannot exist, one cannot say "There is not (a known-plaintext attack)." One could say, however, "I do not know of a known-plaintext attack against AES, however I do not know of a proof that rules one out."
Also, HTTP/2 traffic may be encrypted using another scheme (which may be vulnerable to large constant blocks in known locations). I don't think HTTP refers or deals at all with encryption. I'm not sure about HTTP2. My understanding is that they are application protocols only and that message integrity protocols occur at a different layer.
> Also, HTTP/2 traffic may be encrypted using another scheme (which may be vulnerable to large constant blocks in known locations).
This isn’t really a situation worth caring about when designing a protocol. If you’re choosing to encrypt something with weak encryption, you had better be darn sure that its vulnerabilities don’t apply. (And… why would you do that to begin with? No compelling performance reasons, even.)
If you use randomness (like in say CBC mode) that should suffice to give you IND-CPA security, no? And I believe GCM mode gives you IND-CCA2 security as well.
Actually CBC is a great mode for demonstrating how constructions built on secure primitives can introduce vulnerabilities. We can prepend a specific block to one with known plaintext in CBC mode to end up with garbage + plaintext we control.
Not actually requiring something to start with a fixed string doesn't really mean anything when every message begins with a fixed string because it's just the way to do.
I'd believe it might be the most common (though, I'd guess "GET /favicon.ico HTTP/1.1" to be a real contender). I highly doubt either are anywhere more than a small fraction of the total HTTP requests though.
As shown in the commit itself[1] (and implied by the original article), the connection header always started with a constant text block. It was just changed to include "PRISM" instead of "START".
I downvoted you, but I thought I'd explain why - I don't think it's reasonable to characterise this as childish. Government surveillance is something which our industry is the best positioned to speak out against. This type of thing seems to be clearly political speech, and not a prank.
Legit question, why?(in this case)
If you mean general lack of professionalism, sure, but in this case encoding this as a matter of protest seems appropriate.
Yes; the entirety of our internet infrastructure is...almost...hopelessly insecure. The OpenSSL team refuses to even use SSL/TLS on their website because "they don't want anyone to get the slightest illusion that it's secure" and want everyone to manually verify the SHA hashes.
2) The OpenSSL source code is stored in a git repo in GitHub. While this doesn't ensure that the code hasn't been tampered with, git does make it substantially easier to detect tampering than other VCSs do.
3) All of the release tarballs are PGP signed. Verification of the authenticity of these files is just about as automatic as it gets.
Have they posted their concerns regarding SSL/TLS? It would make interesting reading. I am assuming the issue is the certificate issuance hierarchy and correspondent lack of transparency, but that's just a guess.
Of course if those hashes are also served via plaintext, then comparing them also doesn't matter, and using them as verification is akin to praying to not be compromised
As part of the in-group I can think it's kind of funny in a "stick it to the man" kind of way. But considering that much of what let's the NSA do what it does is because of the failure of organizations like the IETF (and bystanders like ourselves not to put more pressure on them) to make secure technologies prevalent, it's not that funny anymore.
This is exactly what the NSA wants. That you feel like you're on the right side and "sticking it to the man" while they laugh all the way to their long term data storage.
It might be political speech, but it's also childish.
What is it intended to accomplish? What is the actual statement being made, and who is the intended audience? Not the government - they already know what PRISM is. Also not the common internet user. If it had the potential to be effective at influencing or censoring political dialogue, I would be upset at the attempt to bake propaganda (however sympathetic the tech community might be to it) into what should be a politically neutral protocol. But it does seem more like a prank than anything else.
I disagree! I think there's a pretty good case to be made that if the IETF is not in fact comprised of immature hobbyists, its overall behavior is well described by a reversion to that lowest common denominator.
There are some great IETF success stories (mostly from a long time ago), but most of the best IETF products are the result of forceful and expert engineering done by just one or two people, carefully documented. You can see how much better that kind of work is than the IETF collaborative process by looking at the v2's and v3's of standards that started out the other way.
It's not a crazy observation to make. Think about the culture of heavily-engaged Wikipedia volunteers. Wikipedia is an amazing resource, probably the most impressive thing on the entire Internet, but most HN commenters would have no trouble calling it out for immaturity, insularity, and frivolity.
and then you go on to make a point that agrees with OP, pointing out that IETF members are very mature engineers that are trying to solve hard problems.
No. You've misread me. I do not like the IETF, and feel like even though there are lots of well-intentioned and highly-competent engineers working within it, its process and current overall membership produces results that are sometimes indistinguishable from "immature hobbyists".
Specifically, I think the argument that the success of the modern Internet is evidence that the IETF is "mature" is dubious and worth debating.
(Hey: just to be totally clear: I do not think the string "PRISM" is evidence of immaturity.)
The Internet is apparently in its "infantile" brain stage. Would we notice when the Internet's "memory" (the Web) becomes a cantankerously irreverent mentally unstable adolescent?
The relevance is that this text could ostensibly be used as the known plaintext to aid in breaking encryption, however, it's not going help much anyway.
For example, with HTTP 1.1, it is pretty much likely that an HTTP response would start with HTTP/1.1 200 OK, or that it would contain strings such as "Content-Type" or "Content-Length". 24 more known bytes won't make it much worse.
When you have an encrypted file (or a stream for that matter) and you know the first N bytes of the decrypted data, it might be easier to compute the encryption key if algorithm uses blocks smaller than N.
...and how much have you contributed to HTTP/2? Are you in any position to criticize Martin based on his avatar of all things?
edit: I apologize for the ad hominem nature of this comment. I just get frustrated when people attack major contributers to open source projects. We need to keep in mind how important these projects are and how thankless contribution can be.
Martin has done a lot of work on HTTP/2 and WebRTC. If somebody wants a different message in the HTTP/2 handshake that doesn't evoke immaturity, then that person can contribute sufficient to gain stature in the community and then change that message. Criticizing Martin's avatar is just dragging us into deeper waters for nothing.
Fair enough, I admit that comment was not one of my finest and should have focused the argument on the substance of HTTP/2 itself instead of it's contributors.
In-group behavior is not a good indicator in terms of being a meritocracy. Rather it indicates that people don't expect anyone to scrutinize them.
Say there's a summit on internet governance and someone is making a point about this statement against PRISM and wants link to his commit message. Say that someone is part of the US government and the audience is middle eastern business men. Who is taking away from the meritocracy now?
If you're attempting to slyly suggest something, everyone would be much better off if you'd just come out and say it. Please don't waste people's time with innuendo.
That's so unprofessional, my bow-tie came off! He should be hanged! You do still have hangings in the colonies, right? Nothing gets the rabble in line like a good hanging, I always say. Oh, I'm late for my croquet game. Toodles.
Heh. Ironically, the HTTP WG (including Martin) went through a period where many of us wore bow ties, etc. to meetings. Trying to raise the sartorial standards of the IETF (under the guidance of the ever-well-turned-out Hasan).
Why would you not expect programmers with strong political leanings? What kind of world do you live in where top researchers all share moderate views and modes of expressing those views?