Hacker News new | past | comments | ask | show | jobs | submit login

We can still improve this by caching across CDNs.



You mean updating browsers to recognize the same file across different domains (e.g. md5 hash, etc)?



Shipping Google chrome with a pre-downloaded static cache for all this popular js hosting services urls+contents will probably help the most and be really easy to implement.


the reason this hasn't been done is:

a) would require all servers and browsers to be updated for what is a marginal gain

b) privacy nightmare


It would require browsers to be updated (to support checking the hash), but not servers.


a) No it wouldn't; it would be totally optional. b) Get over it; there is no difference between this and CDNs.


> Get over it; there is no difference between this and CDNs.

except the part where you track people across sites

what I am saying is not speculative. this has been proposed previously, and shot down. there is a reason why it hasn't happen.


Where would you be tracking people across sites?


If you trawl through the IETF and more recently WHATWG mailing lists you will find that every time caching comes up - either with Last-Modified or when ETag was being ratified the proposal for cross-origin caching also comes up, and is rejected.

The browser vendors just spent the past 4-5 years locking down cross-origin access in the DOM because of all the security and privacy implications that come up. Corporate profiles and ISO standards don't even accept running the code - let alone caching it (i've worked on plenty of corporate projects where you aren't allowed to even use Google hosted JS - it just won't run due to AD policies).

To give you but one example of what arises with this new vector. Say I were to go to the top 50 banking sites and take a sample of a Javascript file that is unique to each site. I would then take the hash of each file, and in a single web page include the script element with a hash of each of those files. When a visitor hits that page and the browser attempts to load each script element, if it loads it in 4ms then I know the file was in the cache and that the visitor is a customer of that bank.


Why would the banks enable hashing on their unique stuff? It would be their security flaw, not the design of the system.


Well that is just one, and as I mentioned cross-origin requests and access have been further locked down recently, not opened. With all the different versions and variants of libraries you are implementing and exposing a lot to save very little. You wouldn't even save 1% of web requests. as mentioned, this isn't a new idea, I'm just telling you the reasons why it hasn't and won't happen




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: