Hacker News new | past | comments | ask | show | jobs | submit login
Cdnjs - the missing cdn (cdnjs.com)
88 points by thomasdavis on July 31, 2011 | hide | past | favorite | 39 comments



This seems to once again miss the point. The main point of a CDN is to speed up delivery of content. Having all of these different CDNs only increases the time it takes because you are adding in more DNS queries to the client and on top of that they are adding HTTPs support which even further increases load time because of the "handshake" that goes on. That being said, if this CDN is the only one you are using then great, that works out fine. But a CDN dedicated to only Javascript just seems kind of silly other than for passing aound example HTML files for how to code something.

EDIT: Also another pitfall is that you can't compress all your JS into one big JS file saving time on requests too.


The other point of a (public) CDN is to have more websites using the same kind of resources (JS/CSS files) to load them from the CDN's URLs, which in the end benefits the users by having the browsers reuse the cached version of the resources across different sites that use the public CDN.


But having fewer open HTTP connections is very likely going to result in better overall performance no matter how good this JavaScript CDN is.


Yes, but having no HTTP connection is better than having fewer HTTP connections. If it's already in the cache (and the resource expire header doesn't signal resource expiration), the browser won't open another HTTP connection.


Is this speculation or proved? It's just as easy to surmise that getting the assets closer to the client will overcome any slow down from an extra connection.


It's speculation to make this claim. The burden of proof should be on them. I'd need to see a benchmark showing that making a new HTTP connection (DNS resolution and all) for a single <100KB file is worthwhile rather than sending it over your already open HTTP 'keep alive" connection.


Asynchronous script loaders like LAB.js (http://labjs.com/) also help alleviate the latency cost of having multiple resource requests by loading them asynchronously.


It all depends on the size of the file and how fast the user's DNS server is.


Your objections are misguided. I can't say they are outright wrong, but...

A (good) CDN speeds up content by having distribution points close to the client. An added benefit of a shared CDN (like this) is that resources can be shared by multiple pages, which will mean the cache will be shared by multiple pages. DNS lookup can be a factor in page load time, but parallelizable HTTP requests (especially to a location that is close by) can usually make up for it.

Google's recommendation is to use between 1 and 5 hosts per page: http://code.google.com/speed/page-speed/docs/rtt.html#Minimi.... For the type of clients this CDN is aimed at (people who don't have their own CDN) it is unlikely they have more than one or two hosts already used, so this CDN is likely to be useful for them.

HTTPS support is required for domains served over HTTPS. If you aren't using HTTPS then don't use it.


I'm confused. In most cases, the libraries will arrive faster from the CDN than each user's web server. CDNjs' reason for existence is specifically that these libraries are not available in other CDNs. HTTPS must be supported on HTTPS pages to avoid security warnings. Why would you use this to pass around how-to HTML files?


It might arrive faster on a CDN but the time that it takes the user's computer to connect to a DNS server to resolve the CDN may override the benefits if the user has a slow DNS which is a fairly common problem now days. If the server used a single CDN instead of all these misc CDNs then you only have one DNS lookup and it would be worth it then. But a CDN dedicated to a single file type kind of defeats the purpose unless your website doesn't have any images, css or anything.


I wasn't aware that a single DNS lookup was that expensive. Aren't they frequently cached? And with more people using cdnjs, there's a better chance the DNS entry as well as the assets might already be cached, leading to an even greater performance gain.

When you say "all these misc CDNs" you're talking about just one, CDNjs, right?

I suppose the CDN could host other assets if it made sense. But I can't think of any other assets with the type of commonality that JavaScript libs have.


"... the user has a slow DNS which is a fairly common problem now days"

I would find this somewhat surprising given the increased use of hosted DNS providers for both authoritative and recursive. Can you point to any data that corroborates this (not saying it isn't true, just looking for hard data)?


The real benefit of using one repository across multiple sites is that it increases the likelihood the library is available from the browser's cache. DNS lookup and retrieval will need to happen the first time a visitor goes to a site with a CDNJS library. If that same visitor visits another site that uses the same library then the library can be fetched from the browser's cache without any DNS or download latency. In other words, the more sites that use CDNJS, the bigger the benefit to the whole CDNJS community.


Could one solve this problem with URLs of this form?

http://199.27.135.101/ajax/libs/prototype/1.7.0.0/prototype....

CloudFlare doesn't seem to support direct access based on IP, but there's no reason it couldn't, right? Do browsers support this?


I don't think it's possible. CDN by its very nature is to send content in low latency manner to users in different geographical locations by having multiple servers also spread in different geographical locations. Users from different parts of the world might get identical CloudFlare URLs resolved to different IP addresses based on where they live, to make sure they get their content from the nearest CloudFlare's CDN server.



CloudFlare uses Anycast. However, we shift sites around between IPs frequently in order to, for example, isolate a site under DDoS from the rest of our network.

While it may be possible to create a IP-based CDN, my guess is the tradeoffs in terms of not being able to thoroughly defend yourself against attacks would not be worth the few milliseconds saved on the initial DNS lookup.


There is a balance between DNS cost and benefits of loading resources from different domain: current browsers can use 4-6 connections per hostname, so if all are busy you'll have to wait till some gets free. With different domains you can load more resources in parallel.


True, and Javascript files are the smaller assets of a typical website - the bulk of it is usually HTML code and images. There might be some small speed-up because using a CDN doesn't count against the limited number of connections a browser makes to the page-serving domain, but I also suspect this performance advantage may be offset by DNS lookups and connection overhead. Another aspect to consider is the way a website can quickly double its potential points of failure from one to two by relying on some external CDN service.


> True, and Javascript files are the smaller assets of a typical website - the bulk of it is usually HTML code and images.

As the number of SPA-type "sites" grows, the ratio tilts towards JS and dynamic communications (XML, JSON, etc...). Though overall I don't think it matters much.

A more interesting (I think) potential effect, should a given CDN reach a significant hold of the overall serving market, is the likelihood of a cache hit for the relevant library.


The HTTPS support is required so that you don't get the mixed content warnings from some browsers.

It's a must have.


so that you don't get the mixed content warnings

And so that, yunno, you don't compromise the entire page making the use of HTTPS completely worthless.


Pro tip: leave off the protocol to support both http and https. Example:

<script src="//cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js"></script>

Source: http://html5boilerplate.com/


Secondary protip: use a DNS prefetch for even faster 3rd party asset delivery.

<link rel="dns-prefetch" href="//cdnjs.cloudflare.com">

More at: http://html5boilerplate.com/docs/#DNS-Prefetching


If you're loading the javascript on the same page you're surely going to see no benefit at all right?


The browser can parallelize the DNS lookup, whereas running it serially and delaying the RTT for the asset will be much slower. So there is significant benefit to be had regardless. More: http://www.chromium.org/developers/design-documents/dns-pref...


Neat trick! Thanks!


protocol relative URLs awesome, but be aware, because this is only good for scripts and images. stylesheets on such links are downloaded twice in internet explorer (http://www.stevesouders.com/blog/2010/02/10/5a-missing-schem...)


I might be alone here - but I would never use a CDN that isn't back by a company with a strong vested interest in preventing security breaches of their servers.

Consider that the community CDN is compromised - if that file gets replaced with a different JS file, you've now provided an attacker an XSS hole into _every_ page using the CDN.

I have a reasonable trust in Google to secure their own servers against such a compromising attack, but have no similar reason to put faith in smaller companies/services.


Note that <script src>'ing from some random website means that site can XSS you any time it wants to. (I'm not saying the owners intend for this; a third party could just as well hack their site and do the same.)


you should host this on a separate domain because your clients are setting cookies on your domain and when I hit one of your hosted js files I see all of those cookies

for eg. I can see techcrunch and metric fuckton of google analytics cookies

separate this from the cloudfront clients or you have a potential security problem and a definite privacy breach


Kudos on the recently added https support. And on CloudFlare now (vs Amazon)...interesting.


https was there from day 1. They're just using CloudFront so you would have to use the CloudFront url instead of the pretty one (which doesn't matter because visitors would never see it).


Really like the idea of this community driven cdn.


Are there any statistics regarding the reliability and speed from different locations?


It's hosted on cloudflare... just google around for cloudflare statistics.



Cool. Looks like times are gradually improving.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: