Hacker News new | past | comments | ask | show | jobs | submit login
[dupe] DynoSRC - Eliminate HTTP requests for JavaScript files (dinosrc.it)
92 points by selamis on Nov 28, 2013 | hide | past | favorite | 38 comments



Ilya Grigorik talks about how the Gmail devs keep all their JS modular, so that when a change really needs to be made on the live web-app; only one small file needs to be affected. Another huge JS file does not need to be sent down the wire. If you modify a massive file, you inadvertently DDOS your own server.

He talks about it here → http://www.youtube.com/watch?v=E9FxNzv1Tr8#t=600


My first instinct is "how bad is the motivating problem, really?". How often do you have a single massive JS file where a small number of lines change?

I'm not saying it's a bad idea, but starting with a case study would be nice rather than jumping in with "we've completely changed everything about how script files are delivered, it's amazing and we have cool graphics to prove it". The dinosaur looks a bit hand-wavy.

There are a lot more moving parts, a lost more surface area for things to go wrong, etc. Yes, downloading huge JS files is a problem, but you're going to have to download them one way or another.


I imagine its probably fairly common. Lots of sites will minify and concatenate their JS, after all.


> Lots of sites will minify and concatenate their JS, after all.

Most with absolutely zero data to back up that it provides any sort of useful improvements to justify the difficulties it introduces in debugging.

Considering how almost all web-servers and browsers support HTTP compression, minification is mostly a cargo-cult ritual.


FWIW, I'm making a site at the moment and very similar thoughts entered my head, so I tested it out. Minification + gzip (at best compression level) beat out just gzip by ~ 20%. I don't actually have that much JS so it's not really that important, but just thought it was worth a mention.


As much as I would like your statement to be true:

    jquery-2.0.3.js       236K
    jquery-2.0.3.js.gz     70K
    jquery-2.0.3.min.js    82K
    jquery-2.0.3.min.js.gz 29K
Minification pay off a lot even with compression.


To be fair, this doesn't show a performance improvement, it shows a file size reduction. Of course you can assume one leads to the other, but it doesn't fully prove the point.


    time curl http://code.jquery.com/jquery-2.0.3.js > /dev/null
    real	0m0.433s

    time curl http://code.jquery.com/jquery-2.0.3.min.js > /dev/null
    real	0m0.210s
Is it proved now ?

Edit: Off course I didn't ran these requests only once.


The obvious next question is "can't you achieve the same thing with gzip":

    time curl http://code.jquery.com/jquery-2.0.3.js > /dev/null
    236k bytes
    real	0m0.432s

    time curl http://code.jquery.com/jquery-2.0.3.min.js > /dev/null
    83612 bytes
    real	0m0.323s

    time curl http://code.jquery.com/jquery-2.0.3.js --compressed > /dev/null
    87509 bytes
    real	0m0.273s

    time curl http://code.jquery.com/jquery-2.0.3.min.js --compressed > /dev/null
    34066 bytes
    real	0m0.215s
I think that people should be more skeptical of the complexity introduced by things like minification, but the advantages are pretty clear, even with compression.


This still doesn't show that actual site performance is improved. Also, gzip is not enabled on that server, so I think it's a less realistic scenario. I'm certainly being pedantic, but I've run into similar surprises before.


> Most with absolutely zero data to back up that it provides any sort of useful improvements

While it depends on a lot of factors, generally speaking minification is a good thing especially for mobile. There's really no downside in doing it (see next point).

> justify the difficulties it introduces in debugging

What difficulties? You should use a build process that minifies for your production site but not for testing. If you need to debug the live, minified files you can use source maps[1].

1: http://www.html5rocks.com/en/tutorials/developertools/source...


I never actually gave much thought into whether or not to minify JS or not, thanks for that insight. I'd definitely still concatenate files though; I have one project that has like 150 JS files and another 100 html templates (single page webapp), and there's a very tangible performance difference concatted vs. unconcatted.


This isn't a problem if you generate source maps. Of course, if you have to support and debug old browsers, it's a different story.

A quick google search got me to SO: http://stackoverflow.com/a/807161 results seem pretty significant. Besides, the worse your connection is, the more you will appreciate these techniques.


Sprockets come's as standard with rails. In development the files are served separately, for easy debugging, but then they're minified/concatenated automatically for production.


Wouldn't that screw up the diff as well?


You don't commit the packaged and minified code to your repo. Instead, you leave it in whatever number of source files you want to use (which are file repo'd).

The packaging and minification can be handled either as a filter at the server level or during the build and deploy if your stack supports that metaphor. These have been the common practices in my experience.


Not an expert on this, but could something like rsync be implemented and allow it to treat the file as binary? (e.g. minified or non-minified doesn't matter)


If the minifcation involves symbol substitution, the substitutions are going to depend on the content of the whole file.

So unless you re-implement the minification algorithm to be optimised for minimising diffs you might find that every variable's name has changed and size of the file diff might start to approach (or exceed) the size of the file itself.


Agreed this is a problem. Although perhaps it's worth not minifying your JS to gain the speedup from the use of diffs?


rsync requires a lot of communication back and forth which is precisely what you're trying to avoid. You want to fire off a single diff in a single roundtrip.


How well will "Differential Updates" work for minified files?


Good question, definitely needed to know before using.


I see the repo the Github Docs points to is a fork of https://github.com/DynoSRC/dynosrc, which one is actually the canonical repo?

BTW, how does differential updates do under minification? Don't the differences usually end up replacing the entire functions?


> BTW, how does differential updates do under minification

Don't minify. Minifying is a kludge. This (and gzip) actually looks like a properly engineered solution.


isn't that what's browser cache is for ?


mabye they use the localstorage for incremental updates


This seems like it would really complicate use of a CDN given that page content is now different based on what users have cached. You could look at cookie values and still get some caching, but this would hurt your cache hit rate. It also bars you from using a lot of the cheaper CDNs which won't let you inspect the request deeply enough to efficiently cached based on the cookie.

So if you can't use a CDN or your cache hit rates are much lower, you might end up with slower page loads with technique. But of course I have no data to back this up.


What does this mean to local storage requirements if every site does this?


Be nice if it did something about images too


And CSS, of course. Cram the entire page in a single HTTP request.


This really looks like the wrong layer to do that... It could be interesting to extend HTTP / etag to do something similar globally. Instead of "tell me nothing changed, or send a new version", browser could request "tell me nothing changed, or send a patch from version XXX".


I agree the "send me differences" model is appealing. A version based off rsync was created back in 1999 but patents killed it. I think the patent in question runs out pretty soon though...

The Tragedy of the RProxy: http://ozlabs.org/~rusty/rproxy.html


Are there any benchmarks? I have a hard time seeing this make any significant improvement over minified concatenated javascript files. And are you really updating your javascript files enough thats it's a problem?


Sorry,i'm quite confuse.i reloading the page .seem request for js file still exist ? Shouldn't browser cache javascript file normally upon second time refreshing web page.


Yes the browsers do it.

But this is in case you have a bigger JS-File (e.g. for a Web-App) and your customer visit again, so you only have to transfer a small peace of data.

But iam not sure if the performance / traffic advantage is so high, that dyno justify the effort. And you have one more source of possible error, so i did not think many developer will use it.

The idea is good, but the solution should work automatically in background on the transportation layer, between browser and server, not for app developer.

@Jason Anderson: Call google - chrome developer if they implement it in their Browser and write an apache Plugin ! :-)



> Eliminate HTTP requests for JavaScript files

Great! We already do this manually for mobile sites so I will definitely try it out.

> and serve differential updates

I didn't understand this part.


Differential updates means that only line changes in your javascript will have to be downloaded instead of downloading entire files.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: