Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Which tools do you use to optimize your webapps?
41 points by nixy on Aug 6, 2009 | hide | past | favorite | 16 comments
I'm trying to always keep size of images I publish on the web at a minimum. I have been using the smush.it service a lot, but after Y! took over the service, it hasn't been working too well -- for example I can't smush images stored locally any more.

Also, minifying and merging JS and CSS is a good trick to keep loading times down.

Now I'm interested to know -- which webapps, tools or tricks do you use to optimize your web content?




1) The YSlow presentations.

2) YSlow itself.

3) Cheating creatively. For example, the front page of Bingo Card Creator has one middle-weight screenshot on it, which links to a hefty full-sized version in a lightbox which accompanies some text exhorting users to download or sign up now. I want opening that Lightbox to be faster than instant, but the screenshot is huge. So I cheat: by the simple expedient of placing <img src="/blah.jpg" style="display: hidden;" /> somewhere visible on the front page, everybody who hits it will start loading the image more or less immediately. Then, when they trigger the lightbox (some seconds later), the image displayed in it (<img src="/blah.jpg" />) doesn't come from my server at all, it comes straight from their browser cache, and renders almost instantly.

Related trick: I load the critical assets for my purchasing page on every page on the site. Thus, someone accessing the purchasing page (which I have clear reason$ to want to be fast) will almost always see it stupidly fast, despite it having a fair bit of Javascript and whatnot to render the cart.

I also abuse the browser cache for "live" previews in my web app. Switching an image's source to an image which is not downloaded is a bad idea, because it typically causes both a delay and unsightly flicker. So instead of switching the source of the image directly, you switch the source of a hidden image, then use a Javascript onload callback to switch the visible image to the source of the hidden image when the hidden image finishes loading.


For YSlow users, a buddy of mine (actually, he's kind of a jerk) is working on an interesting project that lets you run automated YSlow tests (using Django and firefox and various other wizardry things)

https://wiki.mozilla.org/Webdev:Cesium

/shameless plug :)



- Host your CSS, JS and images on a CDN -- Serve them from a different domain. Otherwise every element request requires cookies to be sent in the headers. Waste of overhead for end users.

- Combine as many JS and CSS files as possible. You will also want to compress them.

- Use memcached to cache backend elements to speed up pageloads and hit the database less.

- If you arent using a CDN, at least use a reverse proxy so you don't create unnecessary overhead for your webserver.

- Get a profiler for your app. See what areas are slower then others. Find a database profiler and fix queries that are slow. If your using MySQL use the slow query log to see where there are hold ups.

- Make sure you have the proper indexes on your database. Over doing it can waste memory.

- Try and use gzip if possible. Most browsers support it and it can have a noticeable speed increase.


* - Host your CSS, JS and images on a CDN -- Serve them from a different domain. Otherwise every element request requires cookies to be sent in the headers. Waste of overhead for end users.*

One thing that you have to worry about here is serving these files through https://. A lot of CDNs either (a) don't support https:// or (b) charge a monster fee for it.


Another good one - which may or may not be worth the effort is splitting media across domain names - there is a limit on open connections per hostname for browsers(between 2-8, currently, I believe).


A little trick from FLOSS Podcast's session with John Resig: instead of serving jQuery from your domain, use the Google hosted one: http://ajax.googleapis.com/ajax/libs/jquery/1.3/jquery.min.j... - you get to benefit from Google's CDN, and there's a fair chance it's already cached in the user's browser.


Another way to help Google track the websites you are visiting and forcing your users to do so too. CDN are great, but please take into consideration the privacy issues when using the ones provided "for free" by the big players.


Since the YSlow recommendations and practices are almost universally used, wouldn't it be possible to automate the process? Could someone write a servlet filter that incorporated the optimization rules and automatically applied them to outgoing content on the fly? And you could make this an Apache module so that it could sit at the web server layer instead of at the container layer.

This would also address issues that arise when you're using a product or framework that assembles the page dynamically, and you're not in control of how the css links are constructed.


Yes -- this is exactly the kind of thing I'm looking for! A nice batch process that takes care of everything when I publish my static content. Shouldn't be too hard to make something like that, my problem is finding the right tools (pngcrush, JS minifier, JPEG EXIF stripper etc) and the proper settings to use for each and every one of them. It's a jungle.


I've found YSlow very useful in identifying problems.

My lazy, three-step optimization process:

1. turn on gzip encoding in apache

2. verify that caching headers (Etag/last-modified) are being set

3. tail -f mysql-slow.log


* in Photoshop - "save for web" and let it figure out what it thinks is optimal

* YSlow for Firebug to see if anything is really loading slowly / being gross

* Like you mentioned, YUI Compressor to minify/compress CSS and JS

* We use Symfony so the Symfony dev toolbar to profile slow SQL queries and generally see what is causing pages to render slowly.

* Lately, Firebug's NET panel to monitor what requests are taking the most time.


I do the following:

1. Use a python profiler and look for problems, which usually means rewriting db calls to tradeoff storage space vs CPU time and/or front loading tasks so they're heavier on write than read

2. Write memcache layers wherever possible

3. Optipng or cut the backgrounds out of gifs

4. Clean up redundant css / js

5. Minify

6. I set really long modified headers, but use a version number that I can increment in my statics files' query strings


Google Page Speed is also worth a look. It will point out problem areas and recommend best practices to speed them up.


For images, I haven't found a better solution than to fire up Photoshop and do batch process on all the images and save for web. The quality looks the same, and the images are WAY smaller.

I've used pngcrush before, but just for png's, and it didn't come close to what Photoshop was giving me as a result.

Photoshop does cost ~$700, but if you already have it, definitely give it a whirl.


WebPageTest.org is pretty good - almost YSlow/Firebug-like, but hosted.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: