Hacker News new | past | comments | ask | show | jobs | submit login
Page Speed Service (googlecode.blogspot.com)
192 points by handraiser on July 28, 2011 | hide | past | favorite | 61 comments



The settings docs page [1] shows what this service does:

  Combine CSS
  Combine JavaScript
  Image Optimize
  Image Resize
  JavaScript Optimize
  Move CSS to head
  Proxy CSS
  Proxy Images
  Proxy JavaScript
You can see demos of the Page Speed Service doing each of these things at [2].

The dashboard docs page [3] shows that there's some analytics but not a great deal.

[1] https://code.google.com/speed/pss/docs/settings.html

[2] http://www.pssdemos.com/

[3] https://code.google.com/speed/pss/docs/dashboard.html


Seems like strong competition to Cloudflare (https://www.cloudflare.com/). Always thought that was their greatest risk, someone with a huge, worldwide, already paid for CDN stepping in an offering the same service.

They still have the statistics and security features going for them, though...


Cloudflare does a LOT more than this. In fact, the area where they cross-over is the area that interests me the least - I am quite capable of making a website balls-to-the-wall fast without either of their products, but Cloudflare leverage their size to accomplish things I literally could not accomplish without them. I initially wrote Cloudflare off because of this, they don't do a great job of telling you about the coolest things they do.


Yeah i agree there seems to be some overlap with CloudFlare. I wanted to sign up a client of mine who is running a Wordpress site to CloudFlare, but wasnt able to do so because it required moving their complete zone file over to CloudFlare which they werent able to do for multiple reasons. CloudFlare dont seem to provide a solution where you can just point an A record to their servers (currently).

I wonder if this Google PageSpeed service can provide a similar service to CF, without needing to move the Zone file across completely. Mind you CloudFlare's basic service is free.


We will have CNAME pointing available in the very near future.


Cloudflare does have the ability to let you define CNAMES for your site and manage those in your own DNS outside of Cloudflare. I believe it is in limited beta right now but we are testing it and it works great.

Contact them for more details.


We actually have the backend all built for this. It is just a matter of doing just a little bit more testing before releasing it into the wild. We've used a handful of beta testers to help us identify potential problems.


Im a little hazy on my DNS knowledge. I understand upto zone file, dns and A records. Haven't quite understood how i would achieve this with a CNAME? _ i've tried to understand many times what CNAME's are but never got my head around it.

Would you mind elaborating a little? I did contact them about a month ago, and explained the problem and they just told me it wasnt possible at this moment in time(?)


A CNAME points a name to another name. In the case of Cloudflare, Google, or others, they need to know what CNAME you are using (say, www.example.com CNAME some.proxyservice.com) so they can map the request to your content.


Care to elaborate? I've been wondering what the advantages are but haven't had the time to look into it in more depth. What's been your experience?


Well, I can do all the speed improvement things myself, so I'm interested in some of the weirder things. They can detect bots quite reliably because with a network that large it's pretty damn obvious when something is crawling a site - and they can then strip out or mangle email addresses from your HTML before reaching those bots, so you can have email addresses in plaintext without having them harvested. That's a pretty imaginative solution!

Another thing they can do with their size is automatically identify and react to new attacks, so when a DDOS occurs it very quickly routes the traffic away and updates its ruleset to always be protected against that attack from now on, for all sites behind them.

I don't actually have any experience though as I've never used the service, this is all from their website and from talking to their staff on HN :)


CloudFlare does proxying and cashing for static and dynamic pages too and adds some security stuff. Not sure they minify and compress static resources, though. So, both services look like they are very different. CloudFlare will speed up a dynamic page that loads slowly, Google's service won't.


Cloudflare user here..

Cloudflare does have JS and CSS minify features, as well as some asynchronous JS support.


CloudFlare does have a feature to minimize some resources called Auto Minify. http://blog.cloudflare.com/an-all-new-and-improved-autominif...


Does this mean that by going with Google, you'll also mitigate the risk of DDoS's? Because I know that was one of Cloudfare's advantages, too.


Also http://www.strangeloopnetworks.com (who have some great resources on web performance optimization: http://www.strangeloopnetworks.com/resources/ )


Google have strong statistics/reporting products in other areas and I can't see it being long until they add this into the mix for Page Speed (if they haven't already.)


We actually talk about the announcement today on our blog here: http://bit.ly/nLkE9Y

We also do far more than just act as a CDN.


It's pretty iteresting in what direction google is pushing.

Since all the requests are handled by google (you have to change the dns servers to point to google), this will give them complete website profiles, on what surfers visit and how they can rank websites better. They could even adapt their crawling on that No need to crawl websites which a human has never visited before.

I would also consider that they might evaluate the logs from their open dns servers to rank websites and decide what's important and what not.

It's like having a cookie in adsense or analytics, just nobody noticing it.


I think I just don't get it... The optimized version is actually slower than the normal one... http://www.webpagetest.org/result/110728_KN_1bd8e23bea97037c...

I must be missing something obvious...


The same with our website. (http://www.webpagetest.org/result/110728_2Y_cf94015a7345bbed...) 8% faster than Google's version. Yay!

To be fair we have done a lot of optimizations on the site, but still feels like I may be missing something obvious.


Getting similar results. I don't really get it, the optimized version takes a few seconds before anything other than my background is displayed. By the time the rest of the 'optimized' site is being rendered, my original site's already loaded.

(This is from watching the video in my test results)


I don't think we're the target audience.

I suspect the biggest gains will be sites based on proprietary products which may be in quite a state.


> Page Speed Service fetches content from your servers, rewrites your pages by applying web performance best practices, and serves them to end users via Google's servers across the globe

I don't get it. Given that most websites have some dynamic content, that static assets can be updated, edited or otherwise altered at any time, Google must be fetching content on a request-by-request basis, i/e acting as a proxy. How can this be faster? You've got the existing latency of fetching your content, PLUS the extra latency of Google zipping and concatenating assets and forwarding the result on to the client.

Doesn't make any sense.


Didn't make sense to me either. Now I realized it optimizes static content: css, javascrips and images on the firs run, then it serves optimized version until you update them. There is some latency to fetching dynamic content and that's the reason optimized sites are slower on this test.


The cache-and-optimize-offline strategy that Google Page Speed is using is fraught with problems for dynamic sites. We tried this at first with some CloudFlare optimizations and it was a disaster. We rewrote the system from scratch to do on-the-fly optimization so it works even with dynamic sites. Here's a blog post about it from some time ago:

http://blog.cloudflare.com/an-all-new-and-improved-autominif...


Many business sites still consist of mainly static content.


If anyone on HN thinks their site might be sub-optimal on the front-end, contact me on whyslow@zofrex.com and I'll take a look, and let you know what you can do to speed things up :)


If anyone could give me feedback on why the above is being downvoted, I'd appreciate it. Ta!

Edit: I'd actually like feedback please... it went down to 0, then up to 1, then down to 0 again.. and then back up to 2 after I posted this reply. I wasn't asking for upvotes, I want to know what it is that is making some people downvote this. Is it not in keeping with the etiquette here, is there some unspoken rule I'm violating?


It could be interpreted as spam. People often think that "I want to help you" is "I want you to pay me" in disguise :)


People should realize that by using this service, all sensitive data posted to your website will be readable by google.


Can end-users somehow opt-out of the pages they are viewing passing through Google?

It should be technically possible but I doubt they will offer that.


It is not technically possible because to use the service, a web site has to change their hosting provider to Google. There are no longer any servers outside Google hosting the content, if I understand the bit about DNS change correctly.


I believe you still run your own servers and Google will fetch your content from them and distribute it through its CDN.


Why would you want to?


Some users are uncomfortable with the idea that even if they avoid Google entirely by not using their search, OS, email, or maps, that Google can still have a profile of them and which sites they visit.

If you don't like the company and want to avoid them, you will now have to choose to not visit websites served through their DNS. That's much more difficult than just using a different email provider.


If one wants to avoid Google, they also need to make sure anyone they e-mail isn't using GAFYD. How many anti-Google folks do that?


When Eric Schmidt allows me to search through his wallet and home, he can collect the web pages I browse and the cookies, sessions and httpauth data attached to them.


How about you just don't go to websites that use this? That's like me saying I hate Level 3 and I never want my web traffic to go through them, is there a way I can make sure my data packets never peer with a Level 3 connection?

You should worry more about your ISP than Google trying to make the web faster.


Tried their test. It was slower on my site...

http://blog.craigrcannon.com/post/8170801499/my-site-tested-...


Absolutely great. What do people await? It is a free service helps to improve ya site. I run the g-pagespeedtest + webpagetest and received a list with errors in my project http://www.webdesign-angebote.com Especially pics and js made problems.

And the suggestions are okay, really helpful {background-color: expression(this.runtimeStyle.backgoundColor = 'none'} and "Compress Images", but thats been my fault. I adepted it immediatly and hope for improved user- and rankingresults.


I really don't see the point of passing it through them. Instead I'd rather they just tell me how to optimize it so I can apply the improvements myself.


They offer that as well. http://pagespeed.googlelabs.com/


Pagespeed is on the soon to be closed Google Labs (http://googleblog.blogspot.com/2011/07/more-wood-behind-fewe...).


They have already announced that Page Speed will not be closed and will be moved off labs.


Good stuff but I'm not sure if if optimizing a website consisting of static pages this way really makes sense. If the website was done by an idiot then yes, but most of the users probably won't even notice any change in speed if the website was developed correctly earlier.

Also I probably wouldn't trust some other company enough to proxy my stuff to my users for the price of a slight optimization.


Anyone notice all the testing locations are AWS Regions?

I was assuming webpagetest was a Google property and figured they would use their own data centers.


Testing from within their own datacenters would be biased _very_ heavily toward their own service, don't you think? These tests are only meaningful if performed on an external ISP. Note also that if you do the test it says "US East (Virginia) - IE 8 - DSL", suggesting that it's on a regular DSL connection, not a EC2 box.


WebPageTest originally came out of AOL as an open source project. Patrick Meenan was the lead on that. He recently took a job at Google, but WebPageTest is still open source and has contributions from lots of companies who care about performance. For example, my company Torbit sponsors a WebPageTest instance in Ireland.


AWS regions and what Google is using are common to many service providers. These are typical peering locations. Google does have the benefit of having caches local to various ISPs.


That doesn't prove much though, does it? "Northern Virginia" is a pretty broad region.


Weird that their default test browser is IE8 too. Hello Chrome ?


A dead link by Canopy below says:

"WebPagetest is not run by Google."

and points to their about page. Apparently, he's right.


I tried it on www.google.com !! Their optimized version is slower than their normal version :) Check it out for yourself http://www.webpagetest.org/result/110728_TR_d83c788b1387dab0...


+1 useful. On a related note, does anyone know of something similar that can diagnose and optimize a browser's connection to the internet (the other side of end-to-end performance)?


Does this work with dynamic web pages, like Facebook or news.ycombinator.com?


I haven't checked, but I assume it works fine if you've got all your Cache and Vary headers set properly.


just out of curiosity i checked some of my web based application :

- orginal : load time 5 seconds *

- optimized : infinity....

I think they have to optimize their optimizer...

* : all js & css & image files about 900kb and the connection is 1Mbit adsl.


Just another example of Google trying to improve the world.

I would personally be a bit worried about the level of control this would mean giving up. But I just like to tinker. I know plenty of people for whom this is perfect.


Seems hard to believe their motivation for seeing a ton of consumer web traffic on third party sites is to improve the world. I mean, obviously google analytics isn't some altruistic attempt to help people track their web usage better.


Deprecated in 5, 4, 3, ...




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: