Hacker News new | past | comments | ask | show | jobs | submit login
How Google Pagespeed works (calibreapp.com)
179 points by benschwarz on July 3, 2019 | hide | past | favorite | 45 comments



> In July, the SEO ranking algorithm was updated to include page speed as a ranking factor for both mobile pages and ads.

What's one supposed to do when PageSpeed only points to Google's own asset delivery services as being "the" big problems which cause a lower score?

- "serve images in next gen formats" - complains about images served from tpc.googlesyndication.com

- "efficiently encode images" - same as above

- "reduce javascript execution time" - cdn.ampproject.org; securepubads.g.doubleclick.net; www.googletagservices.com

.. ?

I can easily get rid of those issues, "simply" by not showing any ads. The site will then be really fast, and... I'll get the ability to display ads again?!

Shame the ad-related revenue will be 0, which isn't unfortunately a good thing for the site.

Why is Google themselves not doing things "right" in order to ensure a speedy execution?

A while back, one of the reasons for a low score was that the GPT.js code (required to show ads) had a short expiration time. Dammit. How am I supposed to fix that, Google?


The left hand does not always know what the right hand is doing in large corps. I'm assuming the SEO team is not in daily contact with the ad delivery team(s). The SEO team is just reporting on what they how they are seeing the page load without concern for where the slowdowns occur. It is quite funny/sad/ironic that it is within the same letters of the Alphabet that are the problem.

I had the same experience with Apple when the iTunes team were invalidating ProRes files created from Final Cut Pro.


> The left hand does not always know what the right hand is doing in large corps.

On the other hand, it's a good thing to not give a free pass for bad behavior in products you produce.


I wish people would remember this phenomenon when they are engaging in wild theories of intricate (and often malevolent) corporate strategy.

Such as: buying GitHub is unlikely to be Microsoft's attempt to trap OSS; Apple's early adoption of new ports probably isn't motivated by selling an extra $12 dongle for every $2000 Macbook; a journalist criticising Facebook is rather unlikely to be acting on corporate orders to sully the competition, etc.


I get an excellent PageSpeed score by hosting my own stuff on Azure. CDNs make sense only for sites with huge traffic. And at that point, maybe you could afford to pay for extra bandwidth and still not use a CDN. And I've mostly stopped using PageSpeed at all, because the metrics are obvious: total download size matters.

All the arguments for "but your user might already have such-and-such-version-of-jQuery cached" just have not had a good enough impact on total experience. There are so many versions of whatever you might be importing, and HTTP keep-alive works really, really well.

Of course, I also don't have any tracking setup, say nothing of the 50 dozen systems most sites use. I've never gotten anything out of them that I couldn't have figured out on my own. They always come back to "users bounce when shit is slow". Thanks, tracking scripts, for contributing to my bounce-rate.

So there you go, do you want to create a good user experience, or do you want to sell your users out for a pittance?

Have your first user experience in your site be super optimized. Static HTML. Host all assets on your own. Size images correctly. Minimize, then inline the CSS and JS. There are plenty of design-time scripts to automate it. And if you're making SPAs, you can implement all the dynamic loading you want past the initial page because users clicking to open your app are already engaged. But get that first page on-screen in less than a second on a 2G network.


Hi! Author of the post here.

Lighthouse definitely has some improvements to make for the audit suggestions that offer advice to improve the score. I think people get a little bit frustrated/blinded by the advice when there's nothing that can be done about third parties (I know I do!).

Our product Calibre (mentioned in the bottom of the post) has a n adblocking/third party blocking feature that allows you to compare your pages with or without third party scripts gunking up performance metrics (here's the release post: https://calibreapp.com/blog/release-notes-apr-2019/).

We've seen really good success in separating the two because it sends a clear signal of what a site WOULD score without ads. This is good because it makes it clear to PMs, management and decision makers that tracking tools incur a really visible cost.


You realise that calibre is also the name of a very well known ebook management app that's been around forever? I thought this was a post from them, initially. I'd imagine your website would get a lot of bounces due to this.

https://calibre-ebook.com/


… but to answer your question better. It sucks that Google's own services don't live up to the performance expectations of tools made in other parts of Google.


I imagine the various teams at Google see each other as competition more than we might realize. How many chat-like applications are there? How about the Java/Kotlin Android teams versus the Flutter team? All of the GPL die-hards looking at the non-GPL (MIT, BSD, and Apache2) Fuchsia project wondering just how committed Google is to the Linux kernel.


Yes, different teams have different priorities, but there isn't direct competition between a performance-measuring tool and ad-serving.

In this case it's probably more the performance team wanting to do unbiased measurement. As they should.


if google could gain a market advantage similar to the one they got from linux (for free) they would do so immediately no matter the "cost" to linux/OSS/the-world-at-large


So, I'm curious.

How does https://calibreapp.com/ compare to https://www.webpagetest.org/ and https://www.sitespeed.io/ and https://gtmetrix.com/ ?

Why would someone want to pay you for your service, as opposed to using any of the above? Do you support browsers other than Chrome? Do you support a wide variety of simulated hardware? Do you support a wide variety of testing locations? Do you support any page speed test profile other than Lighthouse and Google Page Speed? Do you support running your software in a Docker container of my own choice?

Because I can get all those things via one or more of the competitors in this space, and I'm not seeing any compelling reason to even look at Calibre.

Please feel free to convince me that I'm wrong.


Depends on your use case - I recommend Calibre to some clients, SpeedCurve to others and self hosted WebPageTest or Sitespeed.io to others.

One of the key deciding factors is do you want to run the service yourself e.g. VMs, containers, in multiple regions or do you want to make that someone else's problem and get on with running your business - most of my clients are in the later camp

Both Calibre and SpeedCurve have clean UI's that allow performance to be tracked over time, from multiple locations and integration with other services via web hooks, APIs etc.

GTMetrix is old and outdated IMV, - it still heavily relies on YSlow rules which are well past their sell-by-date

WebPageTest is my favourite (SpeedCurve uses it under the hood) but it's got no ability to track over time, SiteSpee.io is really nice too but is self-hosted.

With the exception of some real devices in Dulles, all of the products rely on emulation for mobile, and Safari is a real gap for them all too


"serve images in next gen formats" - complains about images served from tpc.googlesyndication.com

I knew PageSpeed was bunk when that recommendation came up on a web page I was working with. All of the "next gen" formats that PageSpeed recommended weren't supported by any mainline browsers — not even Chrome.

That may have changed by now, but my opinion of PageSpeed hasn't.


Well I think the good news is that all of your competitors have to suffer the same problems, so you'll all be punished equally.


Find a real business model instead of serving cancer & malware to your visitors?


It's hard to take this blog post seriously when there is not one but THREE open Github issues which poses a serious question about the credibility of this whole pagespeed circus. It's just fundamentally wrong to claim that pagespeed speed score is based on the Lighthouse speed score when hundreds of people are reporting otherwise and even Lighthouse is suspiciously silent on this matter.


Hi, I'm the author :-)

I've left significant levels of details about pagespeed on the lighthouse issue tracker over the last few months. It's really frustrating to see the lighthouse team being blamed for something that isn't operationally even theirs… but that's a big company for you.

The biggest contributing factor to unstable pagespeed scores is the power of the machines that do the tests. Last time I checked the scores / bench mark scores for pagespeed were all over the place. … All the same, it's really important to understand how it all works. Hopefully the post had something for you?


> The speed of your site on mobile will affect your overall SEO ranking. > If your pages load slowly, it will reduce your ad quality score, and ads will cost more.

We've deliberately excluded mobile devices from our ad campaigns. Do I understand correctly that even though we do not serve any mobile visitors at all, we're still judged by the page speed of the mobile page, not just the desktop one?


>we do not serve any mobile visitors at all //

How is that possible, if people know your domain name they'll type it in to their phone/tablet/watch/gamepad (or search from their shared history, or whatever). Do you reject connections that indicate they're from non-desktop devices??

I can see "we don't target mobile" but users don't often care what you're targeting and to my surprise recently I had to review mobile usage and found that it's higher in general than desktop across the board and my sites (small local websites and personal sites, mind you) followed that trend.


These are adwords landing pages. We're in super MVP-mode, trying to establish some contacts with potential customers, they leave behind an email address, we set up a call, etc.

I understand what you're saying, but we can literally see from Google Analytics that we have 0 mobile visitors. As would be expected.

It just seems weird for being judged by our mobile experience in this case, but it would good to know if this were the case.


> we do not serve any mobile visitors at all

> > How is that possible

> These are adwords landing pages. We're in super MVP-mode

In your specific case, google pagespeed ranking is probably irrelevant.

If you truly have 0 mobile visitors, and these are just adwords landing pages, then people are probably not "Googling" your company, so search ranking is probably totally irrelevant at this point too.


As far as I understood from the article, it also impacts the adwords quality score. Or did I misunderstand this?


That's what I took from it yep.


Too bad Google's own page describing Pagespeed and Lighthouse fares abysmally on their own tests. (https://imgur.com/a/56FVOwK)

I'm sick and tired of Google playing the role of the global regulator of what's acceptable and what's not. Not everyone can afford own data centers and CDNs to serve content. Not every single page can benefit from SSL.

And web's not the only area they are regulating through monopoly - the number of hoops one has to jump through these days to run an e-mail server, just to have its e-mail acceptable by the holy Google e-mail servers is silly.


> Not every single page can benefit from SSL.

huh?


You don't need "SSL" (TLS, https, etc.) to serve static, uncontroversial content


That only covers the "encryption" part of SSL/TLS.

If there is any incentive to fake your content (e.g. for phishing or other attacks, or even simply spreading false information), you'll also want the "authenticity" part of TLS for preventing middlemen from arbitrarily changing content.


That's absolutely untrue. For example a man in the middle could intercept and modify the page.

https://www.troyhunt.com/heres-why-your-static-website-needs...


This was a good article, I concede the point.


With Let's Encrypt you really have no reason not too though. And what's not controversial in one country may be in another.


http://objective.st gets a 100% score. No SSL, no CDN. Hosted on a single Digital Ocean droplet.

Mind you, also virtually no traffic and very little and lightweight content, but still...


Nice article, but is it just me that thinks it is weird that Google ranks better the pages that are speedier in their browser?

Antitrust here?


This is curious, I'd like to see some study on this point but do you think that the ranking would vary a lot depending on which browser was used to provide the small speed related element? Any anecdotal data about sites that are super slow on Chrome but blazing on other browsers?


Just that the rank criteria is how fast it is in Chrome. Some of the micro benchmarks, like Time To Interactive, looks really browse dependent.


It’s just one of many factors so probably not.


Reading the definition for TTI is also helpful in narrowing down your page speed problems: https://github.com/WICG/time-to-interactive#definition

Also this seems like the right time for me to brag about the impact we were able to have on our site's responsiveness. I know a lot of people will think a lot of this is BS being pushed by Google, and I agree in some cases, but overall our site is incredibly fast now (perceptively, significantly faster), and that's thanks to Lighthouse and its suggestions.

This is without HTML caching, so still some room for improvement :) https://i.imgur.com/d7KCTmj.png

This is a full Vue SPA with chunking enabled.

Edit: oops, I was behind a VPN when I ran the test which impacted my ping. All metrics are down to 0.3s max :)


It’s astounding to me that web development is in such a state that it takes google dinging placement for people to start thinking that speed is important.

Speed is a feature. Speed is your most important feature. It’s a gazillion times more important than all the other cool features that developers and product managers think are important.


But react is all I know how to write...!


React doesn't make your site slow.


> In July, the SEO ranking algorithm was updated to include page speed as a ranking factor for both mobile pages and ads.

July 2018, according to both linked sources.


If the weight of "Estimated Input Latency" is zero, why is it even mentioned?


Because it's in the code and it's factually correct. It felt weird to not mention it!


Hi, do you have any figures for this subject? >If your pages load slowly, it will reduce your ad quality score, and ads will cost more.

Thanks.


You can check for the page speed user agent and do some optimizations to satisfy it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: