Hacker News new | past | comments | ask | show | jobs | submit | wubsitesgood's comments login

Good interview question, fun experiment.


How to make responsive HTML video remain responsive after page load, using web components.


Responsive video is a web standard again, and with patches in Firefox and Chromium, cross-browser support lands this week. This article talks about how to use media queries for delivering HTML video responsibly


Different moments in user experience and need to be measured differently. There are too-few resources on testing perceived performance of navigation after initial pageload. Let's fix that.


Notably, the examples in the OP article are slower by many thousands of ms, not just hundreds.

Either way, the post isn't saying you have to abandon app-like experiences. It's only about improving initial HTML delivery regardless of what you do after that.


For what it's worth, Twitter has gone back and forth on this. They did serve useful HTML up-front in the recent past. It was faster for initial delivery and I'd imagine helped the odd person who was sent a link to a tweet and hadn't loaded the site in a while.

Other sites in the article fit a traditional website model where folks commonly land on pages from search results, but they're still very JS dependent for content. CNN and FedEx etc


As the article says, the first example in the post does include the time that it takes to go out and fetch the static HTML that swaps in, and it added about a second to the server response of the experiment run that doesn't show up in the control run. For a big distributed site, a second may be more time than it would really take to put together a dynamic response.

Even with that 1-second additional delay included though, the improvement in that first test is still large (over 8 seconds faster in those tests). If the experiment took a few seconds longer on the server, it still would be 5 or 6 seconds faster to render content than the control.


I agree that server rendered would be faster, just the article had presented the absolute best-case scenario for the server. That said there could be other tradeoffs at play. Maybe loading the JavaScript and requesting a small amount of JSON each page is faster loading the initial page and then scrolling 10 pages down is faster than the server rendering out each page and appending it to the end.


Some of the tests in the article are run on Desktop Chrome using a "cable" connection speed instead of 4G, which looks to have about a 6x faster round trip time than their 4G does. Those results are a little less impactful but still significant (many seconds faster still in some metrics).

More testing environments would make the results more or less significant, as you'd expect.

In ideal browsing conditions, the impact will be more minor, and in the spotty barely 1-bar connection you mention, the difference would be much more dramatic than the 4G examples in the post.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: