Hacker News new | past | comments | ask | show | jobs | submit login

Something I have had at the back of my mind for a long time: in 2017, what's the correct way to present optional resources that will improve the experience of users on fast/uncapped connections, but that user agents on slow/capped connections can safely ignore? Like hi-res hero images, or video backgrounds, etc.

Every time a similar question is posed on HN, someone says "If the assets aren't needed, don't serve them in the first place", but this is i) unrealistic, and ii) ignores the fact that while the typical HN user may like sparsely designed, text-orientated pages with few images, this is not at all true of users in different demographics. And in those demos, it's often not acceptable to degrade the experience of users on fast connections to accommodate users on slow connections.

So -- if I write a web page, and I want to include a large asset, but I want to indicate to user agents on slow/capped connections that they don't _need_ to download it, what approach should I take?




This seems like the thing that we'd want cooperation with the browser vendors rather than everyone hacking together some JS to make it happen. If browsers could expose the available bandwidth as a media query, it would be trivial to have different resources for different connections.

This would also handle the situation where the available bandwidth isn't indicative of whether the user wants the high-bandwidth experience. For example, if you're on a non-unlimited mobile plan, it doesn't take that long to load a 10mb image over 4G, but those 10mb chunks add up to overage charges pretty quickly, so the user may want to set his browser to report a lower bandwidth amount.


Here in Greece, the internet is plenty fast (in bandwidth), but everything is far away, so there's lots of latency in opening every page. Going to the US on a trip, it's striking how much faster every website loads, there's no 300ms pause at the start anywhere.

Because I got frustrated at fat websites downloading megabytes of useless things, I decided to start an informational site about this very thing:

http://www.lightentheweb.com/

It's not ready yet, but I'm adding links to smaller alternatives to popular frameworks, links to articles about making various parts of a website (frontend and backend) faster, and will possibly add articles by people (and me) directly on the site. If anyone has any suggestions, please open an issue or MR (the site is open source):

https://gitlab.com/stavros/lighten-the-web/issues


Very interesting.

I would suggest swapping the current structural aesthetic of "come in and look around" for the somewhat more widespread approach of having one or more calls to action and making the homepage fully sketch out the points you want to make.

FWIW, I say this out of pragmaticness. I don't mind the "welcome! browse!" approach myself, but it won't appeal to the demographic you're trying to reach: people who themselves are being paid to eat/sleep/dream modern web design.

Another thing I would recommend is using every single trick in the book to make the site fast. For example you could get ServiceWorkers caching everything for future visits (with maybe AppCache on top just because) and use the HTML5 history API so you can preload all the site text (say, in an XHR that fires after page load) and use that to make it feel like navigation is superhumanly fast.

TL;DR, use this as your playground to learn how to make sites load better. Voila, the site will be stupidly fast, and it will self-describe too, which is kind of cool. And you'll wind up with a bunch of knowledge you could use for consulting... and then you could use the site as the home base for that, which would be even cooler.

(I realize you just started this project, and that the above suggestions are in the "Rome wasn't built in a day" category)


It's funny that you mention that, because I just wanted to have a site I could optimize to hell, and it seemed apt to make an informational site about optimization for that . AppCache is obsolete and harmful now (yes, already), and I should link to the articles that talk about that, thanks for reminding me.

As for the "come browse" approach, you're definitely right, and I don't intend the finished site to look like this, but I'm also not sure how to structure the content. What do I send the user to first? Maybe I'll write a tutorial hitting all the bullet points with links, though (eg add caching to static media, bundle them, don't use heavy libraries, load js async if you can, etc etc).

Thank you very much for your feedback!


I've wanted to play around with some similar ideas for a while too, actually. I have a few loose high-level ideas - I know I want it to feel like an app, but I want to use plain JS; I want to leverage everything modern browsers can support, while remaining backward-compatible (!); I want to try odd things like using Lua inside nginx for everything (or even writing my own web server), or programmatically reorganizing my CSS and JS so runs of similar characters are grouped together and gzipping has the best effect. I also have a hazy idea of what I want the site to be about (not a content site, some sort of interactive thing) but I haven't resolved all the "but if you make it about X, it doesn't really nail all those bits about Y you wanted" stuff yet. Anyway.

Thanks for the note that AppCache is now out of the picture. I actually think I remember reading something vaguely about it being not the greatest, but I didn't know it was actively harmful. Do you mean in a security sense or it just being bad for performance?

I wasn't sure what to say about the content structure thing at first, but then I thought: take the pragmatic approach. Gather piles and piles and piles of actual content and dump it either on the site itself or your dev version. Notions about structure, presentation and content will likely occur in the process of accumulating (or writing) what's on the site.

As for what kind of content to put up, I would suggest focusing heavily on links to (and/or articles about) pragmatic, well-argued/well-reasoned arguments for lightening page load, and the various kinds of real-world metrics that are achieved when people make the investment to do that.

An obvious example: it's one thing to say "I recommend http://vanilla-js.com!", it's quite another to say "YouTube lightened their homepage weight from 1.5MB to 98KB and made it possible for brand new demographics in 3rd-world countries to experience video playback (http://blog.chriszacharias.com/page-weight-matters). Also, the reason the site feels so fast now when you click from one video to the next is that the platform only pulls in the new video URL, comments and description - the page itself never reloads."

Regarding where to start, I was thinking that a mashup/ripoff of halfway between https://developers.google.com/web/fundamentals/ and MDN might be an interesting target to aim for. I'm definitely not saying to [re-]do that much work (although I wouldn't be protesting if someone did... some of those Fundamentals tutorials are horribly out of date now), I'm just saying, the way that info is presented could do with cleanup and you can always run rings around them in various ways (layout, design, navigational hierarchy) because of bureaucracy blah blah... but you could do worse than aiming for something that feels like those sites do. Except you'd be focusing on making everything as lightweight as possible, and you would of course make the site your own as time went by. Maybe what I'm trying to get at here is that nobody's done a full-stack (as in, "bigger picture") top-to-bottom "here's how to do everything lightweight, and here are a bunch of real resources" sort of site yet, and I'm suggesting the lightweight-focused version of Google Web Fundamentals... :/

On a related note, I've come across a few websites that are nothing more than a bunch of links and a tiny bit of text describing some technical/development issue or whatever. They almost feel like spam sites, except they talk about legitimate issues and are clearly written by a person.

I'm sure these people mean well, but the low-text high-link format (or the "I'm going to rewrite what's in this link in my own words" approach) doesn't work for blog sites (possibly because of WordPress's 10000-clicks-to-get-a-high-level-overview browsing model...tsk) and similar - I'm trawling for actual text when I'm on a site like that, if you give me a link I'm not even on your website anymore.

You've probably seen sites like that too. (Note that I'm slightly griping here, I don't see your site as similar at all. I think I got a bit off track, I was trying to demonstrate the exact opposite of the direction I would suggest you go in. :P)

Also, I just thought of https://www.webpagetest.org and https://jsperf.com. Arguably microoptimization-focused, but I thought I'd mention them anyway.


> Do you mean in a security sense or it just being bad for performance?

It is bad for performance. I was going to link you to the article, but I figured I'll add it to the site :) http://www.lightentheweb.com/resources/

> Regarding where to start...

Ah, good idea, thank you. Yes, I'm not really in a position to rewrite content that exists (it would take too much time), but I would like to at least index it sensibly.

> nobody's done a full-stack (as in, "bigger picture") top-to-bottom "here's how to do everything lightweight

That's exactly what I'm aiming for, with clear steps and possibly a checklist (good idea!) on what to do.

> I've come across a few websites that are nothing more than a bunch of links

I think that's hard to avoid when making an informational site, but the links could possibly be embedded into article-style copy, making it not look as spammy. I'll keep that in mind, thank you.

> I was trying to demonstrate the exact opposite of the direction I would suggest you go in

Haha, yes, I know what you mean, and the links will be the "read more" material. I'd like to add some original content and non-time-sensitive guides to the fundamentals.

> Arguably microoptimization-focused, but I thought I'd mention them anyway.

Those are great, thank you!


Can I suggest you add caching to the css, javascript and logo?


I will, it's still early so I hadn't taken a look. To be honest, since this is hosted on Netlify I was kind of assuming they'd be sending caching headers for all static files, but I see that they aren't.

I'll look into it, thank you!

EDIT: Netlify would be setting everything properly if I hadn't turned that off... Luckily they have great support!


Whole things super fast now, loads in 55ms. I assume most of that is ping (I'm in Australia).


Fantastic, thanks!


Thanks for the links to lightweight CSS and JS libs; I actually need exactly that right now for a project.

Gitlab link is 500-ing, unfortunately.


Ah, oops :/ Seems to be okay now, let me know if there's something else you need! I'm looking for ideas on how to organize the docs/site at the moment.


>If browsers could expose the available bandwidth

I don't know why this seems like such an imposition, but I think I'd be uncomfortable with my browser exposing information about my actual network if it didn't have to. I have a feeling way more people would be using this to track me than to considerately send me less data.

That said, browser buy-in could be a huge help, if only to add a low-tech button saying, "request the low-fi version of everything if available." This would help mobile users too -- even if you have lots of bandwidth, maybe you want to conserve.


Indeed; as an user, I don't want the site to decide what quality to serve me based on probing my device. It'll only lead to the usual abuse. I want to specify I want a "lightweight" version or "full experience" version, and have the page deliver an appropriate one on demand.


I remember when websites used to have "[fast internet]" or "[slow internet]" buttons that you could use to choose if you wanted flash or not. Even though I had a high-speed, I chose slow because the site would load faster.


It doesn't have to be your actual bandwidth. The values could be (1) high quality/bandwidth, (2) low quality/bandwidth, (3) average. The browser can determine that automatically with an option to set it if you want to (e.g. for mobile connections).

That should solve most problems without giving away too much information. But an extra button would probably just confuse people.


Progressive resources would help a lot here. We have progressive JPEGs and (I might be wrong) PNGs, you can set your UA to low-fi mode and it will only download the first layer of the JPEG.


I think if someone wants to track you, the bandwidth is not the first thing they'll be looking at.

It's just another signal, but there's already a few tens of them, so adding one more is not going to make a significant difference.


If you consider every identifying piece of information as a bit in a fingerprint, it makes more than a significant difference; it makes an exponential difference. Consider the difference between 7 bits (128 uniques) and 8 bits (256 uniques) and then 15 bits (32K uniques) and 16 bits (65K uniques). Every additional bit makes a difference when fingerprinting a browser.


This sounds just like the idea that the website should be able to know how much power you had left on your device, so it could serve a lighter version of the webpage.

I think the arguments against are pretty much the same.


You can get the network type or downlink speed in Firefox and Chrome using NetworkInformation interface: https://developer.mozilla.org/en-US/docs/Web/API/NetworkInfo...

Then you can lazy load your assets depending on that condition.


One hack-y way to do it would be to load it via JavaScript. For example, see this stackoverflow [0]. Obviously not a great solution, but it works if your dying for something.

I bet people w/ slow connections are much more likely to disable javascript, though.

    let loadTime = window.performance.timing.domContentLoadedEventEnd- window.performance.timing.navigationStart;
    if (loadTime > someArbitraryNumber) {
        // Disable loading heavy things
    }
[0] http://stackoverflow.com/questions/14341156/calculating-page...


It's too late to disable loading heavy things at that point - the loading is already started.

Do the opposite, start loading heavy things if the page loaded quickly.

A clean way would be to set one of two classes, connection-slow or connection-fast, on the body element. Then you could use those classes in CSS to choose the correct assets for background images, fonts and so on.


Well, I meant to leave the heavy things out of the HTML you send over to the browser and then inject them only if they can be loaded.

So, yeah totally agree with you. Should have been clearer.


>start loading heavy things if the page loaded quickly

...and not loaded from cache. You need a way to determine this reliably. AFAIK there's no way to determine this for the main page itself.


There is a proposed API for that.

https://wicg.github.io/netinfo/

And like most such APIs, it has been kicked around for a long time and it has only been adopted by Chromium on Android, ChromeOS and iOS. It'd be great if it were more widely adopted...


Yay, more browser fingerprinting data points!


Well, to be fair, as the spec notes, you can already fingerprint on speed by timing how long an AJAX call takes.

Also, "on a shit home DSL connection" doesn't really distinguish me from millions of other UK residents.


Yes it does, when used in combination with the other data points.

That said, I always get "unique" on those fingerprinting tests. You can't be "extra unique," so I guess I don't mind it.


> Every time a similar question is posed on HN, someone says "If the assets aren't needed, don't serve them in the first place", but this is i) unrealistic, and ii) ignores the fact that while the typical HN user may like sparsely designed, text-orientated pages with few images, this is not at all true of users in different demographics. And in those demos, it's often not acceptable to degrade the experience of users on fast connections to accommodate users on slow connections.

This is prejudice. People use Craigslist, for example. If the thing is useful, people will use it. If there's a product being sold, and if it's useful to the potential clientele, they'll buy it. Without regard to the UI.

In the past ten years while my connection speed increased, the speed at which I can browse decreased. As my bandwidth increased, all the major websites madly inflated.

> So -- if I write a web page, and I want to include a large asset, but I want to indicate to user agents on slow/capped connections that they don't _need_ to download it, what approach should I take?

Put a link to it with (optionally) a thumbnail.


> People use Craigslist, for example. If the thing is useful, people will use it. If there's a product being sold, and if it's useful to the potential clientele, they'll buy it. Without regard to the UI.

Craigslist achieved critical mass in the 90s, so it's not a good example. Many useful products disappear because they can't attract enough users to become sustainable. A nice UI can affect users' credibility judgments and increase the chance that they'll stick around or buy things.

[1] http://dl.acm.org/citation.cfm?id=1315064


Random idea: Get the current time in a JS block in the head, before you load any CSS and JS, and compare it to the time when the dom ready event fires. If there's no real difference, load hi-res backgrounds and so on. If there is a real time difference, don't.


Wouldn't that be measuring latency more so than bandwidth? You'd run the danger of confusing a satellite internet connection (high(ish) bandwidth, high latency) with a third-world, low bandwidth connection.


Satellite ISPs have low data caps and/or charge a lot per GB of transfer. Avoiding unnecessary downloads seems like the correct behavior in this case.

I think the best solution would be an optional http header. That way, the server could choose to send a different initial response to low-bandwidth users. If connection speed is solely available via JavaScript API or media query, then only subsequent assets can be adapted for users on slow connections.


In the ideal future, FLIF [0] would become a standard, universally supported image and animation format. Almost any subset of a FLIF file is a valid, lower-resolution FLIF file. This would allow the browser - or the user - to determine how much data could be downloaded, and to display the best-quality images possible with that data. If more bandwidth or time became available, more of the image could be downloaded. The server would only have one asset per image. Nice and simple.

[0] http://flif.info/


We outsource this to CloudFlare and their Mirage service: https://support.cloudflare.com/hc/en-us/articles/200403554-W...


I think this is an important question.

Like another reply to your comment, I thought about having a very small js script in the header puting `Date.now()` in a global, then on page load, having another script checking the amount of time that had passed to see if it was worth downloading the "extra" at all. But then again where do you put the threshold? Has anyone tried this with some degree of success?


Design your UX so that any large assets can be requested at will by the user, and indicate the file size? That way it's the user's choice if they want to load that large video over their slow network, etc.


Most users on fast connections are not going to enjoy explicitly clicking to download every background image, font, etc. For videos it might make more sense, but there are many more optional assets to deal with.


Background images and fonts are examples of things probably not needed at all. I already have fonts on my computer, I don't need yours.


I have to agree with the original comment here:

> Every time a similar question is posed on HN, someone says "If the assets aren't needed, don't serve them in the first place", but this is i) unrealistic, and ii) ignores the fact that while the typical HN user may like sparsely designed, text-orientated pages with few images, this is not at all true of users in different demographics.


I think your underestimating how perceptibly slow the internet has become for a lot of people. They don't realise they're downloading 2MB of JavaScript, they don't realise what JavaScript or css are. They'll say things like "I think my computers getting old" or "I think I have a virus". More often than not this is just because they're favourite news site has become so slow and they can't articulate it any better than that. All they want to do is read their text oriented news sites with a few images.


I don't think there is an easy way to tell the browser not to download something because the connection is slow. Progressive enhancement can work well for giving users a basic page that loads quickly with minimal assets while also downloading heavier content in the background that renders later. That's still different than putting a timer on a request to download the content (which would require JS to detect the slow connection).

If you make a page well it should render quickly under any network condition, slow or fast. As an example, you could try serving pictures quickly by providing a placeholder picture which uses lossy compression to be as small as possible. It could be byte64 encoded so it's served immediately even over a slow connection. Then after the page is rendered, a request could go out to download the 0.5Mb image and a CSS transition could fade the image in over the placeholder. People on fast connections wouldn't notice a change because it would load right away, while people on a 40kbit 2G connection would be OK with your page too.

The requests to download larger content will still go out over a slow connection but the user won't suffer having to sit through seconds of rendering. Maybe similar to how people have done mobile-first responsive design, people could try doing slow-first design. Get everything out of the critical rendering path and progressively enhance the page later.


I think srcsets are a reasonably proxy for this. Serve extremely optimized images to mobile devices, and the full images to desktops.

It isn't perfect - you'll get some mobile devices on wifi that could have consumed the hero images and some desktop devices still on dial up, but it's still a lot better than doing nothing.


Server side logs and sessions. You should be able to tell users devices/browser and what bandwidth they operate. Then calculate average speed they use. You could then create tiered service where media quality and JS features can be adjusted. You will periodically process logs to make sure that grouping is still correct. As additional feature users could choose in theirs setting what tier they want to use.

On client side, You can achieve some of this by heavy use of media queries. https://msdn.microsoft.com/en-us/library/windows/apps/hh4535... You can basically manipulate resolution/disable of assets based on screens quality. This is under assumption that someone with retina screen will have decent internet.


There are two ways:

1. Serve the AMP version of your page (https://www.ampproject.org) which uses lazy loading and other optimizations

2. Use the Network Information API (https://developer.mozilla.org/en-US/docs/Web/API/NetworkInfo...) to load heavy assets only on high bandwidth connections.


Here's a proxy: buy a 1st-gen iPad, turn JS off, and then use it to browse the site.

If it crashes the device, you're way off.

If it's appreciably slow or clunky, find ways to improve it.

Iterate until it's fast and usable.


Which demographic likes large images that are not necessary for the task at hand?


Simple. Design your webpages to only load the functional lighter weight essential stuff by default. Then use javascript to add in all the large assets you want. Users with slow connections can browse with javascript turned off.


I like this solution.

Unfortunately there is little consistency in how well browsers accommodate JS-less browsing. Firefox removed the ability to switch off JS from its standard settings panel. Brave lets you switch JS on/off for each domain from the prominent lionface button panel.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: