> Maybe we should be buying slower computers so we feel the pain. Facebook has been internally intentionally slowing down their office internet once a week to help build empathy with their users in other 3rd world internet speed countries (coughAustraliacough).
Yep. I've just got back from spending over a month on a ship in the middle of the Atlantic, with two very flakey 256kbps satellite connections (capped to like 125kbps each as well). Shared between roughly 50 people. Our policy is to let anything through (it's a research ship, so being able to look up stuff on the internet is a must) but aggressively apply QoS rules and caching to the network. With the sheer amount of applications that assume they have a low-latency high-bandwidth connection which they can open hundreds of TCP connections on, just tweaking the firewall / bandwidth shaper / application control is a full-time job just by itself.
First observation, relevant to Slack: Before we locked down the firewall, Slack running on 2 laptops was by far the highest traffic application. Even though the few Slack users just had the client open but minimized - not actually using it. It was persistently using 20kbps, all day every day, until we blocked it outright. What the hell other stuff does it have to do other than download text!? Slack / other devs using Electron oughtta test their apps on extremely slow shared internet.
Second comment, relevant to Facebook intentionally slowing down office internet: The Mobile Facebook site (m.facebook.com) was by far the nicest to use on our severely degraded uplinks. No timeouts, pages took under 10 seconds to load, we didn't have to wait several minutes of blank page while another F%$£ing web font that's Arial but with the letter T slightly more aesthetically pleasing to download. No massively parallel loading of resources. Good use of caching.
It struck me that a lot of our problems were problems because devs never have to deal with that kind of crappy internet, it's not an edge case that gets tested. It results in people unable to load important stuff like Google, GMail, Outlook web app, and as such their products lose users. No matter how much we tweak our network internally, even restricting the internet use to a single computer doesn't result in usable web apps anymore. This is probably something that we share with third-world internet; a much larger market than our "research ship" niche.
Finally: People who put company logos in their email signatures hate other people. Base64-encoded PNG images make one-line email responses increase in size by like a factor of 100.
Yep. I've just got back from spending over a month on a ship in the middle of the Atlantic, with two very flakey 256kbps satellite connections (capped to like 125kbps each as well). Shared between roughly 50 people. Our policy is to let anything through (it's a research ship, so being able to look up stuff on the internet is a must) but aggressively apply QoS rules and caching to the network. With the sheer amount of applications that assume they have a low-latency high-bandwidth connection which they can open hundreds of TCP connections on, just tweaking the firewall / bandwidth shaper / application control is a full-time job just by itself.
First observation, relevant to Slack: Before we locked down the firewall, Slack running on 2 laptops was by far the highest traffic application. Even though the few Slack users just had the client open but minimized - not actually using it. It was persistently using 20kbps, all day every day, until we blocked it outright. What the hell other stuff does it have to do other than download text!? Slack / other devs using Electron oughtta test their apps on extremely slow shared internet.
Second comment, relevant to Facebook intentionally slowing down office internet: The Mobile Facebook site (m.facebook.com) was by far the nicest to use on our severely degraded uplinks. No timeouts, pages took under 10 seconds to load, we didn't have to wait several minutes of blank page while another F%$£ing web font that's Arial but with the letter T slightly more aesthetically pleasing to download. No massively parallel loading of resources. Good use of caching.
It struck me that a lot of our problems were problems because devs never have to deal with that kind of crappy internet, it's not an edge case that gets tested. It results in people unable to load important stuff like Google, GMail, Outlook web app, and as such their products lose users. No matter how much we tweak our network internally, even restricting the internet use to a single computer doesn't result in usable web apps anymore. This is probably something that we share with third-world internet; a much larger market than our "research ship" niche.
Finally: People who put company logos in their email signatures hate other people. Base64-encoded PNG images make one-line email responses increase in size by like a factor of 100.